Effective assessment feedback needs motive, opportunity and means

John Kleeman HeadshotPosted by John Kleeman

Assessment feedback, whether it relates to a question, a topic or an entire assessment, has tremendous value – but only if learners make practical use of it! I’d like to share some solid approaches about feedback and talk about how a couple of Questionmark reports can help you put them into practice.

From Andrew Morgan (quoted in Valerie Shute’s excellent ETS research report on feedback), we get the concept that to be effective and useful, feedback needs the following:

  • Motive – the learner wants to follow the feedback
  • Opportunity – the learner has it in time to use it: it’s not given too late for action
  • Means – the learner is able to use it: the feedback is actionable

Another good way to think about feedback comes from Dr Steve Draper of the University of Glasgow School of Psychology in his presentation at eAssessment Scotland in 2012:

“There is no point in giving feedback to a learner unless the learner acts on it: does something concrete and differently because of it”.

Feedback needs to be processed to be usefulFeedback that the learner doesn’t read isn’t valuable.

Feedback that the learner reads but doesn’t process isn’t valuable.

You must get the learner to evaluate the feedback and adjust their thinking as a result of the feedback, i.e. process the feedback and do something with it.

I’ve been wondering about how you can apply these concepts using the Questionmark coaching report when presenting an assessment score as feedback.

Most learners are motivated to use their score achieved in a test as feedback; they want to get a high scores next time. And if they can take a test again, they have the opportunity to use the feedback. But a score on its own is just a number. How can you help learners use their scores as catalysts for action?

Clearly, a score is more valuable if it can be compared to something, and there are three obvious comparisons:

  • Ipsative, comparing score to previous attempts: have you done better than last time?
  • Criterion referenced, comparing score to a benchmark: have you reached the desired standard?
  • Normative, comparing score to how others do: how do you compare to your peers?

Questionmark’s Transcript report lets learners view all their results and see how they improve or change over time. And Questionmark’s Coaching report includes the concept of benchmarks – you can set a benchmark for the assessment and for each topic. What you may not know is that the Coaching report allows you to compare a score against the average of a group of other test-takers. You can define the group of people by a variety of demographics and then display how the participant compares against their scores. This screenshot shows how to set this up:

Setting a comparison in the coaching report

Giving learners information about how they compare to others can be a powerful motivator; I encourage you to explore this capability of the Questionmark Coaching report.

For more on general concepts of processing feedback, see Steve Draper’s interesting page here. Questionmark users can see more information on the Coaching report comparison options on the Questionmark support site. And if you want to hear more from me about assessment feedback, I’ll be speaking about it at the Questionmark user conference in March.

One Response to “Effective assessment feedback needs motive, opportunity and means”

  1. Dick Bacon says:

    Another powerful motivator for action is feedback based upon a careful analysis of what led to lost marks, presented immediately along with a further try at the question. This is probably discipline and situation dependant. It is particularly, but I am sure not exclusively, appropriate for numerical problems in the sciences and where values can be randomised. Using such schemes in low stakes assessments, collaboration at the problem level can be encouraged and several tries can be offered (with reduced marks) if required.

Leave a Reply