Evidence that topic feedback correlates with improved learning

John Kleeman HeadshotPosted by John Kleeman

It seems obvious that topic feedback helps learners, but it’s great to see some evidence!

Here is a summary of a paper, “Student Engagement with Topic-based Facilitative Feedback on e-Assessments” (see here for full paper) by John Dermo and Liz Carpenter of the University of Bradford, presented at the 2013 International Computer Assisted Assessment conference.

Dermo and Carpenter  delivered a formative assessment in Questionmark Perception over a period of 3 years to 300 students on an undergraduate biology module.  All learners were required to take the assessment once, and were allowed to re-take it as many times as they wanted. Most took the test several times. The assessment didn’t give question feedback, but gave topic feedback on the 11 main topic areas covered by the module.

The intention was for students to use the topic feedback as part of their revision and study to diagnose weaknesses in their learning: the comments provided might be able to direct students in their learning. The students were encouraged to incorporate this feedback into their study planners and to take the test repeatedly, expecting that students who engage with their feedback, and are “mindful” of their learning will  benefit most.

Here is an example end of test feedback screen.

Assessment Feedback screen showing topic feedback

As you can see, learners achieved “Distinction”, “Merit”, “Pass” and “Fail” for each topic. They were also given a topic score and some guidance on how to improve. The authors then correlated time spent on the tests, questions answered and distribution of taking the test over time with each student’s score on the end-of-module summative exam.  They found a correlation between taking the test and doing well on the exam. For example, the correlation factor on number of attempts on the formative assessment and the score on the  summative exam was 0.29 (spearman rank order correlation, p<0.01).

You can see some of their results below, with learners divided into a top, middle and bottom scoring group on the summative exam. This shows that the top scoring group answered more questions, spent more time on the test, and spread the effort over a longer period of time.

Clustered bar charts showing differences between top middle and bottom scoring groups on the dependent variables time, attempts, and distribution

The researchers also surveyed the learners, 82% of whom agreed or strongly agreed that “I found the comments and feedback useful”. Many students also drew attention to the fact that the assessment and feedback let them focus their revision time on the topics that needed most attention, for example one student said:

“It showed clearly areas for further improvement and where more work was needed”.”

There could be other reasons why learners who spent time on the formative assessments did well on the summative exam:  they might, for instance, have been more diligent in other things. So this research offers proof of correlation, not proof of cause and effect. However, it does provide evidence pointing to topic feedback being useful and valuable in improving learning by telling learners which areas they are weak in and need work on more. This seems likely to apply to the world of work as well as to higher education.

How Topic Feedback can give Compliance Assessments Business Value

Posted by John Kleeman

If you need to prove compliance with regulatory requirements, should your training and assessments focus on compliance needs? Or should you train and assess to improve skills that will impact your business primarily, and meet compliance needs as well?

I recently interviewed Frederick Stroebel and Mark Julius from a large South African financial services company, Sanlam Personal Finance, for the SAP blog. Sanlam have used Questionmark Perception for more than a decade and combine it with SAP HR and Learning software. You can see the full interview on the SAP site. Their view was that compliance and business-related needs must be combined:

“We deliver assessments both for compliance and e-learning. It’s a combination of business requirements and legislation. We predominantly started it off thinking that the purpose would be for business reasons, but as soon as the business realized the value for regulatory compliance, we received more and more requests for that purpose.”

One of the key ways in which Sanlam use the results of assessments to improve feedback is to use topic feedback, which identifies topics that may be weak points for the participant.

We set up our assessments so that at the end, the computer gives the participant a summary of the topics and what the score was per topic, so the participant can immediately see where they need further facilitation as well.

It is also valuable in providing feedback to the learner, where a facilitator sits with the learner. The facilitator can immediately determine from the coaching report where exactly the learner needs to go for re-training. We have done extremely well in terms of increasing our overall pass mark and per topic scores by using topic feedback.  For example, for brokers and advisers, there’s an initial assessment they do, and because questions are in different topics, once they’ve taken the assessment, the facilitator can immediately see which type of training that person must go on.

To illustrate, here is part of a Coaching Report that shows a participant has scored 80% in one topic (well above what is needed for competency) and 58% in another (slightly above what is needed).

 

Questionmark Perception coaching report

Topic feedback is a great way of getting value from assessments and I hope Sanlam’s experience and insight can help you.