Join a UK briefing on academic assessment

Chloe MendoncaPosted by Chloe Mendonca

There are just a few weeks to go until the UK Academic Briefings at the University of Bradford and University of Southampton.

These complimentary morning briefings bring people and ideas together.

For those working in academic-related roles, here is an opportunity to get up to date with recent and future developments in online testing.

The agenda will cover ways to:

  • enhance student learning through e-assessment
  • overcome key obstacles surrounding e-assessment in higher education
  • mitigate the threat of cheating and fraud within online exams

If you’re new to online assessment or are thinking of implementing it within your institution, attend a briefing to see Questionmark technologies in action
and speak with assessment experts about potential applications for online surveys, quizzes, tests and exams.

Register for the date and location that suits you best:

 

Where’s the evidence for assessment?

Posted by John Kleeman

I’m always on the lookout for hard evidence that assessment improves learning outcomes, and I’m indebted to Denise Whitelock, Lester Gilbert and Veronica Gale for alerting me to some powerful evidence in their research report at the 2011 CAA Conference. This Australian study looked at more than 1500 students taking part in an applied maths course and showed that taking formative quizzes during a course improved learning outcomes.

The study was conducted by two economics lecturers, Dr Simon Angus and Judith Watson, and is titled Does regular online testing enhance student learning in the numerical sciences? Robust evidence from a large data set. It was published in the peer-reviewed British Journal of Educational Technology Vol 40 No 2, 255-272 in 2009.

Angus and Watson introduced a series of 4 online, formative quizzes into the course, and wanted to determine whether students who took the quizzes learned more and did better on the final exam than those who didn’t. The interesting thing about the study is that they used a statistical technique (ordinary least squares regression). This allowed them to estimate the effect of several different factors, and isolate the effects of taking the quizzes from the previous mathematical experience of the students, their gender and their general level of effort to determine which impacted the final exam score most.

You can see a summary of their findings in the graph below, which shows the estimated coefficients for four of the main factors, all of which had a statistical significance of p < 0.01.

Factors associated with final exam score graph

You can see from this graph that the biggest factor associated with final exam success was how well students had done in the midterm exam, i.e. how well they were doing in the course generally. But students who took the 4 online quizzes learned from them and did significantly better. The impact of taking or not taking the quizzes was broadly the same as the impact of their prior maths education: quite reasonable and significant.

We know intuitively that formative quizzes help learning, but it’s nice to see a statistical proof that – to quote the authors – “exposure to a regular (low mark) online quiz instrument has a significant and positive effect on student learning as measured by an end of semester examination”.