Managing a complex testing environment with a QMWise-based dashboard

 Chloe Mendonca Posted by Chloe Mendonca

Earlier this week I spoke with Gerard Folkerts of Wageningen University who will be delivering a presentation at the upcoming Questionmark 2013 European Users Conference in Barcelona  November 10 – 12.Wageningen UR logo

In their case study presentation, Gerard and his co-presenter, Gerrit Heida, will share their experience using the QMWISe API to create a dashboard for their  Questionmark Perception installation.

Could you tell me about Wageningen University and how you use Questionmark assessments?

Wageningen University is the only university in the Netherlands to focus specifically on the themes of ‘healthy food and the living environment’ — and for the eighth year in a row we’ve been awarded the title of “Best University in the Netherlands”.

We use Questionmark for summative and formative assessments. We do about 10 to 20 summative assessments each period with up to 500 participants simultaneously. Besides the summative assessments there are also formative assessments available to students. Besides the web based assessments we also have a setup for running tests in a secure environment which require the use of Microsoft Office, SAS or SPSS etc.

Could you tell me a little more about this secure setup?

eu confThe secure test environment turns a regular PC room into a test center. Each computer inside the room is closed from communication with the outside world. With our i- house developed tools we are able to deliver the necessary documents for the assessment to each participant. Such an assessment could be writing a report based on data analysis within SPSS. At the end of the assessment all the work of the participants is stored in a central location.

How do you effectively manage numerous participants taking simultaneous tests?

First of all we have set up a number of procedures to ensure a stabile testing environment. For this we have separated the environments needed for question development and formative and summative testing. Multiple administrators are working in the same environment and could schedule assessments to start simultaneously. With Questionmark Web Integration Services environment (QMWISe,) we created a dashboard to get a real-time view of the amount of assessments scheduled and the number of participants that are enrolled into the assessments.

How has this helped to control your environment?

Using the standard QMWise API makes it possible to build a dashboard which will show this data, and with the information already there, it is easy to do a basic health check on the assessments. For technical support it is essential to predict how much load can be expected on your QMP farm, so the dashboard has helped us to get our QMP environment in control

How do you hope that your session will benefit others?

I hope that my session will give some insight in the possibilities of QMWise. Once you understand how it works it is not that hard to add custom modules to the Questionmark Perception environment. I think the API is undervalued.

What are you looking forward to most about the Users Conference?

Meeting with other Questionmark users and learning about the roadmap of Questionmark Perception.  And of course a visit to Barcelona…

Check out the conference program and click here to register. We hope to see you there!

Euro Conf Banner

Evidence that topic feedback correlates with improved learning

John Kleeman HeadshotPosted by John Kleeman

It seems obvious that topic feedback helps learners, but it’s great to see some evidence!

Here is a summary of a paper, “Student Engagement with Topic-based Facilitative Feedback on e-Assessments” (see here for full paper) by John Dermo and Liz Carpenter of the University of Bradford, presented at the 2013 International Computer Assisted Assessment conference.

Dermo and Carpenter  delivered a formative assessment in Questionmark Perception over a period of 3 years to 300 students on an undergraduate biology module.  All learners were required to take the assessment once, and were allowed to re-take it as many times as they wanted. Most took the test several times. The assessment didn’t give question feedback, but gave topic feedback on the 11 main topic areas covered by the module.

The intention was for students to use the topic feedback as part of their revision and study to diagnose weaknesses in their learning: the comments provided might be able to direct students in their learning. The students were encouraged to incorporate this feedback into their study planners and to take the test repeatedly, expecting that students who engage with their feedback, and are “mindful” of their learning will  benefit most.

Here is an example end of test feedback screen.

Assessment Feedback screen showing topic feedback

As you can see, learners achieved “Distinction”, “Merit”, “Pass” and “Fail” for each topic. They were also given a topic score and some guidance on how to improve. The authors then correlated time spent on the tests, questions answered and distribution of taking the test over time with each student’s score on the end-of-module summative exam.  They found a correlation between taking the test and doing well on the exam. For example, the correlation factor on number of attempts on the formative assessment and the score on the  summative exam was 0.29 (spearman rank order correlation, p<0.01).

You can see some of their results below, with learners divided into a top, middle and bottom scoring group on the summative exam. This shows that the top scoring group answered more questions, spent more time on the test, and spread the effort over a longer period of time.

Clustered bar charts showing differences between top middle and bottom scoring groups on the dependent variables time, attempts, and distribution

The researchers also surveyed the learners, 82% of whom agreed or strongly agreed that “I found the comments and feedback useful”. Many students also drew attention to the fact that the assessment and feedback let them focus their revision time on the topics that needed most attention, for example one student said:

“It showed clearly areas for further improvement and where more work was needed”.”

There could be other reasons why learners who spent time on the formative assessments did well on the summative exam:  they might, for instance, have been more diligent in other things. So this research offers proof of correlation, not proof of cause and effect. However, it does provide evidence pointing to topic feedback being useful and valuable in improving learning by telling learners which areas they are weak in and need work on more. This seems likely to apply to the world of work as well as to higher education.

Assessment types and their uses: summative assessments

Posted by Julie Delazyn

To use assessments effectively, it’s important to understand their context and uses within the learning process.

Over the past few weeks I have written about diagnostic assessments, formative assessments and needs assessments. My last post in this series is about summative assessments.

Typical uses:

  • Measuring or certifying knowledge, skills and aptitudes (KSAs)
  • Providing a quantitative grade and making a judgment about a person’s knowledge, skills and achievement
  • Determining whether the examinee meets the predetermined standard for specialized expertise
  • Determining a participant’s level of performance at a particular time

Types:

  • Licensing exams
  • Certification tests
  • Pre-employment tests
  • Academic entrance exams
  • Post-course tests
  • Exams during study

Stakes:
Medium, High


Example:

Summative assessments are easy to explain: they sum up the knowledge or the skills of the person taking the test. This type of assessment provides a quantitative grade and makes a judgment about a person’s knowledge, skills and achievement. A typical example would be a certification that a technician must pass in order to install and/or do repairs on a particular piece of machinery. In passing the certification exam, a candidate proves his or her understanding of the machinery.

For more details about assessments and their uses check out the white paper, Assessments Through the Learning Process. You can download it free here, after login. Another good source for testing and assessment terms is our glossary.