Timing is Everything: Using psychology research to make your assessments more effective

Posted by Julie Delazyn

Over the past year or so, John Kleeman has been sharing new evidence from cognitive psychology indicating that when it comes to making your assessments more effective, timing is crucial.

One of the strongest research findings is that learning is far more effective if it is spaced out over time instead of being delivered all at once. There is evidence that giving regular quizzes helps people by:

  • encouraging them to space out their study in preparation for the quizzes
  • help them learn simply by taking the quizzes and seeing the feedback

Psychology research also shows that taking a quiz or test directly helps you retain information: The more you practice retrieving something, the more likely you are to retain it for the long term.

This SlideShare presentation will help you learn about psychology research and how you can apply it to improve your use of assessments.

Using QR Codes – Start to Finish

Posted By Doug Peterson

QR codes. They’re everywhere these days!

Questionmark staff have already written some great blogs on QR codes – Using QR Codes to Direct Smartphone Users to Assessments (which includes a link to a fantastic PowerPoint slideshow on using QR codes) and QR Codes for Surveys: A Perfect Fit, so I don’t need to repeat how useful they are in this blog. What I will do is show you just how easy it is to set up a QR code that launches an assessment, and how easy it is for a participant to launch an assessment from a mobile device using a QR code.

In this video I have used Authoring Manager to generate a URL to launch a simple quiz last December. It’s an open access quiz, and I’ve added “&name=holiday” to the end of the URL string so that the user doesn’t have to provide a name at the Open Access portal: they’ll be taken straight into the quiz.

Watch the video to learn how to turn the URL into a QR code graphic, and what the experience is like for a user accessing the quiz via the QR code!

It’s one thing to train, another to prove training

Simone Buchwald Posted by John Kleeman

I recently published an interview in the SAP Community Network with Simone Buchwald, who was the product manager of SAP’s Learning Solution (LMS) when it first came out and now works in the SAP Ecosystem as a consultant at EPI-USE. She shared some information on assessments within SAP and trends in compliance, which I thought readers of this blog might enjoy, here is an extract from the interview:


Why do people use assessment in the SAP Learning Solution (LSO)?

Assessments play a vital role in LSO as they broadly do in every LMS, especially as a big focus for LSO use is compliance management. Assessments are a crucial part of a compliance solution, because organizations have to test and document that people understand what they are supposed to learn.

I remember when I was at SAP and we made contact with Questionmark – we immediately decided that we would not extend the LSO test authoring system much, but would instead suggest Questionmark as the assessment partner tool for LSO because of its strong capabilities.


What is the best way to document training for compliance purposes. Is it okay just to record that someone has been through training or do you need to assess to check understanding?

I see a trend both in regulated and non-regulated environments that it is not enough just to track the completion of training, you also need to validate the understanding of learning.

So I think it is important to check understanding by assessments, both for tracking purposes and in case an auditor wants to see the records. It is one thing to do something, and another to prove it. This even extends nowadays outside of the core regulatory use cases to courses where the company wants to track the completion and the understanding, for example  in sales training to prove the value of the learning for the company.


What trends do you see in compliance; how are things changing?

A lot of the compliance processes that we have seen in the past that have been mandatory in compliance-driven businesses —  like pharmaceuticals, chemicals and mining — we are now seeing also expanding in other industries and customers. For instance we see needs for mandatory training of certain business processes, sometimes imposed by regulation, but sometimes just imposed by the business itself. For example, when organizations roll out equal opportunities training, the company may decide to implement the same process as for compliance-required training – checking who has completed it, checking who has done the assessments and been validated. And often this is driven from internal business needs, not external regulators.

People are seeing the advantage of conducting learning and assessments not just when the regulator requires it, but for their own business purposes.


You can see the full interview here.

Alignment, impact and measurement with the A-model

Posted by Julie Delazyn

It’s essential for learning and other important change initiatives – including assessment and evaluation — to align with an organization’s strategic goals. So it’s important for organizations and individuals to clarify the goals, objectives and human performance issues of their work and design systematic assessment programs to evaluate progress.

The A-model, developed by Dr. Bruce C. Aaron, offers a framework for helping organizations  focus on their goals, promote continuous learning and adjust performance as needed. This framework, or map, helps hold the many elements of human performance in place — right from the original business problem or business issue up through program design and evaluation.

We have many resources for helping you understand the A-model and apply it to the specific needs of your organization:

  • White Paper: Alignment, Impact and Measurement with the A-model (free with registration)
  • A podcast with Bruce Aaron
  • Other perspectives on the A-model
  • This SlideShare presentation!

Improving course evaluations and boosting participation

Posted by Jim Farrell

Course and session evaluations  are popular assessments  for helping  to improve course and instructor quality at conferences and in learning programs all over the world.

One of our major goals over the past few years has been to make it easier to create and deliver course evaluations – and to help organizations glean more meaningful, actionable results from them.

Back in 2010, when we added the ability to author course evaluation surveys in Questionmark Live, we included question libraries to draw from in creating  surveys. These libraries cover four topics: Demographics, Instructor, Course Materials and Facilities;  you can either write your own questions or choose some from the libraries.

More recently, we’ve  been exploring the use of QR codes to increase course evaluation response rates by taking participants directly to online surveys via their mobile devices. Go here and here for more details about the benefits of using QR codes.

What about the results of course evaluations? We now have 4 course evaluation surveys in Questionmark Analytics: Course Summary, Instructor Summary, Class Summary and Class Detail. The four course evaluation reports allow you to drill into more and more detail about your course evaluation results. You can start at the course level and work your way down to an instructor/question level of detail. Each report also has visual cues to make performance obvious with a quick glance.

In the example below, the course summary report compares evaluation results across courses. It is most useful for managers and supervisors to comparing different courses within an organization.

If you are a customer looking to improve your course evaluations, you can click  here to read for our Course Evaluation Best Practices Guide.  Anyone who hasn’t used Questionmark Live can sign up for a free trial via our Tryouts and Downloads page.

How you can improve assessment accessibility

Posted by Noel Thethy

Over the next few weeks I’ll be releasing several “How To” blog posts that I hope will provide some insight in to the features and functionality available to Questionmark users.

In particular I’ll be looking at:

•    How to ensure your questions are as accessible as possible
•    How-to tweak Questionmark’s accessibility features to suit your specific needs.

This series will consist of several video demonstrations and some explanations of basics concepts.

I’ll be covering topics like:

•    Using media and images in a question in an accessible way
•    Ensuring question text is appropriately accessible
•    Customizing the font size changer
•    Customizing the contrast changer

Before we go on this journey I’d like to remind you of some other posts about accessibility that have already appeared on this blog. They include:

•    Assessment Accessibility: A View from the Inside
•    Assessment Accessibility in Questionmark Perception Version 5

I hope you will join me as I take a look at making assessments accessible.