Assessment good practice: 6 tips from Santa Fe

John Kleeman HeadshotPosted by John Kleeman

Last week’s Questionmark Conference in beautiful Santa Fe was a great opportunity to meet and learn from Questionmark customers and some of the world’s leading experts in online assessment. Here are six tips I heard which I hope will interest Questionmark blog readers.

  • Tip 1. Set up topic structures that are useful to report on. Define topic structures at the lowest probable reporting level so you can use topic scores to look at performance by topic in an actionable way. Make topic names meaningful to the business as elements of knowledge or skill or competency. For example, it’s often worth looking at average topic scores across a cohort of participants. If one or two topic areas are weaker than others, this likely shows an area that people do not understand or where your training is weak. In general, when designing assessments, the more you think about the decisions you will make as a result of your assessments, the more valuable your assessments will be.


  • Tip 2. Get SMEs to evaluate questions checking task performance before using them. If your questions are being used to check task or job performance, you need to make sure they actually do so. One presenter explained how they send out a survey to SMEs (subject matter experts), asking about the quality and essentiality of each proposed question on a 4-point scale. The quality question asks whether the question is written well enough to accurately assess knowledge. The essentiality question asks how important the question is to meet assessment objectives. Questions with low quality should be modified or rewritten, and questions should only be included in the assessment if they score well in essentiality.


  • Tip 3. If using observational assessments, keep a close watch on observer reliability. A lot of organizations use Questionmark to deliver observational assessments, which rely on a person grading a participant on performance. To make this most effective, if you have many observers/graders, put in effort to ensure all graders grade similarly (i.e. improve inter-rater reliability). If you don’t do this, there is a risk that scores will vary unfairly due to different ratings by different graders. To do this, first define the rubric or scoring rules very clearly; second, give good training to graders, with practice sessions and exemplars; and finally, monitor the grading in as close to real time as possible to catch any drift.


  • Tip 4. Randomization helps security. It’s definitely helpful to shuffle choices in most questions and shuffle question order in most assessments. It’s also common to select questions from an item bank at random. From a security perspective, this makes it harder for participants to collude or pass information to one another. Also, if a participant does leak questions, the fact that each sees a randomly different test can sometimes identify the leaker due to the unique randomization he/she received.


  • Tip 5. Balance your retake policy against your item bank size. If you select at random from an item bank and allow participants to retake assessments, it’s important to have a large enough item bank size to make items different each time someone retakes. A small item bank is a security weakness as it makes content exposure more damaging.Dialog box that shows when you randomly select 5 questions from a topic, you can tick on "Avoid previously delivered" It’s helpful to use the Questionmark setting that ensures that when selecting questions at random, when enough questions are available, you avoid selecting those previously delivered to the same participant.


  • Tip 6: Enter into signed agreements with authors and administrators. There is a big benefit in signing  formal, written security agreements with authors and administrators. This communicates very clearly that assessments and related content need to be kept confidential. Sometimes people inadvertently or deliberately leak data or let it escape and say that they did not understand the consequences; a security agreement that a person has signed makes such an excuse much harder to make and will encourage security.

Thank you to everyone who attended the conference and especially to those who shared good practice.

As a final point,  I loved this motto (adapted from the famous Socrates saying) courtesy of Ted Stille at the US State Department, Diplomatic Security Service:

“The unexamined course is not worth giving”.

This might have several meanings, but the two that strike me are that any training needs an exam or other measurement to make it worthwhile, but also that you should think hard and evaluate well any training activity to ensure that your time and the time of your participants are well spent.

After I shared this lovely motto on Twitter, I was reminded by testing guru Bill Coscarelli of one of his maxims:

” You don’t get what you want, you get what you test.”

I hope you enjoyed these tips. More best practice can be found in our white papers and eBooks at:

Internet assessment software pioneer Paul Roberts to retire

Paul Roberts photoPosted by John Kleeman

We think of the Internet as being very young, but one of the pioneers in using the Internet for assessments is about to retire. Paul Roberts, the developer of the world’s first commercial, Internet assessment software is retiring in March. I thought readers might like to hear some of his story.

Paul was employee number three at Questionmark, joining us as software developer in 1989 when the company was still working out of my home in London.

During the 1990s, our main products ran on DOS and Windows. When we started hearing about the new ideas of HTML and the web, we realized that the Internet could make computerized assessment so much easier. Prior to the Internet, testing people at a distance required a specialized network or sending floppy disks in the mail (yes people really did this!). The idea that participants could connect to the questions and return their results over the Internet was compelling. With me as product manager, tester and documenter for our new product — and Paul as lead (and only!) developer — he wrote the first version of our Internet testing product QM Web, which we released in 1995.

QM Web manual cover

QM Web became widely used by universities and corporations who wanted to deliver quizzes and tests over the Internet. Later in the nineties, learning from the lessons of QM Web, we developed Questionmark Perception, our enterprise-level Internet assessment management system still widely used today. Paul architected Questionmark Perception and for many years was our lead developer on its assessment delivery engine.

One of Paul’s key innovations in developing Questionmark Perception was the use of XML to store questions. XML (eXtensible Markup Language) is a way of encoding data that is both human-readable and machine-readable. In 1997, Paul implemented QML (Question Markup Language) as an early application of this concept. QML allowed questions to be described independently of computer platforms. To quote Paul at the time:

“When we were developing our latest application, we really felt that we didn’t want to go down the route of designing yet another proprietary format that would restrict future developments for both us and the rest of the industry. We’re very familiar with the problems of transporting questions from platform to platform because we’ve been doing it for years with DOS, Windows, Macintosh and now the Web. With this in mind, we created a language that can describe questions and answers in tests, independently of the way they are presented. This makes it extremely powerful because QML now enables the same question database to be presented no matter what computer platform is chosen on or whatever the operating system.”

Questionmark Perception and Questionmark OnDemand still use QML as their native format, so that every single question delivered by Questionmark technology has QML as its core. QML was very influential in the design of the version 1 IMS Question & Test Interoperability specification (IMS QTI), which was led by Questionmark CEO Eric Shepherd and to which Paul was a major contributor. Paul also worked on other industry standards efforts including AICC, xAPI and ADL SCORM.

Over the years, many other technology innovators and leaders have joined Questionmark, and we have a thriving product development team. Most members of our team have had the opportunity to learn from Paul over the years, and Paul’s legacy is in safe hands: Questionmark will continue to break new frontiers in computerizing assessments. I am sure you will join me in wishing Paul well in his personal journey post-retirement.

Next Generation Assessment Technology & Exciting Events Driving the Conference Agenda

Now that we have the program in place for Questionmark Conference 2017, I’m eager to highlight a few sessions that you will have a chance to attend in Santa Fe, New Mexico March 21-24.

Before the conference gets rolling, there are two full-day workshops available Tuesday, March 21:

Here’s a peak and the agenda. You can explore the entire list of sessions here: Conference Program.

Questionmark Features & Functions

Case Studies

Best Practices

Networking Events

We have some fantastic networking events planned as well.

We’re kicking off the conference with our signature dessert reception. The next day, you will have a chance to enjoy Santa Fe enchanting downtown and dine with a group of fellow assessment professionals. But it’s all culminating to our final event: Meow Wolf’s House of Eternal Return – an exciting multimedia immersive art exhibit experience. Watch out for surprise acts! We look forward to an evening of eating, networking and celebrating!

Questions? Email We’re happy to help!

How to manage compliance in a highly regulated world? [30-minute webinar]

Posted by Chloe Mendonca

If your industry demands compliance, then your people need compliance training and certifications. Learning and training records are almost always reviewed during regulatory audits to check that employees have received the required training and their competencies or certifications are up to date and valid. The regular assessment of employee knowledge and competencies ensures you’re always ready for an audit or if something goes wrong.

Did you know…

Source: Brandon Hall Group

Perhaps you already understand the value assessments can bring, but need to convince your management team. Or perhaps you’d like a better view on how to use assessments most effectively to ensure compliance. You’re not alone.

Join us for a 30-minute webinar on Thursday March 9, 2017 to:

  • discuss the critical role assessments play in compliance learning
  • explore the benefits of using assessments before, during and after training
  • find out 7 ways assessments fortify compliance
  • get best practice tips for ensuring valid and reliable assessments

We also have several other webinars you may be interested in:

  • How to write high-quality test items – March 4, 2017

If you’d like best practice tips to improve your test items and ensure they produce fair, valid and reliable results then sign up for this 30-minute webinar.

  • Introduction to Questionmark’s Assessment Management System – Various dates and times

Get an overview of Questionmark’s features and functions in this live demo. We will look at the basics of authoring, delivering and reporting on surveys, quizzes, tests and exams.