Seven tips to recruit and manage SMEs for technology certification exams

imagePosted by John Kleeman

[repost from February 8, 2017]

How do you keep a certification exam up to date when the technology it is assessing is changing rapidly?

Certifications in new technologies like software-as-a-service and cloud solutions have some specific challenges. The nature of the technology usually means that questions often require very specialist knowledge to author. And because knowledge of the new technology is in short supply, subject matter experts (SMEs) who are able to author and review new items will be in high demand within the organization for other purposes.

Cloud technological offerings also change rapidly. It used to be that new technology releases came out every year or two, and if you were writing certification exams or other assessments to test knowledge and skill in them, you had plenty of notice and could plan an update cycle. But nowadays most technology organizations adopt an agile approach to development with the motto “release early, release often”. The use of cloud technology makes frequent, evolutionary releases – often monthly or quarterly – normal.

So how can you keep an exam valid and reliable if the content you are assessing is changing rapidly?

Here are seven tips that could help – a few inspired by an excellent presentation by Cisco and Microsoft at the European Association of Test Publishers conference.

  1. Try to obtain item writing SMEs from product development. They will know what is coming and what is changing and will be in a good position to write accurate questions. 
  2. Also network for SMEs outside the organization – at technology conferences, via partners and resellers, on social media and/or via an online form on your certification website. A good source of SMEs will be existing certified people.
  3. Incentivize SMEs – what will work best for you will depend on your organization, but you can consider free re-certifications, vouchers, discounts off conferences, books and other incentives. Remember also that for many people working in technology, recognition and appreciation are as important as financial incentives. Appreciate and recognize your SMEs. For internal SMEs, send thank you letters to their managers to appreciate their effort.
  4. Focus your exam on underlying key knowledge and skills that are not going to become obsolete quickly. Work with your experts to avoid items that are likely to become obsolete and seek to test on fundamental concepts, not version specific features.
  5. When working with item writers, don’t be frightened to develop questions based on beta or planned functionality, but always do a check before questions go live in case the planned functionality hasn’t been released yet.
  6. Analyze, create, deliverSince your item writers will likely be geographically spread and will be busy and tech-literate, use a good collaborative tool for item writing and item banking that allows easy online review and tracking of changes. (See https://www.questionmark.com/content/distributed-authoring-and-item-management for information on Questionmark’s authoring solution.)
  7. In technology as in other areas, confidentiality and exam security are crucial to ensure the integrity of the exam. You should have a formal agreement with internal and external SMEs who author or review questions to remind them not to pass the questions to others. Ensure that your HR or legal department are involved in the drafting of these so that they are enforceable.

Certification of new technologies helps adoption and deployment and contributes to all stakeholders success. I hope these tips help you improve your assessment program.

A conversation on the value of asking good questions

Joan Phaup

Posted by Joan Phaup

I enjoyed  a blog post by Andy Klee of  Klee Associates about a  recent conversation he’d had with our own John Kleeman, Questionmark’s chairman. Klee, whose organization provides JD Edwards and SAP training and consulting, showed a great deal of interest in how good questions and tests can improve learning outcomes.

Click here to follow their wide-ranging discussion, which covers such topics as the challenge of creating high-quality test questions, the correlation between performance on certification exams and  future job performance, and trends in exam design and administration.

It’s great to see more and more people recognizing that asking questions adds value to learning. If you would like to read more on this subject, check out this paper by Dr Will Thalheimer of Work-Learning Research: The Learning Benefits of Questions.

Multiple choice quizzes help learning, especially with feedback

Posted by John Kleeman

I promised in an earlier blog entry to pass on my understanding of research in educational psychology about the unmediated or direct benefits of questioning, i.e., how answering questions helps people learn. I’ve recently read a 2008 paper by Butler and Roediger from the Memory Lab at Washington University in St. Louis, Missouri (see here for the 2008 paper and here for a 2010 review paper including the graph below), which includes some fascinating information on how multiple choice quizzes directly aid learning.

The researchers divided students randomly into four groups as follows

  • Study a subject, no quiz
  • Study a subject, take a multiple choice quiz, no feedback
  • Study a subject, take a multiple choice quiz, feedback after each question
  • Study a subject, take a multiple choice quiz, feedback at the end of the quiz

They then tested all the groups a week later and got the results below.

Chart showing quizzes give better retention

As you can see, the students who had taken a quiz (or test as the authors describe it) got better results on average than those who hadn’t taken a quiz. This is expected due to the general principle that answering questions gives retrieval practice, which helps you to be able to recall things later and so helps learning.  This is similar to results I’ve blogged on elsewhere.

However, what is interesting on this study is that on multiple choice quizzes, there is the potential danger that students will choose the wrong answers and so think they have retrieved information which is in fact wrong. What this study showed was that if you give feedback on the quiz, then this improves learning further, as you can see in the graph above. Interestingly, although you might think that immediate feedback right after the question is best, this wasn’t the case in this example. Quizzes with feedback delayed until the end of the assessment gave better results than those with feedback after each question. The authors postulate that a slight gap in giving the feedback allows the incorrect concept to dissipate before the feedback is given and also gives spacing in time, which helps learning.

My summary of understanding from this research:

  • Giving a quiz after learning will help retention, as it gives recall practice
  • Giving feedback helps improve retention, particularly in multiple choice quizzes where there is a danger of learners choosing wrong answers and thinking they are right
  • Feedback is better at the end of the quiz, not after each question

For more information on the research, see Professor Roediger’s publications page at http://psych.wustl.edu/memory/publications/.

One interesting issue this raises is that it’s common in certification exams not to give feedback, to retain the confidentiality of the questions by not repeating them, and because certification aims at measuring rather than learning. What this research shows is that if you want to help your successful and failing candidates learn, then you could consider feedback in some form.

Here’s a question to allow you to practice retrieval on the subject of this blog:

Should you give feedback on multiple choice quizzes after each question or at the end of the assessment?

Certifying Usability Analysts

Joan Phaup

Posted by Joan Phaup

Human Factors International, Inc. (HFI), a company specializing in user-centered design, offers a certification program for Usability Analysts from many different parts of the world.  HFI-Certified Usability Analysts must demonstrate their ability to create intuitive, easy-to-use Web sites and applications.

Candidate learn to design interfaces that effectively meet the needs of end users. Questionmark hosts the certification exams candidates take after completing instructor-led HFI courses or studying on their own to master the principles of data gathering, task analysis and usability testing.

You can learn more about this program by reading our case study.