Item Development – Conducting the final editorial review

Austin Fossey-42Posted by Austin Fossey

Once you have completed your content review and bias review, it is best to conduct a final editorial review.

You may have already conducted an editorial review prior to the content and bias reviews to cull items with obvious item-writing flaws or inappropriate item types—so by the time you reach this second editorial review, your items should only need minor edits.

This is the time to put the final polish on all of your items. If your content review committee and bias review committee were authorized to make changes to the items, go back and make sure they followed your style guide and that they used accurate grammar and spelling. Make sure they did not make any drastic changes that violate your test specifications, such as adding a fourth option to a multiple choice item that should only have three options.

If you have resources to do so, have professional editors review the items’ content. Ask the editors to identify issues with language, but review their suggestions rather than letting them make direct edits to the items. The editors may suggest changes that violate your style guide, they may not be familiar with language that is appropriate for your industry, or they may wish to make a change that would drastically impact the item content. You should carefully review their changes to make sure they are each appropriate.

As with other steps in the item development process, documentation and organization is key. Using item writing software like that provided by Questionmark can help you track revisions to items, document changes, and track your items to make sure each one is reviewed.

Do not approve items with a rubber stamp. If an item needs major content revisions, send it back to the item writers and begin the process again. Faulty items can undermine the validity of your assessment and can result in time-consuming challenges from participants. If you have planned ahead, you should have enough extra items to allow for some attrition while retaining enough items to meet your test specifications.

Finally, be sure that you have the appropriate stakeholders sign off on each item. Once the item passes this final editorial review, it should be locked down and considered ready to deliver to participants. Ideally, no changes should be made to items once they are in delivery, as this may impact how participants respond to the item and perform on the assessment. (Some organizations require senior executives to review and approve any requested changes to items that are already in delivery.)

When you are satisfied that the items are perfect, they are ready to be field tested. In the next post, I will talk about item try-outs, selecting a field test sample, assembling field test forms, and delivering the field test.

Check out our white paper: 5 Steps to Better Tests for best practice guidance and practical advice for the five key stages of test and exam development.

Austin Fossey will discuss test development at the 2015 Users Conference in Napa Valley, March 10-13. Register before Jan. 29 and save $100.

Proving compliance – not just attendance

Posted by John Kleeman

Many regulators require you to train employees – in financial services, pharmaceuticals, utilities and in health & safety across all industries. You need to train them and when you are audited or if something goes wrong, you need to document that you did the training. To quote the US regulator OSHA: Documentation can also supply an answer to one of the first questions an accident investigator will ask: “Was the injured employee trained to do the job?”

Is it good enough to get the participant to sign something saying that they’ve attended the training or read the safety manual? An excellent blog series on the SafetyXChange says no:

Some companies ask their workers to sign a form after training sessions acknowledging that they understood the lesson and will put it into practice. Don’t let these forms lull you into a false sense of security. “Most workers will just sign these things without even reading them, let alone making sure that they understood everything you told them,” says a health and safety attorney in New York City. This is especially true if the training and instructions are complicated.

In the safety field, a US Appeals Court law case ruled in 2005 (my underlining):

Merely having an individual sign a form acknowledging his responsibility to read the safety manual is insufficient to insure that the detailed instructions contained therein have actually been communicated.

Two good ways to show that someone not only attending the training but also understood it:

Workplace assessment on ladder use

Give employees a test or quiz at the end of the training to confirm that they understood it. This will also give them practice retrieving information to slow the forgetting curve (see Answering Questions directly helps you learn). And it will allow you to pick out people who didn’t get the learning or weak points in the class.

For more practical skills, you might want to observe people to check they understood the training and can practice it, or in the safety world demonstrate that they can do the job safely. For example the screenshot on the right shows how a supervisor can use an iPad to check and log someone’s skill on using a ladder.

My view is that you want to give these kinds of tests for two reasons. First and most importantly, you want to prevent your employees from falling off ladders or making other mistakes. Second, if something does go awry, you want evidence that you’ve trained people well.

If you’re interested in this area you might check out a recent, busy discussion (registration required) on the LinkedIn Compliance Exchange forum. Paraphrasing some of the views there:

Yes, you should give a quiz as it proves attendance – videoing the training is another option.

Yes, you should give a test and regulators in particular the US FDIC are increasingly demanding this

No. Danger of a test is that you need to take action if scores are bad, which may give you a lot of work. Safer not to ask the questions in case you don’t like the answers.

Yes, you should give a test but it can be a very easy and simple one, to check basic understanding and prove attendance

Yes, you should test, as well as confirming understanding it will also highlight vulnerabilities in the training

What do you think? Use the reply form below and contribute to the dialog.

Knowledge-Check Assessments within Software User Assistance and Documentation

Posted by John Kleeman

We’ve been advocating for our customers to embed knowledge checks within learning, and I’m glad to say that we have been doing this ourselves. As we say in the software industry when a company uses its own products, we’re eating our own dog food!

The evidence shows that you learn more if you study and take a quiz than if you just study and study, so we wanted to give this benefit to our users.

Questionmark has an extensive knowledge base of 600+ articles, which supplement our user manuals. These knowledge bases require registration to view, but here is an example knowledge base that is free for all to view.  Our knowledge checks typically ask 3 to 5 questions and are randomized so you get different questions when you come to the page again. We’ve put knowledge checks within the most popular articles, and since these have now been live for several months we can share some of the results:

  • On average, 13% of visitors to our knowledge base pages with embedded knowledge checks answer the questions and press Submit to see the results and feedback.
  • The response rate varies considerably depending on the subject matter from 2% in a few technical articles to over 50% in a few where the knowledge check is very appropriate.
  • About 60% of participants get a score of 75% or higher.

Here is some advice from  our documentation team lead (Noel Thethy) and me on what we’ve learned about knowledge checks in user assistance:

  1. Focus knowledge checks in articles that give learning in areas people want to learn for the long term. We found few people clicked on the knowledge checks in areas like installation advice, where you just want to do something once, but there was more interest in articles that explained concepts.
  2. Embed knowledge checks in prominent locations within content so that people can see them easily.
  3. Align questions with key learning points.
  4. Ensure the vocabulary within a knowledge check is consistent with the information it pertains to.
  5. Provide meaningful feedback to correct misconceptions.
  6. Review the questions carefully before publishing (Questionmark Live is great for this).
  7. Plan for regular reviews to make sure the content remains valid as your software changes.
  8. Use references or a naming convention to ensure it is easy to associate knowledge checks with articles in reporting.
  9. Unless you want to capture individual results, use a generic participant name to make filtering and reporting on results easier.
  10. Use the Assessment Overview report to get an overview of results and the Question Statistics or Item Analysis reports to identify which questions people are weak at; this may show that you need to improve your learning material.

I’d love to hear any questions or comments from anyone interested in knowledge checks in user assistance, feel free to email me at john@questionmark.com. To give you a flavour of how a knowledge check helps you practice retrieval on something you’ve learned, answer the three questions on this page to check your knowledge on some of the above concepts.