Knowledge-Check Assessments within Software User Assistance and Documentation

Posted by John Kleeman

We’ve been advocating for our customers to embed knowledge checks within learning, and I’m glad to say that we have been doing this ourselves. As we say in the software industry when a company uses its own products, we’re eating our own dog food!

The evidence shows that you learn more if you study and take a quiz than if you just study and study, so we wanted to give this benefit to our users.

Questionmark has an extensive knowledge base of 600+ articles, which supplement our user manuals. These knowledge bases require registration to view, but here is an example knowledge base that is free for all to view.  Our knowledge checks typically ask 3 to 5 questions and are randomized so you get different questions when you come to the page again. We’ve put knowledge checks within the most popular articles, and since these have now been live for several months we can share some of the results:

  • On average, 13% of visitors to our knowledge base pages with embedded knowledge checks answer the questions and press Submit to see the results and feedback.
  • The response rate varies considerably depending on the subject matter from 2% in a few technical articles to over 50% in a few where the knowledge check is very appropriate.
  • About 60% of participants get a score of 75% or higher.

Here is some advice from  our documentation team lead (Noel Thethy) and me on what we’ve learned about knowledge checks in user assistance:

  1. Focus knowledge checks in articles that give learning in areas people want to learn for the long term. We found few people clicked on the knowledge checks in areas like installation advice, where you just want to do something once, but there was more interest in articles that explained concepts.
  2. Embed knowledge checks in prominent locations within content so that people can see them easily.
  3. Align questions with key learning points.
  4. Ensure the vocabulary within a knowledge check is consistent with the information it pertains to.
  5. Provide meaningful feedback to correct misconceptions.
  6. Review the questions carefully before publishing (Questionmark Live is great for this).
  7. Plan for regular reviews to make sure the content remains valid as your software changes.
  8. Use references or a naming convention to ensure it is easy to associate knowledge checks with articles in reporting.
  9. Unless you want to capture individual results, use a generic participant name to make filtering and reporting on results easier.
  10. Use the Assessment Overview report to get an overview of results and the Question Statistics or Item Analysis reports to identify which questions people are weak at; this may show that you need to improve your learning material.

I’d love to hear any questions or comments from anyone interested in knowledge checks in user assistance, feel free to email me at john@questionmark.com. To give you a flavour of how a knowledge check helps you practice retrieval on something you’ve learned, answer the three questions on this page to check your knowledge on some of the above concepts.

One Response to “Knowledge-Check Assessments within Software User Assistance and Documentation”

  1. Howard Eisenberg Howard says:

    Regarding the third question, you have to be clear on what level of “knowledge” you are trying to test. If you are keen to only test “recall,” then using the same vocabulary is good practice. But if you wish to help learners deeply understand the concepts learned, in other words promote true comprehension, then using different vocabulary is advisable. Lifting sentences from documents and manuals, blanking out a word or phrase, and offering choices to fill in the blank, is a widely spread practice. Unfortunately, it only promotes rote memorization; not comprehension. Without comprehension, application and problem-solving will not be achieved either.

Leave a Reply