Six tips to increase content validity in competence tests and exams

Posted by John Kleeman

Content validity is one of the most important criteria on which to judge a test, exam or quiz. This blog post explains what content validity is, why it matters and how to increase it when using competence tests and exams within regulatory compliance and other work settings.

What is content validity?

An assessment has content validity if the content of the assessment matches what is being measured, i.e. it reflects the knowledge/skills required to do a job or demonstrate that the participant grasps course content sufficiently.

Content validity is often measured by having a group of subject matter experts (SMEs) verify that the test measures what it is supposed to measure.

Why does content validity matter?

If an assessment doesn’t have content validity, then the test isn’t actually testing what it seeks to, or it misses important aspects of job skills.

Would you want to fly in a plane, where the pilot knows how to take off but not land? Obviously not! Assessments for airline pilots take account all job functions including landing in emergency scenarios.

Similarly, if you are testing your employees to ensure competence for regulatory compliance purposes, or before you let them sell your products, you need to ensure the tests have content validity – that is to say they cover the job skills required.

Additionally to these common sense reasons, if you use an assessment without content validity to make decisions about people, you could face a lawsuit. See this blog post which describes a US lawsuit where a court ruled that because a policing test didn’t match the job skills, it couldn’t be used fairly for promotion purposes.

How can you increase content validity?

Here are some tips to get you started. For a deeper dive, Questionmark has several white papers that will help, and I also recommend Shrock & Coscarelli’s excellent book “Criterion-Referenced Test Development”.

  1. Conduct a job task analysis (JTA). A JTA is a survey which asks experts in the job role what tasks are important and how often they are done. A JTA gives you the information to define assessment topics in terms of what the job needs. Questionmark has a JTA question type which makes it easy to deliver and report on JTAs.
  2. Define the topics in the test before authoring. Use an item bank to store questions, and define the topics carefully before you start writing the questions. See Know what your questions are about before you deliver the test for some more reasoning on this.
  3. You can poll subject matter experts to check content validity for an existing test. If you have an existing assessment, and you need to check its content validity, get a panel of SMEs (experts) to rate each question as to whether it is  “essential,” “useful, but not essential,” or “not necessary” to the performance of what is being measured. The more SMEs who agree that items are essential, the higher the content validity. See Understanding Assessment Validity- Content Validity for a way to do this within Questionmark software.
  4. Use item analysis reporting. Item analysis reports flag questions which are don’t correlate well with the rest of the assessment. Questionmark has an easy to understand item analysis report which will flag potential questions for review. One of the reasons a question might get flagged is because participants who do well on other questions don’t do well on this question – this could indicate the question lacks content validity.
  5. Involve Subject Matter Experts (SMEs). It might sound obvious, but the more you involve SMEs in your assessment development, the more content validity you are likely to get. Use an assessment management system which is easy for busy SMEs to use, and involve SMEs in writing and reviewing questions.
  6. Review and update tests frequently. Skills required for jobs change quickly with changing technology and changing regulations.  Many workplace tests that were valid two years ago, are not valid today. Use an item bank with a search facility to manage your questions, and review and update or retire questions that are no longer relevant.

I hope this blog post reminds you why content validity matters and gives helpful tips to improve the content validity of your tests. If you are using a Learning Management System to create and deliver assessments, you may struggle to obtain and demonstrate content validity. If you want to see how Questionmark software can help manage your assessments, request a personalized demo today.


Need better assessments? Read our white papers

Posted by Joan Phaup

Assessments play a crucial role in learning, performance improvement and regulatory compliance — and Questionmark White Papers help you create better assessments, deliver them securely and get trustable results.

We invite you to download any of these papers with our compliments. Here are just five of the many available for you to choose from:

  • Assessments Through the Learning Process
    • This is a great place to start for people who are beginning to explore the possibility of using online assessments in education, training, certification or compliance. It explains how you can use assessments to improve learning and measurement, and it will point you to many additional resources.
  • The Role of Assessments in Mitigating Risk for Financial Services Organizations
    • Online assessments play a critical role in corporate compliance programs. This paper describes today’s regulatory landscape and explores the business benefits of using assessments for compliance. It also describes specific ways to use assessments within a compliance program, examines security and accessibility issues and offers role-specific best practice guidance for implementing legally defensible assessments.
  • Using Online Assessment for Compliance
    • This paper explains how assessments can help organizations document their effectiveness in delivering compliance training and – in addition – play a role in improving organizational performance. It describes the various ways in which organizations can use assessments to demonstrate that their employees understand regulatory standards.
  • Alignment, Impact and Measurement with the A-model
    • The A-model framework helps individuals and organizations clarify the goals, objectives and human performance issues of their work and design systematic assessment systems to evaluate progress towards their goals. Learn the ideas behind the A-model and how to implement them.
  • Embedded Assessments: Building Knowledge Checks, Surveys and Other Assessments into Learning Materials
    • The ability to place assessments within many different contexts — embedding them in wikis, blogs and portals, for instance — changes their potential uses and brings them directly into the learning process. This paper value of embedded assessments to learners and instructors, shares examples of where such assessments can be used and explores how they might help shape the future of learning.


Knowledge-Check Assessments within Software User Assistance and Documentation

Posted by John Kleeman

We’ve been advocating for our customers to embed knowledge checks within learning, and I’m glad to say that we have been doing this ourselves. As we say in the software industry when a company uses its own products, we’re eating our own dog food!

The evidence shows that you learn more if you study and take a quiz than if you just study and study, so we wanted to give this benefit to our users.

Questionmark has an extensive knowledge base of 600+ articles, which supplement our user manuals. These knowledge bases require registration to view, but here is an example knowledge base that is free for all to view.  Our knowledge checks typically ask 3 to 5 questions and are randomized so you get different questions when you come to the page again. We’ve put knowledge checks within the most popular articles, and since these have now been live for several months we can share some of the results:

  • On average, 13% of visitors to our knowledge base pages with embedded knowledge checks answer the questions and press Submit to see the results and feedback.
  • The response rate varies considerably depending on the subject matter from 2% in a few technical articles to over 50% in a few where the knowledge check is very appropriate.
  • About 60% of participants get a score of 75% or higher.

Here is some advice from  our documentation team lead (Noel Thethy) and me on what we’ve learned about knowledge checks in user assistance:

  1. Focus knowledge checks in articles that give learning in areas people want to learn for the long term. We found few people clicked on the knowledge checks in areas like installation advice, where you just want to do something once, but there was more interest in articles that explained concepts.
  2. Embed knowledge checks in prominent locations within content so that people can see them easily.
  3. Align questions with key learning points.
  4. Ensure the vocabulary within a knowledge check is consistent with the information it pertains to.
  5. Provide meaningful feedback to correct misconceptions.
  6. Review the questions carefully before publishing (Questionmark Live is great for this).
  7. Plan for regular reviews to make sure the content remains valid as your software changes.
  8. Use references or a naming convention to ensure it is easy to associate knowledge checks with articles in reporting.
  9. Unless you want to capture individual results, use a generic participant name to make filtering and reporting on results easier.
  10. Use the Assessment Overview report to get an overview of results and the Question Statistics or Item Analysis reports to identify which questions people are weak at; this may show that you need to improve your learning material.

I’d love to hear any questions or comments from anyone interested in knowledge checks in user assistance, feel free to email me at To give you a flavour of how a knowledge check helps you practice retrieval on something you’ve learned, answer the three questions on this page to check your knowledge on some of the above concepts.