6 Tips for trustworthy compliance assessments

Posted by Chloe Mendonca

If you’re responsible for the development or management of compliance tests you have a heavy responsibility on your shoulders. It’s up to you to ensure your tests are both valid and reliable. We’ve spoken about reliability and validity many times here on the Questionmark blog and these really are two of the keys to ensuring your assessment results can be trusted. If your tests don’t measure what they’re designed to or the content doesn’t reflect the required job knowledge, how can you make defensible decisions on the basis of the results?

This infographic shares 6 tips that you should consider implementing if you haven’t already that will help you to develop trustworthy compliance assessments.

 

Click here to get a hi-res copy of this infographic.

To learn more about developing trustworthy assessments, check out the 26-page Questionmark White Paper “Assessment Results You Can Trust”.

The Ultimate Guide To Using Assessments for Compliance [eBook]

ebookJulie ProfilePosted by Julie Delazyn

With increasing regulatory requirements, compliance is becoming more and more of a priority for many organizations.

Without regular testing, how do you know what your employees know? And in the case of an audit or an emergency, is it good enough to have had the participant sign off saying that they’ve attended training and understand the content? Most organizations today see online assessments as a critical part of their compliance programs.

Download your complimentary copy of the eBook: Using Assessments for Regulatory Compliance to learn about the most useful applications of assessments in a compliance program and best practice recommendations for using them.

7 Strategies to Shrink Satisficing & Improve Survey Results

John Kleeman Headshot

Posted by John Kleeman

My previous post Satisficing: Why it might as well be a four-letter word explained that satisficing on a survey is when someone answers survey questions adequately but not as well as they can. Typically they just fill in questions without thinking too hard. As a commenter on the blog said: “Interesting! I have been guilty of this, didn’t even know it had a name!”

Examples of satisficing behavior are skipping questions or picking the first answer that makes some kind of sense. Satisficing is very common.  As explained in the previous blog, some reasons for it are participants not being motivated to answer well, not having the ability to answer well, them finding the survey too hard or them simply becoming fatigued at too long a survey.

Satisficing is a significant cause of survey error, so here are 7 strategies for a survey author to reduce satisficing:

1. Keep surveys short. Even the keenest survey respondent will get tired in a long survey and most of your respondents will probably not be keen. To get better results, make the survey as short as you possibly can.Bubble-Sheet---Printing-and-Scanning_2

2. Keep questions short and simple. A long and complex question is much more likely to get a poor quality answer.  You should deconstruct complex questions into shorter ones. Also don’t ask about events that are difficult to remember. People’s memory of the past and of the time things happened is surprisingly fragile, and if you ask someone about events weeks or months ago, many will not recall well.

3. Avoid agree/disagree questions. Satisficing participants will most likely just agree with whatever statement you present. For more on the weaknesses of these kind of questions, see my blog on the SAP community network: Strongly Disagree? Should you use Agree/Disagree in survey questions?

4. Similarly remove don’t know options. If someone is trying to answer as quickly as possible, answering that they don’t know is easy for them to do, and avoids thinking about the questions.

5. Communicate the benefit of the survey to make participants want to answer well. You are doing the survey for a good reason.  Make participants believe the survey will have positive benefits for them or their organization. Also make sure each question’s results are actionable. If the participant doesn’t feel that spending the time to give you a good answer is going to help you take some useful action, why should they bother?

6. Find ways to encourage participants to think as they answer. For example, include a request to ask participants to carefully deliberate – it could remind them to pay attention. It can also be helpful to occasionally ask participants to justify their answers – perhaps adding a text comment box after the question explaining why they answered that way. Adding comment boxes is very easy to do in Questionmark software.

7. Put the most important questions early on. Some people will satisfice and they are more likely to do it later on in the survey. If you put the questions that matter most early on, you are more likely to get good results from them.

There is a lot you can do to reduce satisficing and encourage people to give their best answers. I hope these strategies help you shrink the amount of satisficing your survey participants do, and in turn give you more accurate results.

5 Steps to Better Tests

Julie ProfilePosted by Julie Delazyn

Creating fair, valid and reliable tests requires starting off right: with careful planning. Starting with that foundation, you will save time and effort while producing tests that yield trustworthy results.five steps white paper

Five essential steps for producing high-quality tests:

1. Plan: What elements must you consider before crafting the first question? How do you identify key content areas?

2. Create: How do you write items that increase the cognitive load, avoid bias and stereotyping?

3. Build: How should you build the test form and set accurate pass/ fail scores?

4. Deliver: What methods can be implemented to protect test content and discourage cheating?

5. Evaluate: How do you use item-, topic-, and test-level data to assess reliability and improve quality?

Download this complimentary white paper full of best practices for test design, delivery and evaluation.

 

Free eBook: Using Assessments for Compliance

Chloe MendoncaPosted by Chloe Mendonca

Every organisation needs to assess its workforce — whether to check competence, company procedures, knowledge of the law, health and safety guidelines, or testing product knowledge — and assessments are the most reliable and cost-effective way of doing so.ebook

Without regular testing, how do you know what your employees know? And in the case of an audit or an emergency, is it good enough to have had the participant sign off saying that they’ve attended training and understand the content?

With increasing regulatory requirements, compliance is becoming more and more of a priority for many organisations. However, due to the challenges of setting up an effective assessment program, many organisations aren’t doing enough to demonstrate compliance.

Questionmark has just published a new eBook Using Assessments for Compliance* providing tips and recommendations for the various stages within assessment development.

The eBook covers:

  • The rationale for assessments in compliance
  • The business benefits
  • Specific applications of useful assessments within a compliance program
  • Best practice recommendations covering the entire assessment lifecycle
    • Planning
    • Deployment
    • Authoring
    • Delivery
    • Analytics

Click here to get your copy of the free eBook. *

*Available in a variety of formats (PDF, ePub, MOBI) for various eReaders.

The key to reliability and validity is authoring

John Kleeman HeadshotPosted by John Kleeman

In my earlier post I explained how reliability and validity are the keys to trustable assessments results. A reliable assessment means that it is consistent and a valid assessment means that it measures what you need it to measure.

The key to validity and reliability starts with the authoring process. If you do not have a repeatable, defensible process for authoring questions and assessments, then however good the other parts of your process are, you will not have valid and reliable assessments.

The critical value that Questionmark brings is its structured authoring processes, which enable effective planning, authoring, Questionmark Liveand reviewing of questions and assessments and makes them more likely to be valid.

Questionmark’s white paper “Assessment Results You Can Trust” suggests 18 key authoring measures for making trustable assessments – here are three of the most important.

Organize items in an item bank with topic structure

There are huge benefits to using an assessment management system with an item bank that structures items by hierarchical topics as this facilitates:

  • An easy management view of all items and assessments under development
  • Mapping of topics to relevant organizational areas of importance
  • Clear references from items to topics
  • Use of the same item in multiple assessments
  • Simple addition of new items within a topic
  • Easy retiring of items when they are no longer needed
  • Version history maintained for legal defensibility
  • Search capabilities to identify questions that need updating when laws change or a product is retired

Some stand alone e-Learning creation tools and some LMSs do not provide you with an item bank and require you to insert questions individually within an assessment. If you only have a handful of assessments or you rarely need to update assessments, such systems can work, but for anyone with more than a few assessments, you need an item bank to be able to make effective assessments.

Authoring tool subject matter experts can use directly

One of the critical factors in making successful items is to get effective input from subject matter experts (SMEs), as they are usually more knowledgeable and better able to construct and review questions than learning technology specialists or general trainers.

If you can use a system like Questionmark Live to harvest or “crowdsource” items from SMEs and have learning or assessment specialists review them, your items will be of better quality.

Easy collaboration for item reviewers to help make items more valid

Items will be more valid if they have been properly reviewed. They will also be more defensible if the past changes are auditable. A track-changes capability, like that shown in the example screenshot below, is invaluable to aid the review process. It allows authors to see what changes are being proposed and to check they make sense.

Screenshot of track changes functionality in Questionmark Live

These three capabilities – having an item bank, having an authoring tools SMEs can access directly and allowing easy collaboration with “track changes” are critical for obtaining reliable and valid, and therefore trustable assessments.

For more information on how to make trustable assessments, see our white paper “Assessment Results You can Trust” 

Next Page »