How many errors can you spot in this survey question?

John KleemanPosted by John Kleeman

Tests and surveys are very different. In a test, you look to measure participant knowledge or skill; you know what answer you are looking for, and generally participants are motivated to answer well. In a survey, you look to measure participant attitude or recollection; you don’t know what answer you are looking for, and participants may be disinterested.

Writing good surveys is an important skill. If you’re interested in how to write good surveys of opinion and attitude in training, learning, compliance, certification, based on research evidence, you might be interested in a webinar I gave titled, “Designing Effective Surveys.” Click HERE for the webinar recording and slides.

In the meantime, here’s a sample survey question. How many errors can you spot in the question?

The material and presentation qualty at Questionmark webinars is always excellent. Strongly Agree Agree Slightly agree Neither agree nor disagree Disagree Strongly disagree

There are quite a few errors. Try to count the errors before you look at my explanation below!!

I count seven errors:

  1. I am sure you got the mis-spelling of “quality”. If you mis-spell something in a survey question, it indicates to the participant that you haven’t taken time and trouble writing your survey, so there is little incentive for them to spend time and trouble answering.
  2. It’s not usually sensible to use the word “always” in a survey question. Some participants make take the statement literally, and it’s much more likely that webinars are usually excellent than that every single one is excellent.
  3. The question is double-barreled. It’s asking about material AND presentation quality. They might be different. This really should be two questions to get a consistent answer.
  4. The “Agree” in “Strongly Agree” is capitalized but not in other places, e.g. “Slightly agree”. Capitalization should be equal in every part of the scale.

You can see these four errors highlighted below.

Red marking corresponding to four errors above

Is that all the errors? I count three more, making a total of seven:

  1. The scale should be balanced. Why is there a “Slightly agree” and not a “Slightly disagree”?
  2. This is a leading or “loaded” question, not a neutral one, it encourages you to a positive answer. If you genuinely want to get people’s opinion in a survey question, you need to ask it without encouraging the participant to answer a particular way.
  3. Lastly, any agree/disagree question has acquiescence bias. Research evidence suggests that some participants are more likely to agree when answering survey questions. Particularly those who are more junior or less educated who may tend to think that what is asked of them might be true. It would be better to word this question to ask people to rate the webinars rather than agree with a statement about them.

Did you get all of these? I hope you enjoyed this little exercise. If you did, I explain more about this and good survey practice in our Designing Effective Surveys webinar, click HERE for the webinar recording and slides.

Beyond Recall : Taking Competency Assessments to the Next Level

A pyramid showing create evaluate analyze apply understand remember / recall

John KleemanPosted by John Kleeman

A lot of assessments focus on testing knowledge or facts. Questions that ask for recall of facts do have some value. They check someone’s knowledge and they help reduce the forgetting curve for new knowledge learned.

But for most jobs, knowledge is only a small part of the job requirements. As well as remembering or recalling information, people need to understand, apply, analyze, evaluate and create as shown in Bloom’s revised taxonomy right. Most real world jobs require many levels of the taxonomy, and if your assessments focus only on recalling knowledge, they may well not test job competence validly.

Evaluating includes exercising judgement, and using judgement is a critical factor in competence required in a lot of job roles. But a lot of assessments don’t assess judgement, and this webinar will explain how you can do this.

There are many approaches to creating assessments that do more than test recall, including:

  • You can write objective questions which test understanding and application of knowledge, or analysis of situations. For example you can present questions within real-life scenarios which require understanding a real-life situation and working out how to apply knowledge and skills to answer it. It’s sometimes useful to use media such as videos to also make the question closer to the performance environment.
  • You can use observational assessments, which allow an observer to watch someone perform a task and grade their performance. This allows assessment of practical skills as well as higher level cognitive ones.
  • You can use simulations which assess performance within a controlled environment closer to the real performance environment
  • You can set up role-playing assessments, which are useful for customer service or other skills which need interpersonal skills
  • You can assess people’s actual job performance, using 360 degree assessments or performance appraisal.

In our webinar, we will give an overview of these methods but will focus on a method which has always been used in pre-employment but which is increasingly being used in post-hire training, certification and compliance testing. This method is Situational Judgement Assessments – which are questions carefully written to assess someone’s ability to exercise judgement within the domain of their job role.

It’s not just CEOs who need to exercise judgment and make decisions, almost every job requires an element of judgement. Many costly errors in organizations are caused by a failure of judgement. Even if people have appropriate skill, experience and knowledge, they need to use judgement to apply it successfully, otherwise failures occur or successful outcomes are missed.

Situational Judgment Assessments (SJAs) present a dilemma to the participant (using text or video)  and ask them to choose options in response. The dilemma needs to be one that is relevant to the job, i.e. one where using judgement is clearly linked to a needed domain of knowledge, skill or competency in the job role. And the scoring needs to be based on subject matter experts alignment that the judgement is the correct one to make.

Context is defined (text or video); Dilemma that needs judgment; The participant chooses from options; A score or evaluation is made

Situational Judgement Assessments can be a valid and reliable way of measuring judgement and can be presented in a standalone assessment or combined with other kinds of questions. If you’re interested in learning more, check out our webinar titled “Beyond Recall: Taking Competency Assessments to the Next Level.” You can download the webinar recording and slides HERE.

Q&A: Sue Martin and John Kleeman discuss steps to building a certification program

Posted by Zainab Fayaz

Certification programs are a vital way of recognizing knowledge, skills and professional expertise, but, during a time of digital transformation, how do you build a program that is sustainable and adaptable to the evolving needs of your organization, stakeholders and the market?

Questionmark Founder and Executive Director, John Kleeman and Sue Martin, certification expert and Business Transformation Consultant presented a webinar on how to build a certification program (you can view the webinar HERE). Before the webinar, we sat down with our experts to gain some insight on what they’ll be covering during the session.

Tell us a bit about what you’ll be covering during the webinar:

Sue: During the webinar, we’ll be covering a range of things; from the conceptual steps of building a certification program to the many projects that have evolved from these and the importance of outlining key steps from the very beginning of the process for creating a comprehensive and cohesive certification program.

We will also talk about the value certification program, can add to an organization, not only in the short-haul but also for many years to come. It is important to remember, “why” and “what” you are trying to achieve, and this webinar will provide detail on how the alignment of strategic goals and communication with stakeholders contributes to the success of an adaptable certification program.

John: We’ll be discussing a range of things during the webinar, but here are the ten easy steps that we’ll be describing:

  1. Business goals
  2. Scope
  3. Security
  4. Vendor evaluation
  5. Blueprint and test design
  6. Test development
  7. Pilot
  8. Communications
  9. Delivery
  10. Reporting and monitoring

What influenced the selection of these 10-steps you have identified in building a certification program?

John:  Sue and I sat down to plan the webinar when we were together at the OEB conference in Berlin in December. Although we wanted to cover a bit some of the obvious things like test design and development, we wanted to make sure people think first about the preparation and planning, for example getting organization buy-in and working out how to market and communicate the program to stakeholders. So we’ll be focusing on what you need to do to make a successful program, and that will drive everything you do

Although you’ll be covering the key steps for building a certification program during the webinar, can you advise on three key steps you find to be the most important during the process:

Sue:
1. Planning:
The emphasis of the program’s work should be at the start, in the planning phase – especially in order to build a flexible program which will adapt to the needs of your audience and stakeholders as their needs change over time. In all of the individual project components, whether it be test creation, vendor evaluation or communications rollout, for example, design and plan for the end goal. For example, when it comes to creating an exam, you plan for it right at the start of the project – you hit the ground running! It is not all about item writing, but also the development of the project from the beginning and if you don’t plan; this can lead to the lack of validity in the exam program and inconsistency over time

2. Practical tips and tricks for approaching various elements of your program development: It is important to set out the target audience; identify their learning journey and how they learn – in knowing this, can you go forward and build a certification program that can become integrated and aligns with the learning process

3. Scope: This is very important; setting the scope is a priority. Of course, in the greater scheme of things; you’ll have a mission statement, which provides you with a strategic vision, but when it comes to the finer detail and knowing what countries to enter, the pricing structure or knowing whether to offer remote proctoring; always keep in mind three things: the value contribution, the stakeholders and ask yourselves “yes, but why?”; as this will help align with organizational objectives.

What can attendees take away from the webinar you’ll present?  

Sue: Those attending will learn the value and importance of planning and questioning everything from the start of the process. We’ll share advice on the importance of having a value statement for every part of the process and making sure you know that a certification program is what you are looking for. By attending you can walk away with knowing the operational and strategic steps you must go through in order to build a program that is sustainable; think of it as a checklist!

John: If you’re starting a new certification program, I think this webinar will help guide you and help you create it more easily and more effectively. And if you already have a certification program and want to improve it, you’ll probably be doing a lot of what we suggest already but I hope they’ll be something for everyone to take away and learn.

Want to know more?

If you’re interested in learning more about the steps to building a certification program that meets the needs of your organization and stakeholders, check out John and Sue’s webinar session, Building a Certification Program in 10 easy steps.

A little bit more about our two experts:

John Kleeman is Executive Director and Founder of Questionmark. He has a first-class degree from Trinity College, Cambridge, and is a Chartered Engineer and a Certified Information Privacy Professional/Europe (CIPP/E). John wrote the first version of the Questionmark assessment software system and then founded Questionmark in 1988 to market, develop and support it. John has been heavily involved in assessment software for 30 years and has also participated in several standards initiatives including IMS QTI, ISO 23988 and ISO 10667. John was recently elected to the Association of Test Publishers (ATP) Board of Directors.

Sue Martin is a trusted advisor to companies and institutions across Europe in the area of workforce credentialing, learning strategies and certification. Her career prior to consulting included a role as Senior Global Certification Director for SAP and several regional and global management roles in the testing industry. She has also held several positions within industry institutions, such as the Chair of the European Association of Test Publishers and is currently a member of the Learning & Development Committee at BCS (British Computer Society).

Washington DC – OnDemand for Government Briefing Recap

Posted by Kristin Bernor

Last Thursday, May 17, Questionmark hosted a briefing in Washington, DC that powerfully presented the journey to delivering the Questionmark OnDemand for Government assessment management system to our government customers. We would like to thank industry experts that made this possible including speakers from the Department of State, FedRAMP, Microsoft and Schellman. In just 3 1/2 hours, they were able to present the comprehensive process of what goes into delivering the Questionmark OnDemand for Government system to market. This new government community cloud-based service dedicated to the needs of U.S. governmental and defense agencies is currently wrapping up the onsite portion of the audit. The auditors will be finalizing their testing and review culminating in an assessment report. The complete security package is expected to be available in July. Questionmark OnDemand for Government is designed to be compliant with FedRAMP and hosted in a FedRAMP certified U.S. data center.

Highlights from the briefing included presentations from:

  • Eric Shepherd, CEO of Questionmark, hosted the event.
  • Ted Stille, Department of State, discussed the agency’s motivations and experience as project sponsor for Questionmark OnDemand for Government.
  • Stacy Poll and David Hunt, Public Sector Business Development Manager and Information Security Officer of Questionmark respectively, presented a system overview including demonstration screens, migration paths and detailed next steps to plan for implementation.
  • Christina McGhee, Schellman audit team, spoke about the 3PAO role in the FedRAMP authorization process.
  • Zaree Singer and Laurie Southerton, FedRAMP PMO Support, explained the FedRAMP ATO approval process.
  • Ganesh Shenbagaraman, Microsoft, discussed Microsoft Azure’s government cloud service.

This unique opportunity to learn about the OnDemand for Government assessment management system, meet with peers and other customers, and hear from FedRAMP and our 3PAO themselves proved invaluable to attendees who rated the briefing a near 5 out of 5.

Please reach out to Stacy Poll at stacy.poll@questionmark.com for more information.

Special Briefing: Cloud-based Assessment Management for Government and Defense Agencies

Posted by Kristin Bernor

In just two weeks, the special briefing about Questionmark OnDemand for Government, a new cloud-based service dedicated to the needs of U.S. governmental and defense agencies, takes place. You don’t want to miss it!

Join us on Thursday, May 17th in Washington, DC, to learn about how this new service enables agencies to securely author, deliver and analyze assessments, and hear from dynamic speakers including:

  • Jono Poltrack, contributor to the Sharable Content Object Reference Model (SCORM) while at Advanced Distributed Learning (ADL)
  • Ted Stille, Department of State, will discuss the agency’s motivations and experience as project sponsor for Questionmark OnDemand for Government
  • Christina McGhee, Schellman audit team, will discuss the 3PAO role in the FedRAMP authorization process
  • Zaree Singer, FedRAMP PMO Support, will explain the FedRAMP ATO approval process
  • Ganesh Shenbagaraman, Microsoft, will discuss Microsoft Azure’s government cloud service

Space is limited, so register today! Here’s how:

Questionmark has finalized its FedRAMP System Security Plan and this plan, which documents our security systems and processes, is now being reviewed by an accredited FedRAMP Third Party Assessment Organization (3PAO); this means that we are officially in audit. Once this document has been audited it becomes part of the FedRAMP library for Security Officers to review and provide individual agencies with an “Authorization to Operate” (ATO). Note: Briefing attendees will be eligible to receive a pre-release copy of the FedRAMP System Security Plan.

Questionmark is widely deployed by U.S. governmental and defense agencies to author, deliver and report on high-stakes advancement exams, post-course tests for distance learning, job task analysis, medical training and education, competency testing, course evaluations and more. For government agencies currently using the on-premise installed Questionmark Perception, OnDemand for Government provides a cost-effective option to upgrade to a secure, best-in-class cloud-based assessment management system.

We look forward to seeing you in Washington for a morning of learning and networking!

Seven Ways Assessments Fortify Compliance

Posted by John Kleeman
Picture of a tablet being used to take an assessment with currency symbols adjacentWhy do most of the world’s banks, pharmaceutical companies, utilities and other large companies use online assessments to test the competence of their employees?

It’s primarily because compliance fines round the world are high and assessments reduce the risk of regulatory compliance failures. Assessments also give protection to the organization in the event of an individual mis-step by proving that the organization had checked the individual’s knowledge of the rules prior to the mistake.

Here are seven reasons companies use assessments from my experience:

1. Regulators encourage assessments 

Some regulators require companies to test their workforce regularly. For example the US FDIC says in its compliance manual:

“Once personnel have been trained on a particular subject, a compliance officer should periodically assess employees on their knowledge and comprehension of the subject matter”

And the European Securities and Market Authority says in its guidelines for assessment of knowledge and competence:

“ongoing assessment will contain updated material and will test staff on their knowledge of, for example, regulatory changes, new products and services available on the market”

Other regulators focus more on companies ensuring that their workforce is competent, rather than specifying how companies ensure it, but most welcome clear evidence that personnel have been trained and have shown understanding of the training.

People sitting at desks with computers taking tests2. Assessments demonstrate commitment to your workforce and to regulators

Many compliance errors happen because managers pay lip service to following the rules but indicate in their behavior they don’t mean it. If you assess all employees and managers regularly, and require additional training or sanctions for failing tests, it sends a clear message to your workforce that knowledge and observance of the rules is genuinely required.

Some regulators also take commitment to compliance by the organization into account when setting the level of fines, and may reduce fines if there is serious evidence of compliance activities, which assessments can be a useful part of. For example the German Federal Court recently ruled that fines should be less if there is evidence of effective compliance management.

3. Assessments find problems early

Online assessments are one of the few ways in which a compliance team can touch all employees in an organization. You can see results by team, department, location or individual and identify who understands what and focus in on weak areas to look at improving. There is no better way to reach all employees.

4. Assessments document understanding after training

Many regulators require training to be documented. Giving someone an assessment after training doesn’t just confirm he or she attended the course but confirms they understood the training.

5. Assessments increase retention of knowledge and reduce forgetting

Can you remember everything you learned? Of course, none of us can!

There is good evidence that quizzes and tests increase retention and reduce forgetting. This is partly because people study for tests and so remind themselves of the knowledge they learned, which helps retain it. And it is partly because retrieving information in a quiz or test makes it easier to retrieve the same information in future, and so more likely to be able to apply in practice when needed.

6. By allowing testing out, assessments reduce the time and cost of compliance trainingTake test. If pass, skip training. Otherwise do training.

Many organizations permit employees to “test out” of compliance training. People can take a test and if they demonstrate good enough knowledge, they don’t need to attend the training. This concentrates training resources and employee time on areas that are needed, and avoids demoralizing employees with boring compliance training repeating what they already know.

7. Assessments reduce human error which reduces the likelihood of a compliance mis-step

Many compliance failures arise from human error. Root cause analysis of human error suggests that a good proportion of errors are caused by people not understanding training, training being missing or people not following procedures. Assessments can pick up and prevent mistakes caused by people not understanding what they should do or how to follow procedures, and so reduce the risk of error.

 

If you are interested in learning more about the reasons online assessments mitigate compliance risk, Questionmark are giving a webinar “Seven Ways Assessments Fortify Compliance” on April 11th. To register for this or our other free webinars, go to www.questionmark.com/questionmark_webinars.