Getting Results — The Questionmark Blog : Discussing assessment trends and best practices, tech tips, case studies and more.

2018 Questionmark Conference Recap – 2019 Announced

Posted by Kristin Bernor

Wow, just wow! The 2018 Questionmark User’s Conference inspired all that are new to Questionmark and those that have been long time customers. Being able to share experiences and use case scenarios was exhilarating. The three day learning experience was a deep dive into learning, fun, and networking – the enthusiasm was truly contagious. From the opening general session where new features and functions were released to the trolley ride over to the railroad museum where guests were greeted with amazing local food and music, this year’s conference was without a doubt the preeminent complete user’s group experience!

Amazing customer presenters were on hand to deliver testimonials to their peers on best practices and lessons learned to truly maximize the power of their assessments. We greatly appreciated case studies presented by our valuable customers from Progressive Insurance, American Institute of Certified Public Accountants, Southwest Gas, Intuitive Surgical, Rio Salado College and Caterpillar.

A lively panel discussion focused on transformation and growth and how assessments are integral to all organizations in order to succeed in those areas. Panel participants included Andrew Dominguez of Southwest Gas, Tricia Allen of Polycom, Bernt Nilsen of Norsk Test and Dave “Lefty” Lefkowith of Louisiana Department of Education. Their invaluable insights into creating high performing teams through assessments was an “aha moment” for many and we are truly grateful for their participation.

Engaging sessions that delved into best practices of using assessment to identify knowledge gaps, improve performance and make informed and defensible decisions were widespread. Questionmark Founder and Executive Director, John Kleeman, presented a session based on a recent white paper, “Assessing for Situational Judgment”. Another well attended session detailed extending your assessment management capabilities with Questionmark apps and left attendees excited to get back to the office and implement.

Evenings provided many opportunities for networking, fine dining and fun. Small group “dine-arounds” — a long-time, popular tradition at our annual conference — gave us all a great chance to take in the sights as we strolled through beautiful, historic Savannah. Delegates and staff attending our Thursday evening reception, hosted at the Georgia State Railroad Museum, enjoyed delicious food, great music and networking.

The conference closed with the unveiling of next year’s Questionmark Conference dates and location. Drum roll please…

Save the date for Questionmark Conference 2019 us in San Diego at the Hard Rock Hotel, February 26 – March 1, 2019 for even more learning, networking and fun!

Watch the conference recap below!!

Questionmark Recognized with Brandon Hall Award for Assessment Technology

Posted by Kristin Bernor

What an amazing way to start the year and what an honor! We were recognized by Brandon Hall Group with a “Best Advance in Assessment and Survey Technology” award as part of their Excellence in Technology Program. Questionmark was presented with this award at the Brandon Hall Group HCM Excellence Conference Awards Gala in Palm Beach Gardens, Florida on February 1, 2018.

“Questionmark goes beyond just capturing results and provides tools for analyzing, interpreting, and understanding the information that assessments are gathering. For organizations with regulatory, compliance, or certification needs, Questionmark provides unrivaled tools for testing and measuring results,” from David Wentworth, Principal Analyst of the Brandon Hall Group.

The award recognizes those organizations that have successfully deployed programs, strategies, modalities, processes, systems, and tools that have achieved measurable results. Criteria used to judge applicants included value proposition, product innovation, unique differentiators, product demonstration and measurable results.

See the press release here.

Item Analysis for Beginners – When are very Easy or very Difficult Questions Useful?

Posted by John Kleeman

I’m running a session at the Questionmark user conference next month on Item Analysis for Beginners and thought I’d share the answer to an interesting question in this blog.

Item analysis fragment showing a question with difficulty of 0.998 and discrimination of 0.034When you run an Item Analysis report, one of the useful statistics you get on a question is its “p-value” or “item difficulty”. This is a number from 0 to 1, with the higher the value the easier the question. An easy question might have a p-value of 0.9 to 1.0, meaning 90% to 100% of participants answer the question correctly. A difficult question might have a p-value of 0.0 to 0.25 meaning less than 25% of participants answer the question correctly. For example, the report fragment to the right shows a question with p-value 0.998 which means it is very easy and almost everyone gets it right.

Whether such questions are appropriate depends on the purpose of the assessment. Most participants will get difficult questions wrong and easy questions right. In general, very easy and very difficult questions will not be as helpful as other questions in helping you discriminate between participants and so use the assessment for measurement purposes.

Here are three reasons why you might decide to include very difficult questions in an assessment:

  1. Sometimes your test blueprint requires questions on a topic and the only ones you have available are difficult ones – if so, you need to use them until you can write more.
  2. If a job has high performance needs and you need to filter out a few participants from many, then very difficult questions can be useful. This might apply for example if you are selecting potential astronauts or special forces team members.
  3. If you need to assess a wide range of ability within a single assessment, then you may need some very difficult questions to be able to assess abilities within the top performing participants.

And here are five reasons why you might decide to include very easy questions in an assessment:

  1. Answering questions gives retrieval practice and helps participants remember things in future – so including easy questions still helps reduce people’s forgetting.
  2. In compliance or health and safety, you may choose to include basic questions that almost everyone gets right. This is because if someone gets it wrong, you want to know and be able to intervene.
  3. More broadly, sometimes a test blueprint requires you to cover some topics that almost everyone knows, and it’s not practical to write difficult questions about.
  4. Easy questions at the start of an assessment can build confidence and reduce test anxiety. See my blog post Ten tips on reducing test anxiety for online test-takers for other ways to deal with test anxiety.
  5. If the purpose of your assessment is to measure someone’s ability to process information quickly and accurately at speed, then including many low difficulty questions that need to be answered in a short time might be appropriate.

If you want to learn more about Item Analysis, search this blog for other articles. You might also find the Questionmark user conference useful, since as well as my session on Item Analysis, there are also many other useful sessions including setting the cut-score in a fair, defensible way and identifying knowledge gaps. The conference also gives opportunity to learn and network with other assessment practitioners – I look forward to seeing some of you there.

New White Paper Examines how to Assess for Situational Judgment

Posted by John Kleeman

Is exercising judgment a critical factor in the competence of the employees and contractors who service your organization? If the answer to this is yes, as it most likely is, you may be interested in Questionmark’s white paper, just published this week on “Assessing for Situational Judgment”.

It’s not just CEOs who need to exercise judgment and make decisions, almost every job requires an element of judgment. Situational Judgment Assessments (SJAs) present a dilemma to the participant and ask them to choose options in response.

Context is defined -> There is a dilemma that needs judgment -> The participant chooses from options -> A score or evaluation is made

Here is an example: 

You work as part of a technical support team that produces work internally for an organization. You have noticed that often work is not performed correctly or a step has been omitted from a procedure. You are aware that some individuals are more at fault than others as they do not make the effort to produce high quality results and they work in a disorganized way. What do you see as the most effective and the least effective responses to this situation?
A.  Explain to your team why these procedures are important and what the consequences are of not performing these correctly.
B.  Try to arrange for your team to observe another team in the organisation who produce high quality work.
C.  Check your own work and that of everyone else in the team to make sure any errors are found.
D.  Suggest that the team tries many different ways to approach their work to see if they can find a method where fewer mistakes are made.

In this example, option C deals with errors but is time consuming and doesn’t address the behavior of team members. Option B is also reasonable but doesn’t deal with the issue immediately and may not address the team’s disorganized approach. Option D is asking a disorganized team to engage in a set of experiments that could increase rather than reduce errors in the work produced. This is likely to be the least effective of the options presented. Option A does require some confidence in dealing with potential pushback from the other team members, but is most likely to have a positive effect.

You can see some more SJA examples at

SJA items assess judgment and variations can be used in pre-hire, post-hire training, for compliance and for certification. SJAs offer assessment programs the opportunity to move beyond assessments of what people know (knowledge of what) to assessments of how that knowledge will be applied in the workplace (knowledge of how).

Questionmark’s white paper is written as a collaboration by Eugene Burke, well known advisor on talent, assessment and analytics and myself. The white paper is aimed at:

  • Psychometricians, testing professionals, work psychologists and consultants who currently create SJAs for workplace use (pre-hire or post-hire) and want to consider using Questionmark technology for such use
  • Trainers, recruiters and compliance managers in corporations and government looking to use SJAs to evaluate personnel
  • High-tech or similar certification organizations looking to add SJAs to increase the performance realism and validity of their exam

The 40 page white paper includes sections on:

  • Why consider assessing for situational judgment
  • What is an SJA?
  • Pre-hire and helping employers and job applicants make better decisions
  • Post-hire and using SJAs in workforce training and development
  • SJAs in certification programs
  • SJAs in support of compliance programs
  • Constructing SJAs
  • Pitfalls to avoid
  • Leveraging technology to maximize the value of SJAs

Situational Judgment Assessments are an effective means of measuring judgment and the white paper provides a rationale and blueprint to make it happen. The white paper is available free (with registration) from

I will also be presenting a session about SJAs in March at the Questionmark Conference 2018 in Savannah, Georgia – visit the conference website for more details.

Why Digital Badges are Win-Win-Win

Posted by John Kleeman

Digital badges are a validated, electronic measure of achievement that are starting to be very widely used in the workplace, in education and in certification. This article explains some of the reasons why they are genuinely a win for all involved.

Picture of a badge with the Questionmark logo on it saying "Data Security Proficient"

A little while ago, achievements were recognized with a paper certificate, which you could frame and put on your office wall. But with digital printing, paper certificates have become easy to forge, and with the Internet, a lot of offices are virtual. So nowadays recognition of achievement is often with a digital certificate or badge. Digital badges use a picture (like the one on the right) to summarize what the badge is and are backed up with detailed information that can be verified online. Many people put badges on their LinkedIn or other social media accounts.

Digital badges can be used for many purposes including:

  • Giving a certificate on passing a test or exam
  • Showing completion of a course
  • Recognize other accomplishments, major and minor
  • Help signpost pathways to learning with each step a badge

With Questionmark Badging, you can award digital badges when someone passes an assessment.

A key human need is to achieve goals and for others to value our achievements, and digital badges provide a mechanism to help satisfy these needs.

Three circles showing Society, Organizations and IndividualsDigital badges can be win-win-win for society, organizations and individuals.


As technology changes and the world becomes more global, skill shortages are a significant issue to the global economy. According to the OECD, “in most countries, large shares of employers complain that they cannot find workers with the skills that their businesses require”.

Digital badges help reduce skill shortages by encouraging and documenting the acquisition of important skills and by recognizing competencies at a distance. They allow a society to increase citizen contributions and help mobility of skills which drives economic value.


The most tangible benefit for digital badges is to the organizations who issue them. Digital badges provide a great opportunity for companies, universities and colleges and certification providers.

  • In the workplace, employers can issue badges to provide recognition of e-learning or instructor led training and to show achievement of important competencies. Badges motivate employees to develop current skills which make a business difference to the employer, and they recognize employees for achievement. For many employees, being recognized for an achievement is as important as a pay rise or other more tangible benefit.
  • Universities and colleges can issue badges to students to show the module make-up in their courses, to encourage or showcase extra achievement or for short courses outside the usual curriculum.
  • Certification providers can give candidates a shareable badge, which encourages them to engage with the provider and promotes the certification brand. Digital badges also allow micro-credentials and delta credentials to deal with fast moving technology.


Lastly, for individuals, digital badges help develop skills and careers. Digital badges let individuals showcase their achievements, grow self-esteem with a sense of accomplishment and obtain recognition for what they do as employee and/or as citizen.


Digital badges make sense for society, for organizations and for individuals. The improved competence and communication about competence they encourage improves society. Organizations gain skills and a new currency to reward their workforce. And badge earners gain recognition and a sense of achievement. It is important that badge issuers set up a good process to ensure their badges are credible and sustainable measures of achievement, but if this is done, badges are truly win-win-win.

If you are interested in learning more about digital badges, you can see information on Questionmark’s solution here.

GDPR: 6 months to go

Posted by Jamie Armstrong

Anyone working with personal data, particularly in the European Union, will know that we are now just six months from “GDPR day” (as I have taken to calling it). On 25-May-2018, the EU General Data Protection Regulation (“GDPR”) will become applicable, ushering in a new privacy/data protection era with greater emphasis than ever on the rights of individuals when their personal data is used or stored by businesses and other organizations. In this blog post, I provide some general reminders about what the GDPR is and give some insight into Questionmark’s compliance preparations.

The GDPR replaces the current EU Data Protection Directive, which has been around for more than 20 years. To keep pace with technology advances and achieve greater uniformity on data protection, the EU began work on the GDPR over 5 years ago and finalized the text in April 2016. There then followed a period for regulators and other industry bodies to provide guidance on what the GDPR actually requires, to help organizations in their compliance efforts. Like all businesses that process EU personal data, whether based within the U.S., the EU or elsewhere, Questionmark has been busy in the months since the GDPR was finalized to ensure that our practices and policies align with GDPR expectations.

For example, we have recently made available revised versions of our EU OnDemand service and US OnDemand service terms and conditions with new GDPR clauses, so that our customers can be assured that their agreements with us meet data controller-data processor contract requirements. We have updated our privacy policy to make clearer what personal data we gather and how this is used when people visit and interact with our website. There is also a helpful Knowledge Base article on our website that describes the personal data Questionmark stores.


One of the most talked-about provisions of the GDPR is Article 35, which deals with data protection impact assessments, or “DPIAs.” Basically, there is a requirement that organizations acting as data controllers of personal data (meaning that they determine the purpose and means of the processing of that data) complete a prior assessment of the impacts of processing that data if the processing is likely to result in a high risk to the rights and freedoms of data subjects. Organizations will need to make a judgment call regarding whether a high risk exists to require that a DPIA be completed. There are scenarios in which a DPIA will definitely be required, such as when data controllers process special categories of personal data like racial origin and health information, and in other cases some organizations may decide it’s safer to complete a DPIA even if not absolutely necessary to comply with the GDPR.

The GDPR expects that data processors will help data controllers with DPIAs. Questionmark has therefore prepared an example draft DPIA template that may be used for completing an assessment of data processing within Questionmark OnDemand. The draft DPIA template is available for download now.

In the months before GDPR day we will see more guidance from the Article 29 Working Party and national data protection authorities to assist organizations with compliance. Questionmark is committed to helping our customers being compliant with the GDPR and we’ll post more next year on this subject. We hope this update is useful in the meantime

Important disclaimer: This blog is provided for general information and interest purposes only, is non-exhaustive and does not constitute legal advice. As such, the contents of this blog should not be relied on for any particular purpose and you should seek the advice of their own legal counsel in considering GDPR requirements.