New White Paper Examines how to Assess for Situational Judgment

Posted by John Kleeman

Is exercising judgment a critical factor in the competence of the employees and contractors who service your organization? If the answer to this is yes, as it most likely is, you may be interested in Questionmark’s white paper, just published this week on “Assessing for Situational Judgment”.

It’s not just CEOs who need to exercise judgment and make decisions, almost every job requires an element of judgment. Situational Judgment Assessments (SJAs) present a dilemma to the participant and ask them to choose options in response.


Context is defined -> There is a dilemma that needs judgment -> The participant chooses from options -> A score or evaluation is made

Here is an example: 

You work as part of a technical support team that produces work internally for an organization. You have noticed that often work is not performed correctly or a step has been omitted from a procedure. You are aware that some individuals are more at fault than others as they do not make the effort to produce high quality results and they work in a disorganized way. What do you see as the most effective and the least effective responses to this situation?
A.  Explain to your team why these procedures are important and what the consequences are of not performing these correctly.
B.  Try to arrange for your team to observe another team in the organisation who produce high quality work.
C.  Check your own work and that of everyone else in the team to make sure any errors are found.
D.  Suggest that the team tries many different ways to approach their work to see if they can find a method where fewer mistakes are made.

In this example, option C deals with errors but is time consuming and doesn’t address the behavior of team members. Option B is also reasonable but doesn’t deal with the issue immediately and may not address the team’s disorganized approach. Option D is asking a disorganized team to engage in a set of experiments that could increase rather than reduce errors in the work produced. This is likely to be the least effective of the options presented. Option A does require some confidence in dealing with potential pushback from the other team members, but is most likely to have a positive effect.

You can see some more SJA examples at http://www.questionmark.com/go/example-sja.

SJA items assess judgment and variations can be used in pre-hire, post-hire training, for compliance and for certification. SJAs offer assessment programs the opportunity to move beyond assessments of what people know (knowledge of what) to assessments of how that knowledge will be applied in the workplace (knowledge of how).

Questionmark’s white paper is written as a collaboration by Eugene Burke, well known advisor on talent, assessment and analytics and myself. The white paper is aimed at:

  • Psychometricians, testing professionals, work psychologists and consultants who currently create SJAs for workplace use (pre-hire or post-hire) and want to consider using Questionmark technology for such use
  • Trainers, recruiters and compliance managers in corporations and government looking to use SJAs to evaluate personnel
  • High-tech or similar certification organizations looking to add SJAs to increase the performance realism and validity of their exam

The 40 page white paper includes sections on:

  • Why consider assessing for situational judgment
  • What is an SJA?
  • Pre-hire and helping employers and job applicants make better decisions
  • Post-hire and using SJAs in workforce training and development
  • SJAs in certification programs
  • SJAs in support of compliance programs
  • Constructing SJAs
  • Pitfalls to avoid
  • Leveraging technology to maximize the value of SJAs

Situational Judgment Assessments are an effective means of measuring judgment and the white paper provides a rationale and blueprint to make it happen. The white paper is available free (with registration) from https://www.questionmark.com/sja-whitepaper.

I will also be presenting a session about SJAs in March at the Questionmark Conference 2018 in Savannah, Georgia – visit the conference website for more details.

Did your training work? Prove the value of your learning programs with results you can measure

Headshot JuliePosted by Julie Delazyn

Quizzes, tests, and exams do so much more than determine whether or not a learner passed a training course. These assessments, as well as surveys, play a crucial role in learning, performance improvement and regulatory compliance. Check out our most popular white paper: Assessments Through the Learning Process,  which explores the varied and important roles assessments play before, during and after a learning experience.

It’s a great places to start exploring the possibility of using online assessments in education, training, certification or compliance. Learn more about the ways you can use assessments to improve learning and measurement. Download your complimentary copy today.

ATTLP WP cover

Item Development Tips For Defensible Assessments

Julie ProfilePosted by Julie Delazyn

Whether you work with low-stakes assessments, small-scale classroom assessments or large-scale, high-stakes assessment, understanding and applying some basic principles of item development will greatly enhance the quality of your results.

What began as a popular 11-part blog series has morphed into a white paper: Managing Item Development for Large-Scale Assessment, which offers sound advice on how-to organize and execute item development steps that will help you create defensible assessments. These steps include:   Item Dev.You can download your copy of the complimentary white paper here: Managing Item Development for Large-Scale Assessment

5 Steps to Better Tests

Julie ProfilePosted by Julie Delazyn

Creating fair, valid and reliable tests requires starting off right: with careful planning. Starting with that foundation, you will save time and effort while producing tests that yield trustworthy results.five steps white paper

Five essential steps for producing high-quality tests:

1. Plan: What elements must you consider before crafting the first question? How do you identify key content areas?

2. Create: How do you write items that increase the cognitive load, avoid bias and stereotyping?

3. Build: How should you build the test form and set accurate pass/ fail scores?

4. Deliver: What methods can be implemented to protect test content and discourage cheating?

5. Evaluate: How do you use item-, topic-, and test-level data to assess reliability and improve quality?

Download this complimentary white paper full of best practices for test design, delivery and evaluation.

 

Is Safe Harbor still safe for assessment data?

John Kleeman HeadshotPosted by John Kleeman

A European legal authority last week advised that the Safe Harbor framework which allows European organizations to send personal data to the US  should no longer be legal. I’d like to explain what this means and discuss the potential consequences to those delivering assessments and training in Europe.

What European data protection law says about transfers outside Europe

According to European data protection law, personal data such as assessment results or course completion data can only leave Europe if an adequate level of protection is guaranteed. All organizations with European participants must ensure that they follow strict rules if they allow personal data to be transferred outside Europe. Data controllers can be fined if they don’t comply.

Data controller has data processors which have sub processorsA few countries, including Canada, are considered to have an adequate level of protection. But in order to send information to the United States and most other countries outside Europe, it’s necessary to ensure that each data processor who has access to the data  guarantees its protection. This includes every processor and sub-processor with access to the data including data centers, backup storage vendors and any organization that accesses the data for support or troubleshooting purposes. Even if data is hosted in Europe, the rules must still be followed if there is any access to it or any copy of it in the US.

There are two main ways in which US organizations can bind themselves to follow data protection rules and so be legitimate processors of European data: the EU Model Clauses or Safe Harbor.

EU Model Clauses

EU FlagThe EU Model Clauses are a standard set of contractual clauses, several pages long, which a data processor can sign with each data controller. Signing signifies a commitment to following EU data protection law when processing data. These clauses cannot be changed or negotiated in any way. Questionmark uses these EU model clauses with all our sub-processors for Questionmark OnDemand data to ensure that our customers will be compliant with EU data protection law.

Safe Harbor

An alternative to the EU model clauses in the US is Safe Harbor. Safe safe harborHarbor (formal name – the US-EU Safe Harbor Framework) is run by the US Department of Commerce and allows US companies to certify that they will follow EU rules for EU data without needing to sign the EU model clauses. You can certify once, and then it applies to all your customers. It’s very widely used, and most large US organizations in assessment and learning are Safe Harbor certified, including Questionmark’s US company, Questionmark Corporation. You can see a full list at http://safeharbor.export.gov/list.aspx.

There is some concern, particularly in Germany, that Safe Harbor is not well enough enforced, so some organizations like Questionmark also use the EU Model Clauses. For example, Microsoft offer these for their cloud products. But Safe Harbor is widely used to ensure the legality and safety of European data sent to the US.

The legal threat to Safe Harbor

Last week, the advocate general of the Court of Justice of the European Union made a ruling that the Safe Harbor scheme should no longer be legal. He argues that the widespread government surveillance by the US is incompatible with the privacy rights set out in the EU Data Protection directive, so the whole of Safe Harbor should be invalidated. His ruling is not yet binding, but rulings by advocate generals are often confirmed and made binding by the court, so there is a genuine threat that Safe Harbor could be suspended.

Negotiations on data protection are underway between the US and Europe, and it is likely that this will be resolved in some way. But there are significant differences in attitude on data protection between Europe and the US.  Much anger remains about Edward Snowden’s revelations about US surveillance, so the situation is hard to predict.

What can organizations do to protect themselves?

It’s likely that a deal will be found and that Safe Harbor will remain safe. And if it is ruled illegal, this is going to affect the whole technology sector, not just learning and assessment. But it’s a further argument to use a European vendor for assessment and learning needs and/or one who is familiar with and has their suppliers signed up to the EU Model Clauses.

For more information and background on data protection, see Questionmark’s white paper:  Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities. John Kleeman will also be presenting at the Questionmark Conference 2016: Shaping the Future of Assessment in Miami, April 12-15. Click here to register and learn more about this important learning event.

7 actionable steps for making your assessments more trustable

John Kleeman HeadshotPosted by John Kleeman

Questionmark has recently published a white paper on trustable assessment,  and we blog about this topic frequently. See Reliability and validity are the keys to trust and The key to reliability and validity is authoring for some recent blog posts about the white paper.

But what can you do today if you want to make your assessments more trustable? Obviously you can read the white paper! But here are seven actionable steps that if you’re not doing already you could do today or at least reasonably quickly to improve your assessments.

1. Organize questions in an item bank with topic structure

If you are already using Questionmark software, you are likely doing this already.  But putting questions in an item bank structured by hierarchical topics facilitates an easy management view of all questions and assessments under development. It allows you to use the same question in multiple assessments, easily add questions and retire them and easily search questions, for example to find the ones that need update when laws change or a product is retired.

2. Use questions that apply knowledge in the job context

It is better to ask questions that check how people can apply knowledge in the job context than just to find out whether they have specific knowledge. See my earlier post Test above knowledge: Use scenario questions for some tips on this. If you currently just test on knowledge and not on how to apply that knowledge, make today the day that you start to change!

3. Have your subject matter experts directly involved in authoring

Especially in an area where there is rapid change, you need subject matter experts directly involved in authoring and reviewing questions. Whether you use Questionmark Live or another system, start involving them.

4. Set a pass score fairly

Setting a pass score fairly is critical to being able to trust an assessment’s results. See Is a compliance test better with a higher pass score? and Standard Setting: A Keystone to Legal Defensibility for some starting points on setting good pass scores. And if you don’t think you’re following good practice, start to change.

5. Use topic scoring and feedback

As Austin Fossey explained in his ground-breaking post Is There Value in Reporting Subscores?, you do need to check whether it is sensible to report topic scores. But in most cases, topic scores and topic feedback can be very useful and actionable – they direct people to where there are problems or where improvement is needed.

6. Define a participant code of conduct

If people cheat, it makes assessment results much less trustable. As I explained in my post What is the best way to reduce cheating? , setting up a participant code of conduct (or honesty code) is an easy and effective way of reducing cheating. What can you do today to encourage your test takers to believe your program is fair and be on your side in reducing cheating?

7. Run item analysis and weed out poor items

This is something that all Questionmark users could do today. Run an item analysis report – it takes just a minute or two from our interfaces and look at the questions that are flagged as needing review (usually amber or red). Review them to check appropriateness and potentially retire them from your pool or else improve them.

Questionmark item analysis report

 

Many of you will probably be doing all the above and more, but I hope that for some of you this post could be a spur to action to make your assessments more trustable. Why not start today?