New white paper: Assessment Results You Can Trust

John Kleeman HeadshotPosted by John Kleeman

Questionmark published an important white paper about why trustable assessment results matter and about how an assessment management system like Questionmark’s can help you make your assessments valid and reliable — and therefore trustable.

The white paper, which I wrote together with Questionmark CEO Eric Shepherd, explains that trustable assessment results must be both valid (measuring what you are looking for them to measure) and reliable (consistently measuring what you want to be measured).

The paper draws upon the metaphor of a doctor using results from a blood test to diagnose an illness and then prescribe a remedy. Delays will occur if the doctor orders the wrong test, and serious consequences could result if the test’s results are untrustworthy. Using this metaphor, it is easy to understand the personnel and organizational risks that can stem from making decisions based on untrustworthy results. If you assesses someone’s knowledge, skill or competence for health and safety or regThe 6 stages of trustable results; Planning assessment, Authoring items, Assembling assessment, Pilot and review, Delivery, Analyze resultsulatory compliance purposes, you need to ensure that your assessment instrument is designed correctly and runs consistently.

Engaging subject matter experts to generate questions to measure the knowledge, skills and abilities required to perform essential tasks of the job is essential in creating the initial pool of questions. However, subject matter experts are not necessarily experts in writing good questions, so an effective authoring system requires a quality control process which allows assessment experts (e.g. instructional designers or psychometricians) to easily review and amend assessment items.

For assessments to be valid and reliable, it’s necessary to follow structured processes at each step from planning through authoring to delivery and reporting.

The white paper covers these six stages of the assessment process:

  • Planning assessment
  • Authoring items
  • Assembling assessment
  • Pilot and review
  • Delivery
  • Analyze results

Following the advice in the white paper and using the capabilities it describes will help you produce assessments that are more valid and reliable — and hence more trustable.
Modern organizations need their people to be competent.

Would you be comfortable in a high-rise building designed by an unqualified architect? Would you fly in a plane whose pilot hadn’t passed a flying test? Would you let someone operate a machine in your factory if they didn’t know what to do if something went wrong? Would you send a sales person out on a call  if they didn’t know what your products do? Can you demonstrate to a regulatory authority that your staff are competent and fit for their jobs if you do not have trustable assessments?

In all these cases and many more, it’s essential to have a reliable and valid test of competence. If you do not ensure that your workforce is qualified and competent, then you should not be surprised if your employees have accidents, cause your organization to be fined for regulatory infractions, give poor customer service or can’t repair systems effectively.

To download the white paper, click here.

John will be talking more about trustable assessments at our 2015 Users Conference in Napa next month. Register today for the full conference, but if you cannot make it, make sure to catch the live webcast.

Can’t travel to Napa? Be there virtually!

Much as we would like to see all of our customers at the annual Questionmark Users Conference, we know it is not always possible to get there.grapes gray

Here’s what to do if you can’t join us in Napa Valley March 10 – 13: catch selected conference sessions via webcasts.

Register soon to experience these conference highlights online:

  • Opening General Session
  • Introduction to Item Development in Large-Scale Test Development
  • Hacking yourself first: Protecting your assessment data with penetration testing
  • Conference Keynote
  • Overview of Setting Performance Standards: Making the cognitive leap from scores to interpretations
  • Writing Performance-Based Test Items

Sign up for these complimentary conference webcasts (for customers only)

Experiencing the complete conference offers by far the best way to learn about assessment-related best practices and about how to make the most of Questionmark technologies.

There’s still time to register for the full conference, but if you can’t make it this time around, we certainly hope you will take advantage of these webcasts.

2015_uc_napa_banner1

conf goers banner

Standard Setting: A Keystone to Legal Defensibility

Austin Fossey-42Since the last Questionmark Users Conference, I have heard several clients discuss new measures at their companies requiring them to provide evidence of the legal defensibility of their assessment. Legal defensibility and validity are closely intertwined, but they are not synonymous. An assessment can be legally defensible, yet still have flaws that impact its validity. The distinction between the two is often the difference between how you developed the instrument versus how well you developed the instrument.

Regardless of whether you are concerned with legal defensibility or validity, careful attention should be paid to the evaluative component of your assessment program. What if  someone asks, “What does this score mean?” How do you answer? How do you justify your response? The answers to these questions impact how your stakeholders will interpret and use the results, and this may have consequences for your participants. Many factors go into supporting the legal defensibility and validity of assessment results, but one could argue that the keystone is the standard-setting process.

Standard setting is the process of dividing score scales so that scores can be interpreted and actioned (AERA, APA, NCME, 2014). The dividing points between sections of the scales  are called “cut scores,” and in criterion-referenced assessment, they typically correspond to performance levels that are defined a priori. These cut scores and their corresponding performance levels help test users make the cognitive leap from a participant’s response pattern to what can be a complex inference about the participant’s knowledge, skills, and abilities.

In their chapter in Educational Measurement (4th Ed.), Hambleton and Pitoniak explain that standard-setting studies need to consider many factors, and that they also can have major implications for participants and test users. For this reason, standard-setting studies are often rigorous, well-documented projects.

At this year’s Questionmark Users Conference, I will be delivering a session that introduces the basics of standard setting. We will discuss standard-setting methods for criterion- referenced and norm-referenced assessments, and we will touch on methods used in both large-scale assessments and in classroom settings. This will be a useful session for anyone who is working on documenting the legal defensibility of their assessment program or who is planning their first standard-setting study and wants to learn about different methods that are available. Participants are encouraged to bring their own questions and stories to share with the group.

Register today for the full conference, but if you cannot make it, make sure to catch the live webcast!

Napa Conference Overview & Special Live Webcast

Headshot JuliePosted by Julie Delazyn

The Questionmark Users Conference in Napa Valley March 10-13 is officially less than a month away!2015-napa-01

We are looking forward to seeing you there for three full days of learning and special events.

Experiencing the complete conference offers by far the best way to learn about assessment-related best practices and about how to make the most of Questionmark technologies. Participants will hear customer case studies and learn about the unique ways in which other organizations are using assessments. They will also weigh in on the product road map, connect with Questionmark staff and network with their peers while enjoying California’s beautiful wine country.

But here’s a great opportunity for Questionmark users who can’t travel to the conference this year: live webcasts of selected conference sessions.

These select sessions will be broadcast at no charge March 11-12:

  • Opening General Session
  • Introduction to Item Development in Large-Scale Test Development
  • Hacking yourself first: Protecting your assessment data with penetration testing
  • Conference Keynotecollage
  • Overview of Setting Performance Standards: Making the cognitive leap from scores to interpretations
  • Writing Performance-Based Test Items

Make sure to sign up for conference webcasts

Aside from getting to take advantage of beautiful California wine country, there’s much more in store for conference participants:

Register today for the full conference, but if you cannot make it, make sure to catch the live webcast

Delivering exams in Europe? What must you do for Data Protection?

John Kleeman HeadshotPosted by John Kleeman

Regulators in Europe are increasingly active in data protection, and most European organizations are reviewing their suppliers to ensure data protection and security. If you are an awarding body, multinational corporation or publisher delivering tests and exams in Europe, what do you need to do to stay comfortably within European Union data protection laws?

There is a fundamentally different approach to personal privacy between Europe and in the USA. In the USA, there is often a cultural expectation that technology and market efficiency are pre-eminent, but in Europe, the law requires technology to ensure privacy.

We all remember that in the US, citizens have a right to  “life, liberty and the pursuit of happiness” and that in France, people have a right to “Liberty, Equality and Fraternity”. But in the 21st century privacy probably one of the strongest discriminators between the continents. In a world being transformed by technology, the EU data protection directive firmly says that computer systems are designed to serve man, and not man serve the computer. Data processing systems must respect the fundamental rights and freedom of people, and in particular the right to privacy.  Whether you think this is right or not, this is the law in Europe.

Increasingly European governments are strengthening their laws on data protection and the penalties for not complying. So if you are delivering your exams in Europe, what do you need to do?  The key responsibilities for data protection are held by what EU Law calls the “Data Controller”.  Most sponsors of assessments – awarding bodies, corporations delivering tests, publishers and educational institutions are Data Controllers and they are responsible for protecting the data from the end user (Data Subject) and ensuring that any processors and sub-processors follow the rules.  The Data Controller will also be liable if anything goes wrong.

Data Subject - Data Controller - Data Procesor - Sub-processor

Here is a summary of the 12 responsibilities of a Data Controller under EU Law when delivering assessments:

1. Tell test takers what is being done with their data including how you are ensuring the assessment is fair.

2. Obtain informed consent from your test takers including relating to who will see their results.

3. Ensure that data is accurate, which in the assessment context likely means that assessments are reliable and valid.

4. Delete personal data when no longer needed.

5. Protect data against unauthorized destruction, loss, alteration and disclosure. If assessment results are lost, altered or disclosed without permission, you may be liable for penalties.  You need to put in place technical and organizational measures and ensure that data is only disclosed appropriately and that any data processors follow the rules strictly.

7. Take care transferring data outside Europe. You need to ensure that if assessment results or other personal data is transferred outside Europe that the EU rules are followed, this is particularly important as not all organizations outside Europe understand data protection, and so they may inadvertently break the rules.

8. If your assessments collect “special” categories of data, including racial or ethnic origin and health information, there are additional rules, get advice on how to ensure there is explicit consent from test takers.

9. People have a right to request data that you hold on them, and in some countries this includes exam results and all the personal details you hold on them. Be prepared to receive such requests.

10. If the assessment is high stakes, ensure there is a human review of automated decision making. Under the EU directive, technology serves man, not the other way round and taking decisions without human review is not always allowed.

11. Appoint a data protection officer and train your personnel

12. Work with supervisory authorities (you have to register in some countries) and have a process to deal with data protection complaints.

As a company established in both the EU and the US, Questionmark has a good understanding of data protection, and if  you use Questionmark OnDemand, several of these responsibilities are aided and ameliorated.

I hope this introduction and summary has been helpful. For more information the requirements of data protection when delivering assessments,  download our white paper (free with registration)  Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities.

Q&A: Pre-hire, new-hire and ongoing assessments at Canon

HollyGroder

Holly Groder

Headshot JuliePosted by Julie Delazyn

Holly Groder and Mark Antonucci are training developers for Canon Information Technology Services, Inc. (Canon ITS). During their case study presentation at the Questionmark 2015 Users Conference in Napa Valley March 10-13, they will talk about Leveraging Questionmark’s Reports and Analytics Tools for Deeper Insight.

Their session will explore Canon’s use of assessments in hiring, training, continuing job skills assessment and company-wide information gathering via surveys.

I asked them recently about their case study:

Why did you start using Questionmark? 

The primary reason for seeking a new assessment tool was our desire to collect more information from our assessments, quicker. Questionmark offered the flexibility of web-based question creation and built-in reports. Questionmark also offered the ability to add jump blocks and a variety of templates. The survey capabilities were just a bonus for us. We were able to streamline our survey process to one point of contact and eliminate an additional software program.

What kinds of assessments do you use?

MarkAntonucci1

Mark Antonucci

The principal function is split between four business needs: Pre-hire employment assessments, new-hire or cross-training assessments, continuing job knowledge assessments, and business information gathering (surveys).

How are you using those tools?

First, potential employees are required to participate in a technical knowledge assessment prior to an offer of employment. Once employment has been offered and accepted, the new employees are assessed throughout the new-hire training period. Annually, all call center agents participate in a job skills assessment unique to their department. And finally, all employees participate in various surveys ranging from interest in community events to feedback on peer performance.

What are you looking forward to at the conference?

We are interested in best practices, insight into psychometrics, and, most important, networking with other users.

Thank you Holly and Mark for taking time out of your busy schedules to discuss your session with us!

***

If you have not already done so, you still have a chance to attend this important learning event. Click here to register.

Next Page »
SAP Microsoft Oracle HR-XML AAIC