Getting Assessment Results You Can Trust: White Paper

Headshot JuliePosted by Julie Delazyn

Modern organizations need their people to be competent.

Would you be comfortable in a high-rise building designed by an unqualified architect? Would you fly in a plane whose pilot hadn’t passed a flying test? Would you send a sales person out on a call  if they didn’t know what your products do? Can you demonstrate to a regulatory authority that your staff are competent and fit for their jobs if you do not have trustworthy assessments?

In all these cases and many more, it’s essential to have a reliable and valid test of competence. If you do not ensure that your workforce is qualified and competent, then you should not be surprised if your employees have accidents, cause your organization to be fined for regulatory infractions, give poor customer service or can’t repair systems effectively.

The white paper, Assessment Results you Can Trust, explains that trustworthy assessment results must be both valid (measuring what you are looking for them to measure) and reliable (consistently measuring what you want to be measured).The 6 stages of trustable results; Planning assessment, Authoring items, Assembling assessment, Pilot and review, Delivery, Analyze results

For assessments to be valid and reliable, it’s necessary to follow structured processes at each step from planning through authoring to delivery and reporting.

The white paper covers these six stages of the assessment process:

  • Planning assessment
  • Authoring items
  • Assembling assessment
  • Pilot and review
  • Delivery
  • Analyze results

Following the advice in the white paper and using the capabilities it describes will help you produce assessments that are more valid and reliable — and hence more trustable.

To download the complimentary white paper, click here.

Interested in finding out more about authoring assessments you can trust?  Make sure to join April Barnum’s session: Authoring Assessments You Can Trust: What’s the Process? We look forward to seeing you in Miami next week at Questionmark Conference 2016!

Establishing a data-driven assessment strategy – A Q&A with Amazon

Headshot Julie

Posted by Julie Delazyn

Jason Sunseri is a senior Program Manager – Learning Technology at Amazon. He will be leading a discussion at Questionmark Conference 2016 in Miami, about Creating a Global Knowledge and Skills Assessment Program for Amazon Sellers.

Jason Sunseri, Program Manager – Learning Technology, Amazon

Jason’s session will look at how Amazon Seller Support and Questionmark OnDemand have partnered to deliver a world-class solution. Jason will illustrate how Amazon has used the OnDemand platform to deliver a robust, data-driven assessment strategy.

I recently asked him about his session:

Tell me about Amazon and its use of assessments:

Amazon Seller Support engages with the 2.5 million+ global sellers represented on the Amazon platform. Due to rapid global expansion across the platform, the Amazon Seller Support needed to find a technology and assessment partner that could support both its knowledge and skill acquisition assessment strategies.

How does Amazon use data to drive strategy?

Assessments play a huge role at Amazon. We have really evolved into a data-driven culture and we use assessments in surveys and inside curriculum to assess training and performance, and to identify early issues and trends in order to tweak training content and fix errors.

What role does Questionmark play in that strategy?

We rely heavily on reports — Survey Matrix, Job Task Analysis and other report functions — to assess performance. We’re able to leverage the tool by having individual training centers analyze learning and training gaps and pass on those results. It allows us to see how and why a site is succeeding; where that behavior stems from — it’s really cool to see.

What are you looking forward to at the conference?

It’s Miami, so…the weather, for sure! In all seriousness, I look forward to learning about how other Questionmark users utilize the same tools and how their approach varies from ours.

Thank you, Jason for taking time out of your busy schedules to discuss your session with us!

SAP to present their global certification program at London briefing

Chloe MendoncaPosted by Chloe Mendonca

A key to SAP’s success is ensuring that the professional learning path of skilled SAP practitioners is continually supported – thereby making qualified experts on their cloud solutions readily available to customers, partners and consultants.

In a world where current knowledge and skills are more important than ever, SAP needed a way to verify that their cloud consultants around the world were keeping their knowledge and skills up-to-date  with rapidly changing technology. A representative of the certification program at SAP comments:breakfast briefing

It became clear that a certification that lasted for two or three years didn’t cut it any longer – in all areas of the portfolio. Everything is evolving so quickly, and SAP has to always support current, validated knowledge.”

Best Practices from SAP

The move to the cloud required some fundamental changes to SAP’s existing certification program. What challenges did they face? What technologies are they using to ensure the security of the program? Join us on the 21st of October for a breakfast briefing in London, where Ralf Kirchgaessner, Manager of Global Certification at SAP, will discuss the answers to these questions. Ralf will tell how the SAP team planned for the program, explain its benefits and share lessons learned.

Click here to learn more and register for this complimentary breakfast briefing *Seats are limited

High-Stakes Assessments

The briefing will  include a best-practice seminar on the types of technologies and techniques to consider using as part of your assessment program to securely create, deliver and report on high-stakes tests around the world. It will highlight technologies such as online invigilation, secure browsers and item banking tools that alleviate the testing centre burden and allow organisations and test publishers to securely administer trustable tests and exams and protect valuable assessment content.

What’s a breakfast briefing?

You can expect a morning of networking, best practice tips and live demonstrations of the newest assessment technologies.The event will include a complimentary breakfast at 8:45 a.m. followed by presentations and discussions until about 12:30 p.m.

Who should attend?

These gatherings are ideal for people involved in certification, compliance and/or risk management, and learning and development.

When? Where?

Wednesday 21st October at Microsoft’s Office in London, Victoria — 8:45 a.m. – 12:30 p.m

Click here to learn more and register to attend

Agree or disagree? 10 tips for better surveys — part 3

John Kleeman HeadshotPosted by John Kleeman

This is the third and last post in my “Agree or disagree” series on writing effective attitude surveys. In the first post I explained the process survey participants go through when answering questions and the concept of satisficing – where some participants give what they think is a satisfactory answer rather than stretching themselves to give the best answer.

In the second post I shared these five tips based on research evidence on question and survey design.

Tip #1 – Avoid Agree/Disagree questions

Tip #2 – Avoid Yes/No and True/False questions

Tip #3 – Each question should address one attitude only

Tip #4 – Minimize the difficulty of answering each question

Tip #5 – Randomize the responses if order is not important

Here are five more:

Tip #6 –  Pretest your survey

Just as with tests and exams, you need to pretest or pilot your survey before it goes live. Participants may interpret questions differently than you intended. It’s important to get the language right so as to trigger in the participant the right judgement. Here are some good pre-testing methods:

  • Get a peer or expert to review the survey.
  • Pre-test with participants and measuring the response time for each question (shown in some Questionmark reports). A longer response time could be connected with a more confusing question.
  • Allow participants to provide comments on questions they think they are confusing.
  • Follow up with your pretesting group by asking them why they gave particular answers or asking them what they thought you meant by your  questions.

Tip #7 – Make survey participants realize how useful the survey is

The more motivated a participant is, the more likely he or she is to answer optimally rather than just satisficing and choosing a good enough answer. To quote Professor Krosnick in his paper The Impact of Satisficing on Survey Data Quality:

“Motivation to optimize is likely to be greater among respondents who think that the survey in which they are participating is important and/or useful”

Ensure that you communicate the goal of the survey and make participants feel that filling it in usefully will be a benefit to something they believe in or value.

Tip #8. Don’t include a “don’t know” option

Including a “don’t know” option usually does not improve the accuracy of your survey. In most cases it reduces it. To those of us used to the precision of testing and assessment, this is surprising.

Part of the reason is that providing a “don’t know” or “no opinion” option allows participants to disengage from your survey and so diminishes useful responses. Also,  people are better at guessing or estimating than they think they are, so they will tend to choose an appropriate answer if they do not have an option of “don’t know”. See this paper by Mondak and Davis, which illustrates this in the political field.

Tip #9. Ask questions about the recent past only

The further back in time they are asked to remember, the less accurately participants will answer your questions. We all have a tendency to “telescope” the timing of events and imagine that things happened earlier or later than they did. If you can, ask about the last week or the last month, not about the last year or further back.

Picture of a trends graphTip #10 – Trends are good

Error can creep into survey results in many ways. Participants can misunderstand the question. They can fail to recall the right information. Their judgement can be influenced by social pressures. And they are limited by the choices available. But if you use the same questions over time with a similar population, you can be pretty sure that changes over time are meaningful.

For example, if you deliver an employee attitude survey with the same questions for two years running, then changes in the results to a question (if statistically significant) probably mean a change in employee attitudes. If you can use the same or similar questions over time and can identify trends or changes in results, such data can be very trustworthy.

I hope you’ve found this series of articles useful.  For more information on how Questionmark can help you create, deliver and report on surveys, see www.questionmark.com. I’ll also be presenting at Questionmark’s 2016 Conference: Shaping the Future of Assessment in Miami April 12-15. Check out the conference page for more information.

Delivering exams in Europe? What must you do for Data Protection?

John Kleeman HeadshotPosted by John Kleeman

Regulators in Europe are increasingly active in data protection, and most European organizations are reviewing their suppliers to ensure data protection and security. If you are an awarding body, multinational corporation or publisher delivering tests and exams in Europe, what do you need to do to stay comfortably within European Union data protection laws?

There is a fundamentally different approach to personal privacy between Europe and in the USA. In the USA, there is often a cultural expectation that technology and market efficiency are pre-eminent, but in Europe, the law requires technology to ensure privacy.

We all remember that in the US, citizens have a right to  “life, liberty and the pursuit of happiness” and that in France, people have a right to “Liberty, Equality and Fraternity”. But in the 21st century privacy probably one of the strongest discriminators between the continents. In a world being transformed by technology, the EU data protection directive firmly says that computer systems are designed to serve man, and not man serve the computer. Data processing systems must respect the fundamental rights and freedom of people, and in particular the right to privacy.  Whether you think this is right or not, this is the law in Europe.

Increasingly European governments are strengthening their laws on data protection and the penalties for not complying. So if you are delivering your exams in Europe, what do you need to do?  The key responsibilities for data protection are held by what EU Law calls the “Data Controller”.  Most sponsors of assessments – awarding bodies, corporations delivering tests, publishers and educational institutions are Data Controllers and they are responsible for protecting the data from the end user (Data Subject) and ensuring that any processors and sub-processors follow the rules.  The Data Controller will also be liable if anything goes wrong.

Data Subject - Data Controller - Data Procesor - Sub-processor

Here is a summary of the 12 responsibilities of a Data Controller under EU Law when delivering assessments:

1. Tell test takers what is being done with their data including how you are ensuring the assessment is fair.

2. Obtain informed consent from your test takers including relating to who will see their results.

3. Ensure that data is accurate, which in the assessment context likely means that assessments are reliable and valid.

4. Delete personal data when no longer needed.

5. Protect data against unauthorized destruction, loss, alteration and disclosure. If assessment results are lost, altered or disclosed without permission, you may be liable for penalties.  You need to put in place technical and organizational measures and ensure that data is only disclosed appropriately and that any data processors follow the rules strictly.

7. Take care transferring data outside Europe. You need to ensure that if assessment results or other personal data is transferred outside Europe that the EU rules are followed, this is particularly important as not all organizations outside Europe understand data protection, and so they may inadvertently break the rules.

8. If your assessments collect “special” categories of data, including racial or ethnic origin and health information, there are additional rules, get advice on how to ensure there is explicit consent from test takers.

9. People have a right to request data that you hold on them, and in some countries this includes exam results and all the personal details you hold on them. Be prepared to receive such requests.

10. If the assessment is high stakes, ensure there is a human review of automated decision making. Under the EU directive, technology serves man, not the other way round and taking decisions without human review is not always allowed.

11. Appoint a data protection officer and train your personnel

12. Work with supervisory authorities (you have to register in some countries) and have a process to deal with data protection complaints.

As a company established in both the EU and the US, Questionmark has a good understanding of data protection, and if  you use Questionmark OnDemand, several of these responsibilities are aided and ameliorated.

I hope this introduction and summary has been helpful. For more information the requirements of data protection when delivering assessments,  download our white paper (free with registration)  Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities.

Q&A: Pre-hire, new-hire and ongoing assessments at Canon

HollyGroder

Holly Groder

Headshot JuliePosted by Julie Delazyn

Holly Groder and Mark Antonucci are training developers for Canon Information Technology Services, Inc. (Canon ITS). During their case study presentation at the Questionmark 2015 Users Conference in Napa Valley March 10-13, they will talk about Leveraging Questionmark’s Reports and Analytics Tools for Deeper Insight.

Their session will explore Canon’s use of assessments in hiring, training, continuing job skills assessment and company-wide information gathering via surveys.

I asked them recently about their case study:

Why did you start using Questionmark? 

The primary reason for seeking a new assessment tool was our desire to collect more information from our assessments, quicker. Questionmark offered the flexibility of web-based question creation and built-in reports. Questionmark also offered the ability to add jump blocks and a variety of templates. The survey capabilities were just a bonus for us. We were able to streamline our survey process to one point of contact and eliminate an additional software program.

What kinds of assessments do you use?

MarkAntonucci1

Mark Antonucci

The principal function is split between four business needs: Pre-hire employment assessments, new-hire or cross-training assessments, continuing job knowledge assessments, and business information gathering (surveys).

How are you using those tools?

First, potential employees are required to participate in a technical knowledge assessment prior to an offer of employment. Once employment has been offered and accepted, the new employees are assessed throughout the new-hire training period. Annually, all call center agents participate in a job skills assessment unique to their department. And finally, all employees participate in various surveys ranging from interest in community events to feedback on peer performance.

What are you looking forward to at the conference?

We are interested in best practices, insight into psychometrics, and, most important, networking with other users.

Thank you Holly and Mark for taking time out of your busy schedules to discuss your session with us!

***

If you have not already done so, you still have a chance to attend this important learning event. Click here to register.

« Previous PageNext Page »