How to Navigate Assessments through the GDPR Automated Decision-Making Rules
Posted by John Kleeman
The GDPR has got a lot of publicity for its onerous consent requirements, large fines and the need to inform of data breaches. But there are other aspects of GDPR which have implications for assessment users. To protect human rights, the GDPR imposes restrictions on letting machines make decisions about people, and these limitations can apply when using computerized assessments. Here is how one of the recitals to the GDPR describes the principle:
“The data subject should have the right not to be subject to a decision … evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention.”
In some cases, it is actually illegal in the European Union to use a computerized test or exam to make a significant decision about a person. In other cases, it is permissible but you need to put in place specific measures. The assessment industry has always been very careful about reliability, validity and fairness of tests and exams, so these measures are navigable, but you need to follow the rules. The diagram below shows what is allowed, with or without protection measures in place, and what is forbidden.
When you are free from restriction
For many assessments, the GDPR rules will not impose any prohibitions, as shown by the green “Allowed” box in the diagram:
- If you are only making minor decisions from an assessment, you do not need to worry. For example, if you are delivering e-learning, and you decide which path to go next depending on an assessment, that is unlikely to significantly impact the assessment participant. But if the assessment impacts significant things, like jobs, promotions or access to education, or has a legal effect, the restrictions will apply.
- Even if decisions made do have legal or significant effects, the GDPR only restricts solely automated decision-making. If humans are genuinely part of the decision process, for example with the ability to change the decision, this is not solely automated decision-making. This doesn’t mean that an assessment is okay if humans wrote the questions or set the pass score; it means that humans must review the results before making a decision about a person based on the test. For example, if a recruitment test screens someone automatically out of a job application process without a person intervening, the GDPR consider this to be solely automated decision-making. But if an employee fails a compliance test, and this is referred to a person who reviews the test results and other information and genuinely decides the action to take, that is not solely automated decision making.
What to do if the restrictions apply
If the GDPR restrictions do apply, you have to go through some logic as shown in the diagram to see if you are permitted to do this at all. If you do not fall into the permitted cases, it will be illegal to make the decision according to the GDPR (the red boxes). In other cases, it is permitted to use automated decision-making, but you have to put measures in place (the yellow boxes). Here are some of the key measures a data controller (usually the assessment sponsor) may take if the yellow boxes apply, for example when using assessments in screening candidates for recruiting:
- Provide a route where test takers can appeal the assessment result and the decision and have a human review;
- Inform test takers that you are using automated decision making and what the consequences for them will be;
- Provide meaningful information about the logic involved. I suggest this might include publishing an explanation of how questions are created and reviewed, how the scoring works and in a pass/fail test, how the pass score is arrived at fairly;
- Have mechanisms in place to ensure the ongoing quality and fairness of the test. The regulators aren’t precise about what you need to do, but one logically important thing would be to ensure that the question and test authoring process results in a demonstrably valid and reliable test. And to maintain validity and reliability, it’s important to conduct regular item analysis and other reviews to ensure quality is maintained.
- Perform and document a Data Protection Impact Assessment (DPIA) to check that test taker’s rights and interests are being respected, if the assessment will involve a systematic and extensive evaluation of personal aspects relating to the test taker or otherwise gives a high risk to rights. Questionmark has produced a template for DPIAs which might help here – see www.questionmark.com/go/eu-od-dpiatemplate.
Although these measures might appear threatening on first sight, in fact they could be helpful for the quality of assessments. As I describe in my blog post What is the best way to reduce cheating?, providing information to test-takers about how the test is created and scored and why this is fair, can help reduce cheating by making the test-taker less likely to rationalize that cheating is fair. And it is generally good practice to use an assessment as one piece of data along with other criteria to make a decision about someone. The increased visibility and transparency of the assessment process by following the requirements above could also encourage better practice in assessment, and so more reliable, valid and trustable assessments for all.
If you want to find out more about the rules, there is guidance available from the European Data Protection Board and from the UK Information Commissioner. Questionmark customers who have questions in this area are also welcome to contact me. You might also like to read Questionmark’s white paper “Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities” which you can download here.
This blog post includes my personal views only and is based on guidance currently available on the GDPR. This is a fluid area that is likely to develop over time, including through publication of additional regulator guidance and court decisions. This blog does not constitute legal advice.