Can you be GDPR compliant without testing your employees?

Posted by John Kleeman

The GDPR is a new extra-territorial, data protection law which imposes obligations on anyone who processes personal data on European residents. It impacts companies with employees in Europe, awarding bodies and test publishers who test candidates in Europe, universities and colleges with students in Europe and many others. Many North American and other non-European organizations will need to comply.

See my earlier post How to use assessments for GDPR compliance for an introduction to GDPR. The question this blog post addresses is whether it’s practical for a large organization to be compliant with the GDPR without giving tests and assessments to their employees?

I’d argue that for most organizations with 100s or 1000s of employees, you will need to test your employees on your policies and procedures for data protection and the GDPR. Putting it simply, if you don’t and your people make mistakes, fines are likely to be higher.

Here are four things the GDPR law says (I’ve paraphrased the language and linked to the full text for those interested):


1. Organizations must take steps to ensure that everyone who works for them only processes personal data based on proper instructions. (Article 32.4)

2. Organizations must conduct awareness-raising and training of staff who process personal data (Article 39.1). This is extended to include “monitoring training” for some organizations in Article 47.2.

3. Organizations must put in place risk-based security measures to ensure confidentiality and integrity and must regularly test, assess and evaluate the effectiveness of these measures. (Article 32.1)

4. If you don’t follow the rules, you could be fined up to 20 million Euros or 4% of turnover. How well you’ve implemented the measures in article 32 (i.e. including those above) will impact how big these fines might be. (Article 83.2d)


So let’s join up the dots.

Firstly, a large company has to ensure that everyone who works for it only processes data based on proper instructions. Since the nature of personal data, processing and instructions each have particular meanings, this needs training to help people understand. You could just train and not test, but given that the concepts are not simple, it would seem sensible to test or otherwise check their understanding.

A company is required to train its employees under Article 39. But the requirement in Article 32 is for most companies stronger. For most large organizations the risk of employees making mistakes and the risk of insider threat to confidentiality and integrity is considerable. So you have to put in place training and other security measures to reduce this risk. Given that you have to regularly assess and evaluate the effectiveness of these measures, it seems hard to envisage an efficient way of doing this without testing your personnel. Delivering regular online tests or quizzes to your employees is the obvious way to check that training has been effective and your people know, understand and can apply your processes and procedures.

Lastly, imagine your company makes a mistake and one of your employees causes a breach of personal data or commits another infraction under the GDPR? How are you going to show that you took all the steps you could to minimize the risk? An obvious question is whether you did your best to train that employee in good practice and in your processes and procedures? If you didn’t train, it’s hard to argue that you took the proper steps to be compliant. But even if you trained, a regulator will ask you how you are evaluating the effectiveness of your training. As a regulator in another context has stated:

“”where staff understanding has not been tested, it is hard for firms to judge how well the relevant training has been absorbed”

So yes, you can imagine a way in which a large company might manage to be compliant with the GDPR without testing employees. There are other ways of checking understanding, for example 1:1 interviews, but they are very time consuming and hard to roll out in time for May 2018. Or you may be lucky and have personnel who don’t make mistakes! But for most of us, testing our employees on knowledge of our processes and procedures under the GDPR will be wise.

Questionmark OnDemand is a trustable, easy to use and easy to deploy system for creating and delivering compliance tests and assessments to your personnel. For more information on using assessments to help ensure GDPR compliance visit this page of our website or register for our upcoming webinar on 29 June.

Internet assessment software pioneer Paul Roberts to retire

Paul Roberts photoPosted by John Kleeman

We think of the Internet as being very young, but one of the pioneers in using the Internet for assessments is about to retire. Paul Roberts, the developer of the world’s first commercial, Internet assessment software is retiring in March. I thought readers might like to hear some of his story.

Paul was employee number three at Questionmark, joining us as software developer in 1989 when the company was still working out of my home in London.

During the 1990s, our main products ran on DOS and Windows. When we started hearing about the new ideas of HTML and the web, we realized that the Internet could make computerized assessment so much easier. Prior to the Internet, testing people at a distance required a specialized network or sending floppy disks in the mail (yes people really did this!). The idea that participants could connect to the questions and return their results over the Internet was compelling. With me as product manager, tester and documenter for our new product — and Paul as lead (and only!) developer — he wrote the first version of our Internet testing product QM Web, which we released in 1995.

QM Web manual cover

QM Web became widely used by universities and corporations who wanted to deliver quizzes and tests over the Internet. Later in the nineties, learning from the lessons of QM Web, we developed Questionmark Perception, our enterprise-level Internet assessment management system still widely used today. Paul architected Questionmark Perception and for many years was our lead developer on its assessment delivery engine.

One of Paul’s key innovations in developing Questionmark Perception was the use of XML to store questions. XML (eXtensible Markup Language) is a way of encoding data that is both human-readable and machine-readable. In 1997, Paul implemented QML (Question Markup Language) as an early application of this concept. QML allowed questions to be described independently of computer platforms. To quote Paul at the time:

“When we were developing our latest application, we really felt that we didn’t want to go down the route of designing yet another proprietary format that would restrict future developments for both us and the rest of the industry. We’re very familiar with the problems of transporting questions from platform to platform because we’ve been doing it for years with DOS, Windows, Macintosh and now the Web. With this in mind, we created a language that can describe questions and answers in tests, independently of the way they are presented. This makes it extremely powerful because QML now enables the same question database to be presented no matter what computer platform is chosen on or whatever the operating system.”

Questionmark Perception and Questionmark OnDemand still use QML as their native format, so that every single question delivered by Questionmark technology has QML as its core. QML was very influential in the design of the version 1 IMS Question & Test Interoperability specification (IMS QTI), which was led by Questionmark CEO Eric Shepherd and to which Paul was a major contributor. Paul also worked on other industry standards efforts including AICC, xAPI and ADL SCORM.

Over the years, many other technology innovators and leaders have joined Questionmark, and we have a thriving product development team. Most members of our team have had the opportunity to learn from Paul over the years, and Paul’s legacy is in safe hands: Questionmark will continue to break new frontiers in computerizing assessments. I am sure you will join me in wishing Paul well in his personal journey post-retirement.

Seven New Year’s Resolutions to Keep Your Assessments Safe

Paper with "Resolutions" written on it implying one is about to write some resolutions downJohn Kleeman HeadshotPosted by John Kleeman

Many blogs at this time of year seek to predict the year ahead, and many of them foresee more data breaches and security incidents in 2017.  But I’m a great believer that the best way to predict the future is to create or change it yourself. So if you want to reduce the chances of your assessment data security being breached in 2017, make some of the things you’ve talked about happen.

Here are some possible New Year’s resolutions that could help keep your assessments safe and secure.

1. Audit your user accounts. Go through each of your systems that hold or give access to assessment data, and check there are no accounts for ex-employees or ex-contractors. Make sure there are no generic or test accounts that do not belong to a current individual. Dormant accounts like this are a common route to a breach. Also check that no one who has changed role has the privileges of their old role.

2. Run an incident response table-top practice exercise. This is a session where you gather together those responsible for security, pretend there is a breach or other incident and work through verbally how you’d deal with it as a team. You can do this in a couple of hours with good preparation, and it allows you to check your procedures and ensure people know what to do. It will often give useful insight into improving your preparedness.  As Benjamin Franklin once said “An ounce of prevention is worth a pound of cure”.

3. Start testing your personnel on security procedures. One of the biggest security risks for any organization is staff mistakes and accidents that compromise credentials or data. Security awareness training makes an important difference. And if you test your personnel on security after the training, you verify that people understand the training and you identify areas of weakness. This makes it more likely that your personnel become more aware and follow better security practices. If you have access to an online assessment tool like Questionmark, it’s very, very easy to do.

Photo of doctor stethoscope on computer keyboard4. Review some of your key vendors. A risk for most organizations is weaknesses in suppliers or subcontractors that have access to your data. Ask suppliers to share information on their technical and organizational measures for security and what they are doing to ensure that your data is not breached. Any reputable organization will be willing and able to provide this under NDA. See 24 midsummer questions to ask your assessment software provider on this blog for some of the questions you can ask.

 

5. Conduct a restore test from backups. How do you know your backups work? Over the years, I’ve come across a few organizations and teams who’ve lost their data because their backups didn’t work. The only way to be sure is to test restoring it from backup and check data is there. If you don’t already run restore tests, organize a restore test in 2017 (ideally once a quarter, but once is better than not at all). You shouldn’t need to do this if you use a cloud service like Questionmark OnDemand as the vendor should do it for you.

6. Run a pilot for online proctoring. Microsoft do it. SAP do it. Why shouldn’t you do it? If you run a certification program that uses physical test centers, consider whether online proctoring might work for you. Not only will it reduce the risk of collusion with proctors helping candidates cheat, but it will also be a huge boon to your candidates who will no longer need to travel to test centers.

TheCadetHonorCodeMonument7. Put in place a code of conduct for your participants. This is a simple thing to do and can make a big difference in reducing cheating by encouraging test-takers to stay honest.  See Candidate Agreements: Establishing honor codes for test takers and What is the best way to reduce cheating? on this blog for tips on how and why to do this. If you are looking for inspiration, at famous code of conduct is that of the U.S. Army West Point Military Academy which simply says: “A cadet will not lie, cheat, steal, or tolerate those who do.” Of course you need to communicate and get buy-in for your code of conduct, but if you do, it can be very effective.

Many of you will already be doing all of these things, but if you’re not, I hope one or more of these resolutions help you improve your assessment security in 2017.

And here’s a bonus New Year’s resolution to consider. Questionmark Information Security Officer David Hunt and I are giving a session on Staying Ahead of Evolving Security Threats at the Questionmark conference in March in Santa Fe. Make a New Year’s resolution to come to the conference, and learn about security and assessment!

Online Proctoring: FAQs

John Kleeman HeadshotPosted by John Kleeman

Online proctoring was a hot-button topic at Questionmark’s annual Users Conference. And though we’ve discussed the pros and cons in this blog and even offered an infographic highlighting online versus test-center proctoring, many interesting questions arose during the Ensuring Exam Integrity with Online Proctoring  session I presented with Steve Lay at Questionmark Conference 2016.

I’ve compiled a few of those questions and offered answers to them. For context and additional information, make sure to check out a shortened version of our presentation. If you have any questions you’d like to add to the list, comment below!

What control does the online proctor have on the exam?

With Questionmark solutions, the online proctor can:

  • Converse with the participant
  • Pause and resume the exam
  • Give extra time if needed
  • Terminate the exam

What does an online proctor do if he/she suspects cheating?

Usually the proctor will terminate the exam and file a report to the exam sponsor.

What happens if the exam is interrupted, e.g. by someone coming in to the room?

This depends on your security protocols. Some organizations may decide  to terminate the exam and require another attempt. In some cases, if it seems an honest mistake, the organization may decide that the proctor can use discretion to permit the exam to continue.

Which is more secure, online or face-to-face proctoring?online proctoring

On balance, they are about equally secure.

Unfortunately there has been a lot of corruption with face-to-face proctoring, and online proctoring makes it much harder for participant and proctor to collude as there is no direct contact, and all communication can be logged.

But if the proctors are honest, it is easier to detect cheating aids in a face-to-face environment than via a video link.

What kind of exams is online proctoring good for?

Online proctoring works well for exams where:

  • The stakes are high and so you need the security of a proctor
  • Participants are in many different places, making travel to test centers costly
  • Participants are computer literate – have and know how to use their own PCs
  • Exams take 2-3 hours or less

If your technology or subject area changes frequently, then online proctoring is particularly good because you can easily give more frequent exams, without requiring candidates to travel.

What kind of exams is online proctoring less good for?

Online proctoring is less appropriate for exams where:

  • Exams are long and participants needs breaks
  • Exams where participants are local and it’s easy to get them into one place to take the exam
  • Participants do not have access to their own PC and/or are not computer literate

How do you prepare for online proctoring?

Here are some preparation tasks:

  • Brief and communicate with your participants about online proctoring
  • Define clearly the computer requirements for participants
  • Agree what happens in the event of incidents – e.g. suspected cheating, exam interruptions
  • Agree what ID is acceptable for participants and whether ID information is going to be stored
  • Make a candidate agreement or honor code which sets out what you expect from people to encourage them to take the exam fairly

I hope these Q&A and the linked presentation are interesting. You can find out more about Questionmark’s online proctoring solution here.

Job Task Analysis Surveys Legally Required?

John Kleeman Headshot

Posted by John Kleeman

I had a lot of positive feedback on my blog post Making your Assessment Valid: 5 Tips from Miami. There is a lot of interest in how to ensure your assessment is valid, ensuring that it measures what it is supposed to measure.

If you are assessing for competence in a job role or for promotion into a job role, one critical step in making your assessment valid is to have a good, current analysis of what knowledge, skills and abilities are needed to do the job role. This is called a job task analysis (JTA), and the most common way of doing this analysis is to conduct a JTA Survey.

Job Task Analysis SurveyIn a JTA Survey, you ask existing people in the job role, or other experts, what tasks they do. A common practice is to survey them on how important each task is, how difficult it is and how often it is done. The resultant reports then guide the construction of the test blueprint and which topics and how many questions on each you include in the test.

If you cannot show that your assessment matches the requirements of a job, then your assessment is not only invalid but it is likely unfair — if you use it to select people for the job or measure competence in the job. And if you use an invalid assessment to select people for promotion or recruitment into the job, you may face legal action from people you reject.

Not only is this common sense, but it was also confirmed by a recent US district court ruling against the Boston Police Department. In this court case, sergeants who had been rejected for promotion to lieutenant following an exam sued that the assessment was unfair, and won.

The judge ruled that the exam was not sufficiently valid, because it omitted many job skills crucial for a police lieutenant role, and so it was not fair to be used to select for the role (see news report).

The 82-page judge’s ruling sets out in detail why the exam was unfair. He references the Uniform Guidelines on Employee Selection Procedures which state:

“There should be a job analysis which includes an analysis of the important work behavior(s) required for successful performance and their relative importance”

But the judge ruled that although a job analysis had been done, it had not been used properly in the test construction process. He said:

“When using a multiple choice exam, the developer must convert the job analysis result into a test plan to ensure a direct and strong relationship between the job analysis and the exam.

However, in this case, the job analysis was not used sufficiently well to construct the exam. The judge went on to say:

The Court cannot find, however, that the test plan ensured a strong relationship between the job analysis and the exam. … too many skills and abilities were missing from the … test outline. 

Crucially, he concluded:

“And a high score on the … exam simply was not a good indicator that a candidate would be a good lieutenant”.

Due to the pace of business change and technological advance, job roles are changing fast. Make sure that you conduct regular JTAs  of roles in your organization and make sure your assessments match the most important job tasks. Find out more about Job Task Analysis here.

Making your Assessment Valid: 5 Tips from Miami

John Kleeman Headshot

Posted by John Kleeman

A key reason people use Questionmark’s assessment management system is that it helps you make more valid assessments. To remind you, a valid assessment is one that genuinely measures what it is supposed to measure. Having an effective process to ensure your assessments are valid, reliable and trustable was an important topic at Questionmark Conference 2016 in Miami last week. Here is some advice I heard:

Reporting back from 3 days of learning and networking at Questionmark Conference 2016 in Miami

Tip 1: Everything starts from the purpose of your assessment. Define this clearly and document it well. A purpose that is not well defined or that does not align with the needs of your organization will result in a poor test. It is useful to have a formal process to kick off  a new assessment to ensure the purpose is defined clearly and is aligned with business needs.

Tip 2: A Job Task Analysis survey is a great way of defining the topics/objectives for new-hire training assessments. One presenter at the conference sent out a survey to the top performing 50 percent of employees in a job role and asked questions on a series of potential job tasks. For each job task, he asked how difficult it is (complexity), how important it is (priority) and how often it is done (frequency). He then used the survey results to define the structure of knowledge assessments for new hires to ensure they aligned with needed job skills.

Tip 3: The best way to ensure that a workplace assessment starts and remains valid is continual involvement with Subject Matter Experts (SMEs). They help you ensure that the content of the assessment matches the content needed for the job and ensure this stays the case as the job changes. It’s worth investing in training your SMEs in item writing and item review. Foster a collaborative environment and build their confidence.

Tip 4: Allow your participants (test-takers) to feed back into the process. This will give you useful feedback to improve the questions and the validity of the assessment. It’s also an important part of being transparent and open in your assessment programme, which is useful because people are less likely to cheat if they feel that the process is well-intentioned. They are also less likely to complain about the results being unfair. For example it’s useful to write an internal blog explaining why and how you create the assessments and encourage feedback.

Lunch with a view at Questionmark Conference 2016 in Miami

Tip 5: As the item bank grows and as your assessment programme becomes more successful, make sure to manage the item bank and review items. Retire items that are no longer relevant or when they have been overexposed. This keeps the item bank useful, accurate and valid.

There was lots more at the conference – excitement that Questionmark NextGen authoring is finally here, a live demo of our new easy to use Printing and Scanning solution … and having lunch on the hotel terrace in the beautiful Miami spring sunshine – with Questionmark branded sunglasses to keep cool.

There was a lot of buzz at the conference about documenting your assessment decisions and making sure your assessments validly measure job competence. There is increasing understanding that assessment is a process not a project, and also that to be used to measure competence or to select for a job role, an assessment must cover all important job tasks.

I hope these tips on making assessments valid are helpful. Click here for more information on Questionmark’s assessment management system.