The 12 responsibilities of a data controller, part 2

John Kleeman HeadshotPosted by John Kleeman

In my post last week, I shared some information on six of the responsibilities of assessment sponsors acting as Data Controllers when delivering assessments in Europe:

1. Inform participants
2. Obtain informed consent
3. Ensure that data held is accurate
4. Delete personal data when it is no longer needed
5. Protect against unauthorized destruction, loss, alteration and disclosure
6. Contract with Data Processors responsibly

Here is a summary of the remaining responsibilities:

7. Take care transferring data out of Europe

You need to be careful about transferring assessment results outside of the European Economic Area (though Canada, Israel, New Zealand and Switzerland are considered safe by the EU). If transferring to another country, you should usually enter into a contract with the recipient based on standard clauses called the “EU Model Clauses” and by performing due diligence.  You can also send to the US if the US company follows the US government Safe Harbor rules, but German data protection authorities require further diligence beyond Safe Harbor.

8. If you collect “special” categories of data, get specialist advice

Political candidate with "Vote" signs

The data protection directive defines “special” categories of data, covering data that reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade-union membership, as well as data concerning health or sex life. Many assessment sponsors will choose not to collect such information as part of assessments, but if you do collect this, for example to prove assessments are not biased, the rules need to be carefully followed. Note that some information may be obtained even if not specifically requested. For example, the names Singh and Cohen may be an indication of race or religious belief. This is one reason why getting informed consent from data subjects is important.

9. Deal with any subject access requests

image

Data protection law allows someone to request information you are holding on them as Data Controller, and if you receive such a request, you will need to review it and respond.

You will need to check specific country rules for how this works in detail. There are typically provisions to prevent people from gaining access to exam results in advance of their formal adjudication and publication.

 

10. If the assessment is high stakes, ensure there is review of any automated decision making

Picture of person with two pathwaysThe EU Directive gives the right “to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data”. You need to be careful that important decisions are made by a person, not just by a computer.

For high-stakes assessments, you should either include a human review prior to making a decision or include a human appeal process. In general, an assessment score should be treated as one piece of data about a person’s knowledge, skills and/or attitudes and you should thoroughly review the materials, scores and reports produced by your assessment software to ensure that appropriate decisions are made.

11. Appoint a data protection officer and train your staff Picture of security console

This is not required everywhere, but it is a sensible thing to do. Most Data Controllers established in Germany need to appoint a data protection officer, and all organizations are likely to find it helpful to identify an individual or team who understands the issues, owns data protection in the organization and ensures that the correct procedures are followed. One of the key duties of the data protection officer is to train employees on data protection.

I recommend (and it’s something we do ourselves within Questionmark) that all employees are tested annually on data security to help ensure knowledge and understanding.

12. Work with supervisory authorities and respond to complaints

You need to register with supervisory authorities in many jurisdictions and provide a route to make complaints and must respond to complaints.

 

If you want to learn more, then please read our free-to-download white paper: Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities [requires registration].

An easier approach to job task analysis: Q&A

Julie Delazyn HeadshotPosted by Julie Delazyn

Part of the assessment development process is understanding what needs to be tested. When you are testing what someone needs to know in order for them to do their job well, subject matter experts can help you harvest evidence for your test items by observing people at work. That traditionally manual process can take a lot of time and money.

Questionmark’s new job task analysis (JTA) capabilities enable SMEs to harvest information straight from the person doing the job. These tools also offer an easier way to see the frequency, importance, difficulty and applicability of a task in order to know if it’s something that needs to be included in an assessment.

Now that JTA question authoring, assessment creation and reporting are available to users of  Questionmark OnDemand and Questionmark Perception 5.7 I wanted to understand what makes this special and important. Questionmark Product Manager Jim Farrell, who has been working on the JTA question since its conception, was kind enough to speak to me about  its value, why it was created, and how it can now benefit our customers.

Here is a snippet of our conversation:

So … first things first … what exactly IS job task analysis and how would our customers benefit from using it?

Job task analysis, JTA, is a survey that you send out that has a list of tasks, which are broken down into dimensions. Those dimensions are typically difficulty, importance, frequency, and applicability. You want to find out things like this from someone who fills out the surveys: Do they find the job difficult? Do they deem it important? And how frequently do they do it? When you correlate all this data you’ll quickly see the items that are more important to test on and collect information on.

We have a JTA question type in Questionmark Live where you can either build your task list and your dimensions or you can import your tasks through a simple import process—so if you have a spreadsheet with all of your tasks you can easily import it. You would then add those to a survey and send them out to collect information. We also have two JTA reports that allow you to break down results by the actual dimension—just look at the difficulty for all the tasks—or you can look at a summary view of all of your tasks and all the dimensions all at
one time; have a snapshot.

That sounds very interesting and easy to use! I’m interested in how did question type actually came to be.

We initially developed the job task analysis survey for the US Navy. Prior to this, trainers would have to travel with paper and clipboards to submarines, battleships and aircraft carriers and watch sailors and others in the navy do their jobs. We developed the JTA survey to help them be more efficient to collect this data more easily and a lot more quickly than they did before.

What do you think is most valuable and exciting about JTA?

To me, the value comes in the ease of creating the questions and sending them out. And I am probably most excited for our customers. Most customers probably harvest information with paper and clipboard and walking around and watching people do their jobs. That’s a very expensive and time-consuming task, so by being able to send this survey out directly to subject matter experts you’re getting more authentic data because you are getting it right form the SMEs rather than from someone observing the behavior.

 

It was fascinating for me to understand how JTA was created and how it works … Do you find this kind of question type interesting? How do you see yourself using it? Please share your thoughts below!

The 12 responsibilities of a data controller, part 1

John Kleeman HeadshotPosted by John Kleeman

In my earlier post, Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities, I suggested there are 12 responsibilities of assessment sponsors acting as Data Controllers when delivering assessments in Europe.

Here is an outline of the first 6 of these:

1. Inform participants

A key principle of data protection is that you tell people what is being done with their data. At a minimum, you need to inform assessment participants of:image_thumb.png

  • your identify and contact details
  • the purposes of the assessment and of any processing of its results
  • who will see the assessment results
  • the rights of the participant under data protection law to see data and correct inaccuracies
  • use of Internet “cookies” in delivering assessments

2. Obtain informed consent

It’s usually recommended to get informed, explicit and recorded consent from everyone whose data you process. You can ask for consent on the first screen of an assessment or in a prior agreement with test-takers. Failure to gain informed consent can have consequences: Case in point: a Portuguese company was fined €20,000 for hiring a third party to assess the professional skills of its employees without notifying them or gaining consent.

3. Ensure that data held is accurate image_thumb.png

You are required to ensure that data is accurate and up to date. In the assessment context, this might include ensuring that if you hold data about someone being certified or not certified, the data is accurate and up to date. It also likely means requiring your assessment itself to be accurate, i.e. created and delivered using appropriate procedures that ensure accuracy. See the Questionmark white papers, “Five Steps to Better Tests” and “Defensible Assessments: What You Need to Know”, for some guidance in this area. These papers are available from https://help.questionmark.com/content/white-papers.

Supervisory authorities can also issue penalties if you fail to maintain accurate data and this causes distress. For instance a UK company was fined UK£50,000 in 2012 for mixing up two individuals’ data and failing to correct it over a period of time.

4. Delete personal data when it is no longer needed

The regulations require that you must not keep data for longer than is necessary and to ensure data held is relevant and not excessive. How long to keep assessment data will depend on the purpose of the assessment. An organization that delivers a formal certification program trusted by the community might want to keep assessment records for decades if those records contribute to the issuing of certificates. Other organizations that deliver casual quizzes to employees or stakeholders would likely choose to delete much sooner.

5. Protect against unauthorized destruction, loss, alteration and disclosure image_thumb.png

This is a critical responsibility and one which typically requires the most effort and care from a Data Controller. You need to share assessment results only with those who are entitled to know about them and safeguard assessment data from being disclosed inappropriately, tampered with, lost or destroyed.

You are required to have in place “appropriate” organizational and technical measures commensurate with risk. Failure to put the appropriate measures in place can result in financial penalties. One UK organization was fined  £150,000 in 2013 for failing to take appropriate technical security measures. If you use Questionmark OnDemand to deliver your assessments, many technical and organizational measures are taken care of for you. You will of course need to take care of any data once it leaves the Questionmark system, e.g. is downloaded to your systems.

6. Contract with Data Processors responsibly

As Data Controller, you are responsible for all the processing that your Data Processors and their Sub-Processors do. You need to appropriately contract with Data Processors, ensure they only process data under your instructions and that they have appropriate technical and organizational measures. An organization was fined £250,000 in 2013 for failing to ensure that one of its processors safeguarded data properly. If you contract with Questionmark, we ensure that data centres and other Sub-Processors that comply with data protection law – and you should check that other suppliers you use also have this in place.

I hope this is helpful. I’ll write about the other 6 responsibilities next week. If you want more details or want to find out about the other 6 before my next post (!), you can download our white paper

Join us in London or Edinburgh for briefings on assessment security

Chloe MendoncaPosted by Chloe Mendonca

This June, we’re getting together with online invigilation leader ProctorU to deliver breakfast briefings in two UK cities.Proctor U

The briefings, to be held in London on 17th June and in Edinburgh on 18th June, will focus on innovative technologies that make it possible to deliver high-stakes tests using almost any webcam and computer, anywhere in the world .

Online courses help organisations increase accessibility to their programs, but until recently, when it came time for an exam, students had to travel to a test centre. Now, if you can study remotely, it’s equally feasible to you take exams remotely, too. Using online invigilators or proctors is a practical solution for institutions and organisations worldwide — a means of providing secure testing at a distance.

The sessions – co-presented by Questionmark and ProctorU, will explain the basics of online invigilation, discuss proven strategies for alleviating the testing centre burden and explore how the “last mile” of high-stakes test delivery can meet the goals and needs of all stakeholders.bb3

The breakfast briefings will include a complimentary breakfast at 8.45 a.m. followed by presentations and discussions until about 12:30 p.m.

These gatherings are ideal for educators, instructional designers and managers from academic institutions, businesses and other organisations.

The sessions offer an excellent way to learn about the newest online assessment technologies and services. They’re also a great opportunity to meet other assessment professionals in your area.

If you’re new to online assessment or online invigilation, this is an opportunity you don’t want to miss!

Reflections on the San Antonio Users Conference

Doug Peterson HeadshotPosted By Doug Peterson

I had the good fortune of attending the Questionmark Users Conference in San Antonio, Texas a couple of weeks ago.

As required by (personal) law, I visited the Hard Rock Café for dinner on my first night in town! And let me tell you, if you missed the fresh sushi at the Grand Hyatt’s Bar Rojo, you missed something pretty doggone special.

But more special than Hard Rock visits and heavenly sushi was the chance to interact with and learn from Questionmark users. Honestly, users conferences are a  favorite part of my job. The energy, the camaraderie, the ideas – it all energizes me and helps keep me fired up!

We had a great session on Item Writing Techniques for Surveys, Quizzes and Tests. We had some wonderful conversations – I like for my sessions to be more of a conversation than a lecture – and I picked up some helpful tips and examples to work into my next presentation. For those of you who couldn’t make this session, it’s based on a couple of blog series. Check out the Writing Good Surveys series as well as the Item Writing Guide series. You’ll also want to check out Improving Multiple Choice Questions and Mastering Your Multiple Choice Questions for more thoughts on improving your multiple choice questions.

The other session I led was on using Captivate and Flash application simulations in training and assessments. As with my previous presentations on this topic, the room was packed and people were excited! During my years as a Questionmark customer, I was always impressed with the Adobe Captivate Simulation and Adobe Flash question types. I feel even more strongly about this since attending a webinar put on the other day by a fairly popular LMS. The process you have to go through to do a software simulation in one of their assessments is far too involved and complicated – it really drove home the simplicity of using the Captivate question type in Questionmark.

It really was great to see old friends and make new ones at the conference. I look forward to working with customers throughout the rest of 2014 and to seeing them again soon.

Ten tips on recommended assessment practice – from San Antonio, Texas

John Kleeman HeadshotPosted by John Kleeman

One of the best parts of Questionmark user conferences is hearing about good practice from users and speakers. I shared nine tips after our conference in Barcelona, but Texas has to be bigger and better (!), so here are ten things I learned last week at our conference in San Antonio.

1. Document your decisions and processes. I met people in San Antonio who’d taken over programmes from colleagues. They valued all the documentation on decisions made before their time and sometimes wished for more. I encourage you to document the piloting you do, the rationale behind your question selection, item changes and cut scores. This will help future colleagues and also give you evidence if you have need to justify or defend your programmesgen 5.

2. Pilot with non-masters as well as masters. Thanks to Melissa Fein for this tip. Some organizations pilot new questions and assessments just with “masters”, for example the subject matter experts who helped compile them. It’s much better if you can pilot to a wider sample, and include participants who are not experts/masters. That way you get better item analysis data to review and you also will get more useful comments about the items.

3. Think about the potential business value of OData. It’s easy to focus on the technology of OData, but it’s better to think about the business value of the dynamic data it can provide you. Our keynote speaker, Bryan Chapman, made a powerful case at the conference about getting past the technology. The real power is in working out what you can do with your assessment data once it’s free to connect with other business data. OData lets you link assessment and business data to help you solve business problems.

4. Use item analysis to identify low-performing questions. The most frequent and easiest use of item analysis is to identify low-performing questions. Many Questionmark customers use it regularly to identify questions that are too easy, too hard or not sufficiently discriminating. Once you identify these questions, you modify them or remove them depending on what your review finds. This is an easy win and makes your assessments more trustworthy.thurs night longhorn flag

5. Retention of learning is a challenge and assessments help. Many people shared that retention was a key challenge. How do you ensure your employees retain compliance training to use when they need it? How do you ensure your learners retain their learning beyond the final exam? There is a growing realization that using Questionmark assessments can significantly reduce the forgetting curve.

6. Use performance data to validate and improve your assessments. I spoke to a few people who were looking at improving their assessments and their selection procedure by tracking back and connecting admissions or onboarding assessments with later performance. This is a rich vein to mine.

7. Topic feedback and scores. Topic scores and feedback are actionable. If someone gets an item wrong, it might just be a mistake or a misunderstanding. But if someone is weak in a topic area, you can direct them to remediation. It’s hugely successful for a lot of organizations to divide assessments into topics and feedback and analyze by topic.

8. Questionmark Community Spaces is a great place to get advice. Several users shared that they’d posed a question or problem in the forums there and got useful answers. Customers can access Community Spaces here.wed dinner gents

9. The Open Assessment Platform is real. We promote Questionmark as the “Open Assessment Platform,” allowing you to easily link Questionmark to other systems, and it’s not just marketing! As one presenter said at the conference “The beauty of using Questionmark is you can do it all yourself”. If you have a need to build a system including assessments, check out the myriad ways in which Questionmark is open.

10. Think of your Questionmark assessments like a doctor thinks of a blood test. A doctor relies on a blood test to diagnose a patient. By using Questionmark’s trustable processes and technology, you can start to think of your assessments in a similar light, and rely on your assessments for business value.

I hope some of these tips might help you get more business value out of your assessments.

Next Page »