Practice versus perfection – 2014 Users Conference

Austin Fossey-42 Posted by Austin Fossey

The Questionmark team has just returned from the 2014 Users Conference, where we had a wonderful time showing off our latest work, discussing assessment strategies with our customers, and learning from each other in a great selection of seminars put on by both Questionmark staff and our clients.

At this year’s conference, I field tested two new presentations: Understanding Assessment Results and Principles of Psychometrics and Measurement Design. I got some great feedback from attendees so I can fine-tune them for the future, but these topics also started a lot of interesting conversations about what we as test developers would like to be doing and what we end up doing in practice.

A recurring theme of these conversations was that people felt there were occasionally aspects of their instruments that could be improved, especially in terms of capturing evidence for a measurement or supporting the validity of the results. In some cases they had an idea of what they wanted to improve, but they either did not know the test development methods they needed to apply, or they did not know how to convince their stakeholders and managers of the importance of specific initiatives. The concept of validity came up several times in these conversations—something we have touched on previously on this blog.

The ideals and realities of the assessment industry do not always align. For example, we may wish to do a construct validity study or an Angoff cut score meeting, but we may lack the resources, time, or stakeholder buy-in to engage in these activities.

I recognize how discouraging this can be for people who constantly want to improve the validity of their inferences, but I am excited to see so many people thinking critically about their assessment designs and searching for areas of improvement. Even if we cannot always implement every research study we are interested in, understanding the principles and best practices of good assessment design and interpretation can still guide our everyday work and help us to avoid invalid results. This blog is a good place to explore some of these principles, and so are Questionmark white papers and our Learning Café videos.

I look forward to continuing to work with (and learn from) our great client base throughout 2014 as we continue to advance our products. A special thanks to our attendees and presenters who joined us at the 2014 conference!

Heading home from San Antonio

Joan Phaup 2013 (3)Posted by Joan Phaup

Bryan Chapman (2)

Bryan Chapman

As we head back home from this week’s Questionmark Users Conference in San Antonio, it’s good to reflect on the connections people made with one another during discussions, focus groups, social events and a wide variety of presentations covering best practices, case studies and the features and functions of Questionmark technologies. Many thanks to all our presenters!

Bryan Chapman’s keynote on Transforming Open Data into Meaning and Action offered an expansive approach to a key theme of this year’s conference. Bryan described the tremendous power of OData while dispelling much of the mystery around it. He explained that OData can be exchanged in simple ways, such as using a URL or inserting a command line to create, read, update, and/or delete data items.

thurs night scottIt was interesting to see how focusing on key indicators that have the biggest impact can produce easy-to-understand visual representations of what is happening within an organization. Among the many dashboards Bryan shared was on that showed the amount of safety training in relation to incidence of on-the-job injuries.thurs night trio

No conference is complete without social events that nurture new friendships and cement long-established bonds. Yesterday ended with a visit to the Rio Cibolo Ranch outside the city, where we enjoyed a Texas-style meal, western music and all manner of ranch activities. Many of us got acquainted with some Texas Longhorn Cattle, and the bravest folks of all took some lassoing lessons (snagging a  mechanical calf, not a longhorn!).

Today’s breakouts and general session complete three intensive days of learning. Here’s wishing everyone a good journey home and continued connections in the year ahead.

Setting Your Data Free – 2014 Users Conference

Austin Fossey-42Posted by Austin Fossey

The Questionmark Product Team is off to the 2014 Users Conference! We had a great time last night at the opening reception and are ready now to launch into the conference program.

A major theme this year is “setting your data free!” — so I wanted to give you a little taste of how this theme relates to my presentations on reporting and analytics.

As you know from my previous posts, we have implemented the OData API, which connects your raw assessment data (the same data driving Questionmark Analytics) to a whole ecosystem of business intelligence tools, custom dashboards, statistical packages, and even common desktop applications like Excel and web browsers. At this year’s conference, we will talk about how OData can be a tool for freeing those data for users of all types. Be sure to check out my OData session, where we will be running through examples using Excel with the PowerPivot add-in.

night riverBut, when we free our data, we want to make sure we are putting good, quality, meaningful data out there for our stakeholders so that they make valid inferences about the participants and the assessments. I will be doing two presentations related to this topic. In one, we will talk about understanding assessment results, with a focus on the classical test theory model and its applications for evaluation assessment quality with item statistics. In the second presentation, we will talk about principles of psychometrics and measurement design, where we will discuss validity studies and how principled test development frameworks like evidence-centered design can help us build better assessments that produce actionable data.

I’m pleased to see everyone in San Antonio and expect to be talking a lot about how we can set data free to make a powerful impact for stakeholders!

Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities

John Kleeman HeadshotPosted by John Kleeman

If you are a European or multinational company delivering assessments in Europe or an awarding body providing certification in Europe, then you likely have responsibilities as a Data Controller of assessment results and data under European law.

European Commission logoThe European Data Protection Directive imposes an obligation on European countries to create national laws about collecting and controlling personal data. The Directive defines the role of “Data Controller” as the organization responsible for personal data and imposes strong responsibilities on that organization to process data according to the rules in the Directive. An assessment sponsor must follow the laws of the country in which it is established, and in some cases may also need to follow the laws of other countries.

To help assessment sponsors, we have written a white paper which explains your responsibilities as a Data Controller when assessing knowledge skills and abilities. If you are testing around the world, this is material you need to pay attention to.

One concept the white paper explains is that if you sub-contract with other companies (“Data Processors”) to help deliver your assessments, then you as Data Controller are responsible for the actions of the Data Processors and their Sub-Processors under data protection law.

Diagram showing a Data Controller with two Data Processors. One Data Processor has two Sub-Processors

Regulators are increasingly active in enforcing data protection rules, so failing in one’s responsibilities can have significant financial and reputational consequences. For example, a UK company was fined UK£250,000 in 2013 after a leakage of data as a result of a failure by a Data Processor. Other companies have faced significant fines or other regulatory action as a result of losing data, failing to obtain informed consent or other data protection failures.

The white paper describes the twelve responsibilities of a Data Controller with regard to assessments, summarized as:

  1. Inform participants
  2. Obtain informed consent
  3. Ensure that data held is accurate
  4. Delete personal data when it is no longer needed
  5. Protect against unauthorized destruction, loss, alteration and disclosure
  6. Contract with Data Processors responsibly
  7. Take care transferring data out of Europe
  8. If you collect “special” categories of data, get specialist advice
  9. Deal with any subject access requests
  10. If the assessment is high stakes, ensure there is review of any automated decision making
  11. Appoint a data protection officer and train your staff
  12. Work with supervisory authorities and respond to complaints

If you use a third party to help deliver assessments, you need to ensure it will help you meet data protection rules.  The white paper describes how Questionmark OnDemand can help in this respect.

Map of world focused on Europe

As well as ensuring you follow the law and reduce the risk of regulatory action, there are benefits in being pro-active to follow your responsibilities as a Data Controller. You build confidence with your participants that the assessment is fair and that they can trust you as assessment sponsor, which increases take-up and in encourages an honest approach to taking assessments. You also increase data quality and data security, and you gain protection against inappropriate data leakage.

Download the White Paper:

The white paper is free to download [requires registration].

« Previous Page