Conference Wrap Up: Tips, Advice & Pictures from Napa

Julie Delazyn HeadshotPosted by Julie Delazyn

flickrThe Questionmark Users Conference is the most important learning event of the year. With over a dozen sessions to attend and topics ranging from penetration testing to measuring and understanding your assessment results, there is so much knowledge packed within three days.

Assessment security was an important topic at the Conference in Napa. Questionmark Chairman John Kleeman took to the blog last week to lay out some security advice he heard from attendees. You can check out his blog post, Assessment Security: 5 Tips from Napa, to learn more.

Frequent blog contributor, Psychometrician and Reporting and Analytics Manager, Austin Fossey, presented a number of  sessions at the conference this year. According to Austin, “regardless of their individual roles or organizations’ goals, Questionmark users are first and foremost measurement professionals.” Impressed by our customers’ commitment to look for ways to always improve their measurements, validity, and impact for stakeholders, Austin wrote a blog post highlighting some of the example that struck him. You can read more about the stories he heard from our customers about their assessment programs in his blog post: 2015 Users Conference – A Gathering of Measurement Professionals.

If you did not have a chance to attend the Conference in Napa, there is always next year! Look out for dates and a special location announcement on the blog. For those of you who attended and would like to relive some of the special moments spent in Napa, you can check out the pictures now on our Flickr page. The photos highlight moments from the conference as well as from our special evening event at Markham Winery.

conf goers banner

2015 Users Conference – A Gathering of Measurement Professionals

Austin Fossey-42Posted by Austin Fossey

We are back after another successful Questionmark Users Conference! This was the fifth of these conferences I have been to, and I think it was one of my favorites. Sure, you can’t go wrong with putting a bunch of like-minded confreres in the heart of wine country, but what I liked most were the stories I heard from our customers about their assessment programs.

Regardless of their individual roles or organizations’ goals, Questionmark users are first and foremost measurement professionals. I was impressed by our customers’ commitment to look for ways to constantly improve their measurements, validity, and impact for stakeholders.collage

Questionmark is after all just a tool, and like any tool, the quality of the work that is produced is dependent on the skill of the craftsperson. It was encouraging to hear about how our customers were using Questionmark along with other tools and research to iteratively improve their work.

For example, one client, knowing that Cronbach’s Alpha reported by Questionmark is a theoretical lower bound of the assessment’s true reliability, shared that they were comparing other reliability coefficients that were appropriate for their homogenous set of assessment scores. Using an alternative coefficient that was appropriate for their data scenario, they were able to defend the assertion that their assessments were probably more reliable than indicated by Cronbach’s Alpha.

Another client, having set a valid cut score using a modified Angoff method, found that their program now had issues of face validity with their performance standard. Despite the evidence that the cut score was set in a fair and valid fashion, their stakeholders maintained that the standard was set too high—an issue of face validity. Rather than rebuilding the entire assessment, the client discussed strategies for building a stronger validity argument to support the cut score through a replication study using a modified Angoff method or another standard-setting method with an independent group of subject matter experts.

These were just a few of the examples of Questionmark users working to build the highest quality assessments that they can. These customers were thinking critically about their assessments, working iteratively with their test developers, exploring their data, and incorporating feedback from their stakeholders. These are hallmarks of good assessment, and I am excited to know that these measurement professionals are using Questionmark as one of their tools.

Assessment Security: 5 Tips from Napa

John Kleeman Headshot

Posted by John Kleeman

Assessment security has been an important topic at Questionmark, and that was echoed at the Questionmark Users Conference in Napa last week. Here is some advice I heard from attendees:

  •  Tip 1: It’s common to include an agreement page at the start of the assessment, where the participant agrees to follow the assessment rules, to keep exam content confidential and not to cheat. This discourages cheating by reducing people’s ability to rationalize that it’s okay to do so and also removes the potential for someone to claim they didn’t know the rules.
  • Tip 2: It’s a good idea to have a formal agreement with SMEs in your organization who author or review questions to remind them not to pass the questions to others. If they are employees, you should involve your HR and legal departments in drafting the agreements or notices. That way if someone leaks content, you have HR and legal on board to deal with the disciplinary consequences.Data gathering, screening, unproctored assessments, proctored assessments
  • Tip 3: It’s prudent to use the capabilities of Questionmark software to restrict access to the item bank by topic. Only give  authors access to the parts they are working on, to avoid inadvertent or deliberate content leakage.
  • Tip 4: There is increasing interest and practical application of hybrid testing strategies for proctored and unprotected tests to allow you to focus anti-cheating resources on risk. For example, you might screen participants with quizzes, then give un-proctored tests and give those who pass a proctored test.  Or you might deliver a series of short exams, at periodic intervals to make it harder for people to get proxy test takers to impersonate them. There is also a lot of interest in online proctoring, where people can take exams at home or in the office, and be proctored by a remote proctor using video monitoring.  This reduces travel time and is often more secure than face-to-face proctoring.
  • Tip 5: If your assessment system is on premise (behind the firewall), check regularly with your IT department that they are providing the security you need and that they are backing up your data. Most internal IT departments are hugely competent, but there is a risk as people change jobs over time that your IT department might lose touch with what the assessment application is used for. One user shared how their IT system failed to make backups of the Questionmark database, so when the server failed, they lost their data and had to restart from scratch. I’m sure this particular issue won’t happen for others, but IT teams have a wide set of priorities, so it’s good to check in with them.

There was lots more at the conference – iPads becoming mainstream for authoring and administration as well as delivery, people using OData to get access to Questionmark data, Questionmark being used to test the knowledge of soccer referees and some good thinking on balancing questions at higher cognitive levels.

One thing that particularly interested me was anecdotal evidence that having an internal employee certification program reduces employee attrition. Employees are less likely to leave your organization if you have an assessment and certification program. Certification makes employees feel more valued and more satisfied and so less likely to leave for a new job elsewhere. A couple of attendees shared that their internal statistics showed this.

This mirrors external research I’ve seen – for example the Aberdeen Group have published research which suggests that best-in-class organizations use assessments around twice as often as laggard organizations, and that first-year retention for best-of-class organizations is around 89% vs 76% for laggards.

For more information on security,  download the white paper: Delivering Assessments Safely and Securely.

 

Learning, Networking and Wine-Tasting : Reporting from Napa

IMGP3542Julie Delazyn HeadshotPosted by Julie Delazyn

It’s been an exciting two days at this week’s Questionmark Users Conference in Napa Valley. The Conference is our most important learning event. It is three days filled with discussions, focus groups, social events as well as a wide variety of presentations covering best practices, case studies and the features and functions of Questionmark technologies.

Our discussion this morning on enhancing assessment security while improving stakeholder experiences was an exciting panel discussion with industry leaders.

1-28-2015 9-31-22 AMIt was interesting to see how data can be transformed into information, knowledge and wisdom to assist with security, reliability and validity. And with recent publicity around data security breaches in major corporations, and Questionmark’s focus on assessment security, it was a timely and informative session.

B_1baODVEAAww96No conference is complete without social events that nurture new friendships and cement long-established bonds. Last night we divided up in small groups and dined around beautiful downtown Napa. Tonight, we head to one of the oldest wineries in Napa for a quintessential wine country evening with a winery tour followed by dinner in the barrel room.

Having select sessions webcast was especially interesting. Though networking and learning from one-another is an important aspect of being present at the Conference, we loved having customers from around the world join us virtually and contribute to our live-blog.

Tomorrow’s breakouts and general session will complete three intensive days of learning.

Here’s wishing everyone a good journey home and continued connections in the year ahead.

Interact with your data: Looking forward to Napa

Steve Lay HeadshotPosted by Steve Lay

It’s almost time for the Questionmark Users Conference, which this year is being held in Napa, California. As usual there’s plenty on the program for delegates interested in integration matters!

At last year’s conference we talked a lot about OData for Analytics, (which I have also written about here: What is OData, and why is it important? ). OData is a data standard originally created by Microsoft but now firmly embedded in the open standards community through a technical group at OASIS. OASIS have taken on further development, resulting in the publication of the most recent version, OData 4.

This year we’ve built on our earlier work with the Results OData API to extend our adoption of OData to our delivery database, but there’s a difference. Whereas the Results OData API provides access to data, the data exposed from our delivery system supports read and write actions, allowing third-party integrations to interact with your data during the delivery process.

Why would you want to do that?

Some assessment delivery processes involve actions that take place outside the Questionmark system. The most obvious example is essay grading. Although the rubrics (the rules for scoring) are encoded in the Questionmark database, it takes a human being outside the system to follow those rules and to assign marks to the participant. We already have a simple scoring tool built directly in to Enterprise Manager but for more complex scoring scenarios you’ll want to integrate with external marking tools.

The new Delivery OData API provides access to the data you need, allowing you to read a participant’s answers and write back the scores using a simple Unscored -> Saved -> Scored workflow. When the result is placed in the final status, the participant’s result is updated and will appear with the updated scores in future reports.

I’ll be teaming up with Austin Fossey, our product owner for reporting, and Howard Eisenberg, our head of Solution Services, to talk at the conference about Extending Your Platform, during which we’ll be covering these topics. I’m also delighted that colleagues from Rio Salado College will also be talking about their own scoring tool that is built right on top of the Delivery OData API.

I look forward to meeting you in Napa but if you can’t make it this year, don’t worry, some of the sessions will be live-streamed. Click here to register so that we can send you your login info and directions. And you can always follow along with social media by following and tweeting with @Questionmark.

New white paper: Assessment Results You Can Trust

John Kleeman HeadshotPosted by John Kleeman

Questionmark published an important white paper about why trustable assessment results matter and about how an assessment management system like Questionmark’s can help you make your assessments valid and reliable — and therefore trustable.

The white paper, which I wrote together with Questionmark CEO Eric Shepherd, explains that trustable assessment results must be both valid (measuring what you are looking for them to measure) and reliable (consistently measuring what you want to be measured).

The paper draws upon the metaphor of a doctor using results from a blood test to diagnose an illness and then prescribe a remedy. Delays will occur if the doctor orders the wrong test, and serious consequences could result if the test’s results are untrustworthy. Using this metaphor, it is easy to understand the personnel and organizational risks that can stem from making decisions based on untrustworthy results. If you assesses someone’s knowledge, skill or competence for health and safety or regThe 6 stages of trustable results; Planning assessment, Authoring items, Assembling assessment, Pilot and review, Delivery, Analyze resultsulatory compliance purposes, you need to ensure that your assessment instrument is designed correctly and runs consistently.

Engaging subject matter experts to generate questions to measure the knowledge, skills and abilities required to perform essential tasks of the job is essential in creating the initial pool of questions. However, subject matter experts are not necessarily experts in writing good questions, so an effective authoring system requires a quality control process which allows assessment experts (e.g. instructional designers or psychometricians) to easily review and amend assessment items.

For assessments to be valid and reliable, it’s necessary to follow structured processes at each step from planning through authoring to delivery and reporting.

The white paper covers these six stages of the assessment process:

  • Planning assessment
  • Authoring items
  • Assembling assessment
  • Pilot and review
  • Delivery
  • Analyze results

Following the advice in the white paper and using the capabilities it describes will help you produce assessments that are more valid and reliable — and hence more trustable.
Modern organizations need their people to be competent.

Would you be comfortable in a high-rise building designed by an unqualified architect? Would you fly in a plane whose pilot hadn’t passed a flying test? Would you let someone operate a machine in your factory if they didn’t know what to do if something went wrong? Would you send a sales person out on a call  if they didn’t know what your products do? Can you demonstrate to a regulatory authority that your staff are competent and fit for their jobs if you do not have trustable assessments?

In all these cases and many more, it’s essential to have a reliable and valid test of competence. If you do not ensure that your workforce is qualified and competent, then you should not be surprised if your employees have accidents, cause your organization to be fined for regulatory infractions, give poor customer service or can’t repair systems effectively.

To download the white paper, click here.

John will be talking more about trustable assessments at our 2015 Users Conference in Napa next month. Register today for the full conference, but if you cannot make it, make sure to catch the live webcast.

Next Page »
SAP Microsoft Oracle HR-XML AAIC