Job Task Analysis Surveys Legally Required?

John Kleeman Headshot

Posted by John Kleeman

I had a lot of positive feedback on my blog post Making your Assessment Valid: 5 Tips from Miami. There is a lot of interest in how to ensure your assessment is valid, ensuring that it measures what it is supposed to measure.

If you are assessing for competence in a job role or for promotion into a job role, one critical step in making your assessment valid is to have a good, current analysis of what knowledge, skills and abilities are needed to do the job role. This is called a job task analysis (JTA), and the most common way of doing this analysis is to conduct a JTA Survey.

Job Task Analysis SurveyIn a JTA Survey, you ask existing people in the job role, or other experts, what tasks they do. A common practice is to survey them on how important each task is, how difficult it is and how often it is done. The resultant reports then guide the construction of the test blueprint and which topics and how many questions on each you include in the test.

If you cannot show that your assessment matches the requirements of a job, then your assessment is not only invalid but it is likely unfair — if you use it to select people for the job or measure competence in the job. And if you use an invalid assessment to select people for promotion or recruitment into the job, you may face legal action from people you reject.

Not only is this common sense, but it was also confirmed by a recent US district court ruling against the Boston Police Department. In this court case, sergeants who had been rejected for promotion to lieutenant following an exam sued that the assessment was unfair, and won.

The judge ruled that the exam was not sufficiently valid, because it omitted many job skills crucial for a police lieutenant role, and so it was not fair to be used to select for the role (see news report).

The 82-page judge’s ruling sets out in detail why the exam was unfair. He references the Uniform Guidelines on Employee Selection Procedures which state:

“There should be a job analysis which includes an analysis of the important work behavior(s) required for successful performance and their relative importance”

But the judge ruled that although a job analysis had been done, it had not been used properly in the test construction process. He said:

“When using a multiple choice exam, the developer must convert the job analysis result into a test plan to ensure a direct and strong relationship between the job analysis and the exam.

However, in this case, the job analysis was not used sufficiently well to construct the exam. The judge went on to say:

The Court cannot find, however, that the test plan ensured a strong relationship between the job analysis and the exam. … too many skills and abilities were missing from the … test outline. 

Crucially, he concluded:

“And a high score on the … exam simply was not a good indicator that a candidate would be a good lieutenant”.

Due to the pace of business change and technological advance, job roles are changing fast. Make sure that you conduct regular JTAs  of roles in your organization and make sure your assessments match the most important job tasks. Find out more about Job Task Analysis here.

Making your Assessment Valid: 5 Tips from Miami

John Kleeman Headshot

Posted by John Kleeman

A key reason people use Questionmark’s assessment management system is that it helps you make more valid assessments. To remind you, a valid assessment is one that genuinely measures what it is supposed to measure. Having an effective process to ensure your assessments are valid, reliable and trustable was an important topic at Questionmark Conference 2016 in Miami last week. Here is some advice I heard:

Reporting back from 3 days of learning and networking at Questionmark Conference 2016 in Miami

Tip 1: Everything starts from the purpose of your assessment. Define this clearly and document it well. A purpose that is not well defined or that does not align with the needs of your organization will result in a poor test. It is useful to have a formal process to kick off  a new assessment to ensure the purpose is defined clearly and is aligned with business needs.

Tip 2: A Job Task Analysis survey is a great way of defining the topics/objectives for new-hire training assessments. One presenter at the conference sent out a survey to the top performing 50 percent of employees in a job role and asked questions on a series of potential job tasks. For each job task, he asked how difficult it is (complexity), how important it is (priority) and how often it is done (frequency). He then used the survey results to define the structure of knowledge assessments for new hires to ensure they aligned with needed job skills.

Tip 3: The best way to ensure that a workplace assessment starts and remains valid is continual involvement with Subject Matter Experts (SMEs). They help you ensure that the content of the assessment matches the content needed for the job and ensure this stays the case as the job changes. It’s worth investing in training your SMEs in item writing and item review. Foster a collaborative environment and build their confidence.

Tip 4: Allow your participants (test-takers) to feed back into the process. This will give you useful feedback to improve the questions and the validity of the assessment. It’s also an important part of being transparent and open in your assessment programme, which is useful because people are less likely to cheat if they feel that the process is well-intentioned. They are also less likely to complain about the results being unfair. For example it’s useful to write an internal blog explaining why and how you create the assessments and encourage feedback.

Lunch with a view at Questionmark Conference 2016 in Miami

Tip 5: As the item bank grows and as your assessment programme becomes more successful, make sure to manage the item bank and review items. Retire items that are no longer relevant or when they have been overexposed. This keeps the item bank useful, accurate and valid.

There was lots more at the conference – excitement that Questionmark NextGen authoring is finally here, a live demo of our new easy to use Printing and Scanning solution … and having lunch on the hotel terrace in the beautiful Miami spring sunshine – with Questionmark branded sunglasses to keep cool.

There was a lot of buzz at the conference about documenting your assessment decisions and making sure your assessments validly measure job competence. There is increasing understanding that assessment is a process not a project, and also that to be used to measure competence or to select for a job role, an assessment must cover all important job tasks.

I hope these tips on making assessments valid are helpful. Click here for more information on Questionmark’s assessment management system.

10 Actionable Steps for Building a Valid Assessment – Infographic

Using item analysis can greatly improve the validity of your assessments by helping you quickly and easily spot any red flags and weed out unreliable questions that are not performing well. Our infographic highlights 9 additional steps you can take to produce valid assessments. 

View and download the infographic here.


10 steps valid assessments

Single sign-on: secure and easy access

Bart Hendrickx Small

Posted by Bart Hendrickx

It’s Tuesday morning. You have just started your computer. Now it’s time to open your day-to-day tools: email, chat/phone, tasks and so on. As you go through your tasks, you realize you need to take that data security test you’ve been postponing.

Every day, you interact with various applications. Some are installed on your personal computer, others on servers managed either by your organization by vendors.  Your applications might come from a multitude of service providers, in-house or on the Cloud.

For many of those applications, you need to authenticate—to tell the applications who you are—so that the app can present the information that pertains to you. Sometimes this happens automatically. When your email client connects to the mail server, you read your emails, not those of your co-workers. Your email client has authenticated you against your mail server because you entered a username or email address, a password and some other data, way back when.

You often need to use different sign-ins for different apps. When you log in tor you Questionmark OnDemand portal, for instance, you enter a different username and password than the one you used to unlock your computer earlier today, (Your organization’s data security policy does not allow you to  store your organizational password in other systems.)

Want to learn more? I’ll be discussing this topic and more at the Questionmark Conference 2016 in Miami, April 12-15. Register before March 3 to take advantage of our final early-bird discounts.

Problem: Unrecognized username or password

You’ve logged in for your exam, but you get an error message. Maybe you mistyped the password? Second try. Nope; same results. You must have forgotten your password. You start an instant message window with your internal IT help desk. “Sorry, we don’t manage Questionmark OnDemand. Can you use its password reset function?”

You go back to your Questionmark login page, get a secure on-time login and establish a new, permanent password that complies with the data security policy — “I better not forget my password this time,” you say to yourself as you finally start your data security test. “Isn’t there something more convenient?”

Solution: Single sign-on

We all find ourselves in similar situations, but with Single sign-on (SSO) we can avoid them.

Since, there are several definitions of SSO, here’s how I’ll define it in the context of this blog:

Single Sign-On (SSO) for software is the ability for one application, the identity provider, to tell another application, the service provider, who you are.

By identity provider, I mean a system that contains digital identity information—also known as people data—on users, For example, think of social network sites or Active Directory from Microsoft.

The service provider is the system that users work with to do something—say Questionmark OnDemand, in the case of your data security test.

With SSO, a user does not log on directly to the service provider. Instead, they log on to an identity provider, which then tells the service provider who the user is. The identity provider and service provider have been configured to trust each other. So when the identity provider says: “This is Jane Doe,” the service provider will trust and accept that.

It is important to note that SSO is therefore not about creating accounts with the same usernames and passwords—a prevalent mechanism for different service providers. SSO is about making those service providers accept what an identity provider says about a user.

Why SSO?

SSO comes with several advantages. Users can access all applications that are linked to their identity providers—using one username and password for multiple systems. Depending on the capabilities of the applications and how things have been set up, the authentication can be seamless. You might log on to your identity provider when you start your computer, and the other applications (service providers) you access during the day will automatically check with your identity provider without you having to enter your username and password again.

SSO makes password management easier for IT administrators. Having an employee leave an organization might mean having to decommission access to dozens of service providers. If the authentication to those service providers has been set up with SSO, then an IT administrator only needs to decommission the employee’s identity provider account. Without that account, the employee can no longer log on to any of the linked applications.

There is one disadvantage to SSO: If the account at the identity provider is hacked, all linked applications can be compromised. It is therefore imperative the account is properly secured. How can you set up SSO to ensure its security and effectiveness? Watch for more posts on this subject, which will include information about our newly added support for a popular technique called SAML.

If you would like to learn more, attend my session: Secure Authentication: Accessing Questionmark OnDemand with SSO at the Questionmark Conference 2016, April 12-15. Register before March 3 to take advantage of our final early-bird discounts.

Next Gen Authoring & Intro to Questionmark – don’t miss these webinars!

Julie Delazyn Headshot

Helping our customers understand how to use assessments effectively is as important to us as providing good testing and assessment technologies.

Our free, one-hour web seminars give you the opportunity to find out what’s happening in the world of online assessment and consider which tools and technologies would be most useful to you. Here’s the current line-up:

Authoring Questions and Assessments with Questionmark OnDemand

This 45-minute webinar demonstrates the “next generation” authoring tool in Questionmark OnDemand. The session will show the basics authoring items and then organizing them into assessments.

Introduction to Questionmark’s Assessment Management System

Learn the basics of authoring, delivering and reporting on surveys, quizzes, tests and exams. This introductory web seminar explains and demonstrates key Questionmark features and functions.

If you’ve been waiting for a webinar in Portuguese, you don’t want to miss this one:

Como utilizar a plataforma de avaliações da Questionmark em conformidade com a RDC nº 17 de 2010

Introdução às tecnologias de gestão na medição de conhecimentos e habilidades de sua equipe de trabalho atendendo conformidades da ANVISA, através das soluções da plataforma de avaliações OnDemand da Questionmark.

Click here to choose your complimentary webinar and register online. And if you have any questions, don’t hesitate to reach out to us!

 

Checklists for Test Development

Austin Fossey-42Posted by Austin Fossey

There are many fantastic books about test development, and there are many standards systems for test development, such as The Standards for Educational and Psychological Testing. There are also principled frameworks for test development and design, such as evidence-centered design (ECD). But it seems that the supply of qualified test developers cannot keep up with the increased demand for high-quality assessment data, leaving many organizations to piece together assessment programs, learning as they go.checklist

As one might expect, this scenario leads to new tools targeted at these rookie test developers—simplified guidance documents, trainings, and resources attempting to idiot-proof test development. As a case in point, Questionmark seeks to distill information from a variety of sources into helpful, easy-to-follow white papers and blog posts. At an even simpler level, there appears to be increased demand for checklists that new test developers can use to guide test development or evaluate assessments.

For example, my colleague, Bart Hendrickx, shared a Dutch article from the Research Center for Examination and Certification (RCEC) at University of Twente describing their Beoordelingssysteem. He explained that this system provides a rubric for evaluating education assessments in areas like representativeness, reliability, and standard setting. The Buros Center for Testing addresses similar needs for users of mental assessments. In the Assessment Literacy section of their website, Buros has documents with titles like “Questions to Ask When Evaluating a Test”—essentially an evaluation checklist (though Buros also provides their own professional ratings of published assessments). There are even assessment software packages that seek to operationalize a test development checklist by creating a rigid workflow that guides the test developer through different steps of the design process.

The benefit of these resources is that they can help guide new test developers through basic steps and considerations as they build their instruments. It is certainly a step up from a company compiling a bunch of multiple choice questions on the fly and setting a cut score of 70% without any backing theory or test purpose. On the other hand, test development is supposed to be an iterative process, and without the flexibility to explore the nuances and complexities of the instrument, the results and the inferences may fall short of their targets. An overly simple, standardized checklist for developing or evaluating assessments may not consider an organization’s specific measurement needs, and the program may be left with considerable blind spots in its validity evidence.

Overall, I am glad to see that more organizations are wanting to improve the quality of their measurements, and it is encouraging to see more training resources to help new test developers tackle the learning curve. Checklists may be a very helpful tool for a lot of applications, and test developers frequently create their own checklists to standardize practices within their organization, like item reviews.

What do our readers think? Are checklists the way to go? Do you use a checklist from another organization in your test development?