Job Task Analysis Summary Report

Austin FosseyPosted by Austin Fossey

There are several ways to determine content for an assessment blueprint and ultimately for the assessment instrument itself. A job task analysis (JTA) study, as explained by Jim Farrell in a previous post, is one commonly used method to describe potential topics or tasks that need to be assessed to determine if a participant meets minimum qualifications within an area of practice.

In their chapter in Educational Measurement (4 th ed.), Clauser, Margolis, and Chase describe the work that must go in to culling the initial list of topics down to a manageable, relevant list that will be used as the foundation for the test blueprint.

Consider a JTA survey that asks participants to rate the difficulty, importance, and frequency of a list of tasks related to a specific job. Subject matter experts (SMEs) must decide how to interpret the survey results to make decisions about which topics stay and which ones go.

For example, there may be a JTA that surveys employees about potential assessment topics and tasks for an assessment about the safe operation of machinery at the job site. One task relates to being able to hit the emergency shutoff in case something goes wrong. The JTA results may show that respondents think this is very important to know, but it is not something they do very frequently because there are rarely emergency situations that would warrant this action. Similarly, there may a task related to turning the machine on. The respondents may indicate that this is important and something that is done on a daily basis, but it is also very easy to do.

There is no all-encompassing rule for how SMEs should determine which tasks and topics to include in the assessment. It often comes down to having the SMEs discuss the merits of each task, with each SME making a recommendation informed by their own experience and expertise. Reporting the results of the JTA survey will give the SMEs context for their decision-making, much like providing impact data in a standard-setting study.

Questionmark Analytics currently provides two JTA reports: the JTA Summary Report, and the JTA Demographic Report. Today, we will focus on the JTA Summary Report.

This report uses the same assessment selection and result filtering tools that are used throughout Analytics. Users can report on different revisions of the JTA survey and filter by groups, dates, and special field values.

The current JTA survey item only supports categorical and ordinal response data, so the JTA Summary Report provides a table showing the frequency distribution of responses for each task by each of the dimensions (e.g., difficulty, importance, frequency) defined by the test developer in the JTA item.

These response patterns can help SMEs decide which tasks will be assessed and which ones are not required for a valid evaluation of participants.

JTA

Response distribution table for a JTA for medical staff using the Questionmark JTA Summary Report.

 

Questionmark receives U.S. Army Certificate of Networthiness

Julie Delazyn HeadshotPosted by Julie Delazyn

We’re proud to announce to that the Questionmark Perception version 5 assessment management system has been awarded the Certificate of Networthiness (CoN # 201417177) by the U.S. Army Network Enterprise Technology Command.

What exactly does this mean?

The accreditation confirms that Questionmark’s assessment management system meets strict U.S. Army and Department of Defense (DoD) standards for security, compatibility, supportability and sustainability.

It also certifies that Questionmark Perception can be deployed for authoring, delivery and analysis of online quizzes, tests, exams and surveys while remaining compliant with U.S. military IT standards.

How significant is this?

The CoN accreditation is required by all enterprise software products functioning within the U.S. Army Enterprise Infrastructure Network. This accreditation applies to the entire U.S. Army, including Army Reserve, National Guard and some DoD organizations.


How do military organizations use Questionmark assessment management technologies?

Military organizations use Questionmark technologies for a range of assessment solutions including:

  • advancement exams
  • medical training
  • job-task analyses
  • post-course exams for distance learning
  • testing for pilots and aircraft engineers

Questionmark is listed with the U.S. General Services Administration (GSA) federal supply list as a provider of testing and assessment technologies and services (contract number is GS-35F-0380Y). This covers Questionmark Perception software, training, consultancy and support.

How important is security for Questionmark?

It’s a top priority, as explained this video:

Security

Where do you deliver assessments from in a post-PRISM world?

John Kleeman HeadshotPosted by John Kleeman

Like many of you, I have been watching with interest revelations about government Internet surveillance initiatives. Technologically and legally, none of it is surprising. Businesses and governmental organizations around the world have frequently expressed concerns about the data privacy implications of the US Patriot Act.  Indeed, many of our customers cite data protection issues as factors in their decisions to opt for the Questionmark OnDemand service based at our European data centre.

Practically, I am torn between admiring our governments defending us against terrorism and pondering Benjamin Franklin’s saying that if you give up liberty for security, you lose liberty.

Wherever you stand on this issue, there are still questions to address about the practical implications this data protection challenge poses for those delivering assessments.  I thought it might be helpful to look at a couple of different scenarios and suggest data protection requirements you might look for when running assessments over the Internet.

Scenario 1. A US company looking for a safe place to deliver assessments from the Cloud

US flagSuppose you are a US company seeking to test your employees via a SaaS vendor. Suppose most employees are in North America but a few are spread round the globe. Here are the likely key data protection requirements:

1. Contract with a US service provider with confidentiality clauses.

2. Data centre and assessment results located in the US.

3. Data centre certified and audited to SSAE 16, the expected standard for quality data centres in North America.

4. Service provider and data centre operator certified under the U.S. Department of Commerce’s Safe Harbor Framework. This means they promise to comply with European data protection rules for data coming from Europe. Without this, you will have HR challenges testing your employees in Europe. With a lot of testing in Europe, you may want to look for stronger measures than Safe Harbor – see the White Paper (complimentary with registration): Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities.

5. Vendors must have strong IT security including the latest SSL/TLS encryption and other technical measures.

Scenario 2. A European organization who wants to run assessments and keep data in Europe

European Union flagMany European companies or universities have a legal need to follow European data protection law and keep their data in Europe, and some may have constitutional requirements to avoid US oversight. Here are some of the key things they would look for:

1. Contract with a European service provider with confidentiality and data protection clauses.

2. Data centre with assessment results and personal data located inside the European Union.

3. Data centre certified and audited under ISO 27001, the expected standard for quality data centres in Europe.

4. This alone is only part of the story. The service provider and the data centre operator must not just be located in Europe, they must be European owned and not a subsidiary of a US company. If a US company runs a data centre or service in Europe, even if they run a subsidiary in Europe, they are required to hand over data on request to the US government, even if that data is in Europe. So if you work with a European subsidiary of a US LMS, VLE or other SaaS company, your data may be obtained by US enforcement agencies. According to a recent report by Reuters, a US judge has ruled that:

Internet service providers such as Microsoft Corp or Google Inc cannot refuse to turn over customer information and emails stored in other countries when issued a valid search warrant from U.S. law enforcement agencies

5. Again, all the legal data protection needs to be accompanied with good IT security. See our security comparison document for some questions to ask.

White Paper (complimentary with registration): Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities.

Questionmark can meet both these needs. You can visit our website to learn how Questionmark OnDemand — US-based or EU-based — offers trustable solutions for either of these scenarios.

Discussing data mining at NCME

Austin FosseyPosted by Austin Fossey

We will wrap up our discussion of themes at the National Council for Measurement in Education (NCME) annual meeting with an overview of the inescapable discussion about working with complex — and often messy– data sets.

It was clear from many of the presentations and poster sessions that technology is driving the direction of assessment, for better or for worse (or as Damian Betebenner put it, “technology eats statistics”). Advances in technology have allowed researchers to examine new statistical models for scoring participants, identify aberrant responses, score performance tasks, identify sources of construct-irrelevant variance, diversify item formats, and improve reporting methods.

As the symbiotic knot between technology and assessment grows tighter, many researchers and test developers are in the unexpected position of having too much data. This is especially true in complex assessment environments that yield log files with staggering amounts of information about a participant’s actions within an assessment.

Log files can track many types of data in an assessment, such as responses, click streams, and system states. All of these data are time stamped, and if they capture the right data, they can illuminate some of the cognitive processes that are manifesting themselves through the participant’s interaction with the assessment. Raw assessment data like Questionmark’s Results API OData Feeds can also be coupled with institutional data, thus exponentially growing the types of research questions we can pursue within a single organization.

NCME attendees learned about hardware and software that captures both response variables and behavioral variables from participants as they complete an online learning task.

Several presenters discussed issues and strategies for addressing less-structured data, with many papers tackling log file data gathered as participants interact with an online assessment or other online task. Ryan Baker (International Educational Data Mining Society) gave a talk about combine the data mining of log files with field observations to identify hard-to-capture domains, like student engagement.

Baker focused on the positive aspects of having oceans of data, choosing to remain optimistic about what we can do rather than dwell on the difficulties of iterative model building in these types of research projects. He shared examples of intelligent tutoring systems designed to teach students while also gathering data about the student’s level of engagement with the lesson. These examples were peppered with entertaining videos of the researchers in classrooms playing with their phones so that individual students would not realize that they were being subtly observed by the researcher via sidelong glances.

Evidence-centered design (ECD) emerged as a consistent theme: there was a lot conversation about how researchers are designing assessments so that they yield fruitful data for
intended inferences. Nearly every presentation about assessment development referenced ECD. Valerie Shute (Florida State University) observed that five years ago, only a fraction of participants would have known about ECD, but today it is widely used by practitioners.

How to create reliable tests using JTA

Jim Farrell HeadshotPosted by Jim Farrell

The gold standard of testing is to have valid test results. You must always be asking yourself: Does this test really test what it is supposed to test? Are the topics covered going to tell me whether or not the participant has the knowledge or skills to perform the tasks required for the job? The only way to be 100 percent sure is to truly know what the tasks are, how important they are, and how often they are performed to make sure you are asking relevant questions. All of this information is covered in a Job Task Analysis (JTA). (A JTA question type is available in Questionmark Live).

A JTA is an exercise that helps you define the tasks a person in a particular position needs to perform or supervise and then measure the:

1. difficulty of the task

2. importance of the task

3. frequency of the task

Together, these dimensions are often called the DIF. There may be other dimensions you may want to measure but the DIF can help you build a competency model for the job. A competency model is a visual representation of the skills and knowledge a person needs to be highly successful. This is created by interviewing subject matter experts (SMEs) who define the DIF for each task. This sounds like a piece of cake, right? Well it can be, but many people often disregard creating a JTA because of the time and expense. The thought of going out and interviewing SMEs and then going back and correlating a ton of data sounds daunting. That is where Questionmark can help out.

With our JTA question type, you can create a list of tasks and dimensions to measure them. You can then send out the survey to all of your SMEs and then use specific job task analysis reports to vet and create your competency model. Now that makes it a piece of cake!

Let’s take a quick look at the process a little more closely. In authoring, you can define your tasks and dimensions by entering them directly or importing them from an outside source.

 

JTA1

Once you add your question to a survey, you can deliver it to your SMEs.JTA2 (2)

The final step of the process is running reports broken down by different demographic properties. This will give you the opportunity to sit down and analyze your results, vet them with your SMEs, and develop your competency model.

JTA3Let’s get to why we are here…designing a test that will yield valid, meaningful results. Now that you know what needs to be tested, you can create a test blueprint or specification. This documentation will drive your item development process and make sure you have the right questions because you can map them back to the tasks in your competency model.

What organizational and technical measures are appropriate in assessment delivery?

John Kleeman HeadshotPosted by John Kleeman

One of the key responsibilities of an assessment sponsor acting as data controller under European Law is to implement appropriate technical and organizational measures to protect personal data.  But what does appropriate mean?

And when you contract with a data processor to deliver assessments, you must ensure that the processor implements appropriate measures. But again what does appropriate mean?

This is not just an academic question. A  UK organization was fined £150,000 in 2013 for failing to protect personal data with the regulator commenting that a key reason for the fine was “… the data controller has failed to take appropriate technical measures against the loss of personal data”

The measures to use will depend on the risk to the data and to the assessment participant. But here are some measures  to consider. They are all met by Questionmark if you delegate service delivery to Questionmark – though some also need action by you:

For more information, you can download a complimentary version of the white paper: Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities [requires registration]

Measure Questionmark OnDemand? Your system?
Premises access control
Data center certified against ISO 27001 or SSAE 16
Two-factor authentication for staff and visitors
24/7/365 personnel intrusion alarms
24/7/365 monitored digital surveillance cameras
23/7/365 security team on site at all times
Strong physical security in nondescript building to aid anonymity
System controls
Well configured firewalls in each tier
Intrusion Detection System or Intrusion Prevention System
Secure software development approach following best practices
Comprehensive anti-virus measures
Regular third party penetration testing
Regularly updated system and application software
24/7/365 network monitoring
Data access control (authentication and authorization)
Individual, unique high strength passwords for all users (you need to action)
Users can easily be deleted when they leave an organization (you need to action)
Store administrator passwords in encrypted form
Administrators can be given access to only functions/data needed (you need to configure)
Participant login & identity can be confirmed by monitors/proctors (you need to configure)
Data transmission control
All participant access via well configured SSL/TLS
All administrator access to results via well configured SSL/TLS
Any data copied for troubleshooting purposes strongly encrypted
No need to send data physically – all data transmitted electronically
Data entry control (keeping track of who does what)
Able to present participant with information & record consent (you need to action)
Participant answers cannot be changed except with authority
Participant submissions recorded with time-stamp
Differential privileges for administrators, control over system functions (you need to configure)
Log important activities by administrators and other users
Contractual control
Have data protection compliant contracts with processors
Processing only performed on instructions from Data Controller
Logical or physical separation of data from different customers
Availability controls (protecting against unauthorized destruction or loss)
Power supply redundancy, UPSs and onsite generators
N+1 or 2N redundancy on all hardware and Internet connections
Backup of all assessment data to offsite location
Backup assessment results frequently (e.g. hourly) to avoid losing data
Regular restore tests of such backups
Save participant answers “as you go” on server during test-taking
Tested, current service continuity plan in place in event of disasters
24/7/365 environment monitoring
Organizational measures (These are all met by Questionmark; you will also have to follow these yourselves.)
Designate a data protection officer
Personnel have written commitment to confidentiality
Background checks on new employees
Regular training of employees on data security
Regular testing of personnel on data security to check understanding
Faulty or end of life disks degaussed or otherwise safely destroyed

I hope this helps you work out what measures might be appropriate for your needs. If you want to learn more, then please read our free-to-download white paper: Responsibilities of a Data Controller When Assessing Knowledge, Skills and Abilities [requires registration].

If you are interested in seeing if Questionmark OnDemand could meet your needs, see here for more information.