Assessment Report Design: Reporting Multiple Chunks of Information

Austin FosseyPosted by Austin Fossey

We have discussed aspects of report design in previous posts, but I was recently asked whether an assessment report should report just one thing or multiple pieces of information. My response is that it depends on the intended use of the assessment results, but in general, I find that a reporting tool is more useful for a stakeholder if it can report multiple things at once.

This is not to say that more data are always better. A report that is cluttered or that has too much information will be difficult to interpret, and users may not be able to fish out the data they need from the display. Many researchers recommend keeping simple, clean layouts for reports while efficiently displaying relevant information to the user (e.g., Goodman & Hambleton, 2004; Wainer, 1984).

But what information is relevant? Again, it will depend on the user and the use case for the assessment, but consider the types of data we have for an assessment. We have information about the participants, information about the administration, information about the content, and information about performance (e.g., scores). These data dimensions can each provide different paths of inquiry for someone making inferences about the assessment results.

There are times when we may only care about one facet of this datascape, but these data provide context for each other, and understanding that context provides a richer interpretation.

Hattie (2009) recommended that a report should have a major theme; that theme should be emphasized with between five to nine “chunks” of information. He also recommended
that the user have control of the report to be able to explore the data as desired.

Consider the Questionmark Analytics Score List Report: Assessment Results View. The major theme for the report is to communicate the scores of multiple participants. The report arguably contains five primary chunks of information: aggregate scores for groups of participants, aggregate score bands for groups of participants, scores for individual participants, score bands for individual participants, and information about the administration of the assessment to individual participants.

Through design elements and onscreen tools that give the user the ability to explore the data, this report with five chunks of information can provide context for each participant’s score. The user can sort participants to find the high- and low-performing participants, compare a participant to the entire sample of participants, or compare the participant to their group’s performance. The user can also compare the performance of groups of participants to see if certain groups are performing better than others.

rep 1

Assessment Results View in the Questionmark Analytics Score List Report

Online reporting also makes it easy to let users navigate between related reports, thus expanding the power of the reporting system. In the Score List Report, the user can quickly jump from Assessment Results to Topic Results or Item Results to make comparisons at different levels of the content. Similar functionality exists in the Questionmark Analytics Item Analysis Report, which allows the user to navigate directly from a Summary View comparing item statistics for different items to an Item Detail view that provides a more granular look at item performance through interpretive text and an option analysis table.

24 midsummer questions to ask your assessment software provider

John Kleeman HeadshotPosted by John Kleeman

This week marks the longest day of the year in the Northern Hemisphere, and in a few lucky places there are 24 hours of daylight.

We can imagine that in ancient days, watchers on castles could relax a bit with the longer hours of sunlight, as it was harder for marauders to sneak up without the cover of night. But in the modern-day cloud world, time of day doesn’t impact security much. Light or dark, you need to keep watch 24 hours a day, 365 days a year to be sure of your assessment security.

Here are 24 questions you might want to ask your assessment software supplier to check that your assessments and results will be safe all day and night long.

Data CenterData center batteries (picture from Wikipedia by Jelson25)

1. Do you host assessments in a professional Data Center, certified to SSAE 16 or ISO 27001?

2. Does the Data Center have 24/7/365 physical security?

3. Does the Data Center have 24/7/365 network monitoring so that if an issue arises, someone is continually monitoring to react to it?

4. Are the servers monitored by CCTV cameras?

5. Does your Data Center have multiple connections to the power grid with onsite generators with at least 24 hours fuel onsite in case of power outages?

6. Does your Data Center have multiple, fast Internet links so that if one goes down, connectivity remains?

Systems and softwareSystems and software

7. Is every server in the system load balanced and does every component have redundancy, so that if any one system fails, another can take over?

8. there an Intrusion Detection or Protection System (IDS or IPS) to help protect against attackers?

9. Is browser access to assessments and administration protected by SSL/TLS to 128 bits or higher, so that assessment data and results cannot be intercepted on the Internet?

10. Is your anti-virus software deployed on all relevant servers and up to date?

11. Do you have separate staging areas to test on before deploying to production?

12. Does all application communication use a strong encryption algorithm? Have you retired any use of the less secure MD5 algorithm, very popular in the past?

People

security people13. Do you background check all employees before you hire them in case of a criminal history?

14. Do you have a signed confidentiality agreement on file with all your employees and do subcontractors have such agreements on file with all their personnel?

15. Do you train all personnel on data security and test them annually to check they understand?

16. When an employee leaves, do you remove all their access? Do you have a procedure to audit this to confirm it’s really happened?

17. Do you follow industry good practice in software development to reduce surface areas of attack and protect against security vulnerabilities?

18. Do you have a dedicated security team reporting in to an executive officer of the company?

Putting it all together to ensure you don’t lose the “crown jewels” of your assessment data

crown19. Are regular penetration tests run against the system by a third-party supplier?

20. Do you destroy faulty or end-of-life disks to ensure no-one can later access the data?

21. Do you have a disaster recovery plan? Suppose you lose your email or another key system, can you communicate internally and with customers, and have you tested this?

22. Are you transparent about your security? For instance, did you disclose what you did about the Heartbleed vulnerability that impacted much of the Internet in April 2014?

23. Can I see real-time information on the current status and uptime, and access statistics from round the world? See http://status.questionmark.com for an example of what you might look for from a provider.

24. Are results data backed up safely and off-site (over the Internet) at least hourly, so that in the event of a catastrophe, you would not normally lose more than an hour’s worth of data?

I hope this list of questions helps you think about your assessment security over midsummer and all the other days of the year. In case you’re wondering, if you use Questionmark OnDemand, the answers to all the questions are “yes”.

Click here to see for Questionmark’s security video.

Security Video Image

Using Diagnostic Assessments to Improve a Government Agency’s Workforce

Headshot JuliePosted by Julie Delazyn

The Aurelius Group (TAG) provides Federal acquisition, human capital, and technology consulting to private industry, federal agencies, and the U.S. Department of Defense. TAG

One of their clients is a large Federal agency that, faced with an expanding workload, inexperienced employees and increasingly scarce resources, needed to identify and close proficiency gaps in their acquisition.

In response, TAG has incorporated Questionmark assessments into a successful workforce improvement program that reveals the aggregate strengths and weaknesses of the workforce and enables the client to direct resources to high-value development opportunities.

Assessments provide annual or biannual snapshots that show how much employees know about the complex bodies of knowledge their work requires and identify competency gaps that can be addressed through further learning. Trend data gleaned from the assessments demonstrates decline and improvement over time and provides objective support for training resource requests, proficiency gap and workforce training.

This case study explains how, in addition to providing an enterprise view of the workforce’s strengths and weaknesses, the program has improved participants’ self-awareness, helped shape their individual development plans (IDPs) and resulted in more effective learning choices.

 

Join us July 27 at 12:00 PM (EDT) for a Questionmark “Customers Online” webinar presentation by The Aurelius Group: Generating and Sending Custom Completion Certificates

Analyzing multiple groups with the JTA Demographic Report

Austin FosseyPosted by Austin Fossey

In my previous post, I talked about how the Job Task Analysis (JTA) Summary Report can be used by subject matter experts (SMEs) to inform their decisions about what content to include in an assessment.

In many JTA studies, we might survey multiple populations of stakeholders who may have different opinions about what content should be on the assessment. The populations we select will be guided by theory or previous research. For example, for a certification assessment, we might survey the practitioners who will be candidates for certification, their managers, and their clients—because our subject matter experts theorize that each of these populations will have different yet relevant opinions about what a competent candidate must know and be able to do in order to be certified.

Instead of requiring you to create multiple JTA survey instruments for each population in the study, Questionmark Analytics allows you to analyze the responses from different groups of survey participants using the JTA Demographic Report.

This report provides demographic comparisons of aggregated JTA responses for each of the populations in the study. Users can simply add a demographic question to their survey so that this information can be used by the JTA Demographic Report. In our earlier example, we might ask survey participants to identify themselves as a practitioner, manager, or client, and then this data would be used to compare results in the report.

As with the JTA Summary Report, there are no requirements for how SMEs must use these data. The interpretations will either be framed out by the test developer using theory or prior research, or the interpretations will be left completely to the SMEs’ expert judgment.

SMEs might wish to investigate topics where populations differed in their ratings, or they may wish to select only those topics where there was universal agreement. They may wish to prioritize or weight certain populations’ opinions, especially if a population is less knowledgeable about the content than others.

The JTA Demographic Report provides a frequency distribution table for each task on the survey, organized by dimension. A chart gives a visual indicator to show differences in response distributions between groups.

JTA2

Response distribution table and chart comparing JTA responses from nurses and doctors using the Questionmark JTA Demographic Report.

UK Briefings Update: Join us for discussions on assessment security

UKBChloe MendoncaPosted by Chloe Mendonca

Last week in London, we held the first of our three UK breakfast briefings taking place this summer.

In case you haven’t attended a breakfast briefing before, these events involve a morning of networking, best practice tips and live demonstrations of the newest assessment technologies.

Last week at our London briefing we received some great feedback about some of our latest features, including new capabilities within Questionmark Live and customised reporting using the Results API.

Our next two briefings will take place on Tuesday 17th June in London and Wednesday 18th June in Edinburgh. They will focus on some of the latest assessment security technologies that make it possible to administer high-stakes tests anywhere in the world.

ProctorU President Don Kassner, will begin by explaining the basics of online invigilation and discuss proven strategies for alleviating the testing centre burden. Then Che Osborne, Questionmark’s VP of sales, will discuss methods you can use to protect your valuable assessment content and test results.

Each briefing will include a complimentary breakfast at 8.45 a.m. followed by presentations and discussions until about 12:30 p.m.

We hope you will be able to attend one of the sessions.

What are the business benefits of online assessments in regulatory compliance?

John Kleeman HeadshotPosted by John Kleeman

As well as writing for this blog, I also contribute to SAP’s Community Network, and I’ve recently published there a 5-part series on the business benefits of online assessments in regulatory compliance. Much of this will also be interesting to a Questionmark audience, so here are some highlights with links to the fuller articles in case you’d like to read more.

**To read more about compliance in the financial industry, download our complimentary white paper: The Role of Assessments in Mitigating Risk for Financial Services Organizations [registration required].**

Making a business case for testing out of training

testoutUsing online assessments to test out of training saves time and money by allowing expensive people to forego unneeded training. It also makes compliance training more valid and respected, and so more likely to impact behaviour, because it focuses training on the people who need it.

If employees already know something well, then training them in it again is a waste of resources and motivation. By forcing people to attend training they feel is unnecessary, you reduce the credibility of your whole compliance initiative. People feel that you are just crossing a task off your list, not really caring about what matters to them and the business. And it devalues other compliance training as well.

My article on testing out of training suggests some good practice and also shares a management report format you can use to help build a business case internally for testing out.

How do you know what your workforce knows?

Training classAlmost all regulators are concerned to ensure that organizations have competent employees. Some require assessments to confirm this. Other regulators prefer to set principles, without detailed guidance. They focus on ensuring that your workforce is competent, often requiring documented training. Either way you need to assess your employees to check their competence.

Online assessments are a consistent and cost-effective means of checking your workforce know the law, your procedures and your products. If you are required to document training, it’s the most reliable way of doing so.

My article on how you know what your workforce knows  gives a list of useful assessment types used for compliance purposes and also explains the pros and cons of assessments vs other ways of documenting training.

Help your workforce retain critical information

Assessments are one of the best ways of ensuring your workforce can retrieve information when they need it, and reduce the chances of them forgetting key information needed for regulatory compliance.

My article on how you can help your workforce retain critical information explains the concept of assessments as retrieval practice and gives some advice on how to use assessments effectively for retrieval practice.

Can assessments reduce human error?

There is evidence that around 40 -50% of compliance errors are contributed to by ineffective training and failure to follow process.  Assessments can make a big difference in preventing these errors.

Some errors happen because training is misunderstood or misdelivered. Perhaps the training covers the area but the employee didn’t understand it properly, can’t remember or cannot apply the training on the job.  Assessments check that people do indeed understand.

observerOther errors happen because training is different to what the real job involves. If the problem is that competencies or tasks needed in the real world aren’t part of the training, then job task analysis, using surveys to ask practitioners what really happens in a role, is a great way to correct this (see An Easier Approach to Job Task Analysis Q&A for more information).

My article also explains how observational assessments can be used to help prevent failure to follow process.  In an observational (or workplace) assessment, an observer watches the participant perform an activity and assesses their performance against a checklist. This is a good way to identify whether procedural steps are omitted.

It takes 20 years to build a reputation and five minutes to lose it

Warren Buffett famously said, “It takes 20 years to build a reputation and five minutes to ruin it”. In my final article on the SAP site, I explain how using Questionmark software is a great way to avoid those fatal five minutes and so mitigate risk to reputation.

The key point I make in the series  is that online assessments are about both staying compliant with regulatory requirements AND giving business value. Assessments help ensure your workforce are competent and reduce regulatory risk, but they also give business value in improved efficiency, knowledge and customer service.

**To read more about compliance in the financial industry, download our complimentary white paper: The Role of Assessments in Mitigating Risk for Financial Services Organizations [registration required].**