What do you want for the holidays?

Posted by Howard Eisenberg

All I want for the holidays is …

As the acting product owner for Questionmark’s Reporting and Analytics zone, I would love to hear how you would complete that sentence … with respect to assessment reporting and analytics, of course.

To help stimulate the ideas, I will highlight recent developments in our reporting and analytics suite.

The Introduction of the Results Warehouse and Questionmark Analytics

In version 5.3, we introduced the Results Warehouse. This is a database of assessment results that is separate from the database responsible for delivering assessment content and storing the participant’s responses. Results are extracted, transformed and loaded (ETL’ed) into the Results Warehouse from the delivery database on a recurring schedule. This database is the data source for the Questionmark Analytics reports.

With the advent of Analytics, we have introduced some new reports and we plan to continue building reports in Analytics. In the case of the Item Analysis report, we’ve actually ported that to Analytics entirely, and in so doing have delivered improvements to the visualization of item quality and report output options.

Addition of New Reports in Questionmark Analytics

Here’s a brief inventory of the reports currently available in Analytics. You can read-up on the purpose of each of these reports and see sample outputs by consulting the Analytics Help Center.

In the spirit of holiday gift-giving, allow me to expound on a few of these reports.

Results Over Time and Average Score by Demographic

These are two separate reports but they are similar in that each one displays an average assessment score within the context of a 95% confidence interval, and a count of the number of results (sample size).

The “Results Over Time” report plots the assessment mean over a period of time, the interval of which is selected by the user.

The “Demographic” report does the same in terms of displaying a mean score, but it groups the results by a demographic. In this way, it enables the report consumer to compare the mean across different demographic groups.

Assessment Completion Time

This report can be used to help investigate misconduct. It plots assessments results on a score axis and a completion time axis. Outliers may represent causes of misconduct. That is, if a participant scores above the mean, yet takes an abnormally short time to complete the assessment; this may represent a case of cheating. If a participant takes an abnormally long time to complete the assessment, yet scores very poorly; this may represent a case of content theft. The report allows the user to set the range for normal score and completion time.

Item Analysis

Finally, the item analysis report has been improved to provide users with better visualization of the quality of items on an assessment form, as well as more output options.

Suspect items are immediately visible because users can specify acceptable ranges for p-value and item-total correlation. Items that fall within acceptable ranges for each measure are green, those that fall outside of the acceptable range for one of the two measures are orange, and any that miss the mark for both p-value and item-total correlation are red.

Additionally, different sections and level of detail included in this report can be output to PDF and/or comma separated value files.

So … what’s on your wish list for the holidays?