Here’s some basic information about the Item Analysis Report, which was recently added to Questionmark Analytics:
What it does: The item analysis report provides an in-depth Classical Test Theory psychometric analysis of item performance. It enables you to drill-down into specific item statistics and performance data. The report includes key item statistics including item difficulty p-value, high-low discrimination, item-total correlation discrimination and item reliability. It also provides assessment statistics relating to the amount of time taken and the scores achieved.
Who should use it: Assessment, learning and education professionals can use this report to determine how well questions perform psychometrically.
How it looks: The report includes an assessment level overview graph and summary table. The overview graph plots a single point for each item in the summary table. Each question is plotted in terms of its item difficulty p-value (X-axis) and by its item-total correlation discrimination (Y-axis):
- Questions that have high (acceptable) discrimination will appear near the top of the graph
- Questions with low (unacceptable) discrimination will appear at the bottom of the graph
- Difficult questions will appear to the left of the graph
- Easy questions will appear to the right of the graph
The summary table beneath the scatter plot graph contains a line item for every question on the assessment. The table provides information on the question order, question wording and description, and summary information regarding the item difficulty p-values and the item-total correlation discrimination for each question. You can select an item in the table to navigate to the details of the selected item. And you can sort on each column to get different views of question performance. For example, you can sort the questions by difficulty to look at the hardest questions at the top of the table.
By clicking on any of the rows in the summary table one can go to a detailed item view of question-level information.
The Demographic Report was recently added to Questionmark Analytics. Here are the basics:
What it does: The demographic report breaks down results by demographic variables such as language, course name, location, department, instructor, and job role or military rank. This report can recognize up to 10 demographic variables recorded by an assessment. Users can review the performance of assessment data broken out by a demographic variable, using data stored in special fields within the database.
Who should use it: Assessment, learning and education professionals can use this report to zero in on test results according to specific demographic information.
How it looks: Graph 1 in this example shows the assessment mean score for the selected demographic. It displays the average percentage score achieved as well as the high and low results. Graph 2 shows the number of results for the selected demographic and includes an overall calculation of the number of results found.
You can assign several filters to limit the information included in the report:
- Assessment (mandatory)
- Special field
Scenario-based assessments (like this one, for example) are a great way to test learners’ understanding of a specific subject or gauge how someone would react in certain circumstances.
You can create scenario-based assessments in Questionmark Perception version 5 by grouping a series of questions with a single stimulus such as a reading passage, case study, video, image or audio track.
To do this, you would select the appropriate template within the Perception Assessment Wizard to group related questions into a single block. You use text and images to create a static introduction or stimulus that would remain visible in on one half of the window while the questions related to it show up one at a time on the other half.
You can create questions as you would any normal set of questions. Group them in a single sub-topic or place the questions in other relevant topics.
Once your assessment is complete, you can schedule it like any other assessment.
There may be times when you want your participants to repeat an assessment – for instance if they fail a quiz or you just want to give them the chance to try an assessment over again regardless of their score. It’s easy to do either of these things in Questionmark Perception. The ability to branch assessments is particularly helpful if you intend to embed them in a web page like SharePoint, blog or wiki.
Here are tips for allowing participants to repeat an assessment:
Setting up a re-take for participants who do not achieve the required score:
- When creating or editing the assessment, check the Enable pass / fail check-box in the Assessment Feedback screen of the Assessment wizard and set the required pass % mark.
- You will also need to select Branching from the Settings menu and indicate that you want the participant to “branch to another assessment.”
- Just choose the assessment you are currently editing, so that participants will automatically repeat it if they do not achieve the passing score.
Branching an assessment back to itself regardless of the participant’s score:
- When creating the assessment, ensure that there is only one assessment outcome for any score from 0 to 100%.
- To do this, uncheck the Enable pass / fail check-box on the Assessment Feedback screen.
- Follow the same instructions as above to indicate that you want participants to branch to the assessment you are currently editing. That way, they can be taken back to the assessment for another try.
There may be times when you want to give test participants access to a certain tool or resource: a calculator or a periodic table, for instance. Or maybe you are giving an open book test about policies and procedures and wish to make a PDF available for participants to refer to.
You can provide these types of resources within your assessments within Perception Authoring Manager. Use the question-by-question (QxQ) template and enable Perception’s Assessment Navigator, which allows participants to move easily from one question to another.
Here are a few rules of thumb for providing tools within assessments:
- Any tools that you use must be web based or accessible via a network from the computer the participant is using. If you are adding more than one tool, each tool will display in the same order as it appears in the template.
- The Questionmark Perception version 5 repository comes with a calculator tool that can be enabled or disabled in the template. Other tools can be stored as a resource and added to the repository.
- You can add third-party tools to assessments, too. These tools are not stored directly in the repository but can be accessed via the Internet or network the participant’s computer is connected to.
Sometimes you may want participants to receive customized messages when they have taken an assessment — perhaps information about their score or confirmation that they have completed your quiz or test at a particular time and date..
You can do this in Questionmark Perception by adding ” to your assessment outcomes within Perception Authoring Manager. Server variables are ‘merge fields’ that pull data from your Perception repository.
Here are examples of the data that can be included:
- assessment name
- current time
- current date
- time limit
- participant name
- participant details
- participant group
- score information
- demographic data saved in special fields – such as the participant’s department or start date
In editing an assessment, you can designate the fields from which you want to draw information for your personalized messages. So each participant, on completing an assessment, will automatically get feedback containing the information you want them to receive.