This week’s “how to” article highlights the “Assessment results over time report,” one of the Questionmark Analytics reports now available in Questionmark OnDemand .
- What it does: The Assessment results over time report provides summary assessment performance information over time, including the mean, minimum and maximum values for a test or exam as well as its 95% confidence interval. It also shows the number of participants who took the assessment during that period of time. You can assign these filters to your report:
- Assessment filter
- Group filter
- Date filter
- Who should use it: Assessment, learning and education professionals can use this report to view and filter assessment results over time, making it easy for them to flag abnormal patterns that may indicate a statistically significant difference between the means.
- How it looks: This report offers a lot of information graphically and compactly, making it easy to interpret vast amounts of information quickly. It is broken down into two components (graphs).
1. A graph displaying average assessment scores achieved by participants over a period of time. The blue triangles represent the means for the assessment results. The vertical lines next to the triangles denote confidence intervals: long bars indicate the data is varied and short bars indicate high confidence in the mean value. It’s easy to see in the first graph that the results of tests administered just before September 6, 2010, differ dramatically from the other results during this period!
2. A graph displaying the number of results from the same time period. This volume information can help plan administration sessions and load.
A PDF and an analysis-friendly CSV can also be generated.
Posted by Greg Pope
In my last blog post I talked about the high level purpose and process of conducting an item analysis. Now I will describe some of the essential things to look for in a typical Item Analysis Report.
You may sometimes see “Alpha if item deleted” statistics in Item Analysis Reports. These statistics provide information about whether the internal consistency reliability (e.g., Cronbach’s Alpha) will increase if the question is deleted from the assessment. An increase in the reliability value indicates that the question is not performing well psychometrically. Many Item Analysis Reports do not display the “Alpha if item deleted” statistic because the item-total correlation coefficient provides basically the same information. Questions with higher item-total correlation coefficient values will contribute to higher internal consistency reliability values, and lower item-total correlation coefficient values will contribute to lower internal consistency reliability values.
Other statistics you might see are variations of the point-biserial item-total correlation coefficient such as “Corrected Point-biserial correlation,” “biserial correlation” or “corrected biserial correlation.” The “corrected” in these refers to taking out the question scores from the calculations so that the question being examined is not “contributing to itself” in terms of the statistics.
A great resource for more information on item analysis is Chapter 8 of Dr. Steven J. Osterlind’s book Constructing Test Items: Multiple-Choice, Constructed-Response, Performance and Other Formats (2nd edition).
In my next post I will dive into the nitty-gritty of item analysis. I will look at example questions and how to use the Questionmark Item Analysis Report in an applied context. Stay tuned to the Questionmark Blog…