Posted by Greg Pope
In my last blog post I talked about the high level purpose and process of conducting an item analysis. Now I will describe some of the essential things to look for in a typical Item Analysis Report.
You may sometimes see “Alpha if item deleted” statistics in Item Analysis Reports. These statistics provide information about whether the internal consistency reliability (e.g., Cronbach’s Alpha) will increase if the question is deleted from the assessment. An increase in the reliability value indicates that the question is not performing well psychometrically. Many Item Analysis Reports do not display the “Alpha if item deleted” statistic because the item-total correlation coefficient provides basically the same information. Questions with higher item-total correlation coefficient values will contribute to higher internal consistency reliability values, and lower item-total correlation coefficient values will contribute to lower internal consistency reliability values.
Other statistics you might see are variations of the point-biserial item-total correlation coefficient such as “Corrected Point-biserial correlation,” “biserial correlation” or “corrected biserial correlation.” The “corrected” in these refers to taking out the question scores from the calculations so that the question being examined is not “contributing to itself” in terms of the statistics.
A great resource for more information on item analysis is Chapter 8 of Dr. Steven J. Osterlind’s book Constructing Test Items: Multiple-Choice, Constructed-Response, Performance and Other Formats (2nd edition).
In my next post I will dive into the nitty-gritty of item analysis. I will look at example questions and how to use the Questionmark Item Analysis Report in an applied context. Stay tuned to the Questionmark Blog…