Mastering Your Multiple-Choice Questions

Posted By Doug Peterson

If you’re going to the Questionmark Users Conference in New Orleans this week (March 21 – 23), be sure to grab me and introduce yourself! I want to speak to as many customers as possible so that I can better understand how they want to use Questionmark technologies for all types of assessments.

This year Sharon Shrock and Bill Coscarelli are giving a pre-conference workshop on Tuesday called “Criterion-Referenced Test Development.” I remember three years ago at the Users Conference in Memphis when these two gave the keynote. They handed out a multiple-choice test that everyone in the room passed on the first try. The interesting thing was that the test was written in complete gibberish. There was not one intelligible word!

The point they were making is that multiple-choice questions need to be constructed correctly or you can give the answer away without even realizing it. Last week I ran across an article in Learning Solutions magazine called “The Thing About Multiple-Choice Tests…” by Mike Dickinson that explores the same topic. If you author tests and quizzes, I encourage you to read it.

A few of the points made by Mr. Dickinson:

  • Answer choices should be roughly the same length and kept as short as possible.
  • Provide a minimum of three answer choices and a maximum of five. Four is considered optimal.
  • Keep your writing clear and concise – you’re testing knowledge, not reading comprehension.
  • Make sure that you’re putting the correct answer in the first two positions as often as the last two positions.

One of the most interesting points in the article is about the correctness of the individual answer choices. Mr. Dickinson proposes the following:

  • One answer is completely correct.
  • Another answer is mostly correct, but not completely. This will distinguish between those who really know the subject matter and those who have a more shallow understanding.
  • Another answer is mostly incorrect, but plausible. This will help reveal if test takers are just guessing or perhaps racing through the test without giving it much thought.
  • The fourth answer would be totally incorrect, but still fit within the context of the question.

Once you have an assessment made up of well-crafted multiple-choice questions, you can be confident that your analysis of the results will give you an honest picture of the learners’ knowledge as well as the learning situation itself – and this is where Questionmark’s reports and analytics really shine!

The Coaching Report shows you exactly how a participant answered each individual question. If your questions have been constructed as described above, you can quickly assess each participant’s understanding of the subject matter. For example, two participants might fail the assessment with the same score, but if one is consistently selecting the “almost correct” answer while the other is routinely selecting the plausible answer or the totally incorrect answer, you know that the first participant has a better understanding of the material and may just need a little help, while the second participant is completely lost.

The Item Analysis report shows you how a question itself (or even the class or course materials) is performing.

  • If pretty much everyone is answering correctly, the question may be too easy.
  • If almost no one is getting the question correct, it may be too hard or poorly written.
  • If the majority of participants are selecting the same wrong answer:
    • The question may be poorly written.
    • The wrong answer may be flagged as correct.
    • There may be a problem with what is being taught in the class or course materials.
  • If the answer selection is evenly distributed among the choices:
    • The question may be poorly written and confusing.
    • The topic may not be covered well in the class or course materials.
  • If the participants who answer the question correctly also tend to pass the assessment, it shows that the question is a “good” question – it’s testing what it’s supposed to test.

What question writing rules do you use when constructing a multiple-choice question? What other tips and tricks have you come up with for writing good multiple-choice questions? Feel free to leave your comments, questions and suggestions in the comments area!

Meaningful Feedback: Some good learning resources

 

Posted by Jim Farrell

December is the time to take stock of the year that’s winding down, and a highlight for me in 2010 was attending the eLearning Guild’s DevLearn conference. One of the things I enjoy most about DevLearn is attending the general sessions where industry leaders speak passionately about the state of elearning and  important trends like social networking, games and simulations in learning.

One of the speakers at this year’s closing session  was Dr. Jane Bozarth, the elearning coordinator for the North Carolina Office of State Personnel. Jane is a great person to follow on Twitter (and not just because she is a fellow resident of the triangle here in NC). Jane’s tweets are full of valuable resources, and one of the many topics that interests her (and me!) is the use of feedback in learning and assessments. Jane’s recent article on Nuts and Bolts: Useful Interactions and Meaningful Feedback in Learning Solutions Magazine includes some great examples of feedback. In that article,  Jane emphasizes that “the point of instruction is to “support gain, not expose inadequacy” — and that feedback should be provided with that goal in mind.

Jane’s article reminded me that during one of our Questionmark Podcasts, Dr. Will Thalheimer of Work-Learning Research notes the importance of retrieval practice in the learning process and the role of feedback in supporting retrieval. The amount of feedback is tied to when the assessment comes in the learning process. For instance,  feedback with a formative assessment can pave new paths to information that can make future retrieval easier. Feedback for incorrect responses during learning is used to repair misconceptions and replace them with correct information and a new mental model that will be used to retrieve information in the future. As Dr. Thalheimer mentions in the podcast, good authentic questions that support retrieval also support good feedback. You will find more details about this in Dr. Thalheimer’s research paper, Providing Feedback to Learners, which you can download from our Web site.

All these resources can help you use feedback to “support gain, not expose inadequacy,” making your assessments in the coming year more effective.