Two great workshops: Questionmark Boot Camp and Criterion-Referenced Test Development

Rick Ault

Posted by Joan Phaup

Planning for the Questionmark 2013 Users Conference in Baltimore, Maryland, March 3 – 6 is well underway.

We have begun posting descriptions of breakout sessions and are pleased to announce the return — by popular demand — of two pre-conference workshops.

Both of these all-day sessions will take place on Sunday March 3, 2013:

Questionmark Boot Camp: Basic Training for Beginners, with Questionmark Trainer Rick Ault

Learn the basics of using Questionmark technologies before the conference starts. Beginning Questionmark users are invited to bring their laptops to a basic training course.

Get into gear with hands-on practice in creating questions, putting together an assessment, then scheduling it, taking it and seeing the results. Start off with some first-hand experience that will give you a firm footing for learning more at the conference.

sharon shrock

Criterion-Referenced Test Development, with Sharon Shrock and Bill Coscarelli

Bill Coscerelli

Sharon and Bill are the authors of Criterion-Referenced Test Development: Technical and Legal Guidelines for Corporate Training. Participants in their pre-conference workshop, which is based on the book, will explore testing best practices and will learn how to meet rigorous competency testing standards and interpret test results correctly.

This workshop is ideal for trainers, instructional designers, course developers and training managers. Understanding the principles of skillful test authoring will help you create ethical and dependable testing programs that yield meaningful, measurable results.

You can save $200 by registering for the conference on or before November 16th. You can sign up for a workshop at the same time or add in a workshop later. It’s up to you!

The Future Looks Bright

Posted by Jim Farrell

Snapshot from a “Future Solutions” focus group

Our Users Conferences are a time for us to celebrate our accomplishments and look forward to the challenges that lie in front of us. This year’s conference was full of amazing sessions presented by Questionmark staff and customers. Our software is being used to solve complex business problems. From a product standpoint, it is the very exciting to bring these real-life scenarios to our development teams to inspire them.

So where do we go from here? The Conference is our chance to stand in front of our customers and get feedback on our roadmap. We also held smaller “Future Solutions” focus groups to get feedback from our customers on what we have done and what we could do in the future to help them. In the best of times, these are an affirmation that we are on the right path. This was definitely one of those years.

One of our Future Solutions sessions focused on authoring. During that session, Doug Peterson and I laid out the future of Questionmark Live. This included an aggressive delivery cycle that will bring future releases at a rapid pace. Stay tuned for videos on new features available soon.

Ok…enough about us. This conference is really about our customers. The panel and peer discussion strand of this year’s conference had some of the most interesting topics. John Kleeman has already mentioned the security panel with our friends from Pearson Vue, ProctorU, Innovative Exams and Shenandoah University.

Another session that stood out was as a peer discussion test defensibility using the Angoff method to set cut scores. This conversation was very  interesting to me as someone who once had to create defensible assessments. I am eager to see organizations utilize Angoff because not only do  you want legally defensible assessments, you want to define levels of competency for a role and be able to determine how that can  predict future performance.

For those of you who do not know, the Angoff method is a way for Subject Matter Experts (SMEs) to grade the probability of a marginal student getting a question right. Attendees at this conference session were provided a handout that includes a seven-step flowchart guiding them in the design, development and implementation of the Angoff method.

If you are interested in Angoff and setting test scores I highly recommend reading Criterion-Referenced Test Development written by our good friends Sharon Shrock and Bill Coscarelli.

We really hope to see everyone at the 2013 Users Conference in Baltimore March 3 – 6. (I am hoping we may even get a chance to visit the beautiful Camden Yards!)

Mastering Your Multiple-Choice Questions

Posted By Doug Peterson

If you’re going to the Questionmark Users Conference in New Orleans this week (March 21 – 23), be sure to grab me and introduce yourself! I want to speak to as many customers as possible so that I can better understand how they want to use Questionmark technologies for all types of assessments.

This year Sharon Shrock and Bill Coscarelli are giving a pre-conference workshop on Tuesday called “Criterion-Referenced Test Development.” I remember three years ago at the Users Conference in Memphis when these two gave the keynote. They handed out a multiple-choice test that everyone in the room passed on the first try. The interesting thing was that the test was written in complete gibberish. There was not one intelligible word!

The point they were making is that multiple-choice questions need to be constructed correctly or you can give the answer away without even realizing it. Last week I ran across an article in Learning Solutions magazine called “The Thing About Multiple-Choice Tests…” by Mike Dickinson that explores the same topic. If you author tests and quizzes, I encourage you to read it.

A few of the points made by Mr. Dickinson:

  • Answer choices should be roughly the same length and kept as short as possible.
  • Provide a minimum of three answer choices and a maximum of five. Four is considered optimal.
  • Keep your writing clear and concise – you’re testing knowledge, not reading comprehension.
  • Make sure that you’re putting the correct answer in the first two positions as often as the last two positions.

One of the most interesting points in the article is about the correctness of the individual answer choices. Mr. Dickinson proposes the following:

  • One answer is completely correct.
  • Another answer is mostly correct, but not completely. This will distinguish between those who really know the subject matter and those who have a more shallow understanding.
  • Another answer is mostly incorrect, but plausible. This will help reveal if test takers are just guessing or perhaps racing through the test without giving it much thought.
  • The fourth answer would be totally incorrect, but still fit within the context of the question.

Once you have an assessment made up of well-crafted multiple-choice questions, you can be confident that your analysis of the results will give you an honest picture of the learners’ knowledge as well as the learning situation itself – and this is where Questionmark’s reports and analytics really shine!

The Coaching Report shows you exactly how a participant answered each individual question. If your questions have been constructed as described above, you can quickly assess each participant’s understanding of the subject matter. For example, two participants might fail the assessment with the same score, but if one is consistently selecting the “almost correct” answer while the other is routinely selecting the plausible answer or the totally incorrect answer, you know that the first participant has a better understanding of the material and may just need a little help, while the second participant is completely lost.

The Item Analysis report shows you how a question itself (or even the class or course materials) is performing.

  • If pretty much everyone is answering correctly, the question may be too easy.
  • If almost no one is getting the question correct, it may be too hard or poorly written.
  • If the majority of participants are selecting the same wrong answer:
    • The question may be poorly written.
    • The wrong answer may be flagged as correct.
    • There may be a problem with what is being taught in the class or course materials.
  • If the answer selection is evenly distributed among the choices:
    • The question may be poorly written and confusing.
    • The topic may not be covered well in the class or course materials.
  • If the participants who answer the question correctly also tend to pass the assessment, it shows that the question is a “good” question – it’s testing what it’s supposed to test.

What question writing rules do you use when constructing a multiple-choice question? What other tips and tricks have you come up with for writing good multiple-choice questions? Feel free to leave your comments, questions and suggestions in the comments area!