Reflections on the San Antonio Users Conference

Doug Peterson HeadshotPosted By Doug Peterson

I had the good fortune of attending the Questionmark Users Conference in San Antonio, Texas a couple of weeks ago.

As required by (personal) law, I visited the Hard Rock Café for dinner on my first night in town! And let me tell you, if you missed the fresh sushi at the Grand Hyatt’s Bar Rojo, you missed something pretty doggone special.

But more special than Hard Rock visits and heavenly sushi was the chance to interact with and learn from Questionmark users. Honestly, users conferences are a  favorite part of my job. The energy, the camaraderie, the ideas – it all energizes me and helps keep me fired up!

We had a great session on Item Writing Techniques for Surveys, Quizzes and Tests. We had some wonderful conversations – I like for my sessions to be more of a conversation than a lecture – and I picked up some helpful tips and examples to work into my next presentation. For those of you who couldn’t make this session, it’s based on a couple of blog series. Check out the Writing Good Surveys series as well as the Item Writing Guide series. You’ll also want to check out Improving Multiple Choice Questions and Mastering Your Multiple Choice Questions for more thoughts on improving your multiple choice questions.

The other session I led was on using Captivate and Flash application simulations in training and assessments. As with my previous presentations on this topic, the room was packed and people were excited! During my years as a Questionmark customer, I was always impressed with the Adobe Captivate Simulation and Adobe Flash question types. I feel even more strongly about this since attending a webinar put on the other day by a fairly popular LMS. The process you have to go through to do a software simulation in one of their assessments is far too involved and complicated – it really drove home the simplicity of using the Captivate question type in Questionmark.

It really was great to see old friends and make new ones at the conference. I look forward to working with customers throughout the rest of 2014 and to seeing them again soon.

Problems and Fixes — Item Writing Guide, Part 4

Doug Peterson Headshot Posted By Doug Peterson

In part 3 of this series on item writing, we began taking a look at some “problem questions” to figure out what was wrong with them and how to make them better. Let’s continue doing that.iwg 1

This is the ol’ “grammar give-away” problem. The stimulus ends in “a”, indicating that the answer begins with a consonant (or at least * should* begin with a consonant, if the assessment author is following standard rules of grammar). There’s only one choice that begins with a consonant, so the participant doesn’t need to know the answer – they just need to know a little grammar.

There are a couple of ways to fix this. One would be to end the stimulus with “a/an”. Another way would be to move the indefinite article (yes, I had to look that up) into the choice: an apple, a banana, an orange, and an eggplant.

Also be sure not to mix a singular in the stimulus with plurals in the choices, or vice versa. And if you’re writing questions in gender-specific languages like Spanish, French, or Italian, be sure to account for masculine and feminine definite and indefinite articles.

This question has a couple of things wrong with it:

iwg 2The first problem is pretty obvious. One choice is significantly longer than the other three. Typically this would mean that choice (b) is the correct answer, and in this case, that would be true.

Can you spot the other problem? It’s a little more subtle. The stimulus uses an important word – “strings” – and only the correct answer uses this word (in its singular form) as well. Without knowing anything about bass guitars, most people would answer this question correctly simply by noticing the use of the same important word in both the stimulus and one of the choices.

To fix this question, the second choice should be changed to something like “Set the intonation.” At that point the length of the correct choice is about the same as the length of the other choices, and the important word “string(s)” is not being used.

Please feel free to add your comments to this discussion – the more, the merrier! In our next installment, we’ll diagnose two more problems, and then wrap things up with a little summary.

Putting Theory Into Practice — Item Writing Guide, Part 3

Doug Peterson HeadshotPosted By Doug Peterson

In part 1 of this series we looked at the importance of fairness, validity and reliability in assessment items. In part 2 we looked at the different parts of an item and discussed some basic requirements for writing a good stimulus and good distractors.

Now it’s time to put all of this into practice. I’d like to present some poorly written items, understand what’s wrong with them, and look at how they could be improved. I’ll be the first to admit that these examples tend to be a little over-the-top, but I’ve never been known for my subtlety (!), and a little exaggeration helps make the problems I’m pointing out a little more clear. Let’s start with a simple Yes/No question.

part 3 aEven if the stimulus of this item didn’t contain nonsense words, it would still be impossible to answer. Why? Because the stimulus basically asks two questions – should you beedle the diggle OR should you zix the frondle.

The stimulus is confusing because it is not clear and concise. Can you only take one of the two actions, and is the question asking which one? Or is it asking if you should take either of the two actions? An item like this is not fair to the test-taker because it doesn’t allow them to display their knowledge. We can fix this item by splitting it out into two questions.

  • Yes or No: When loading your snarkleblaster, should you beedle the diggle?
  • Yes or No: When loading your snarkleblaster, should you zix the frondle?

(And as long as we’re looking at a Yes/No question, bear in mind that a True/False or Yes/No question has a 50% chance of being answered correctly simply by guessing. It’s better to have at least 4 choices to reduce the probability of guessing the correct answer. For more thoughts on this, read my post Are True/False Questions Useless?)

Let’s take a look at another one:

part 3 bYou don’t have to be an expert on child-rearing to get the answer to this question. Choices a, c and d are ludicrous, and all that’s left is the correct choice.

This item is not fair to the stakeholders in the testing process because it gives away the answer and doesn’t differentiate between test-takers who have the required knowledge and those who don’t. Make sure all of the distracters are reasonable within the context of the question.

So what might be some plausible, yet still incorrect, distractors?

 

How about:

  • Engage your child in vigorous exercise to “wear them out”.
  • Raise your voice and reprimand your child if they get out of bed.
  • Discuss evacuation plans in case there is a fire or tornado during the night.

We’ll continue our review of poorly written items in the next post in this series. Until then, feel free to leave your comments below.