Writing Good Surveys, Part 2: Question Basics

Doug Peterson HeadshotPosted By Doug Peterson

In the first installment in this series, I mentioned the ASTD book, Survey Basics, by Phillips, Phillips and Aaron. The fourth chapter, “Survey Questions,” is especially good, and it’s the basis for this installment.

The first thing to consider when writing questions for your survey is whether or not the questions return the data for which you’re looking. For example,let’s say one of the objectives for your survey is to “determine the amount of time per week spent reading email.”

Which of these questions would best answer the question?

  1. How many emails do you receive per week, on average?
  2. On average, how many hours do you spend responding to emails every week?
  3. How long does it take to read the average email?
  4. On average, how many hours do you spend reading emails every week?

All four questions are related to dealing with email, but only one pertains directly to the objective. Numbers 1 and 3 could be combined to satisfy the objective if you’re willing to assume that every email received is read – a bit of a risky assumption, in my opinion (and experience). Number two is close, but there is a difference between reading an email and responding to it, and again, you may not respond to every email you read.

The next thing to consider is whether or not the question can be answered, and if so, ensuring that the question does not lead to a desired answer.

The authors give two examples in the book. The first describes a situation where the author was asked to respond to the question, “Were you satisfied with our service?” with a yes or no. He was not dissatisfied with the service he received, but he wasn’t satisfied with it, either. However, there was no middle ground, and he was unable to answer the question.

The second example involves one of the authors checking out of a hotel. When she tells the clerk that she enjoyed her stay, the clerk tells her that they rate customer satisfaction on a scale of one to ten, and asks if she would give them a ten. She felt pressured into giving the suggested response instead of feeling free to give a nine or an eight.

Another basic rule for writing survey questions is to make sure the respondent can understand the question. If they can’t understand it at all, they won’t answer or they will answer randomly (which is worse than not answering, as it is garbage data that skews your results). If they misunderstand the question, they’ll be answering a question that you didn’t ask. Remember, the question author is a subject matter expert (SME); he or she understands the big words and fancy jargon. Of course the question makes sense to the SME! But the person taking the survey is probably not an SME, which means the question needs to be written in plain language. You’re writing for the respondent, not the SME.

Even more basic than providing enough options for the respondent to use (see the “yes or no” example above) is making sure the respondent even has the knowledge to answer. This is typically a problem with “standard” surveys. For example, a standard end-of-course survey might ask if the room temperature was comfortable. While this question is appropriate for an instructor-led training class where the training department has some control over the environment, it really doesn’t apply to a self-paced, computer-based e-learning course.

Another example of a question for which the respondent would have no way of knowing the answer would be something like, “Does your manager provide monthly feedback to his/her direct reports?” How would you know? Unless you have access to your manager’s schedule and can verify that he or she met with each direct report and discussed their performance, the only question you could answer is, “Does your manager provide you with monthly feedback?” The same thing is true about asking questions that start off with, “Do your coworkers consider…” – the respondent has no idea what his/her coworkers thoughts and feelings are, so only ask questions about observable behaviors.

Finally, make sure to write questions in a way that respondents are willing to answer. Asking a question such as “I routinely refuse to cooperate with my coworkers” is probably not going to get a positive response from someone who is, in fact, uncooperative. Something like “Members of my workgroup routinely cooperate with each other” is not threatening and does not make the respondent look bad, yet they can still answer with “disagree” and provide you with insights as to the work atmosphere within the group.

Here’s an example of a course evaluation survey that gives the respondent plenty of choices.

The stories behind our stories, from Questionmark’s CEO

eric_smallPosted by Eric Shepherd

I have been watching the Questionmark Blog with interest and thought that, as Questionmark’s CEO, it was about time that I made a contribution!

The Questionmark Blog was started to keep you in touch with our products, our news releases, learning materials and our Product Owners’ points of view.  We’ve been focusing on articles that assist assessment practitioners and instructional designers; recently we previewed how embedding syndicated assessments within wikis, web pages and blogs can support the learning process.

eric-tag-cloudSeparate to this initiative I have been running a personal blog (http://blog.eric.info) to bring you more abstract thoughts, observations from travels, and distillations of conversations that I’ve enjoyed along the way.  Not surprisingly the Tag Cloud quickly shows what I blog about, Assessments, Books, Travel and Questionmark.  Here are some links that you might find interesting:

•    Recent article on Learning Environments that explains how systems are now being built around Single Sign-on Portals, Wikis, Blogs and Data Warehouses
•    Questionmark Live – Story Behind the Story
•    Assessments Fundamentals with articles on Fidelity of an Assessment, Blooms Taxonomy, Item Analysis,   Types Of Assessments (Formative, Diagnostic, Summative, and Surveys), and many more.
•    A couple of YouTube videos, one titled Assessment as they relate to Learning Professionals
•    My Favorite Books , which relate to mostly to best practices in management and assessments. I’ll be posting more as I get time.

I look forward to meeting you out in the web 2.0 world!

Being a Good SME Wrangler

jim_smallPosted by Jim Farrell

I was recently demonstrating Questionmark Live to one of our customers and he was telling me about the “Data Wrangler” job in animated movies. Basically the person’s job is to collect all the work from the animators. So all of you Instructional Designers now have a new role to add to your resume…SME Wrangler. It might actually be one of your more taxing and complicated duties.

newwaycowboyoldwaycowboyThe ELearning Guild recently held an online forum titled “To SME or Not to SME: Tips for Working with your Customer and your Team.” I loved the title, but I think  “How to be a Good SME Wrangler” has a little more bite. As learning professionals we have to wear many different hats and foster different relationships. The relationship you have with your SMEs can often make or break your training program. One of the presentations touched on seven tools of the trade for working successfully with SMEs. The number one item on the list was resources. They mainly discussed having a well thought out design document and project plan, but I immediately thought about Questionmark Live. What better way to establish a relationship with the experts in your company than to give them a tool that not only empowers them to transfer their knowledge but also immediately involves them in creating deliverables that will be used by their peers?

Make sure that when you take a look at Questionmark Live you think of your role as a SME Wrangler and how this tool could help you foster a successful relationship with your SMEs.