Getting the results you need from surveys

Brian McNamara HeadshotPosted by Brian McNamara

bsmlA survey is only as good as the results you get from it. That’s why it’s important to carefully consider and plan for survey forms that will yield accurate, valid data that can be analyzed to yield the answers that you and your stakeholders are seeking.

This article looks at a few general tips on identifying the information you want to capture, writing survey questions, structuring surveys and planning ahead for how you or your stakeholders will want to analyze data.

1. Provide a brief introduction to the survey that lets the respondents know the:

  • Purpose of the survey – why do you want the respondents’ opinions?
  • Length of the survey (Number of questions? How long will it take to complete?)
  • Closing date for survey responses

Tip: It also makes sense to include this information in the initial invitation to help set expectations and boost response rates.

2. Keep the survey short and sweet (only ask the minimum number of questions required)… the longer the survey, the more likely that respondents will  abandon it or refuse to participate.

3. Avoid ambiguity in how your questions are worded; be as direct as possible.

4. Within the survey form, let respondents know how much longer they have to finish the assessment – built-in progress bars (available in most of  Questionmark’s standard question-by-question assessment templates) can help here. For example:


5. Consider the flow of the assessment. Ideally your survey should group similar types of questions together. For example, in a course evaluation survey,  you might ask two or three questions about the course content, then questions about the venue, and then questions about an instructor.

6. Avoid the potential for confusing respondents by keeping your Likert scale questions consistent where possible. For example, don’t follow a question  that uses a positive-to-negative scale (e.g. “Strongly Agree” to “Strongly Disagree”) with a question that uses a negative-to-positive scale (e.g. “Very Dissatisfied” to “Very Satisfied”).

7. Make it easy for respondents to answer surveys via a wide variety of devices and browsers. Check out previous blog articles on this topic: Tips for making your assessments BYOD-friendly.

8. Consider what respondent demographics and other information you may wish use for filtering and/or comparing your survey results. For example, in a typical course evaluation, you might be looking to capture information such as:

  • Course name
  • Instructor name
  • Location/Venue
  • Date (or range of dates)

Questionmark provides different options for capturing demographic data into “special fields” that can be used in in Questionmark’s built-in survey and course evaluation reports for filtering and comparison. Likewise, this demographic data can be exported along with the survey results to ASCII or Microsoft Excel format if you prefer to use third-party tools for additional analysis.

9. Consider how you wish to capture demographic information.

  • Easiest way: you can ask a question! In Questionmark assessments, you can designate certain questions as “demographic questions” so their results are saved to “special fields” used in the reporting process.Typically you would use a multiple choice and/or drop-down question type to ask for such information. For example, if you were surveying a group of respondents who attended a “Photoshop Basics” course in three different cities, you might ask the following to capture this data:bsml 2
  •  Embedding demographic data within assessment URLs: In some cases, you might already have certain types of demographic information on hand. For example, if you are emailing an invitation only to London respondents of the “Photoshop Basics” course, then you can embed this information as a parameter of a Questionmark assessment URL – it will be one less question you’ll need to ask your respondents, and a sure-fire way you’ll capture accurate location demographics with the survey results!

If you are looking for an easy way to rapidly create surveys and course evaluations, check out Questionmark Live – click here. And for more information about Questionmark’s survey and course  evaluation reporting tools, click here.

Improving course evaluations and boosting participation

Posted by Jim Farrell

Course and session evaluations  are popular assessments  for helping  to improve course and instructor quality at conferences and in learning programs all over the world.

One of our major goals over the past few years has been to make it easier to create and deliver course evaluations – and to help organizations glean more meaningful, actionable results from them.

Back in 2010, when we added the ability to author course evaluation surveys in Questionmark Live, we included question libraries to draw from in creating  surveys. These libraries cover four topics: Demographics, Instructor, Course Materials and Facilities;  you can either write your own questions or choose some from the libraries.

More recently, we’ve  been exploring the use of QR codes to increase course evaluation response rates by taking participants directly to online surveys via their mobile devices. Go here and here for more details about the benefits of using QR codes.

What about the results of course evaluations? We now have 4 course evaluation surveys in Questionmark Analytics: Course Summary, Instructor Summary, Class Summary and Class Detail. The four course evaluation reports allow you to drill into more and more detail about your course evaluation results. You can start at the course level and work your way down to an instructor/question level of detail. Each report also has visual cues to make performance obvious with a quick glance.

In the example below, the course summary report compares evaluation results across courses. It is most useful for managers and supervisors to comparing different courses within an organization.

If you are a customer looking to improve your course evaluations, you can click  here to read for our Course Evaluation Best Practices Guide.  Anyone who hasn’t used Questionmark Live can sign up for a free trial via our Tryouts and Downloads page.

Control access to surveys but keep the results anonymous

Posted by John Kleeman

When delivering a course evaluation survey or an employee engagement survey, it’s usually best to make the survey anonymous, as this will encourage people to answer candidly and so give you the feedback you want.  But how do you make the results anonymous yet still control who can take the survey and ensure they can only take it once?

Questionmark Perception lets you make a survey anonymous as one of the options when creating the assessment.

Anonymous setting screenshot

What some Questionmark users don’t know is that even if a survey is anonymous, you can still schedule people to it individually, as shown in the diagram below.

All the scheduling capabilities of Questionmark (who can take the survey, when they can do it, limiting to a single attempt) work normally with an anonymous survey. It’s just that the assessment delivery software doesn’t store the person’s name and any other identifying information with the results. So you can limit attempts, you can use Email Broadcast to send out an invitation to participants and you can even remind people who’ve not taken the survey to take it. But in the results database, names are replaced with the text “Anonymous,” and so none of the answers and comments your participants give will be identifiable.

You can also use anonymous surveys in a similar way when running a survey via Single Sign On, from SharePoint or from a Learning Management System. The external system will authenticate the participant, but Perception will not store the identities of people with their results, and so instructors and other reporting users will see the answers and comments as anonymous.

If you are relying on this, you should check it for yourself:  Take a dummy survey and confirm that you cannot see the results when reporting. One thing to be aware of is Special fields, which can sometimes contain identifying information.  There is a system setting  that lets you control whether these are captured for anonymous surveys. (Questionmark Software Support Plan customers can see details here.)

Questionmark ourselves use this capability to deliver anonymous surveys to our own employees, and I hope this capability might be helpful to you, too.

Assessment for virtual training

john_smallPosted by John Kleeman

At the suggestion of the Masie Center, I’ve been reading an interesting book on Virtual Training Basics by Cindy Huggett. So I’ve been thinking about how you can use assessment effectively within Virtual training.

Virtual training is an online event, where a trainer meets up with participants and instructs them in an online classroom or similar environment (for instance Microsoft Office Live Meeting, Webex or Adobe Connect). A recent survey by ASTD suggests that 6.4% of formal US training hours are virtual, which is a lot of training hours.

Assessment is a cornerstone of all learning, but when you are remote from your participants, and so cannot see their facial reactions or body language, assessment is even more important than in face-t- face training. If you are delivering virtual training, here are some ways you can consider using assessments.

  • Pre-test: A pre-test before the virtual training session is valuable for understanding participants’ knowledge in advance and for creating intrigue. In virtual training, it’s harder to engage participants or check in verbally with what they know, so pre-tests are particularly important.
  • Poll slides: Many systems allow poll slides, which present simple questions — usually multiple choice — that allow you to check participants’ views or reactions. These are basic, but easy and useful.
  • Real-time knowledge checks: Whereas poll slides are helpful, they don’t usually store the results or identify people. In longer sessions, short quizzes that check knowledge of topics within the course are sometimes preferable. People can take the assessments in real time and you can see the results collectively and by individual. This is very easy to set up in Questionmark Perception. In some tools, like Live Meeting, you can simply include a web page (see here for instructions) and each participant will get their own versions of the quiz to fill in.
  • Course evaluations: These are important for all training, but in virtual training where you cannot see the reactions face-to-face, they are vital. Every virtual training session should have a course evaluation and should include questions on the virtual experience as well as the usual questions.
  • Post-course tests:  Like any other session, virtual or real, people will forget over time, and questions sent after the event can prevent forgetting and reinforce learning.

As Internet speeds get faster, software improves and travel challenges and costs grow, more and more of us are going to be delivering virtual training. I think assessment within virtual training will be essential to making the training successful and also measuring its success.