Getting better response rates and data from course evaluations

greg_pope-150x1502

Posted by Greg Pope

In my last post I shared some pointers about authoring course evaluation questions and surveys. I’d like to follow that up with some brief tips on how to get better response rates and high quality data when administering course evaluation surveys:

  • Ensure that participants know that their responses are anonymous. This will encourage higher response rates and “truthful” responses.
  • Encourage participants to provide both good and bad feedback so that the instruction, course materials, facilities, and course curriculum can be improved.
  • Have participants take the survey on the last day of the course, or very soon after the course, so that their perspectives are fresh and genuine.
    • For example, link from the end-of-course exam to the course evaluation survey.
  • Consider completion of the course evaluation survey to be part of the successful completion of the course.
  • Encourage multiple delivery platforms for the course evaluation survey (e.g., mobile as well as traditional desktop/laptop) to make the taking of the survey more convenient for participants.
  • Consider reporting back to participants about summary results from the course evaluations and plans for improving courses in the future.
  • Keep the survey short and sweet so that participants are not fatigued by the process of taking it.
  • Ensure that the survey is free of spelling or grammatical errors.
  • Consider offering an opportunity for course evaluation responders to win a prize (e.g., gift certificate, t-shirt, coffee mug, etc.) for responding within a specified time window. In practice, treating the gathering of survey results as an exchange tends to increase response rates (“In return for taking this survey we will offer you an opportunity to win _____.”)

Click here for a sample academic course evalution and here for a corporate training example.

New Dutch and German Interfaces for Questionmark Perception

Screenshot of Questionmark Perception Enterprise Manager - German InterfaceJoan Phaup

Posted by Joan Phaup

Our recent release of Questionmark Perception version 5.2 builds on the new features introduced in version 5. Most notably, we have added German and Dutch administrative, scheduling, browser-based authoring and reporting interfaces to Perception’s Enterprise Manager.

Click here if you would like to download version 5.2 and try it out.

Questionmark Moodle Connector: Supported Edition

I’m pleased to say that the supported edition of  Questionmark’s Moodle Connector is now available to download from the Questionmark Web site . Connector integrates Questionmark Perception with the Moodle 1.9 open source course management system so that users can link  Moodle courses to  Perception course evaluations, quizzes, tests and exams.

The supported edition of the connector represents a milestone in the development of the Moodle Connector, which was the first of our integration products to be made available as an open source community edition during development.  I’d like to thank everyone who has tested the community edition of the connector. With your feedback we’ve been able to get the features of the connector right for this first supported edition.

Of course, development does not stop with the first release!  We’ll continue to develop the Community Edition as part of our Open Assessment Platform initiative.

You can find out more from the Moodle Connector pages on our developer site, and you can see the connector in action in this short video…

New Question Types Added to Questionmark Live

jim_small

Posted by Jim Farrell

We’re Back!!! The Questionmark Live team has released some great new browser-based assessment authoring features. 

 This release of Questionmark Live contains pull-down list question types, explanations, non-scored questions and an interface to support left to right languages such as Arabic. (We will get to this feature in another post.)

Explanations are a great way to collect scenarios, videos, links and images from subject matter experts without the need to create questions — easily crowdsourcing valuable information that is sitting in your organization right now.

The new pull-down list question presents a number of options for a participant to choose from. The pull-down list question has long been a popular way  for Questionmark Perception users to collect information from participants in many types of assessments including surveys.

Non-scored questions allow you to use Questionmark Live to create questions for surveys. This feature allows you to create a question that does not have a correct answer, score, or feedback.

Pull-down questions and non-scored questions are demonstrated in the following video. Take a look!

Ozi, Ozi, Ozi, Oi, Oi, Oi!

rafael-conf-australia2Posted by Rafael Lami Dozo

We’ve ventured Down Under to see if the Australian football supporters are ready to cheer for the Socceroos during the World Cup!

Just kidding! In fact, we are working hard speaking at public events, visiting customers and receiving the Platinum Award for Best Assessment Tool at LearnX Asia Pacific (our third year in a row to win this award in the Best-in-Class Technologies category)!

We are exhibiting and speaking at LearnX, which takes place this week in Sydney. I’ll be giving a presentation on “How to Measure the Talent you Manage” on Wednesday afternoon and Questionmark CEO Eric Shepherd will  present  “Assess Where it  Matters – Everywhere” on Thursday. So be sure to come see us if you are at LearnX!

Stay tuned for my next blog post, as we will be going to New Zealand to meet with our Kiwi users along with members of the New Zealand Association for Training and Development.

Best practices in writing course evaluation surveys

Example questions in a corporate course evaluation survey

greg_pope-150x1502

Posted by Greg Pope

Course evaluations make it easy to obtain valuable feedback that can be gathered and analyzed in order to improve future learning activities. But to get meaningful results from these evaluations, it’s important to observe certain principles. Here are some best practice tips for authoring course evaluation questions and surveys:

  • Use Likert or survey matrix questions.
  • Using the same scale (e.g., Likert 1-5) for all questions on a course evaluation survey will yield the most accurate and easy-to-interpret reports.
  • Use short scales (e.g., Likert 1-5 versus Likert 1-9) if possible to decrease reading and increase the speed with which participants can complete the survey.
  • All questions should be set up so that Likert scales have the highest positive number rating as the highest number (e.g., 4 = Strongly Agree) and the lowest negative number rating as the lowest number (e.g., 1 = Strongly Disagree). Using this pattern consistently for every question on the survey allows for the analysis of reports to be the most meaningful.
  • Generally, reports will yield the most useful topic-level information if meaningful groups of questions are located in smaller number of topics or sub-topics rather than distributed across a large number of topics or sub-topics (e.g., 3 – 4 questions in a topic versus 1 question in numerous topics).
  • Keep course evaluation surveys short and make them easy to complete: they generally should have about 10 – 15 questions.

In my next post, I’ll share some tips about how to get better response rates and high-quality data when administering course evaluations.

In the meantime, here’s a short tutorial on authoring a course evaluations using Questionmark Perception.

« Previous PageNext Page »