Control access to surveys but keep the results anonymous

Posted by John Kleeman

When delivering a course evaluation survey or an employee engagement survey, it’s usually best to make the survey anonymous, as this will encourage people to answer candidly and so give you the feedback you want.  But how do you make the results anonymous yet still control who can take the survey and ensure they can only take it once?

Questionmark Perception lets you make a survey anonymous as one of the options when creating the assessment.

Anonymous setting screenshot

What some Questionmark users don’t know is that even if a survey is anonymous, you can still schedule people to it individually, as shown in the diagram below.

All the scheduling capabilities of Questionmark (who can take the survey, when they can do it, limiting to a single attempt) work normally with an anonymous survey. It’s just that the assessment delivery software doesn’t store the person’s name and any other identifying information with the results. So you can limit attempts, you can use Email Broadcast to send out an invitation to participants and you can even remind people who’ve not taken the survey to take it. But in the results database, names are replaced with the text “Anonymous,” and so none of the answers and comments your participants give will be identifiable.

You can also use anonymous surveys in a similar way when running a survey via Single Sign On, from SharePoint or from a Learning Management System. The external system will authenticate the participant, but Perception will not store the identities of people with their results, and so instructors and other reporting users will see the answers and comments as anonymous.

If you are relying on this, you should check it for yourself:  Take a dummy survey and confirm that you cannot see the results when reporting. One thing to be aware of is Special fields, which can sometimes contain identifying information.  There is a system setting  that lets you control whether these are captured for anonymous surveys. (Questionmark Software Support Plan customers can see details here.)

Questionmark ourselves use this capability to deliver anonymous surveys to our own employees, and I hope this capability might be helpful to you, too.

Podcast: Bryan Chapman on Assessment’s Strategic Role in Learning

 

Posted by Joan Phaup

I am looking forward to the Questionmark 2011 Users Conference for many reasons. For one thing, it will take me back to my hometown of Los Angeles! But more importantly, the conference will bring Questionmark staff together with our customers for three great days of learning and networking.

Bryan Chapman

A highlight of the conference will by a keynote by Bryan Chapman on Assessment’s Strategic Role in Enterprise Learning: Innovations and Trends.

As Chief Learning Strategist at Chapman Alliance, Bryan organizations define, operate and optimize their strategic learning initiatives. His experience with many different types of organizations will inform his talk, which will include examples of innovative assessment strategies that have helped shorten training delivery times, motivate learners and improve performance.

I asked  Bryan for a sneak preview of what he’ll be talking about the conference. You can listen in on our conversation here. But before you do, I’d like to remind Questionmark users that this Friday, December 3rd, is the last day for early-bird registration savings. So after you’ve listened to this podcast, get ready to register soon!

A conversation on the value of asking good questions

Joan Phaup

Posted by Joan Phaup

I enjoyed  a blog post by Andy Klee of  Klee Associates about a  recent conversation he’d had with our own John Kleeman, Questionmark’s chairman. Klee, whose organization provides JD Edwards and SAP training and consulting, showed a great deal of interest in how good questions and tests can improve learning outcomes.

Click here to follow their wide-ranging discussion, which covers such topics as the challenge of creating high-quality test questions, the correlation between performance on certification exams and  future job performance, and trends in exam design and administration.

It’s great to see more and more people recognizing that asking questions adds value to learning. If you would like to read more on this subject, check out this paper by Dr Will Thalheimer of Work-Learning Research: The Learning Benefits of Questions.

Conducting Observational Assessments: New Feature in Questionmark

Posted by Brian McNamara

We’re very pleased to announce the availability of a new feature available for users of Questionmark OnDemand:  Observational Assessments.

We’re all familiar with how traditional tests and quizzes work:  we are presented with questions and submit our answers for scoring.  Observational assessments work a little differently:  an observation of the participant is conducted, and the “observer” then answers questions to rate the participant’s performance, skills or knowledge.  For example, in a workplace setting an observational assessment could be used to measure and record how well an employee performs a certain task, rating areas such as skills, safety practices and adherence to required procedures.

The new feature we’ve introduced enables observers to log in to Questionmark Perception OnDemand, select the desired assessment, select which participant is to be rated, and then complete the assessment. Questionmark’s existing “Coaching Report” provides an ideal mechanism for sharing results and feedback from Observational Assessments with stakeholders.

In many cases, conducting an observational assessment requires mobility – and we’re excited that the many new mobile delivery features and Apps introduced this year really add to the flexibility of how and where you can now conduct these assessments via Questionmark.

I invite you to take a couple of minutes to watch this video, which provides a brief conceptual overview followed by a short demonstration of how Questionmark Perception is used to deliver and report on this type of assessments.

Multiple choice quizzes help learning, especially with feedback

Posted by John Kleeman

I promised in an earlier blog entry to pass on my understanding of research in educational psychology about the unmediated or direct benefits of questioning, i.e., how answering questions helps people learn. I’ve recently read a 2008 paper by Butler and Roediger from the Memory Lab at Washington University in St. Louis, Missouri (see here for the 2008 paper and here for a 2010 review paper including the graph below), which includes some fascinating information on how multiple choice quizzes directly aid learning.

The researchers divided students randomly into four groups as follows

  • Study a subject, no quiz
  • Study a subject, take a multiple choice quiz, no feedback
  • Study a subject, take a multiple choice quiz, feedback after each question
  • Study a subject, take a multiple choice quiz, feedback at the end of the quiz

They then tested all the groups a week later and got the results below.

Chart showing quizzes give better retention

As you can see, the students who had taken a quiz (or test as the authors describe it) got better results on average than those who hadn’t taken a quiz. This is expected due to the general principle that answering questions gives retrieval practice, which helps you to be able to recall things later and so helps learning.  This is similar to results I’ve blogged on elsewhere.

However, what is interesting on this study is that on multiple choice quizzes, there is the potential danger that students will choose the wrong answers and so think they have retrieved information which is in fact wrong. What this study showed was that if you give feedback on the quiz, then this improves learning further, as you can see in the graph above. Interestingly, although you might think that immediate feedback right after the question is best, this wasn’t the case in this example. Quizzes with feedback delayed until the end of the assessment gave better results than those with feedback after each question. The authors postulate that a slight gap in giving the feedback allows the incorrect concept to dissipate before the feedback is given and also gives spacing in time, which helps learning.

My summary of understanding from this research:

  • Giving a quiz after learning will help retention, as it gives recall practice
  • Giving feedback helps improve retention, particularly in multiple choice quizzes where there is a danger of learners choosing wrong answers and thinking they are right
  • Feedback is better at the end of the quiz, not after each question

For more information on the research, see Professor Roediger’s publications page at http://psych.wustl.edu/memory/publications/.

One interesting issue this raises is that it’s common in certification exams not to give feedback, to retain the confidentiality of the questions by not repeating them, and because certification aims at measuring rather than learning. What this research shows is that if you want to help your successful and failing candidates learn, then you could consider feedback in some form.

Here’s a question to allow you to practice retrieval on the subject of this blog:

Should you give feedback on multiple choice quizzes after each question or at the end of the assessment?

Embedding Assessments in DotNetNuke

Embed a Questionmark Perception assessment, survey or quiz within a DotNetNuke portal.

  • To see how this would look, see a snapshot of an assessment embedded within a web page using DotNetNuke.
  • Check out this how-to on our developer web site.
  • DotNetNuke an open source platform for building web sites based on Microsoft .NET technology. DotNetNuke’s content management system is extensible and customizable through the use of skins, modules, data providers, language packs and templates.

Next Page »