Use a survey with feedback to aid student retention

Posted by John Kleeman

I’d like to share an interesting application of Questionmark Perception at the University of Glamorgan, which is using self-assessment surveys to help increase student retention. Students answer questions about aspects of their studying and receive feedback to help them improve. These assessments provide useful data that the University has used to improve student retention. When discussing them with colleagues and students, the authors use the term “exercise” instead of “survey” or “assessment” – to describe something that is designed primarily to enable students to improve their own achievement and progression. The exercises are helping avoid situations that can lead to drop-out.

There are two exercises: an “Early Days” exercise for new students and a “Study Health Check Exercise” for all students. Both exercises are anonymous, use branching to direct students to relevant questions and give short, actionable feedback.

Early Days Exercise

The Early Days exercise goes to all new students six weeks into the academic year, and is heavily promoted so that most of them will take it. The exercise includes questions on study resources, the student’s attitude to his or her academic work and how the student is getting to grips with university life. See below for a screenshot of feedback for a couple of questions.

Early Days Exercise screenshot

 

Study Health Check Exercise

This survey is sent out in the second term (semester) of the year. It similarly asks about the health of the student’s studying and learning, for example here is an example question and feedback:

Study Health Check Exercise screenshot

The exercise asks about various factors that the University believes are important for successful study, and gives feedback to try and help with areas of concern or weakness. Student satisfaction with this exercise is above 90% – it’s easy to take and gives actionable feedback.

John Kleeman and David Lewis at University of GlamorganAs well as helping individual students, the exercise also helps the University identify issues impacting retention. Cross-tabbing questions about whether the student has considered withdrawing or suspending study with other factors such as a student’s employment, family commitments etc. and motivation can help identify and highlight factors that lead students to remain at the University.

Thanks to David Lewis (pictured left with me, in the Faculty of Advanced Technology at Glamorgan), Julie Prior and Karen Fitzgibbon of the University of Glamorgan for sharing info on these exercises – you can see more here.

 

Some key reasons why these self-assessment surveys seem successful:

  • They take only 10 minutes or so to complete
  • They are well promoted
  • They provide short actionable feedback
  • They are genuinely anonymous
  • They remain similar year on year, making them easy to maintain and useful for identifying trends
  • They focus on specific issues the University has learned are important for retention and student success

You can see a demonstration of Early Days here and a demonstration version of Study Health Check here.

Timing is Everything: Using psychology research to make your assessments more effective

Posted by Julie Delazyn

Over the past year or so, John Kleeman has been sharing new evidence from cognitive psychology indicating that when it comes to making your assessments more effective, timing is crucial.

One of the strongest research findings is that learning is far more effective if it is spaced out over time instead of being delivered all at once. There is evidence that giving regular quizzes helps people by:

  • encouraging them to space out their study in preparation for the quizzes
  • help them learn simply by taking the quizzes and seeing the feedback

Psychology research also shows that taking a quiz or test directly helps you retain information: The more you practice retrieving something, the more likely you are to retain it for the long term.

This SlideShare presentation will help you learn about psychology research and how you can apply it to improve your use of assessments.

Using QR Codes – Start to Finish

Posted By Doug Peterson

QR codes. They’re everywhere these days!

Questionmark staff have already written some great blogs on QR codes – Using QR Codes to Direct Smartphone Users to Assessments (which includes a link to a fantastic PowerPoint slideshow on using QR codes) and QR Codes for Surveys: A Perfect Fit, so I don’t need to repeat how useful they are in this blog. What I will do is show you just how easy it is to set up a QR code that launches an assessment, and how easy it is for a participant to launch an assessment from a mobile device using a QR code.

In this video I have used Authoring Manager to generate a URL to launch a simple quiz last December. It’s an open access quiz, and I’ve added “&name=holiday” to the end of the URL string so that the user doesn’t have to provide a name at the Open Access portal: they’ll be taken straight into the quiz.

Watch the video to learn how to turn the URL into a QR code graphic, and what the experience is like for a user accessing the quiz via the QR code!

It’s one thing to train, another to prove training

Simone Buchwald Posted by John Kleeman

I recently published an interview in the SAP Community Network with Simone Buchwald, who was the product manager of SAP’s Learning Solution (LMS) when it first came out and now works in the SAP Ecosystem as a consultant at EPI-USE. She shared some information on assessments within SAP and trends in compliance, which I thought readers of this blog might enjoy, here is an extract from the interview:

 

Why do people use assessment in the SAP Learning Solution (LSO)?

Assessments play a vital role in LSO as they broadly do in every LMS, especially as a big focus for LSO use is compliance management. Assessments are a crucial part of a compliance solution, because organizations have to test and document that people understand what they are supposed to learn.

I remember when I was at SAP and we made contact with Questionmark – we immediately decided that we would not extend the LSO test authoring system much, but would instead suggest Questionmark as the assessment partner tool for LSO because of its strong capabilities.

 

What is the best way to document training for compliance purposes. Is it okay just to record that someone has been through training or do you need to assess to check understanding?

I see a trend both in regulated and non-regulated environments that it is not enough just to track the completion of training, you also need to validate the understanding of learning.

So I think it is important to check understanding by assessments, both for tracking purposes and in case an auditor wants to see the records. It is one thing to do something, and another to prove it. This even extends nowadays outside of the core regulatory use cases to courses where the company wants to track the completion and the understanding, for example  in sales training to prove the value of the learning for the company.

 

What trends do you see in compliance; how are things changing?

A lot of the compliance processes that we have seen in the past that have been mandatory in compliance-driven businesses –  like pharmaceuticals, chemicals and mining — we are now seeing also expanding in other industries and customers. For instance we see needs for mandatory training of certain business processes, sometimes imposed by regulation, but sometimes just imposed by the business itself. For example, when organizations roll out equal opportunities training, the company may decide to implement the same process as for compliance-required training – checking who has completed it, checking who has done the assessments and been validated. And often this is driven from internal business needs, not external regulators.

People are seeing the advantage of conducting learning and assessments not just when the regulator requires it, but for their own business purposes.

 

You can see the full interview here.

Alignment, impact and measurement with the A-model

Posted by Julie Delazyn

It’s essential for learning and other important change initiatives – including assessment and evaluation — to align with an organization’s strategic goals. So it’s important for organizations and individuals to clarify the goals, objectives and human performance issues of their work and design systematic assessment programs to evaluate progress.

The A-model, developed by Dr. Bruce C. Aaron, offers a framework for helping organizations  focus on their goals, promote continuous learning and adjust performance as needed. This framework, or map, helps hold the many elements of human performance in place — right from the original business problem or business issue up through program design and evaluation.

We have many resources for helping you understand the A-model and apply it to the specific needs of your organization:

  • White Paper: Alignment, Impact and Measurement with the A-model (free with registration)
  • A podcast with Bruce Aaron
  • Other perspectives on the A-model
  • This SlideShare presentation!

Improving course evaluations and boosting participation

Posted by Jim Farrell

Course and session evaluations  are popular assessments  for helping  to improve course and instructor quality at conferences and in learning programs all over the world.

One of our major goals over the past few years has been to make it easier to create and deliver course evaluations – and to help organizations glean more meaningful, actionable results from them.

Back in 2010, when we added the ability to author course evaluation surveys in Questionmark Live, we included question libraries to draw from in creating  surveys. These libraries cover four topics: Demographics, Instructor, Course Materials and Facilities;  you can either write your own questions or choose some from the libraries.

More recently, we’ve  been exploring the use of QR codes to increase course evaluation response rates by taking participants directly to online surveys via their mobile devices. Go here and here for more details about the benefits of using QR codes.

What about the results of course evaluations? We now have 4 course evaluation surveys in Questionmark Analytics: Course Summary, Instructor Summary, Class Summary and Class Detail. The four course evaluation reports allow you to drill into more and more detail about your course evaluation results. You can start at the course level and work your way down to an instructor/question level of detail. Each report also has visual cues to make performance obvious with a quick glance.

In the example below, the course summary report compares evaluation results across courses. It is most useful for managers and supervisors to comparing different courses within an organization.

If you are a customer looking to improve your course evaluations, you can click  here to read for our Course Evaluation Best Practices Guide.  Anyone who hasn’t used Questionmark Live can sign up for a free trial via our Tryouts and Downloads page.

Next Page »
SAP Microsoft Oracle HR-XML AAIC