How tests help online learners stay on task

Joan Phaup Headshot Posted by Joan Phaup

Online courses offer a flexible and increasingly popular way for people to learn. But what about the many distractions that can cause a student’s mind to wander off the subject at hand?

According to a team of Harvard University researchers, administering short tests to students watching video lectures can decrease mind-wandering, increase note-taking and improve retention.

Interpolated memory tests reduce mind wandering and improve learning of online lectures, a paper by Harvard Postdoctoral Fellow Karl K. Szpunar, Research Assistant Novall Y. Khan and Psychology Professor  Daniel L. Schacter, was published this month in Proceedings of the National Academy of Sciences  (PNAS) in the U.S.

The team conducted two experiments in which they interspersed online lectures with memory tests and found that such tests can: help students pay more sustained attention to lecture content encourage task-relevant note-taking improve learning reduce students’ anxiety about final tests.

“Here we provide evidence that points toward a solution for the difficulty that students frequently report in sustaining attention to online lectures over extended periods,” the researchers say.


In this experiment a group of students watched a 21-minute lecture presented in four segments of about five minutes each. After each segment, students were asked to do some math problems. Some students were then tested on the material from the lecture, while others (the “not tested” group) did more math problems.

This research seems to indicate that including tests or quizzes could make online courses more successful. So yes! Use assessments to reinforce what people are learning in your own courses. Whatever types of information you are presenting online – whether it’s a lecture, an illustration or text, you can help students stay focused by embedding assessments right on the same page as your learning materials.

A previous post on this blog offers an example of how embedded quizzes are being used to engage learners. You can read more about the recently published research, including an interview with Szpunar and Schacter, in the Harvard Gazette. You can read the paper here.

SlideShare Presentation on Assessment Feedback

Headshot JuliePosted by Julie Delazyn

The impact of assessments on learning is something Questionmark Chairman John Kleeman has written about extensively in this blog. He has explained psychology research that demonstrates the importance of retrieval practice – including taking formative quizzes with feedback — as an efficient way of retaining learning for the long term.

John has been focusing lately on what the effective use of feedback can bring to assessments, and he shared what he’s been learning during a presentation at the Questionmark Users Conference on Assessment Feedback – What Can We Learn from Psychology Research?

In this SlideShare presentation, John Kleeman explains how assessments and feedback can influence learning and offers some good practice recommendations.

For more on this theme, check out John’s conversation with Dr. Douglas Larsen, an expert in medical education at the Washington University in St Louis, about Dr. Larsen’s research on how tests and quizzes taken during learning aid learning and retention in medical education. You can also click here to read John’s post about ten benefits of quizzes and tests in educational practice.

Advice from Cognitive Psychologist Roddy Roediger on using retrieval practice to aid learning

Posted by John Kleeman

I am a keen admirer of the work of Professor Roddy Roediger, a cognitive psychologist who investigates how quizzes and tests directly aid learning by giving retrieval practice. I recently interviewed him and here is how he explains this effect and how we can apply this in practice.

Roddy Roediger

Could you explain a little about your background and how you moved into the memory field?

I have a Ph.D. in cognitive psychology from Yale University. I’ve always been interested in memory, and I was surprised to find there was an academic discipline devoted to studying remembering, so I naturally gravitated to that. I worked with Robert Crowder and Endel Tulving at Yale, two leading people in the field. Since then I have taught at Purdue University, the University of Toronto and Rice University. I am now James S. McDonnell Distinguished University Professor at Washington University in St. Louis.

Most of my career has been doing laboratory research trying to show factors that help or harm memory. In the 1990s I published a series of studies on illusions of memory – that is, on false memories  — how we can have very strong memories of something that either never happened or that happened quite differently from how we remember it. About 8 years ago I became interested in applying what we were learning about memory to education, and I started looking at factors that are important for learning and remembering but that are not well appreciated in education. One of these is retrieval practice, which is what happens when we test ourselves, or when we are given a test or quiz. When we actually retrieve information from memory, it’s a very potent enhancement to remembering it. We are much more likely to remember something again if we actively retrieve it than if we are passively exposed to it in restudying.

Is this the testing or quizzing effect  —  that if you learn something and answer questions on it, you are much more likely to retain it for the long term than if you don’t answer the questions?

Yes, absolutely. Making people actually think about material, to reconstruct it, to say it in their own words is much more effective than simply restudying it, yet many students don’t seem to appreciate this. If you ask students how they study to remember, their study strategy is typically re-reading and reviewing. That’s good up to a point, but it would be much better if they actively practiced retrieval, which is what a test requires them to do. If you haven’t constructed or answered practice questions, you won’t do as well on a test as students who have practiced.

A lot of our readers are in corporate training; does this apply in this field too? How should this affect people’s design of learning programs?

I think retrieval practice can have direct implications in the corporate world.

Let me give you an anecdote. One of the people I was talking with about this was skeptical. She was going to work on the train, reading the newspaper like she does every morning. She decided she’d put the paper down after each story and summarize it to herself mentally in her own words. When she got home that night, she asked her husband to test her on the stories she’d read.  And she did really well, surprising them both. Because after she’d read the stories, she’d retrieved them and put them in her own words in her mind.

So if you’re a sales person and you need to remember a lot of qualities of your product to go out and sell it, the best way to do it is to practice retrieving the information and consult your notes only when you fail to retrieve a critical piece of information. Then when you are with a customer you will know all the information. I talk to textbook sales people a lot. Some can walk in and tell me all about the books while others just get out their notes in their folders and show them to me. It’s so much more impressive when the the salespeople can look you in the eye and tell you about the book without having to refer to their notes.

How does this actually work inside the brain?

We don’t know the neural mechanisms yet, but i can tell you some factors that seem to be important.

There seems to be something about effortful retrieval that matters. If you have to put a bit of effort into the retrieval — if it’s harder to bring the fact out of memory — that helps. So imagine you are trying to remember a face or a name; say you meet someone and you want to remember her name. You might think it would be good to repeatedly retrieve the name immediately after you met her, but it is not. Repeated immediate retrieval is like rote rehearsal – and that doesn’t do very well. But if you space out your retrievals – so you do it right away after you meet the person (to make sure you have the name) and then you wait a while to try again and you keep trying at spaced intervals, you will remember the name much better. The delayed retrieval makes you expend a bit more effort. You want to make retrieval a little difficult for yourself, so something about retrieval effort does seem to matter.

Another way you can see that is if you have people read a passage and take a multiple-choice test. In a multiple-choice test you see all the alternatives and you see which one is familiar and correct. You will get a slight benefit in retention from that. But if you are given a short answer question and can actively retrieve the answer, you will get even more benefit, because you have to reproduce the information instead of just recognizing it.  Although both tests provide a benefit, research shows that more benefit accrues from a short answer test or quiz where you have to retrieve information than from a multiple choice or true/false one where you just have to recognize it.

Would that apply to other kinds of recall questions like putting a number as your answer or filling in a blank in a question?

Yes, it does. Fill-in-blank questions do provide the benefit. I assume the same would be true in remembering numbers, but I do not know any research on that topic yet.

What about with multiple response questions or matching questions?

We haven’t done the research in these areas, but we believe that questions that stimulate recall are superior to those that use recognition, but all retrieval practice is useful.

Does it just apply to learning knowledge and facts or does it apply to learning concepts and higher levels of learning?

It definitely applies to concepts. Let me give one example.

Larry Jacoby and his colleagues at Washington University study how people learn bird concepts like warbler or thrasher and so on. He had some people study examples of birds and which category they were in, and another group were given tests on birds and tried  to guess which category they belonged in (and then they got feedback).  So one group just studied birds with their category names whereas the other group learned them while being tested on the names. When he tested both groups a couple of days later, the people who’d had been tested while learning did better than those who’d just studied the examples and the categories. In the test, he showed novel examples that people hadn’t seen before, for instance a bird that belonged in the thrasher family but that had not been used in the practice phase, and the people who’d taken the tests did better. Answering the questions about the birds allowed them to grasp the concept better and generalize it to new examples.

By testing yourself, making mistakes and being corrected, you sharpen what you know about a concept.

So this sounds like a significant way that people can learn better that isn’t widely known. Why is that?

I don’t know! In his essay on memory, Aristotle said, “Exercise in repeatedly recalling a thing strengthens the memory.” Sir Francis Bacon and William James also knew the benefits of retrieval practice (or recitation, as it used to be called) and wrote about them. They didn’t have evidence, of course, except from their own experience. But the technique has mostly been lost from education and training.  In fact the idea of retrieval practice is pretty much derided in education because people in the U.S., at least, are so opposed to anything that smacks of testing.

Certainly testing can be misused; in the old school days there was an emphasis on rote memorization –students had to remember poetry, say, by heart. Educators later decried what they called this “kill and drill” approach to education and they got away from these techniques. That is good in part, because the philosophy behind rote memorization was misguided. Some educators a hundred years ago considered “Memory” to be a faculty of mind and to operate like a muscle. The idea was if a student practiced memorizing poetry, “the Memory” would become stronger and would be better at learning and remembering other things, like algebra. However, the mind simply does not work that way. Practicing one topic helps that topic but does not usually spill over to learning unrelated topics.

But on the other hand, with the de-emphasis on memorization, the benefits of active retrieval should not be lost, because active retrieval is a potent memory enhancer. When you see how children learn multiplication tables, they use flash cards with 6×4 on one side and 24 on the other, and teachers say, “Practice until you think you really know it. Practice until it’s completely automatic.” So teachers use retrieval practice for multiplication tables, but the idea that you can use it for much for complex ideas is not widely appreciated.

Testing has gotten a bad name in the educational community. Instead of thinking of testing as standardized testing to place people into groups, we need to see use of low-stakes quizzes in the classroom and self-testing outside the classroom as a study and learning strategy.

Next week we’ll publish the second part of the interview, in which Professor Roediger gives practical advice for people seeking to use the retrieval practice effect to help people learn.

Using Twitter to help learners retain knowledge

john_smallPosted by John KleemanTwitter  Logo

Here’s a question for you: “What is the best way of stopping people forgetting things after learning?”

Think about this for a moment before looking ahead if you can.

I hope your answer is something like this: by asking them questions over time after the learning takes place.

When you learn something, you connect two or more concepts in memory. And when you are asked a question about what you have learned, you have to search your memory to find the answer. This searching makes the connection in memory stronger, so in the future you will be more likely to remember what you have learned rather than forget it. If you’re not familiar with this important idea, see these white papers by learning expert Will Thalheimer for more information:  The Learning Benefits of Questions and Measuring Learning Results.

If your learners go on to another course or go back to work, it’s not always easy to reach them to stimulate their memory with follow-up questions. Here’s where Twitter comes in: it can be a great tool for sending follow up questions.

Twitter grad logo

  1. Have your learners follow you on Twitter, either on your main account, or on a subsidiary account made for each course.
  2. Post short questions as tweets to stimulate people’s memory. Remember, even thinking about the answer can help reinforce the learning. You could post the right answer the next day.
  3. Follow these up with quizzes in Questionmark Perception. You can post links to to these assessments in your tweets. With the new support of mobile devices in Perception version 5, your learners can access these quizzes from mobile devices as well as PCs and Macs, and take the quizzes from their home or while traveling.

Shortening a question into 140 characters  is usually possible, and it’s easy to compress a URL to Perception’s open access entry point (open.php) to fit within a tweet. For instance the URL http://bit.ly/ElectricQuiz links to one of Questionmark’s sample assessments on Electricity Skills.
I hope this idea helps. And in case you’ve forgotten, what is the best way of helping people remember after learning?

Measuring Learning Results: Eight Recommendations for Assessment Designers

Joan PhaupPosted by Joan Phaup

Is it possible to build the perfect assessment design? Not likely, given the intricacies of the learning process! But a white paper available on the Questionmark Web site helps test authors respond effectively to the inevitable tradeoffs in order to create better assessments.

Measuring Learning Results, by Dr. Will Thalheimer of Work-Learning Research, considers findings from fundamental learning research and how they relate to assessment. The paper explores how to create assessments that measure how well learning interventions are preparing learners to retrieve information in future situations—which as Will states it is the ultimate goal of training and education.

The eight bits of wisdom that conclude the paper give plenty of food for thought for test designers. You can download the paper to find out how Will arrived at them.

1. Figure out what learning outcomes you really care about. Measure them. Prioritize the importance of the learning outcomes you are targeting. Use more of your assessment time on high-priority information.

2. Figure out what retrieval situations you are preparing your learners for. Create assessment items that mirror or simulate those retrieval situations.

3. Consider using delayed assessments a week or month (or more) after the original learning ends—in addition to end-of-learning assessments.

4. Consider using delayed assessments instead of end-of-learning assessments, but be aware that there are significant tradeoffs in using this approach.

5. Utilize authentic questions, decisions, or demonstrations of skill that require learners to retrieve information from memory in a way that is similar to how they’ll have to retrieve it in the retrieval situations for which you are preparing them. Simulation-like questions that provide realistic decisions set in real-world contexts are ideal.

6. Cover a significant portion of the most important learning points you want your learners to understand or be able to utilize. This will require you to create a list of the objectives that will be targeted by the instruction.

7. Avoid factors that will bias your assessments. Or, if you can’t avoid them, make sure you understand them, mitigate them as much as possible, and report their influence. Beware of the biasing effects of end-of-learning assessments, pretests, assessments given in the learning context, and assessment items that are focused on low-level information.

8. Follow all the general rules about how to create assessment items. For example, write clearly, use only plausible alternatives (for multiple-choice questions), pilot-test your assessment items to improve them, and utilize psychometric techniques where applicable.

Questionmark Conference: Progress in Criterion Referenced Measurement

sharon-shrock

Sharon Shrock

joan-small2Posted by Joan Phaup

A highlight of the Questionmark Users Conference in Memphis  was Sharon Shrock and Bill Coscarelli’s keynote, “Results You Can Rely On: What We’ve Learned from 25 Years of Criterion-Referenced Measurement.”

Sharon and Bill took us from the early history of testing to the work of Robert Glaser, who in laying the foundations of  criterion reference testing focused on the importance of setting objectives and measuring test takers against a standard instead of against one another. They had us try William Angoff’s method of setting cut scores ( click here to access a detailed paper on this subject by Bill and Sharon together with Andrew Barrett and John Kleeman) and reviewed Donald Kirkpatrick’s four levels  (“Don’t skip Kirkpatrick Level 2 if you are using 3 and 4!”) .  They also described the

Bill Coscarelli

Bill Coscarelli

six levels of what they call the “Certification Suite” (noted by Dr. Will Thalheimer in his review of  the latest edition of  Shrock and Coscarelli’s book, Criterion-Referenced Test Development.)

A key point was that people on a job are doing much more than remembering facts. So test questions, rather than just operating at the  memory level, should deal with real problems that people who do a particular job have to think through.

Tuesday’s conference schedule also included sessions on how to organize item banks, effective reporting techniques, teaching faculty to use Questionmark Perception, item analysis, and test maintenance best practices.

Wednesday brought some encore tech training sessions and a “road ahead” session seeking participants’ reactions to ideas for future features and services. Then some quick good-byes as participants headed for home. Here’s looking forward to next year’s conference!