Simulating real life: Questions that test application of knowledge

Doug Peterson HeadshotPosted By Doug Peterson

Questionmark offers over 20 different question formats. We have the standard multiple choice, multiple response, true/false and yes/no question types. We also have question types with more interaction, to more fully engage the participant: hot spot and drag-and-drop, fill in blanks and select a blank, and matching and ordering. We also have text match and essay questions that allow participants to show their understanding by formulating a response from scratch.

Questionmark has two other question types that I’d like to discuss, because I feel they are very powerful yet often overlooked: the Adobe Captivate and Flash question types.

Prior to joining Questionmark, I worked for a large telecommunications company, and one of my group’s responsibilities was to train customer call center employees. The call center representatives had to know all about hooking up set-top boxes and DVD players as well as configuring routers and setting up wireless connections. They had to know how to use several different trouble-shooting applications as well as the main trouble ticket application. We used Captivate and Flash question types in a few different ways to effectively and accurately assess the participants’ knowledge as they went through their 13 weeks of training.

  1.  We used Flash and ActionScript to create a duplicate of the trouble ticket application. And I mean *duplicate*. It looked and behaved exactly like the real thing. Honestly, there were a couple of times I got confused as to which was the real thing and which was the simulation when I had them open side by side, that’s how realistic the simulation was. With this simulation, we were able to go beyond using multiple choice questions that just asked, “What value would you select for Trouble Reason?” or “What error code would you use, given the customer’s description?” Instead, we presented the participant with (what appeared to be and behaved exactly like) the trouble ticket application and said, “Fill out the ticket.” We gave a point or two for every value they entered correctly, or checkbox they checked correctly, or radio button they selected correctly. In this way we could assess their understanding of what fields needed to be populated at every stage of the process and their overall ability to use the software, as well as their understanding of what values to use in each field.
  2.  Similar to #1, we created simulations for setting up a wireless connection or configuring a router. We presented the participant with a Windows desktop and they had to go through the process of setting up a connection to a local router – entering the SSID, entering the WEP key, etc. We didn’t give points for individual steps in this one, as the instructions were to set up a connection – either you did it all correctly and established the connection, or you
    didn’t.
  3.  The class was typically taught in a classroom, and at one point the instructor would wheel in an audio/visual cart with a television, a set-top box, a DVD player, and a home theater sound system. The members of the class would then wire the components together correctly, or troubleshoot the instructor’s incorrect wiring job. Then one day we were asked to teach the class remotely, with student’s taking the training in their own homes. How could we do the wiring exercise if the students weren’t all in the same physical location? As I’m sure you’ve guessed, we used a Flash simulation. The simulation presented the backs of the various components along with the ability to select different types of wiring (HDMI cable, coax cable, and RCA cables). Students could click-and-drag the selected wire from a connector on one component to a connector on another component.

Because this way of assessing a learner’s understanding is not all that common, we used similar simulations as formative quizzes during the training and provided a practice assessment prior to the first “real” assessment. This helped the participants get comfortable with the format by the time it really counted, which is important: We want to be fair to the learner and make sure we give them every opportunity to prove their knowledge, skill or ability without interference or stress. It’s not fair to suddenly spring a new question format on them that they’ve never seen before.

One great way to learn more about this topic is to attend our the Questionmark 2014 Users Conference March 4-7 in San Antonio. I typically present a session on using Flash and Captivate in e-learning and assessments. Looking forward to seeing you there!

Conference Close-up: Using Flash and Captivate Questions with Questionmark

Posted by Joan Phaup

Participants in the annual Questionmark Users Conference bring a lot of enthusiasm about using innovative question types in their assessments. A number of our customers have extedougnsive experience with this and like to share their expertise at the conference. I spoke the other day with Doug Peterson from Verizon Communications and asked him about the case study he will share at the conference about using Flash and Captivate questions  within Questionmark Perception.

Here’s a quick wrap-up of our conversation:

Q:  What’s your role at Verizon  Communications?
A:  I have two roles: I develop, maintain and deliver training — mainly  now on internet technologies — and I’m responsible for a series of online automated tests for our help center training program. This is a pass/fail curriculum and very high stakes because these tests can affect people’s job status. So we need to be absolutely sure that the tests are well written and well maintained. These used to be written tests that were graded by an instructor. We turned to Questionmark for an objective, unbiased, online, airtight testing system and I oversee that.

Q: How are you using Questionmark Perception?
A: We have a couple of tests for each of the three modules in the training curriculum. We use Questionmark for end-of-lesson reviews as well as the higher stakes tests  that determine whether a person has passed or failed a module. We use scenarios that trainees might encounter in working with a customer. There might be 6 to 8 scenarios in each test and 10 or 12 questions about each scenario. The trainees take these tests right in the classroom, on their classroom computers.  We create individual QM accounts for each student and schedule the tests directly for those accounts.  We schedule them for a specific day and time window.  No one can see the tests except for the students, and they can only access them during the testing window. We had subject matter experts tell us what we needed to cover in the scenarios and what questions we needed to ask about them. They explained what would be a reasonable way to present a question or simulation to test a particular skill. Once we’d created all the scenarios and written all the questions we did an in-depth validation.

Q: What will you be sharing during your case study presentation at the Users Conference?
A:  Our call center agents have to use several applications when they get a call from a customer. They’ll have to look up a trouble ticket, get information about the customer and so forth. We need to make sure they knew how to use those applications, so we have created Perception questions using Captivate and Flash files with ActionScript that present the application to the student. Then the student needs to work through the application to demonstrate their proficiency with it. We’ve worked out a way to create a highly interactive, very realistic simulation in Flash that captures each student’s actions in using a particular application. It really tracks step by step. Being able to take the individual things from the Flash scenarios makes it so that when we run reports after the test we can easily see if a lot of people are is missing something like clicking on a particular button. Then the instructor can go back and make sure the students understand what they are supposed to. We went through a complex process to figure all this out, but it’s given us the ability to create a highly interactive, very realistic simulation in Flash with action script ActionScript coding and all kinds of logic and still pass back individual point values for different tasks. I’m very proud of the tests we have created and the work we have done. We have some fabulous questions in there that allows the students to show that they really understand applications and know how to do something from start to finish. We learned many tips and tricks along the way and I will be sharing those with the people at my session.

Q: What are you looking forward to at this year’s conference?
A: I really enjoyed the sessions on item analysis and test validity at the 2009 conference, and I am looking forward to learning even more about those subjects this year. And anything about new functionality in Perception Version 5 will be on my list too.

You can attend Doug’s presentations and many others at the conference in Miami March 14 – 17. Early-bird registration ends January 22nd, so sign up soon!

Start the New Year with Learning: Choose a Webinar

Joan Phaup

Posted by Joan Phaup

What better way to start the New Year than with a fresh line-up of  Web seminars? We have just added many new one-hour Webinar sessions to our U.S. schedule and hope you will take advantage of these learning opportunities!  The following sessions will suit a wide variety of people, from those who are just beginning to think about using online assessments to experienced users of Questionmark Perception:

Questionmark Perception Orientation — A great starting point for anyone who would like a quick  overview of authoring, scheduling, delivering and reporting on Perception assessments. This Webinar is offered twice each week, so choose the session that suits you best.

Analyzing and Sharing Assessment Results with Questionmark Enterprise Reporter — Learn about Perception’s 12 standard reports and the data and statistics they contain. Participants will also see how to use templates to streamline report creation and understand the various filter options available.

Monday, January 25th at 3 p.m. Eastern Time

Beyond Multiple Choice: Nine Ways to Leverage Technology for Better Assessments — Learn techniques for creating effective assessments that will help your organization improve performance, manage workforce competencies, and ensure regulatory compliance.

Wednesday, January 20th, and Thursday, February 18th, at 3 p.m. Eastern Time.

Questionmark Customers Online: Integrating Perception with SAP — Find out how the Nebraska Public Power District has integrated Perception with SAP Learning Solution to create a one-stop learning shop for employees working in a highly regulated industry.

Tuesday, January 26th, at 1 p.m. Eastern Time

From Item Banking to Content Harvesting: Authoring in Questionmark Perception — This Web seminar will demonstrate the use of different authoring tools to author questions for use in surveys, quizzes, tests and exams. Discover which tools are the most practical for you and how to make them work together to produce assessments quickly and easily.

Thursday,  February 4th, at 3 p.m. Eastern Time

Introduction to Questionmark Live: Browser-Based Authoring for Subject Matter Experts — Questionmark Live lets you easily write questions and then export them for use in Questionmark Perception. Participants in this webinar will learn to create questions of seven different types and then email or download them to a Perception administrator for import into Perception.

Wednesday, February 17th, at 3 p.m. Eastern Time

You may register for any of these Webinars by clicking here.

How to Create Matching Questions in Questionmark Perception

Joan PhaupPosted by Joan Phaup

Quiz and test authors need an arsenal of different question types to suit various purposes. Matching questions, which present two series of words or ideas, ask participants to match items from one list to items within the other. Learners must correctly identify which items go together–say, for instance, a state or country and its capital.

Matching questions make it possible to measure a relatively large amount of knowledge in a small amount of space, but it’s important to bear in mind that they emphasize information recognition rather than information recall.

A matching item in Questionmark Perception might  look like the question below. In this example, which uses a graphical presentation format, someone has already started figuring things out!

matching question

Here’s a quick tutorial on how to create a matching question in Perception, with or without a graphical interface. The tutorial will also show you how to set up scoring and feedback.