Simulating real life: Questions that test application of knowledge

Doug Peterson HeadshotPosted By Doug Peterson

Questionmark offers over 20 different question formats. We have the standard multiple choice, multiple response, true/false and yes/no question types. We also have question types with more interaction, to more fully engage the participant: hot spot and drag-and-drop, fill in blanks and select a blank, and matching and ordering. We also have text match and essay questions that allow participants to show their understanding by formulating a response from scratch.

Questionmark has two other question types that I’d like to discuss, because I feel they are very powerful yet often overlooked: the Adobe Captivate and Flash question types.

Prior to joining Questionmark, I worked for a large telecommunications company, and one of my group’s responsibilities was to train customer call center employees. The call center representatives had to know all about hooking up set-top boxes and DVD players as well as configuring routers and setting up wireless connections. They had to know how to use several different trouble-shooting applications as well as the main trouble ticket application. We used Captivate and Flash question types in a few different ways to effectively and accurately assess the participants’ knowledge as they went through their 13 weeks of training.

  1.  We used Flash and ActionScript to create a duplicate of the trouble ticket application. And I mean *duplicate*. It looked and behaved exactly like the real thing. Honestly, there were a couple of times I got confused as to which was the real thing and which was the simulation when I had them open side by side, that’s how realistic the simulation was. With this simulation, we were able to go beyond using multiple choice questions that just asked, “What value would you select for Trouble Reason?” or “What error code would you use, given the customer’s description?” Instead, we presented the participant with (what appeared to be and behaved exactly like) the trouble ticket application and said, “Fill out the ticket.” We gave a point or two for every value they entered correctly, or checkbox they checked correctly, or radio button they selected correctly. In this way we could assess their understanding of what fields needed to be populated at every stage of the process and their overall ability to use the software, as well as their understanding of what values to use in each field.
  2.  Similar to #1, we created simulations for setting up a wireless connection or configuring a router. We presented the participant with a Windows desktop and they had to go through the process of setting up a connection to a local router – entering the SSID, entering the WEP key, etc. We didn’t give points for individual steps in this one, as the instructions were to set up a connection – either you did it all correctly and established the connection, or you
    didn’t.
  3.  The class was typically taught in a classroom, and at one point the instructor would wheel in an audio/visual cart with a television, a set-top box, a DVD player, and a home theater sound system. The members of the class would then wire the components together correctly, or troubleshoot the instructor’s incorrect wiring job. Then one day we were asked to teach the class remotely, with student’s taking the training in their own homes. How could we do the wiring exercise if the students weren’t all in the same physical location? As I’m sure you’ve guessed, we used a Flash simulation. The simulation presented the backs of the various components along with the ability to select different types of wiring (HDMI cable, coax cable, and RCA cables). Students could click-and-drag the selected wire from a connector on one component to a connector on another component.

Because this way of assessing a learner’s understanding is not all that common, we used similar simulations as formative quizzes during the training and provided a practice assessment prior to the first “real” assessment. This helped the participants get comfortable with the format by the time it really counted, which is important: We want to be fair to the learner and make sure we give them every opportunity to prove their knowledge, skill or ability without interference or stress. It’s not fair to suddenly spring a new question format on them that they’ve never seen before.

One great way to learn more about this topic is to attend our the Questionmark 2014 Users Conference March 4-7 in San Antonio. I typically present a session on using Flash and Captivate in e-learning and assessments. Looking forward to seeing you there!

Leave a Reply

Your email address will not be published. Required fields are marked *