Simulating real life: Questions that test application of knowledge

Doug Peterson HeadshotPosted By Doug Peterson

Questionmark offers over 20 different question formats. We have the standard multiple choice, multiple response, true/false and yes/no question types. We also have question types with more interaction, to more fully engage the participant: hot spot and drag-and-drop, fill in blanks and select a blank, and matching and ordering. We also have text match and essay questions that allow participants to show their understanding by formulating a response from scratch.

Questionmark has two other question types that I’d like to discuss, because I feel they are very powerful yet often overlooked: the Adobe Captivate and Flash question types.

Prior to joining Questionmark, I worked for a large telecommunications company, and one of my group’s responsibilities was to train customer call center employees. The call center representatives had to know all about hooking up set-top boxes and DVD players as well as configuring routers and setting up wireless connections. They had to know how to use several different trouble-shooting applications as well as the main trouble ticket application. We used Captivate and Flash question types in a few different ways to effectively and accurately assess the participants’ knowledge as they went through their 13 weeks of training.

  1.  We used Flash and ActionScript to create a duplicate of the trouble ticket application. And I mean *duplicate*. It looked and behaved exactly like the real thing. Honestly, there were a couple of times I got confused as to which was the real thing and which was the simulation when I had them open side by side, that’s how realistic the simulation was. With this simulation, we were able to go beyond using multiple choice questions that just asked, “What value would you select for Trouble Reason?” or “What error code would you use, given the customer’s description?” Instead, we presented the participant with (what appeared to be and behaved exactly like) the trouble ticket application and said, “Fill out the ticket.” We gave a point or two for every value they entered correctly, or checkbox they checked correctly, or radio button they selected correctly. In this way we could assess their understanding of what fields needed to be populated at every stage of the process and their overall ability to use the software, as well as their understanding of what values to use in each field.
  2.  Similar to #1, we created simulations for setting up a wireless connection or configuring a router. We presented the participant with a Windows desktop and they had to go through the process of setting up a connection to a local router – entering the SSID, entering the WEP key, etc. We didn’t give points for individual steps in this one, as the instructions were to set up a connection – either you did it all correctly and established the connection, or you
    didn’t.
  3.  The class was typically taught in a classroom, and at one point the instructor would wheel in an audio/visual cart with a television, a set-top box, a DVD player, and a home theater sound system. The members of the class would then wire the components together correctly, or troubleshoot the instructor’s incorrect wiring job. Then one day we were asked to teach the class remotely, with student’s taking the training in their own homes. How could we do the wiring exercise if the students weren’t all in the same physical location? As I’m sure you’ve guessed, we used a Flash simulation. The simulation presented the backs of the various components along with the ability to select different types of wiring (HDMI cable, coax cable, and RCA cables). Students could click-and-drag the selected wire from a connector on one component to a connector on another component.

Because this way of assessing a learner’s understanding is not all that common, we used similar simulations as formative quizzes during the training and provided a practice assessment prior to the first “real” assessment. This helped the participants get comfortable with the format by the time it really counted, which is important: We want to be fair to the learner and make sure we give them every opportunity to prove their knowledge, skill or ability without interference or stress. It’s not fair to suddenly spring a new question format on them that they’ve never seen before.

One great way to learn more about this topic is to attend our the Questionmark 2014 Users Conference March 4-7 in San Antonio. I typically present a session on using Flash and Captivate in e-learning and assessments. Looking forward to seeing you there!

“It takes 20 years to build a reputation and five minutes to ruin it”

John Kleeman HeadshotPosted by John Kleeman

A recent corporate survey reported in Insurance Journal suggests that reputation is the hardest risk to manage. The survey indicates that 81% of companies see reputation as their most significant asset but are challenged in knowing how to protect it.

Warren Buffett famously said, “It takes 20 years to build a reputation and five minutes to ruin it”. So how can organization’s avoid those fateful five minutes?

Assessments can be a great tool to mitigate risk to reputation. I’d like to share some ideas on this from my Questionmark colleagues, Eric Shepherd and Brian McNamara.

Let’s start by considering the classic business model shown in the diagram below. A company uses its core capabilities in Production with a supplier network and product/services development to make an Offer to its customers, which it communicates via a sales and marketing Channel, with a supporting Finance structure.

Classic business model. Production, Offer, Channel, Finance

The pink shaded areas in the diagram are where there is reputation risk.

If you make mistakes within Production – in regulatory compliance, processes & procedures or health & safety – this can seriously hurt your reputation. Errors in regulatory compliance or failing to follow processes & procedures can similarly damage reputation in Finance. Assessments can help by confirming health and safety, checking the competence of employees and testing knowledge of processes & procedures.

Many companies have a bigger challenge in the sales and marketing Channel, as this is more spread out and harder to control. You have to comply with laws on how you sell, both industry-specific and general ones like anti-corruption. The people in your Channel must have product/solution knowledge. And reputation is hurt by overselling and unsatisfied customers.

The diagram below breaks down the Channel into typical parts:

breaking down Channel into Market Messaging and Relationship Management

How can assessments help with reputation challenges in the Channel?

Market Messaging

When you message your customers, there is a risk that your messaging is inappropriate or that messages do not resonate. Most organizations assess customers with surveys to determine if they are “getting it”.

Sales

You need your sales people, whether in-house personnel or partners, to comply with laws and avoid corruption. They need to ensure your customers are satisfied, by selling fairly and not using trickery. Online quizzes and tests are great ways to check your sales people know all the rules and are competent to sell. Observational assessments using tablets or smartphones also let supervisors check team members.

Customer Care

In customer care, a challenge is high staff turnover, requiring lots of training. As in sales, the customer care team need product and process knowledge and need to satisfy customers. Quizzes and tests motivate learning, maintain focus and enable recognition of people who “get it”.

Technical Support

Lastly, every company has challenges when products or services don’t work. How you deal with problems impacts your reputation. The challenge for technical support is to delight the customer and fix problems on first call.

Quizzes and tests are useful in technical support, but something that works really well for technical teams is a certification program. Skills and knowledge required are often complex, and using assessments to certify gives technical support teams career progression. It also encourages pride in their jobs, leading to better employee retention and better service.

 

I hope this article helps you realize that online assessments help solve one of the biggest challenges facing business – mitigating risk to reputation. Next time you are making an internal case for online assessments, consider whether your senior management might find reducing reputation risk a compelling reason to deploy assessments.

Best practices for test design and delivery: Join the webinar

Joan Phaup HeadshotPosted by Joan Phaup

So many people signed up for Doug Peterson’s recent web seminar about best practices for test design and delivery that we’re offering it again in August:

Join us at 11 a.m. Eastern Time on Thursday, August 22, for Five Steps to Better Tests: Best Practices for Design and Delivery

This webinar will give you practical tips for planning tests, creating items, and building, delivering and evaluating tests that yield actionable, meaningful results.

Doug speaks from experience, having spent more than 12 years in workforce development. During that time, he created training materials, taught in the classroom and over the Web, and created many online surveys, quizzes and tests.

The webinar is based on Doug’s 10-part series in this blog about test design and delivery, which is also available in the form of Questionmark White Paper, Five Steps to Better Tests.

Join the webinar for a lively explanation of these five essential steps for effective test design and delivery:

1. Planning the test
2. Creating the test items
3. Creating the test form
4. Delivering the test
5. Evaluating the test

Go to our UK website  or our  US website for webinar details and free registration.

 

Open Standards: Spotlight on CSS

Steve Lay HeadshotPosted by Steve Lay

In my role as Integrations Product Owner and champion of Questionmark’s Open Assessment Platform strategy I often write on the topic of open standards.

When we browse the internet on our mobiles, tablets or even on the humble PC, our experience is based on a vast stack of open standards covering everything from the way the information is wrapped up in ‘packets’ for sending over the network to the way text and graphics appear on our screens.

You’ve probably all heard of HTML, the main markup language used for creating web pages. HTML, or HyperText Markup Language to give it its full name, allows web servers to specify how text is broken up into paragraphs, lists or tables, when it should be emphasised and how it relates to media files like images and videos that are also rendered on the page. But HTML has a lesser-known yet powerful helper: Cascading Style Sheets (CSS).

CSS is a standard which allows a designer to apply ‘style’ to a web page. By style, we are talking about formatting information: things that affect the appearance of the page without affecting the meaning. Essentially, information on the web is split into these two halves: content (in HTML) and style (in CSS). Initial versions of the CSS standard were rudimentary, and support across different browsers was often inconsistent. But the standard is now on version 3, often abbreviated to CSS3, and renderings are much more predictable. Also, adoption of more advanced features is rapidly becoming the norm rather than the exception.

By adopting HTML and CSS at Questionmark, the content/style division translates into different responsibilities for the question author (responsible for content) and the graphic designer (responsible for style). By being mindful of this division — and the fact that the same question may have different styles applied on different devices or in different contexts — authors can avoid question wording that is dependent on the style or type of rendering.

For example, a phrase such as “which category applies to the text in red?” makes specific reference to an element of style appearing elsewhere in the content. If colour is not essential to the meaning it would be better to use a more neutral term such as the emphasised text. Being aware of different styles has the knock-on benefit of making assessment content more accessible while ensuring they look good!

Questionmark has embraced CSS as the best technology for customising the appearance of tests. It is easy to copy the default CSS files and change the colours and fonts, say, to match your company portal.

In this screenshot shot, I’ve created a yellow background simply by changing one line in the default style sheet:

css

With CSS, web designers can help you make your assessments look even more professional!

Evidence that topic feedback correlates with improved learning

John Kleeman HeadshotPosted by John Kleeman

It seems obvious that topic feedback helps learners, but it’s great to see some evidence!

Here is a summary of a paper, “Student Engagement with Topic-based Facilitative Feedback on e-Assessments” (see here for full paper) by John Dermo and Liz Carpenter of the University of Bradford, presented at the 2013 International Computer Assisted Assessment conference.

Dermo and Carpenter  delivered a formative assessment in Questionmark Perception over a period of 3 years to 300 students on an undergraduate biology module.  All learners were required to take the assessment once, and were allowed to re-take it as many times as they wanted. Most took the test several times. The assessment didn’t give question feedback, but gave topic feedback on the 11 main topic areas covered by the module.

The intention was for students to use the topic feedback as part of their revision and study to diagnose weaknesses in their learning: the comments provided might be able to direct students in their learning. The students were encouraged to incorporate this feedback into their study planners and to take the test repeatedly, expecting that students who engage with their feedback, and are “mindful” of their learning will  benefit most.

Here is an example end of test feedback screen.

Assessment Feedback screen showing topic feedback

As you can see, learners achieved “Distinction”, “Merit”, “Pass” and “Fail” for each topic. They were also given a topic score and some guidance on how to improve. The authors then correlated time spent on the tests, questions answered and distribution of taking the test over time with each student’s score on the end-of-module summative exam.  They found a correlation between taking the test and doing well on the exam. For example, the correlation factor on number of attempts on the formative assessment and the score on the  summative exam was 0.29 (spearman rank order correlation, p<0.01).

You can see some of their results below, with learners divided into a top, middle and bottom scoring group on the summative exam. This shows that the top scoring group answered more questions, spent more time on the test, and spread the effort over a longer period of time.

Clustered bar charts showing differences between top middle and bottom scoring groups on the dependent variables time, attempts, and distribution

The researchers also surveyed the learners, 82% of whom agreed or strongly agreed that “I found the comments and feedback useful”. Many students also drew attention to the fact that the assessment and feedback let them focus their revision time on the topics that needed most attention, for example one student said:

“It showed clearly areas for further improvement and where more work was needed”.”

There could be other reasons why learners who spent time on the formative assessments did well on the summative exam:  they might, for instance, have been more diligent in other things. So this research offers proof of correlation, not proof of cause and effect. However, it does provide evidence pointing to topic feedback being useful and valuable in improving learning by telling learners which areas they are weak in and need work on more. This seems likely to apply to the world of work as well as to higher education.

Delivering a million+ assessments takes a village: A SlideShare Presentation

Headshot JuliePosted by Julie Delazyn

What does it take to deliver thousands of different assessments to thousands of students each year?

Rio Salado College, one of the largest online colleges in the United State – with 67,000 students — knows the answer: collaboration.

The people who run the college’s Questionmark assessments wear many hats. They are instructional designers, authors and programmers, as well as networking and IT services staff. Teamwork between people in these varying roles is essential. And since the college delivers more than one million assessments each year, external collaboration – with Questionmark staff – is essential, too.

A team from Rio Salado explained their cooperative approach during this year’s Questionmark Users Conference, and we’re happy to share the handouts from their presentation with you: It Takes a Village – Collaborating for Success with High-Volume Assessments.

This presentation includes an overview of how the college uses surveys, quizzes and tests within its extensive online learning programs. It also focuses on some of the many lessons gleaned from Rio Salado’s many years of involvement with Questionmark.

This is just one example of what people learn about at our Users Conferences. Registration is already open for the 2014 Users Conference March 4 – 7 in San Antonio, Texas. Plan to be there!

Next Page »