Effective assessment feedback needs motive, opportunity and means

John Kleeman HeadshotPosted by John Kleeman

Assessment feedback, whether it relates to a question, a topic or an entire assessment, has tremendous value – but only if learners make practical use of it! I’d like to share some solid approaches about feedback and talk about how a couple of Questionmark reports can help you put them into practice.

From Andrew Morgan (quoted in Valerie Shute’s excellent ETS research report on feedback), we get the concept that to be effective and useful, feedback needs the following:

  • Motive – the learner wants to follow the feedback
  • Opportunity – the learner has it in time to use it: it’s not given too late for action
  • Means – the learner is able to use it: the feedback is actionable

Another good way to think about feedback comes from Dr Steve Draper of the University of Glasgow School of Psychology in his presentation at eAssessment Scotland in 2012:

“There is no point in giving feedback to a learner unless the learner acts on it: does something concrete and differently because of it”.

Feedback needs to be processed to be usefulFeedback that the learner doesn’t read isn’t valuable.

Feedback that the learner reads but doesn’t process isn’t valuable.

You must get the learner to evaluate the feedback and adjust their thinking as a result of the feedback, i.e. process the feedback and do something with it.

I’ve been wondering about how you can apply these concepts using the Questionmark coaching report when presenting an assessment score as feedback.

Most learners are motivated to use their score achieved in a test as feedback; they want to get a high scores next time. And if they can take a test again, they have the opportunity to use the feedback. But a score on its own is just a number. How can you help learners use their scores as catalysts for action?

Clearly, a score is more valuable if it can be compared to something, and there are three obvious comparisons:

  • Ipsative, comparing score to previous attempts: have you done better than last time?
  • Criterion referenced, comparing score to a benchmark: have you reached the desired standard?
  • Normative, comparing score to how others do: how do you compare to your peers?

Questionmark’s Transcript report lets learners view all their results and see how they improve or change over time. And Questionmark’s Coaching report includes the concept of benchmarks – you can set a benchmark for the assessment and for each topic. What you may not know is that the Coaching report allows you to compare a score against the average of a group of other test-takers. You can define the group of people by a variety of demographics and then display how the participant compares against their scores. This screenshot shows how to set this up:

Setting a comparison in the coaching report

Giving learners information about how they compare to others can be a powerful motivator; I encourage you to explore this capability of the Questionmark Coaching report.

For more on general concepts of processing feedback, see Steve Draper’s interesting page here. Questionmark users can see more information on the Coaching report comparison options on the Questionmark support site. And if you want to hear more from me about assessment feedback, I’ll be speaking about it at the Questionmark user conference in March.

ATP Highlights: Security and Future Innovations

Jim Farrell HeadshotPosted by Jim Farrell

One of the early highlights of every year for a lot of us here at Questionmark is the Association of Test Publishers (ATP) Conference – Innovations in Testing.

Established in 1992, ATP is a non-profit organization representing providers of tests and assessment tools and/or services related to assessment, selection, screening, certification, licensing, educational or clinical uses. The conference offers its members the opportunity for networking, workshops and sessions led by industry leaders. This year’s conference was held in Ft. Lauderdale, Florida (not a bad place to be in early February) and o had record attendance.

For me, there were two themes that stood out: Security and Looking to the Future.

ericatATPSecurity is always paramount at Questionmark. We often describe the issues in higher stakes tests with the Fraud Triangle (Rationalization, Opportunity, and Motivation) and the actual threats (Impersonation, Content Theft and Cheating). As shown in the picture to the left, our CEO, Eric Shepherd led a panel on remote monitoring with our some of our good friends including Don Kassner of ProctorU, Doug Winneg of Software Secure and Ruben Garcia of Innovative Exams. Each company provides a different level of security including recorded video of the test taker (Software Secure), Live Proctoring (ProctorU) and a Secure Kiosk with Live Proctoring (Innovative Exams). At Questionmark we see it as a sliding scale.

stakes ATP As testing moves from brick and mortar test centers to community centers and libraries, remote proctoring is becoming a real solution and Questionmark is excited to be part of it.

The other interesting trend at this conference was “the future”. The keynote speaker was Jack Uldrich, and he is a futurist. What does a futurist do, you ask? Here is a video that shows some of Jack Uldrich’s books and ideas. He says that “what we don’t know yet is just as important as what we know today. In this unknown knowledge is extraordinary opportunity.”

I really love this quote because it is a direct challenge to be innovative and unlearn things that were once true but are not true for the future. We have to realize that we are going to do some of the same things we have always done, but we are going to do them differently. This is either inherently scary or extremely exciting. I believe it is the latter, and that it’s up to us to listen to the trends and always have our eye on what is not yet possible.

I am looking forward to having conversations about the future of testing with our customers at the 2013 Questionmark User’s Conference in Baltimore, March 3-6. Click here to register.

Deploying compliance-related assessments: good practice recommendations

Headshot JuliePosted by Julie Delazyn

Last week I wrote about planning compliance-related assessments, the first post in a five-part series offering good practice recommendations as described in our white paper, The Role of Assessments in Mitigating Risk for Financial Services Organizations.

This paper offers a great deal of information about these kinds of assessments and advice about best practices for implementing a legally defensible assessment program. It describes five stages of deploying assessments — from planning to analytics — and offers recommendations for good practice for chief compliance officers, authoring experts, subject matter experts, trainers and IT specialists responsible for compliance in the organization:


Compliance five steps

Some of these recommendations are specific to Questionmark technologies, but most can be applied to any testing and assessment system.

Today, let’s look at good practice for the second of the five stages: deployment:

deployment chartClick here to read the paper, which you can download free after login or sign-up.

Secure exams outside the testing center

Joan Phaup HeadshotPosted by Joan Phaup

The increasing numbers of students studying online in recent years – many of them raising families and holding down jobs – have embraced the idea of doing all their coursework at the kitchen table, so to speak.  But until recently, when it came time for a test, these students had to travel to at testing center. Many of these students raised the question: “If I can study at the kitchen table, why can’t I take an exam there, too?”

Today, taking tests from home or the office — using online monitors or proctors — is an option for certification candidates as well as students, and there are various means of providing secure testing at a distance.

Delegates to the Questionmark Users Conference in Baltimore March 3 – 6 will have the opportunity learn more about online proctoring/invigilation during a presentation on Secure Testing in Remote Environments.

Don Kassner, president of ProctorU, will co-present this session with Maureen Woodruff, who directs the Office of Test Administration at Thomas Edison State College. I spent a few minutes with Don the other day and asked for some details.

Can you explain what makes it possible to offer secure remote proctoring or monitoring?

Don Kassner

Don Kassner

There are three key elements to this: the environment, the computer and the test taker. The first thing we need to do is to make sure each test taker has reliable internet access and is in a fairly controlled environment. This is not about testing anywhere. It’s about testing in an environment that’s predictable. The person gets to choose the place, but it has to be in a certain kind of place. And the test taker must “show” us their environment using a webcam. Second, we have to secure their computer. Test takers use their own equipment, but we need to make sure they are not switching tasks, accessing the Internet for answers and so forth. Last, we have to secure the test taker themselves, by using a layered authentication approach to make sure they are who they say they are having our online proctors observe them as they complete their tests.

What are the biggest security challenges in delivering tests to people outside of test centers?

In a test center, you already know that the environment and the computers are secure; you can focus on the identity and behavior of the test taker.  When proctoring at a distance, you have to give a lot of importance to all three of the elements I mention; there are a lot more decisions to be made about the testing process.

With online proctoring, we have to be willing to stop a testing session and say that something doesn’t meet our standards – that the test taker is not meeting the requirements and must reschedule.

We also need to be able replicate our processes across the board and make sure that the testing experience replicates no matter who is taking the test or where they are taking it. We have to focus on making sure the experience is identical for every test taker.

How will you address those challenges during your session?

We will introduce the basics of online proctoring and give examples of how different institutions and organizations have used it. We’ll also drill down into the details of what it takes to secure the environment, the computer and the test taker.

Woodruff Maureen Exc  Portrait (cropped)

Maureen Woodruff

Maureen will share a case study about what they did at Thomas Edison State College and the important factors they had to take into account when they set up remote testing for their students. And I’ll differentiate between the factors that are important for academic institutions and those that matter the most for certification tests. Students are likely to take a number of tests and end up have a long track record. Certification candidates tend not to be repeat test takers, so that means using slightly different procedures.

What kinds of tests are best suited for online monitoring or proctoring?

If you are going to use this kind of proctoring, you really need to think about the nature and structure of the test. You are trying to minimize the risk inherent in someone taking a test, so you need to ask yourself what issues you are concerned about relative to that. Tests with large data banks are best, because they help mitigate the risks of people stealing questions or colluding. Standard tests  increase the risk factor and may not be appropriate.

What would you like your audience to take away?

A real understanding on how effective this approach can be in some situations and  an understanding of when it may or may not be appropriate – so they can think about their own programs and consider where they think this will fit.

Click here for more information about the conference program — and register soon!

 

 

Questionmark Live’s LaTeX Editor

Doug Peterson HeadshotPosted By Doug Peterson

You might recall that Questionmark Live has a built-in math formula editor (if not, check it out here). Did you know that Questionmark Live also includes a LaTeX formula editor?

LaTeX (yes, it is properly spelled with a mixture of upper and lower-case letters, and is pronounced lay-tek) is a “document preparation system for high-quality typesetting. It is most often used for medium-to-large technical or scientific documents…” (retrieved 29 January 2013). You can think of it as kind of a markup language, like HTML on steroids. As it is used mainly for technical or scientific documents, it is very powerful when it comes to math and science formulas.

LaTeX is especially popular in the world of higher education, which includes a number of Questionmark customers. To meet the needs of those who prefer to create their math formulas in LaTeX, Questionmark Live offers a LaTeX editor. In any text area such as the question stimulus or a choice, simply place your cursor where you want the formula to appear and click the LaTeX Editor button in the text editing toolbar.

LaTex 1Enter your LaTeX code in the Questionmark LaTeX Editor dialog and click the OK button.

LaTex 2

Your LaTeX formula is converted into a graphic and inserted in the text area!

LaTex 3To make changes to your formula, simply double-click the graphic. The Questionmark LaTeX Editor will open and display your original LaTeX formula for editing.

You can read about LaTeX at Wikipedia or at the LaTeX Project site. Please keep in mind that the Questionmark LaTeX editor only recognizes the subset of LaTeX markup pertaining to math formulas.

You can learn more about this and other developments at the Questionmark Users Conference in Baltimore March 3 -6. Click here to register.

Planning compliance-related assessments: good practice recommendations

Headshot JuliePosted by Julie Delazyn

Last week I wrote about the business benefits of assessments that mitigate risk and help ensure compliance– as described in our white paper, The Role of Assessments in Mitigating Risk for Financial Services Organizations, This paper offers a great deal of information about these kinds of assessments and advice about best practices for implementing a legally defensible assessment program.

The paper describes five stages of deploying assessments — from planning to analytics — and offers recommendations for good practice for people in these six job roles:

  • Chief Compliance Officer, responsible for compliance in the organization
  • Compliance Officer, who runs the assessment part of the compliance program
  • Authoring Expert, the authoring team lead
  • SME, the Subject matter expert who authors and reviews questions
  • Trainer, who trains in compliance
  • IT Specialist, responsible for IT setup

Compliance five steps

Some of these recommendations are specific to Questionmark technologies, but most can be applied to any testing and assessment system.

Today, let’s look at good practice for the first of the five stages: planning:

Planning - Compliance

Click here to read the paper, which you can download free after login or sign-up.

« Previous Page