UK Webinar: Using Assessments to Mitigate Risk and Ensure Regulatory Compliance

Posted by Chloe Mendonca

Recent news stories about regulatory compliance in the financial services industry highlight the need for effective training and assessment in this sector.

In response, we’re offering a webinar that highlights the role assessments in mitigating risk and supporting compliance.

Join us at 11 a.m. British Summer Time (London — UTC +1) on Tuesday, 3rd July for Using Assessments to Mitigate Risk and Ensure Regulatory Compliance

In addition to discussing the vital role of assessments in mitigating risk, this webinar will share how Questionmark assessment technologies can be used to support and promote compliance.

We will cover these and other topics:

  • Demonstrating your organisation’s commitment to comply with laws, making prosecution less likely if an individual mis-steps
  • How assessments can warn of a lack of knowledge or understanding before these impact the business
  • Providing effective documentation that training has taken place and that employees understand what they are required to do in line with regulations
  • Saving time by allowing knowledgeable employees to “test out” of training on topics in which they are already expert

Click here to register

Podcast: Jarrod Morgan of ProctorU explains live monitoring of online tests

Posted by Joan Phaup

Andrew Jackson University, a distance education school based in Birmingham, Alabama, faced a big question several years ago: How could students take tests the same way they pursued their studies — off campus, via the internet — with the same level of security they would encounter in a test center?

Jarrod Morgan, who was then the school’s director of technology, recently told me the answer:

Jarrod Morgan

 We decided to come up with our own process based on the face-to face monitoring experience that’s been used by colleges and universities and test centers for decades and decades.
 
What that means is that you have to see the person, you have to see what they’re doing, and you have to know who they are. So if you can do all three of those, see the person, see what they’re doing, and know who they are, you’ve essentially monitored that exam the same way you would do it in a test center.
 

Great idea! But how did they implement it? By using webcams, screen-sharing technology and layered authentication methods, with a live proctor observing each test taker.

The success and popularity of this solution led to the formation of ProctorU, a live, online proctoring/monitoring service that allows people to complete assessments from wherever they are while  ensuring exam integrity. Questionmark partners with ProctorU to bring added security for higher-stakes exams taken outside testing centers.

In this podcast interview with  Jarrod, now the company’s vice president of business development, you will hear more about how online monitoring works and why  academic institutions, businesses and other organizations are choosing this option. We also talked about various security challenges, including identity fraud, and how to combat them. We touched on how test takers have responded to this idea, too. Feel free to listen in!

 

Making sure assessment video and audio are accessible to participants

Posted by Noel Thethy

One of the benefits of an assessment management system like Questionmark’s is its ability to include rich and interactive media within questions. But how do you cope with this content to ensure it is accessible to those with needs?

Video and audio content

When including this type of content in questions, you should provide alternative means for consuming the information. It is important to provide equivalents for users who cannot see or hear.

Captions should be added for video and audio information detailing all the spoken content — and for videos any important non-spoken information. You can include a transcript in several different ways, including:

  • Adding the transcript in the questions stimulus
  • Adding a link to the transcript from the question
  • Embedding closed captions in the video or audio
  • Using the scenario/case question format to display the transcript in parallel with the question multimedia (as shown below)

Rich media content

If you are using any rich media content like Flash or Captivate, be sure to follow the Adobe Accessibility guidelines. This will ensure that the content (which can be interactive) has been created with the necessary attention to the available accessibility features and designs.

UK Webinar: Learn the basics of mobile assessment delivery

Posted by Chloe Mendonca

The growing use of mobile devices is rapidly changing the way we carry out daily tasks.

At the end of 2011, there were 6 billion mobile phone subscriptions, which is equivalent to 87 % of the world population. Still more remarkable is the increasing growth in the use of smart phones! In 2011, 491 million were sold which is a 61% rise from the previous year. These numbers continue to grow and it’s predicted that 686 million smartphones will be sold in 2012. Tablets are hugely popular, too.

But how does this affect the learning industry? It means the ability to reach millions of people anywhere and at any time. Mobile delivery enables new possibilities in the way we test — and creating assessments, quizzes and surveys for mobile delivery is the key to reaching people on the go.

We will host a complimentary one-hour web seminar, ‘Creating Assessments for Mobile Delivery’ here in the UK on Wednesday 27th June 2012 at 11:00AM (GMT /London Time). Please join us to learn about this delivery option and to get a basic understanding of using Questionmark technologies for mobile assessments.

Register now for this Webinar!

Topic Hierarchies in Questionmark Live!

Posted By Doug Peterson

Questionmark Live, Questionmark’s web-based item and assessment authoring tool, includes hierarchical topics for organizing questions.

Hierarchical topics allow you to author your questions in a tree structure as shown in the screen capture, starting with a broad topic at the highest level and narrowing down to a specific piece of knowledge at the lowest level.

In the example shown, the highest level of organization is school curricula. Math is then further broken down into more specific topics such as Precalculus, which is in turn narrowed further into Algebra and Trigonometry.

The Trigonometry topic is divided into even more detailed sub-topics. I can now create questions related to calculating a cosine value in the Cosines topic, while questions relating to Euler’s Formula would be stored in the Eulers Formula topic.

At this point it is very easy to assemble an assessment for a specific purpose. If I want to give a quiz on cosines, I can pull questions from the Cosines topic. If I’m creating an end-of-course exam, I would pull a few questions from each sub-topic under Trigonometry.

You can share any topic at any level in the hierarchy with other Questionmark Live users so that your whole team can work collaboratively, and the hierarchical structure is preserved as you move data between Questionmark Live and Authoring Manager (Questionmark’s Windows-based authoring tool).

If you’d like to become more familiar with Questionmark Live, check out this webinar on June 27, 2012 – 12:00 PM (EDT).

What is ipsative assessment and why would I use it?

Posted by John Kleeman

As I’m writing this, I’ve just got back from the gym, where I beat my personal best distance on an exercise bike. What’s this got to do with computerized assessment, you might ask? Hear me out.

You’re probably familiar with norm-referenced testing and criterion-referenced testing :

  • A norm-referenced test compares a test-taker against his or her peers. For example, you might compare my results with those of my Questionmark colleagues. (If you did, then seeing how energetic many are in the gym, I suspect my performance would not compare well!)
  • A criterion-referenced test measures a test-taker against external criteria. For example, it might be that people of a certain age should be expected to reach a certain distance in a certain time on an exercise bike.

A third type is sometimes called ipsative assessment.

  • An ipsative assessment in an education/learning context compares a test-taker’s results against his or her previous results. This is how I measure myself at the gym – I am pleased that I am doing better than I have before. I’m not worried if this meets some external criteria or if I’m better or worse than other people.

It’s very common to use criterion-referenced tests as computerized assessments because they help us measure competence. If you want to be sure that your employees know the rules, if you want to validate a pilot to fly a plane, or if you want to check that someone has understood training, a criterion-referenced test is usually the way to go.

But an advantage of ipsative assessment is that it measures progress and development – a test-taker can see if he or she is improving and whether or not he/she is taking advantage of feedback from previous assessments. Using ipsative assessment can help all test-takers improve: A weaker performer will be encouraged by seeing performance improvements over earlier attempts, and a stronger performer can be challenged to do better. This can deal with the risks of the weaker performer becoming demotivated from a poor test result and the strong performer complacent from a good one. Ipsative assessment can be used for objective measures (e.g. did I get a better score?) and also for more subjective measures (e.g. am I more confident about something?).

Questionmark software makes it easy to produce coaching reports on each attempt at an assessment, and these can easily be used to allow test-takers to compare results from previous attempts and see how they’ve improved. This is particularly useful for observational assessments, which measure skill and performance – areas where everyone wants to improve and there can never be a perfect score.

To learn more on ipsative assessment in education and learning, one resource is this study by Dr Gwyneth Huges of the Institute of Education. (As a heads-up, the term ipsative measure is also used in a different, technical way in psychological testing as a within-person measure.)

Expertise is built up by deliberate practice, and being tested can help identify where that practice is needed. I think it’s helpful for all of us to remember that progress and improvement is a useful thing to measure as well as achievement and competency.

Next Page »