Join a UK briefing on academic assessment

Chloe MendoncaPosted by Chloe Mendonca

There are just a few weeks to go until the UK Academic Briefings at the University of Bradford and University of Southampton.

These complimentary morning briefings bring people and ideas together.

For those working in academic-related roles, here is an opportunity to get up to date with recent and future developments in online testing.

The agenda will cover ways to:

  • enhance student learning through e-assessment
  • overcome key obstacles surrounding e-assessment in higher education
  • mitigate the threat of cheating and fraud within online exams

If you’re new to online assessment or are thinking of implementing it within your institution, attend a briefing to see Questionmark technologies in action
and speak with assessment experts about potential applications for online surveys, quizzes, tests and exams.

Register for the date and location that suits you best:

 

Six trends to watch in online assessment in 2014

John Kleeman HeadshotPosted by John Kleeman

As we gear up for 2014, here are six trends I suggest could be important in the coming year.

1. Privacy. The revelations in 2013 that government agencies intercept so much electronic data will reverberate in 2014. Expect a lot more questions from stakeholders about where their results are stored and how integrity, data protection and privacy are assured, including the location and ownership of suppliers and data centres. I suspect some organizations will look to build trust with stakeholders by adopting the ISO standard on assessments in the workplace ISO 10667.

2. Anticipation of problems. Many organizations already use assessments to look forward, not just backwards. In regulatory compliance, smart organizations don’t just use assessments to check competence; they analyze results from assessments to identify trends or problems that can indicate potential issues or weaknesses, and prompt corrective measures before it gets too late. Universities and colleges increasingly use assessments to predict problems and help prevent students from dropping out (see for instance Use a survey with feedback to aid student retention). It’s exciting that assessments can be used to find issues in this way and deal with them before they happen. Don’t just treat assessments as a rear-view mirror: use them to look forward.

3. Software as a service (SaaS). For all but the very large organizations, running online assessments via a software as a service is much more cost-effective than running an on-premise system. Delegating to a service provider like Questionmark,makes the hassle of upgrading, maintaining security patches and managing deployment goes away. Increasingly, delivering assessments via a SaaS model will become the default.connected

4. Smaller and more connected world. The Internet is bringing us all together. The world is becoming connected, and in some sense smaller. We can no longer think of another continent or country as being a world away, because we can all connect together so easily. This means it is increasingly important to make your assessments translatable, multi-lingual and cross-cultural. Most medium and large organizations work across much of the world, and assessments need to reflect that.

5. Environment. I wonder if 2014 could be the year when the environmental benefits of online assessments could start to be seriously recognized. Clearly, using computers rather than paper to deliver assessments saves trees, but a bigger benefit is in reduced carbon emissions due to less traveling. For service organizations, business travel is a large proportion of carbon emissions (see for example here), and delivering training and assessments online can make a useful difference. With many countries requiring reporting of carbon emissions by listed companies, this could be important.

6. Security. Last but definitely not least, assessment security will continue to matter. As there is more awareness of the risks, everyone will expect high levels of technical and organizational security in their assessment delivery.  If you are a provider, expect a lot more questions on security from informed users; and if you are a customer or user, check that your supplier and your internal team is genuinely up to date on its security.

Read this list and look at the starting letters, and you get P – A – S – S – E – S! I wish you a happy new year and hope that each of your test-takers passes their assessments in 2014 when it is appropriate that they do so.

Mark it up: Highlight text and strike through distractors in your online assessments

Brian McNamara HeadshotPosted by Brian McNamara

Remember what it was like to take a test or exam on paper? One of the benefits was that participants typically had the option of making marks on a test paper or test booklet to help them focus on certain key passages or terms – or perhaps to cross out certain choices as they worked on finding a correct answer.highlight-and-strikethrough-for-blog

As a test author or administrator, you might want to – or indeed be required to – provide this kind of flexibility during an exam.

Fortunately, you don’t have to sacrifice this flexibility when delivering your assessments online. New functionality being introduced to Questionmark OnDemand users enables them to configure their assessments to allow participants to:

  1. Highlight terms or passages within a question stimulus
  2. Make strike-through marks to visually eliminate distractors.

Highlight and strike-through is a great way to empower participants to create visual cues within items to help them focus on key content and eliminate distractors as they work to answer a given question.

Want to see it in action? Check out the video below!

[youtube http://www.youtube.com/watch?v=hQKLuxCrggk?rel=0]

We have plenty of resources available to you. “How-to” videos and brief presentations about best practices, will give you valuable pointers about authoring, delivery and integration in our Learning Cafe. We also share presentations and videos on our SlideShare page.

Conference Close-up: Perfecting the Test Through Question Analysis

Posted by Joan Phaup

Neelov Kar

Neelov Kar

Neelov Kar, Project Management Program Owner for Dell Services (previously Perot Systems) is getting ready to attend the Questionmark Users Conference in Miami this month. He will be delivering a case study about how he and his team have used statistical analysis to improve their test questions. I spent some time talking with Neelov the other day and wanted to share what I learned from him.

Q: Tell me a little about your company.

A:  We are a one-stop shop for IT Services and have people working all over the world, in 183 countries.

Q: What does your work entail?

A: I’m the project management program owner, so I am in charge of all the project management courses we offer. I help identify which courses are appropriate for people to take, based on training need analysis, and I work with our project management steering committee  to work out what courses we need to develop. Then we prioritize the requirements, design and develop the courses, pilot them and finally implement them as a regular course. As a Learning and Development department we also look after leadership courses and go through a similar process for those. I moved into this role about a year ago. Prior to that I was leading the evaluation team, and it was during my time on that team that we began using Questionmark.

Q: How you do you use online assessments?

A: We use Questionmark Perception for Level 2 assessment of our project management and leadership courses. We started with a hosted version of Questionmark Perception and it was I who actually internalized the tool. We offer leadership courses and project management courses internally within the organization across all geographies. Some of the project management courses already had tests, so we converted those to Questionmark.  We started designing the end-of-course assessments for our newly introduced leadership and project management courses once we started using Perception.

Q: What you will be talking about during your conference presentation?

A:  Last year we introduced a new course named P3MM Fundamentals, and because it was a new course we had to pilot the course with some of our senior members. In the pilot we asked the students to take the end-of-course test, and we found that many people had trouble passing the test. So we analyzed the results and refined the questions based on the responses. Analysis of results within Perception — particularly the Assessment Overview Report, Question Statistics Report, Test Analysis Report and Item Analysis Report — helped us in identifying the bad questions. We also saw that there were things we could do to improve the instruction within the course in order to better prepare people for the test. Using the Questionmark reports, we really perfected the test. This course has been going for over a year now, and it’s pretty stable. Now, every time we launch a course we do a pilot, administer the test and then use the Questionmark tools to analyze the questions to find out if we are doing justice to the people who are taking the test.

Q: What are you looking forward to at the conference?

A: I want to find out what Perception version 5 offers and how we can use it for our benefit.  Also, I saw that there are quite a few good papers to be presented, so I’m looking forward to attending those. And I want to get involved in the discussion about the future of SCORM.

Neelov’s is just one of 11 case studies to be presented at the conference, which will also include technical training, best practice presentations, peer discussions and more. Online registration for the conference ends on Tuesday, March 9th, so if you would like to attend, be sure to sign up soon!

Podcast: Assessments for Learning, Compliance and More

Posted by Joan Phaup

Online assessments  are used in many different way at Sanlam Personal Finance in South Africa. The company uses Questionmark Perception to test competencies and product knowledge. Assessments also play an important role in compliance. And in keeping with the company’s goal of engendering a high-performance culture, they are used on an ongoing basis to support learning.

Because Sanlam uses online assessments so extensively, my recent conversation with Sanlam Training Technology Consultant Mark Julius covered a wide array of topics. We spoke about how his organization uses graphics and animations to simulate on-the-job situations during assessments. We also talked about the special challenges of operating in countries with varying levels of internet connectivity and the ever-expanding importance of proving compliance with government regulations.

You can learn more by reading our case study about Sanlam and their use of Perception, which they use together with the SAP Human Resources and Learning Management System.

Podcast: Encouraging Faculty to Use Online Assessments

Posted by Joan Phaup

Fox Valley Technical College makes a special effort to help faculty — including those with little or no technological expertise — make the transition to online quizzes and tests. The college’s training team provides classroom, online and blended learning options for faculty. In today’s podcast, Fox Valley System Administrator Vicki Sahr gives practical tips for helping faculty embrace online testing. Vicki tells a favorite success story about an anatomy instructor who was reluctant to leave paper behind despite students’ eager requests for online tests. Listen to the podcast to find out what happened!

I learned something else from talking with Vicki: among the many Fox Valley departments that use online testing is the college’s Criminal Justice Division, which provides training and technical assistance for the National Center for Missing and Exploited Children and other law enforcement programs.