Online or test center proctoring: Which is more secure?

John Kleeman HeadshotPosted by John Kleeman

As explained in my previous post Online or test center proctoring: Which is best?, a new way of proctoring certification exams is rapidly gaining traction. With online proctoring, candidates takes exams at their office or home, with a proctor observing via video camera over the Internet. Parents scaling the walls of a building to help their children cheat

The huge advantage of online proctoring is that the candidate doesn’t need to travel to a test center. This is fairer and saves a lot of time and cost. But how secure is online proctoring? You might at first sight think that test center proctoring is more secure – as it sounds easier to spot cheating in a controlled environment and face-to-face than online. But it’s not as simple as that.

The stakes for a candidate to pass an exam are often high, and there are many examples where proctors at test centers coach candidates or otherwise breach the integrity of the exams process. A proctor in a test center can witness the same test being taken over and over again, and they can start to memorize, and potentially sell, the content that they see.  For example, according to a 2011 article in the Economist , one major test center company at that time was shutting down five test centers a week due to security concerns.

Test center vulnerabilities are not always as obvious as they are in the picture to the right (source here), but they are myriad. This recent photo shows parents in India climbing the walls of a building to help their children pass exams, with proctors bribed to help. According to Standard Digital:

“Supervisors stationed at notorious test centres vie for the postings, enticed by the prospect of bribes from parents eager to have their wards scrape through.”

Proxy test taking – where one person takes a test impersonating another – is a also big concern in the industry. A 2014 Computer World article quotes an expert saying:

“In some cases, proxies have been able to skirt security protocols by visiting corrupt testing facilities overseas that operate both a legitimate “front room” test area and a fraudulent “back room” operation.

This doesn’t just happen in a few parts of the world: there are examples worldwide. For instance, there was a prominent case in the UK in 2014 where proctors were dishonest in a test used to check English knowledge for candidates seeking visas. According to a BBC report, in some tests the proctor read out the correct answers to candidates. And in another test, a candidate came to the test center and had their picture taken, but then a false sitter went on to take the test. An undercover investigator posing as a candidate was told:

“Someone else will sit the exam for you. But you will have to have your photo taken there to prove you were present.”

This wasn’t a small scale affair – the UK government announced that at least 29,000 exam results were invalid due to this fraud.

Corrupt test centers have also been found in the US. In May 2015, a New York man was sentenced to jail for being involved in fraud where five New York test centers allowed applicants for a commercial driving license to pay their way to pass the test. According to a newspaper report:

“The guards are accused of taking bribes to arrange for customers to leave the testing room with their exams, which they gave to a surrogate test-taker outside who looked up the answers on a laptop computer. The guards would allow the test-takers to enter and leave the testing rooms.”

There are many other examples of this kind of cheating at test centers – a good source of information is  Caveon’s blog about cheating in the news. Caveon and Questionmark recently announced a partnership to generally enhance the security of high-stakes testing programs. The partnership with Caveon will also provide Questionmark’s customers with easy access to consulting services to help them enhance the security of the exams.
Of course, most test center proctors are honest and most test center exams are fair, but there are enough problems to raise concerns. Online proctoring has some security disadvantages, too:

  • Although improvements are being developed, it is harder for the proctor to check whether an ID is genuine when looking at it through a camera.
  • A remote camera in the candidate’s own environment is less capable of spotting some forms of cheating than a controlled environment in a test center.

But there are also genuine security advantages.  It is much harder for an online proctor to get to know a candidate to be able to coach him or her or receive a payment to help in other ways.

  • Because proctors can be assigned randomly and without any geographic connection, it’s much less likely for the proctor and candidate to be able to pre-arrange any bad behavior
  • All communication between proctor and candidate is electronic and can be logged, so the candidate cannot easily make an inappropriate approach during the exam.
  • While test center proctors have easy access to exam content which can lead to various types of security breaches, online proctors can be restricted from viewing the exam content through the use of such technologies as secure browsers.
  • Because there is less difficulty and cost involved in online proctoring than when the candidate travels to a physical test center, it’s practical to test more frequently– and this is a security benefit. If there is frequent testing, it may be simpler for a candidate to learn the material and pass the test honestly than put a lot of effort into cheating. If you have several exams, you can also compare the pictures of a candidate at each exam to reduce the chance of impersonation.

In summary, the main reason for online proctoring is that it saves time and money over going to a bricks-and-mortar test center. The security advantages and disadvantages of test center versus online proctoring  are open to debate.  Dealing with security vulnerabilities requires constant vigilance. With new online proctoring technologies enhancing exam security, many certification programs are now transitioning away from test centers. Traditionally a test center was a secure place to administer exams, but in practice there have been so many incidents of proctor dishonesty over the years that online proctoring is likely justifiable for security reasons.

Caveon Q&A: Enhanced security of high-stakes tests

Headshot JuliePosted by Julie Delazyn

Questionmark and Caveon Test Security, an industry leader in protecting high-stakes test programs, have recently joined forces to provide clients of both organizations with additional resources for their test administration toolboxes.

Questionmark’s comprehensive platform offers many features that help ensure security and validity throughout the assessment process. This emphasis on security, along with Caveon’s services, which include analyzing data to identify validity risks as well as monitoring the internet for any leak that could affect intellectual property, adds a strong layer of protection for customers using Questionmark for high-stakes assessment management and delivery.

I sat down with Steve Addicott, Vice President of Caveon, to ask him a few questions about the new partnership, what Caveon does and what security means to him. Here is an excerpt from our conversation

Who is Caveon? Tell me about your company.

At Caveon Test Security, we fundamentally believe in quality testing and trustworthy test results. That’s why Caveon offers test security and test item
development services dedicated to helping prevent test fraud and better protecting our clients’ items, tests, and reputations.

What does security mean to you, and why is it important?

High stakes test programs make important education and career decisions about test takers based on test results. We also spend a tremendous amount of time creating, administering, scoring, and reporting results. With increased security pressures from pirates and cheats, we are here to make sure that those results are trustworthy, reflecting the true knowledge and skills of test takers.

Why a partnership with Questionmark and why now?

With a growing number of Questionmark clients engaging in high-stakes testing, Caveon’s experience in protecting the validity of test results is a natural extension of Questionmark’s security features. For Caveon, we welcome the chance to engage with a vendor like Questionmark to help protect exam results.

And how does this synergy help Questionmark customers who deliver high-stakes tests and exams?

As the stakes in testing continue to rise, so do the challenges involved in protecting your program. Both organizations are dedicated to providing clients with the most secure methods for protecting exam administrations, test development investments, exam result validity and, ultimately, their programs’ reputations.

For more information on Questionmark’s dedication to security, check out this video and download the white paper: Delivering Assessments Safely and Securely.

How tests help online learners stay on task

Joan Phaup Headshot Posted by Joan Phaup

Online courses offer a flexible and increasingly popular way for people to learn. But what about the many distractions that can cause a student’s mind to wander off the subject at hand?

According to a team of Harvard University researchers, administering short tests to students watching video lectures can decrease mind-wandering, increase note-taking and improve retention.

Interpolated memory tests reduce mind wandering and improve learning of online lectures, a paper by Harvard Postdoctoral Fellow Karl K. Szpunar, Research Assistant Novall Y. Khan and Psychology Professor  Daniel L. Schacter, was published this month in Proceedings of the National Academy of Sciences  (PNAS) in the U.S.

The team conducted two experiments in which they interspersed online lectures with memory tests and found that such tests can: help students pay more sustained attention to lecture content encourage task-relevant note-taking improve learning reduce students’ anxiety about final tests.

“Here we provide evidence that points toward a solution for the difficulty that students frequently report in sustaining attention to online lectures over extended periods,” the researchers say.


In this experiment a group of students watched a 21-minute lecture presented in four segments of about five minutes each. After each segment, students were asked to do some math problems. Some students were then tested on the material from the lecture, while others (the “not tested” group) did more math problems.

This research seems to indicate that including tests or quizzes could make online courses more successful. So yes! Use assessments to reinforce what people are learning in your own courses. Whatever types of information you are presenting online – whether it’s a lecture, an illustration or text, you can help students stay focused by embedding assessments right on the same page as your learning materials.

A previous post on this blog offers an example of how embedded quizzes are being used to engage learners. You can read more about the recently published research, including an interview with Szpunar and Schacter, in the Harvard Gazette. You can read the paper here.

Translatability in Questionmark Perception Version 5

john_smallPosted by John Kleeman

Making assessments and questions translatable was a key goal for Questionmark Perception version 5.

There always has been a need to translate assessments, particularly in countries like Switzerland and Canada, which have have different languages within their borders. But the Internet has made the world much more connected, and so many organizations have employees and stakeholders in different countries or speaking different languages. Some typical translation needs that fed into version 5 were:

  • A bank that operates in many parts of the world and wants to deliver course evaluation surveys (level 1s) to participants in training in many languages. It’s crucial for them that they can schedule the assessment and allow their participants to choose the language in which the assessment is taken
  • A telephone company that trains employees in Europe and wants to give similar questions to all employees. They want to author the questions once and then translate them for use by training teams in each country.
  • A university that creates and delivers questions in English but is expanding internationally, is in partnership with universities overseas and wants to translate some of its material into other languages
  • A software company that deliver certifications to consultants and partners worldwide in around 20 languages and needs assessments and questions translated so that the certifications can be given fairly to anyone who speaks the languages they support.
  • A manufacturing support company that authors questions in English and then translates them into 8 European and Asian languages for delivery to their partners and employees worldwide.

Some customers use external translators and want Questionmark to export XML that translators can use in specialist tools, whilst other customers have internal expertise and want to translate in house and need screens within Questionmark Perception to make the translations themselves. For all customers, it’s not just the initial translation that they need help with but the management of the process. When questions change, if you have them translated into a dozen language, it’s a nightmare to keep track manually of what needs updated, and Questionmark software needs to flag when a question has changed and remind you to update the translation.

The basic concept of Questionmark’s translation management system is that you create a question in a base language and then translate it into as many target languages as you want. And then you do the same with an assessment.

translating501g

Questions and assessments are linked so that you can report across language.

And to do the translation, you can translate in Perception by typing in the text in a simple user interface shown below.

image

Translating within Questionmark Perception works well if you have an in-house translator, but if you are working with external translators, it’s usually best to export to XML and send them the text to translate – and they will return it to you to import into Perception. Text is exported in an industry standard XML called XLIFF that standard software packages used by translators can process.

Whether you translate interactively or by export to XML, Questionmark keeps track of when questions change in the base language and prompts you to update translations to keep translated questions up to date. So we help you not just manage the initial translation process, but also the ongoing process of translations as questions and assessments change and evolve.

We’re very excited about how easy it is to translate questions in Perception version 5, and we look forward to your feedback as you use it to create quizzes, tests, exams and surveys that can be used in many languages.

The Secret of Writing Multiple-Choice Test Items

julie-smallPosted by Julie Chazyn

I read a very informative blog entry on the CareerTech Testing Center Blog that I thought was worth sharing. It’s about multiple-choice questions: how they are constructed and some tips and tricks to creating them.

I asked its author, Kerry Eades, an Assessment Specialist at the Oklahoma Department of Career and Technology teacherEducation (ODCTE), about his reasons for blogging on The Secret of Writing Multiple-Choice Test Items. According to Kerry, CareerTech Testing Center took this lesson out of a booklet they put together as a resource for subject matter experts who write multiple-choice questions for their item banks, as well as for instructors who needed better instruments to create strong in-class assessments for their own classrooms. Kerry points out that the popularity of multiple-choice questions “stems from the fact that they can be designed to measure a variety of learning outcomes.” He says it takes a great deal of time, skill, and adherence to a set of well-recognized rules for item construction to develop a good multiple-choice question item.

The CareerTech Testing Center works closely with instructors, program administrators, industry representatives, and credentialing entities to ensure skills standards and assessments meet Carl Perkins requirements, reflect national standards and local industry needs. Using Questionmark Perception, CareerTech conducts tests for more than 100 career majors, with an online competency assessment system that delivers approximately 75,000 assessments per year.

Check out The Secret of Writing Multiple-Choice Test Items.

For more authoring tips visit Questionmark’s Learning Café.