Announcing secure delivery of higher-stakes tests on the Apple iPad

Headshot JuliePosted by Julie Delazyn

Questionmark Secure been safeguarding the delivery of medium- and high-stakes tests for over a decade and last year became available to Mac users. Today, I’m pleased to announce that Questionmark Secure is now available for iPad users, too – and that you can download Questionmark Secure for iPad from the iTunes App sotre

Like its predecessors, this free app locks down the browser, disabling functions that participants could use to print or copy exam material, ‘”accidentally” exit a test, or gain access to materials on their devices or the Internet that could give an unfair advantage. The app provides a secure environment for delivering higher-stakes assessments such as tests and exams. Used along with other measures for combating impersonation and content theft, it can help reduce the risk of cheating.

Organizations can use the app to deliver medium and high-stakes tests via low-cost, highly portable tablets — perfect for a BYOD situation or for setting up mobile test centers.

We are very pleased to offer this option to our customers, who are embracing the use of mobile devices for many different purposes.

Shenandoah University School of Pharmacy helped beta test the new app as part of an integrated mobile learning program. So far, students involved in the program have been using the Apple MacBook Pro for accessing course material and online assessments. They’ve also had their choice of using an iPod Touch, iPad 3G or iPhone for a quick mobile delivery of content. But this fall, incoming students will work with the iPad as well as the MacBook Pro. If you’d like more details about Shenandoah’s iMLearning initiative, see Shenandoah’s case study slides from this year’s  Questionmark Users Conference. (Here’s a link to the 2014 conference!) 

To learn more about the app or to download it form iTunes, click here.

Why mobile assessment matters

John Kleeman HeadshotPosted by John Kleeman

The reason assessment on mobile phones and tablets matters is because so many people have these devices, and there is a huge opportunity to use them. Sometimes it’s easy to forget how rapid a change this has been!

I’m indebted to my colleague Ivan Forward for this visualization showing the increase in mobile phone ownership in 10 years. It’s based on data from the South African census reported by the BBC.

See for text figures behind this graph

As you can see, in the decade from 2001 to 2011, more South Africans gained access to electricity, flush toilets and higher education, but the change in use of mobile phones has been far more dramatic.

Figures in other countries will vary, but in every country mobile phone use has increased hugely.

Not only does this explain why mobile assessment matters, it also explains why so many organizations (including Questionmark customers) are moving to Software as a Service / on-demand systems. Because of the rise of mobile phones, the parallel rise in tablets and the fast changing nature of mobile technology, you need your software to be up to date. And for most organizations, this is easier to do if you delegate it to a system like Questionmark OnDemand than if you have to update and re-install your own software frequently.

Observational Assessments—why and how

Posted by Julie Delazyn

An Observational Assessment, in which an observer watches a participant perform a task and rates his or her performance, make it possible to evaluate skills or abilities that are difficult to measure using “traditional” assessments.

As Jim Farrell noted in a previous post, “By allowing a mentor to observe someone perform while applying a rubric to their performance, you allow for not only analytics of performance but the ability to compare to other individuals or to agreed benchmarks for performing a task. Also, feedback collected during the assessment can be displayed in a coaching report for later debriefing and learning.”

Click here for examples of how different types of organizations capture performance data and measure competencies using observational assessments.

If you would like to learn more about Observational Assessments, check out this SlideShare presentation. Also, this video – one of many instructional resources available in our Learning Café –offers a brief overview and shows how to schedule an observational assessment, deliver it to mobile device and report on the results.

Delivering and Reporting on Observational Assessments Using Questionmark Perception

Observational assessments: measuring performance in a 70+20+10 world

Posted by Jim Farrell

Informal Learning. Those two words are everywhere. You might see them trending on Twitter during a #lrnchat, dominating the agenda at a learning conference or gracing the pages of a training digest. We all know that informal learning is important, but measuring it can often be difficult. However, difficult does not mean impossible.

Remember that in the 70+20+10 model, 70 percent of learning results from on the job experiences and 20 percent of learning comes from feedback and the examples set by people around us. The final 10 percent is formal training. No matter how much money an organization spends on its corporate university, 90 percent of the learning is happening outside a classroom or formal training program.

So how do we measure the 90 percent of learning that is occurring to make sure we positively affect the bottom line?

First is performance support. Eons ago, when I was an instructional designer, it was the courseware and formal learning that received most of the attention. Looking back, we missed the mark: Although the projects were deemed successful, we likely did not have the impact we could have. Performance support is the informal learning tool that saves workers time and leads to better productivity. Simple web analytics can tell you the performance support that is most searched on and used on a daily basis.

But onto what I think Questionmark does best – that 20 percent that is occurring through feedback and examples around us. Many organizations have turned to coaching and mentoring to give employees good examples and to define the competencies necessary to be a great employee.

I think most organizations are missing the boat when it comes to collecting data on this 20 percent. While I think coaching and mentoring is a step in the right direction, they probably aren’t yielding good analytics. Yes, organizations may use surveys and/or interviews to measure how mentoring closes performance gaps, but how do we get employees to the next level? I propose the use of observational assessments. By definition, observational assessments enable measurement of participants’ behavior, skills and abilities in ways not possible via traditional assessment.

By allowing a mentor to observe someone perform while applying a rubric to their performance, you allow for not only analytics of performance but the ability to compare to other individuals or to agreed benchmarks for performing a task. Also, feedback collected during the assessment can be displayed in a coaching report for later debriefing and learning. And to me, that is just the beginning.

Developing an observational assessment should go beyond the tasks someone has to do to perform their day-to-day work. It should embody the competencies necessary to solve business problems. Observational assessments allow organizations to capture performance data, and measure the competencies necessary to push the organization to be successful.

If you would like more information about observational assessments, click here.

Bersin notes Questionmark in mobile learning report

Posted by John Kleeman

Bersin & Associates ( is a well-established analyst company in the talent management space. They conduct research and advise companies on what vendors to buy from and what trends there are in the space.

We were pleased to see this description of Questionmark in their March 23, 2011, report on mobile learning  by David Mallon: m-Learning: Mobile Learning Is Finally Going Mainstream – And It Is Bigger Than You Might Think. They described Questionmark as:

The most well-known specialist provider of enterprise assessment technology, Questionmark realized early that companies would want to be able to test and evaluate learners remotely, not just deliver content to them. Questionmark’s mobile assessment applications are innovative and intuitive examples of what is possible in m-learning.

Click here for more on Questionmark’s mobile learning solutions.

Conference Close-up: Silke Fleischer on Design Strategies for Mobile Devices

Posted by Joan Phaup

Mobile assessment is high on the list of topics we’ll be discussing at the Questionmark Users Conference March 15 -18. Silke Fleischer, a co-founder of ATIV Software, will show attendees how to create materials that work well on mobile devices and manage their limitations during her talk on Getting the Most from a Small Screen: Design strategies for mobile devices.

Silke Fleischer

Silke has been a featured speaker at several national and international conferences of organizations such as the eLearning Guild and ASTD, and we’re very pleased she’ll be joining us in Los Angeles. I spoke with her recently about her work and her plans for presenting at the conference:

Q:   Could you tell me about your work at ATIV software?
A: We develop mobile applications for events and eLearning. Our core EventPilot application is a support system for conferences, events and meetings, and we are working on a mobile course app that helps training departments reduce printing costs for instructor-led training. We also do custom mobile development. In case you are wondering about how we got our company name, it stands for innovATIVe, creATIVe, alternATIVe!

Q: What trends are you seeing in mobile learning and assessment?
A: There are more than 700,000 smartphones sold per day, and learning is very convenient on them. More and more standard eLearning content is now being made available and optimized for mobile devices, especially with the technology getting better and screen resolutions larger. That’s not to say that classroom training is going away! You have to have the face-to-face interaction. I think we will see a marriage between mobile and the classroom.

One interesting trend is the use of mobile delivery within Instructor-Led Training (ILT) to cut printing and reduce budgets for handouts and course materials. Our upcoming course app, for instance, allows learners to log in and download all course-related PowerPoint presentations on an iPad and take notes directly on the device to a particular slide. When the course is over, they can take their assessment using Questionmark’s iPad App and also send themselves an email with notes and marked items for easy reference about what they have learned. Also, you can update content on the fly and integrate social aspects such as your training blog or company messaging system. This approach can save thousands of dollars in printing costs for large organizations over the year.

Q: What are the key elements that need to be considered in designing and implementing mobile assessment?
A: Mobile devices are unlike desktop computers or laptops. The smaller screen, for example, limits some types of training and assessments. You also have to consider software and hardware. On the software side, you might have to find new ways of creating interactive content if there is no Flash player available  When it comes to hardware, even though devices are very fast these days, their performance is still dependent on the processors and on memory limitations, so you will want to invest in good programmers. Wifi and cell reception requirements should be analyzed, and you can also look at native versus hybrid versus web apps. The amazing new capabilities outweigh the limitations. Today’s mobile devices make new training paradigms possible and enable instructional designers to use motion sensing and location-based services, cameras and advanced touch screens.

Q: What will you be sharing during your presentation at the Questionmark Users Conference?
A: Since mobile devices have such a small screens we will be talking about design and development approaches. I’ll be offering some basic guidelines to help attendees get started with small screen design and content presentation. An important aspect of this will be the fact that the screens aren’t only used for displaying content, but also for navigation with your fingers. I’ll be sharing some workarounds that will help people overcome the challenges presented by small screens.

Q: What are you looking forward to at the conference?
A: I’m eager to learn about other trends in learning and assessments, which I think will help me get inspired with new ideas for 2011. And of course it will be great to network with others  and share ideas. I’m really excited about going to the conference!

We’re looking forward to the conference, too, and hope you will check out the program and register soon!