Posted by Joan Phaup
We recently celebrated a major expansion of our cloud-based Questionmark OnDemand service with the addition of Questionmark’s European Data Centre to the service we have been providing for some time in the United States.
Wanting to learn more about the reasons for adding the new data centre, I spoke with Questionmark Chairman John Kleeman about the transition many customers are making to cloud-based assessment management.
Here are some of the points we covered:
- why software-as-a-service offers a reliable, secure and cost-effective alternative to deploying assessments from in-house servers
- the importance of giving customers the option of storing data either in the United States or the European Union
- the extensive security procedures, system redundancy, multiple power sources, round-the clock technical supervision and other measures that make for reliable cloud-based assessments
- the relative merits of Questionmark OnDemand (cloud-based) and Questionmark Perception (on-premise) assessment management — and how to choose between them
You can learn more listening to our conversation here and/or reading the transcript.
Posted by Joan Phaup
I’m eagerly looking forward to the keynote presentation Charles Jennings will deliver at the Questionmark 2013 Users Conference on The Challenge of Measuring Informal and Workplace Learning.
Charles is one of the world’s leading thinkers and practitioners in learning and development — currently head of Duntroon Associates and previously Chief Learning Officer for Reuters and Thomson Reuters.He will be talking at the conference about how the 70:20:10 learning framework — based on studies that show high performers learn approximately 70% from experience, 20% from others and 10% from formal study – is being adopted by many organizations around the world.
The keynote will address how this framework serves as a strategy for extending development beyond formal, structured learning to include informal and experiential learning opportunities.
I spoke with Charles recently and asked him for some details about his presentation. For example:
- How would you describe the 70:20:10 framework?
- What are the key challenges of measurement and evaluation within that framework?
- How will your conference presentation address those kinds of challenge?
- What advice would you give to organizations that want to use online assessments to measure the effectiveness of informal and experiential learning?
If you’d like to find out how he answered, listen to this podcast or read the transcript. There will be much, much more, of course, in his keynote address and in the conference program, which we are busy planning right now.
Early-bird registration savings are available through November 16 — so keep an eye on the conference website and be sure to register soon! We’ll look forward to seeing you in Baltimore, Maryland, March 3 – 6 at this terrific learning and networking event.
Posted by Joan Phaup
Andrew Jackson University, a distance education school based in Birmingham, Alabama, faced a big question several years ago: How could students take tests the same way they pursued their studies — off campus, via the internet — with the same level of security they would encounter in a test center?
Jarrod Morgan, who was then the school’s director of technology, recently told me the answer:
We decided to come up with our own process based on the face-to face monitoring experience that’s been used by colleges and universities and test centers for decades and decades.
What that means is that you have to see the person, you have to see what they’re doing, and you have to know who they are. So if you can do all three of those, see the person, see what they’re doing, and know who they are, you’ve essentially monitored that exam the same way you would do it in a test center.
Great idea! But how did they implement it? By using webcams, screen-sharing technology and layered authentication methods, with a live proctor observing each test taker.
The success and popularity of this solution led to the formation of ProctorU, a live, online proctoring/monitoring service that allows people to complete assessments from wherever they are while ensuring exam integrity. Questionmark partners with ProctorU to bring added security for higher-stakes exams taken outside testing centers.
In this podcast interview with Jarrod, now the company’s vice president of business development, you will hear more about how online monitoring works and why academic institutions, businesses and other organizations are choosing this option. We also talked about various security challenges, including identity fraud, and how to combat them. We touched on how test takers have responded to this idea, too. Feel free to listen in!
Posted by Joan Phaup
With the Questionmark Users Conference now less than a month away, it’s a good time to check out the conference agenda and — if you haven’t already done so — to sign up for three great days of learning and networking in New Orleans March 20 – 23.
Two high points on the program will be presentations by Dr. Jane Bozarth:
- a keynote address on the importance of starting with good objectives and clear outcomes for assessments and using them to strategically to support organizational goals
- a breakout session called Instructional Design for the Real World — about tools and tricks that support rapid instructional design, help with needs analysis and make for effective communication with subject matter experts, managers and others
As a training practitioner for more than 20 years, and as Elearning Coordinator for the North Carolina Office of State Personnel, Jane will bring a lot of firsthand experience to these presentations. During a conversation I had with her shortly after she agreed to present at the conference, Jane pointed out some common pitfalls that she will address during her keynote to help listeners address the right things at the right time for the right outcome:
- getting so caught up in writing objectives and developing instruction as to lost sight of the desired end result
- measuring the wrong things or things that have insignificant impact
- paying too little attention to formative assessment
- waiting until after a product is designed to go back and write the assessment for it, instead of addressing assessment first
You can listen to the podcast of our conversation right here and or read the transcript.
Posted by Joan Phaup
Key themes of the Questionmark Users Conference March 20 – 23 include the growing importance of informal and social learning — as reflected by the 70+20+10 model — and the role of assessment in performance improvement and talent management. It’s clear that new strategies for assessment and evaluations are needed within today’s complex workplaces.
Dr. Bruce C. Aaron
We’re delighted that measurement and evaluation specialist Dr. Bruce C. Aaron will be joining us at the conference to talk about the A-model framework he has developed for aligning assessment and evaluation with organizational goals, objectives and human performance issues.
A conversation Bruce and I had about A-model explores the changes that have taken place in recent years and today’s strong focus on performance improvement.
“We don’t speak so much about training or even training and development anymore,” Bruce explained. “We speak a lot more about performance improvement, or human performance, or learning and performance in the workplace. And those sorts of changes have had a great impact in how we do our business, how we design our solutions and how we go about assessing and evaluating them…We’re talking about formal learning, informal learning, social learning, classroom, blended delivery, everything from online learning to how people collect information from their networks and the knowledge management functions that we’re putting in place.”
In a complex world that requires complex performance solutions, Bruce observed that “the thing that doesn’t change is our focus on outcomes.”
The A-model evolved out of dealing with the need to stay focused on goals to logically organize the components of learning, evaluation and performance improvement. It’s a framework or map for holding the many elements of human performance in place — right from the original business problem or business issue up through program design and evaluation.
You can learn more about this from Bruce’s white paper, Alignment, Impact and Measurement with the A-model, from this recording of our conversation — and, of course, by attending the Users Conference! Register soon!