Podcast: Alignment, Impact and Measurement With the A-model

Julie Delazyn HeadshotPosted by Julie Delazyn

The growing emphasis on  performance improvement — of which training is just a part — calls for new strategies for assessment and evaluation.

Bruce C. Aaron

Bruce C. Aaron

Measurement and evaluation specialist Dr. Bruce C. Aaron has devoted a lot of thought to this. His white paper, Alignment, Impact and Measurement with the A-model, describes a framework for aligning assessment and evaluation with an organization’s goals, objectives and human performance issues.

For more information on the A-model, check out the video and free white paper: Alignment, Impact and Measurement with the A-Model.

Our podcast interview with Bruce about the A-model has been of great interest to learning and HR professionals. The interview explores how this framework addresses the changes that have taken place in recent years and the resulting complexities of today’s workplace.

A-model diagramHere are a few excerpts from the conversation. If you’d like to learn more, listen to the 10-minute podcast below.

“The things that I’ve observed have to do with our moving away from a training focus into a performance focus. So we don’t speak so much about training or even training and development anymore. We speak a lot more about performance improvement, or human performance, or learning and performance in the workplace. And those sorts of changes have had a great impact in how we do our business, how we design our solutions and how we go about assessing and evaluating them.

…the A-model evolved out of dealing with the need to evaluate all of this and still focus on what are we trying to accomplish: how do we go about parsing up the components of our evaluation and keeping those things logically organized in their relationship to each other?

…If we have a complex, blended solution, if we haven’t done a good job of really tying that to our objectives and to the original business issue that we’re trying to address…it becomes apparent through a focus on evaluation and assessment.”

Podcast: John Kleeman on Questionmark’s expanding assessment cloud

Posted by Joan Phaup

John Kleeman

We recently celebrated a major expansion of our cloud-based Questionmark OnDemand service with the addition of Questionmark’s European Data Centre to the service we have been providing for some time in the United States.

Wanting to learn more about the reasons for adding the new data centre, I spoke with Questionmark Chairman John Kleeman about the transition many customers are making to cloud-based assessment management.

Here are some of the  points we covered:

  • why software-as-a-service offers a reliable, secure and cost-effective alternative to deploying assessments from in-house servers
  • the importance of giving customers the option of storing data either in the United States or the European Union
  • the extensive security procedures, system redundancy, multiple power sources, round-the clock technical supervision and other measures that make for reliable cloud-based assessments
  • the relative merits of  Questionmark OnDemand  (cloud-based) and Questionmark Perception (on-premise) assessment management — and how to choose between them

You can learn more listening to our conversation here and/or reading the transcript.

Podcast: Charles Jennings on measuring informal and workplace learning

Posted by Joan Phaup

I’m eagerly looking forward to the keynote presentation Charles Jennings will deliver at the Questionmark 2013 Users Conference  on The Challenge of Measuring Informal and Workplace Learning.

Charles Jennings

Charles is one of the world’s leading thinkers and practitioners in learning and development — currently head of Duntroon Associates and previously Chief Learning Officer for Reuters and Thomson Reuters.He will be talking at the conference about how the 70:20:10 learning framework — based on studies that show high performers learn approximately 70% from experience, 20% from others and 10% from formal study – is being adopted by many organizations around the world.

The keynote will address how this framework serves as a strategy for extending development beyond formal, structured learning to include informal and experiential learning opportunities.

I spoke with Charles recently and asked him for some details about his presentation. For example:

  • How would you describe the 70:20:10 framework?
  • What are the key challenges of measurement and evaluation within that framework?
  • How will your conference presentation address those kinds of challenge?
  • What advice would you give to organizations that want to use online assessments to measure the effectiveness of informal and experiential learning?

If you’d like to find out how he answered, listen to this podcast or read the transcript. There will be much, much more, of course, in his keynote address and in the conference program, which we are busy planning right now.

Early-bird registration savings are available through November 16 — so keep an eye on the conference website and be sure to register soon! We’ll look forward to seeing you in Baltimore, Maryland, March  3 – 6 at this terrific learning and networking event.

 

 

Podcast: Jarrod Morgan of ProctorU explains live monitoring of online tests

Posted by Joan Phaup

Andrew Jackson University, a distance education school based in Birmingham, Alabama, faced a big question several years ago: How could students take tests the same way they pursued their studies — off campus, via the internet — with the same level of security they would encounter in a test center?

Jarrod Morgan, who was then the school’s director of technology, recently told me the answer:

Jarrod Morgan

 We decided to come up with our own process based on the face-to face monitoring experience that’s been used by colleges and universities and test centers for decades and decades.
 
What that means is that you have to see the person, you have to see what they’re doing, and you have to know who they are. So if you can do all three of those, see the person, see what they’re doing, and know who they are, you’ve essentially monitored that exam the same way you would do it in a test center.
 

Great idea! But how did they implement it? By using webcams, screen-sharing technology and layered authentication methods, with a live proctor observing each test taker.

The success and popularity of this solution led to the formation of ProctorU, a live, online proctoring/monitoring service that allows people to complete assessments from wherever they are while  ensuring exam integrity. Questionmark partners with ProctorU to bring added security for higher-stakes exams taken outside testing centers.

In this podcast interview with  Jarrod, now the company’s vice president of business development, you will hear more about how online monitoring works and why  academic institutions, businesses and other organizations are choosing this option. We also talked about various security challenges, including identity fraud, and how to combat them. We touched on how test takers have responded to this idea, too. Feel free to listen in!

 

Conference close-up: Assessment as an integral part of instructional design

Jane Bozarth

Posted by Joan Phaup

With the Questionmark Users Conference now less than a month away, it’s a good time to check out the conference agenda and — if you haven’t already done so — to sign up for three great days of learning and networking in New Orleans March 20 – 23.

Two high points on the program will be presentations by Dr. Jane Bozarth:

  • a keynote address on the importance of  starting with good objectives and clear outcomes for assessments and using them to strategically to support organizational goals
  • a breakout session called Instructional Design for the Real World —  about tools and tricks that support rapid instructional design, help  with needs analysis and make for effective communication with subject matter experts, managers and others

As a training practitioner for more than 20 years, and as Elearning Coordinator for the North Carolina Office of State Personnel, Jane will bring a lot of firsthand experience to these presentations. During a conversation I had with her shortly after she agreed to present at the conference, Jane pointed out some common pitfalls  that she will address during her keynote to help listeners address the right things at the right time for the right outcome:

  • getting so caught up in writing objectives and developing  instruction as to lost sight of the desired end result
  • measuring the wrong things or things that have insignificant impact
  • paying too little attention to formative assessment
  • waiting until after a product is designed to go back and write the assessment for it, instead of addressing assessment first

You can listen to the podcast of our conversation right here and or read the transcript.

Conference Close-up: Alignment, Impact & Measurement with the A-model

Posted by Joan Phaup

Key themes of the Questionmark Users Conference March 20 – 23 include the growing importance of informal and social learning — as reflected by the 70+20+10 model — and the role of assessment in performance improvement and talent management. It’s clear that new strategies for assessment and evaluations are needed within today’s complex workplaces.

Dr. Bruce C. Aaron

We’re delighted that measurement and evaluation specialist Dr. Bruce C. Aaron will be joining us at the conference to talk about the A-model framework he has developed for aligning assessment and evaluation with organizational goals, objectives and human performance issues.

A conversation Bruce and I had about A-model explores the changes that have taken place in recent years and today’s strong focus on performance improvement.

“We don’t speak so much about training or even training and development anymore,” Bruce explained. “We speak a lot more about performance improvement, or human performance, or learning and performance in the workplace. And those sorts of changes have had a great impact in how we do our business, how we design our solutions and how we go about assessing and evaluating them…We’re talking about formal learning, informal learning, social learning, classroom, blended delivery, everything from online learning to how people collect information from their networks and the knowledge management functions that we’re putting in place.”

In a complex world that requires complex performance solutions, Bruce observed that “the thing that doesn’t change is our focus on outcomes.”

The A-model evolved out of dealing with the need to stay focused on goals to logically organize the components of learning, evaluation and performance improvement. It’s a framework or map for holding the many elements of human performance in place — right from the original business problem or business issue up through program design and evaluation.

You can learn more about this from Bruce’s white paper, Alignment, Impact and Measurement with the A-model, from this recording of our conversation — and, of course, by attending the Users Conference! Register soon!

Next Page »