What I Learned Back in the Big Easy

Posted by John Kleeman

Questionmark cutomers and staff pose for a picture at the dessert reception

I’m just back from the 2012 Questionmark user conference in New Orleans “the Big Easy”. This is our second user conference in New Orleans – we were there in 2005, and it was great to re-visit a lovely city. Here is some of what I learned from Questionmark customers and speakers.

Most striking session: Fantastic to hear from a major US government agency about how they deployed Questionmark to do electronic testing after a long history of 92 years of paper testing. The agency have a major programme – with 800,000 items and 300,000 test takers. I hope to cover what they are doing in a future blog post.

Most memorable saying: Bill Coscarelli (co-author of Criterion-Referenced Test Development) noting that in his huge experience, the two things in testing that really make a difference are “test above knowledge” and “Use the Angoff method to set a cut/pass score for your tests”.

Meeting a hero: You’ll have seen me writing on this blog about the Bruce C. Aaron’s A-model and it was great to meet Bruce in person for the first time and hear him talk so intelligently about the A-model as a way of measuring effectiveness of training – starting from a business problem.

Most fun time: Got to mention the amazing food in New Orleans – especially the catfish and the oysters. But the highlight was being part of a parade down Bourbon Street with a marching band, all the Questionmark conference attendees and a full New Orleans ensemble.

Best conversation: Discussion with a user who’d come to the conference uncertain that Questionmark was the right system for them, but realized on talking to other customers and the Questionmark team that they were only using 10% of what Questionmark could offer them, and left the conference enthused about all they would do going forward.

Session I learned most from: a security panel chaired by Questionmark CEO Eric Shepherd with executives from Pearson Vue, ProctorU, Innovative Exams and Shenandoah University. There was a great discussion about the advantages of test center delivery vs remote proctoring. Test center delivery is highly secure, but testing at home with a remote proctor observing by video link is very convenient, and it was great to hear real experts debating pros and cons.

Most actionable advice: Keynote speaker Jane Bozarth: “what you measure is what you get”, a reminder that what we do with measurement will define behavior in our organization.

It was great to spend time with and learn from customers, partners and our own team. Look forward to seeing many of you again in the 2013 Questionmark user conference in Baltimore Inner Harbor in March 3-6 2013.

PricewaterhouseCoopers wins assessment excellence award

Posted by Joan Phaup

Greg Line and Sean Farrell accept the Questionmark Getting Results Award on behalf of PwC

I’m blogging this morning from New Orleans, where we have just completed the 10th annual Questionmark Users Conference.

It’s been a terrific time for all of us, and we are already looking forward to next year’s gathering. 2013.

One highlight this week was yesterday’s presentation of a Questionmark Getting Results Award to PricewaterhouseCoopers.

Greg Line, a PwC Director in Global Human Capital Transformation, and Sean Farrell, Senior Manager of Evaluation & Assessment at PwC, accepted the award.

Getting Results Award

The award acknowledges PWC’s successful global deployment of diagnostic and post-training assessments to more than 100,000 employees worldwide, as well as 35,000 employees in the United States.

In delivering more than 230,000 tests each year —  in seven different languages — PwC defines and adheres to sound practices in the use of diagnostic and post-training assessments as part of its highly respected learning and compliance initiatives. These practices include developing test blueprints, aligning test content with organizational goals, utilizing sound item writing techniques, carefully reviewing question quality and using Angoff ratings to set passing scores.

Adhering to these practices has helped PwC deploy valid, reliable tests for a vast audience – an impressive accomplishment that we were very pleased to celebrate at the conference.

So that’s it for 2012! But mark your calendar for March 3 – 6, 2013, when we will meet at the Hyatt Regency in Baltimore!

Open Platform, Security and Eating Out: looking forward to New Orleans

Posted by Steve Lay

Just a few days to go until the Questionmark Users Conference.  As usual I’ve been perusing the conference programme on the lookout for sessions of particular interest to people integrating with Questionmark technologies.

One that stood out to me was Using Innovative Technologies to Aid High-Volume Testing in Multiple Environments. The team from Oklahoma Department of Career and Technology Education will be talking about their application which was developed in partnership with Questionmark’s Solution Services team.  If you want to get an idea of what can be achieved using our application programming interfaces (APIs), you might want to check out this session.

For people who want to get a bit more technical, I’ll be giving a session on Using Web Services to Integrate with Questionmark Perception. In that session I’ll dive a bit deeper into our web service APIs – called QMWISe – and update you on our progress towards the next generation of APIs using REST and OData – but that’s enough acronyms for one blog post!

A significant theme of the conference next week is security.  To an integration specialist, security often means protocols for authentication, authorisation and cryptographic algorithms.  But this traditional concept of security is just one part of a more complex picture.  I’m currently reading “Liars and Outliers” by Bruce Schneier, which provides an accessible survey of the wider social context from the perspective of an author traditionally associated with computer security.  I touched on some of these issues in a blog post a few months ago about a session at our European conference last Autumn.

On this theme, Richard Pierce, from  Shenandoah University will be talking about proctored versus non-proctored testing, grappling directly with the effects of the assessment setting on cheating.  You might also like to attend the panel discussion on Managing Assessment Security as the Stakes are Getting Higher.  It seems unfair to pick out just one or two sessions: as you’d expect, there are plenty more for people interested in a theme that affects every part of an assessment programme.

As usual, the product owners will be available in their own track so why not come and see us to talk about our product road maps in the Future Solutions strand?  Oh, and before I go, did I mention that I’ll also be talking about how to deploy the brand new Perception 5.4 OnPremise?

I look forward to meeting everyone, and even if you can’t make it in person don’t forget to keep up with the conference hash tag (#qm12).  Finally, I’ll leave you with one of my favourite quotes from the conference programme, in Richard Pierce’s words:  “Where are we eating tonight? It is New Orleans, for goodness’ sake!”

See you next week!

Conference Close-up: Timing is Everything

Posted by Joan Phaup

As many followers of this blog will know, Questionmark Chairman John Kleeman has been exploring the findings of cognitive learning research and considering how it can apply to assessments.

At this year’s Questionmark Users Conference in New Orleans, John will explain research on the value of spacing out learning and assessment as means of helping people remember information for the long term.

I asked him the other day about his presentation, Timing is Everything : Using psychology research to make your assessments more effective”

What do you value you most about what you’ve been learning from the work of cognitive learning researchers?

John Kleeman

There’s a lot of evidence from cognitive psychology that could make a difference in how we do assessments, training and learning, and I want people to be aware of it. I’m excited about research that’s being done and am keen to share research findings so that people involved in learning and assessment can use that evidence in practical ways.

What key findings about the timing of assessments will you share during your breakout session?

There’s a fascinating and well-documented finding that if you space out learning – separate it out – it’s much more effective than if you do it all in one chunk. Learning should be regarded as a process, not an event. Research shows that if you spend half an hour a day for four days learning something it will be more effective than if you cover all the material in a single two-hour session. Because of the way the mind works, having breaks between learning sessions will help you remember information for the long term. I’ll be sharing solid evidence about this and will talk about its implications.

Assessment plays into all this! An assessment with feedback is learning, and so if you take a series of separated-out assessments, this gives you spacing. Also if you give learners a series of assessments during a course, and encourage them to learn and revise for each assessment, you are encouraging this good behavior of spacing out their learning. So if you have people take quizzes and tests throughout learning instead of just at the end, you are promoting some good learning habits.

Can you give me an example of this?

I wrote not too long ago about how the University of Lund in Sweden uses embedded assessments for knowledge checks within a SharePoint-based learning platform. They’ve found that requiring students to take quizzes as they work through distance learning courses forces them to engage with the material and practice retrieving information from the very start of their courses — and to keep that engagement going throughout the course. That’s a great way to prevent people from cramming a lot of learning into a short period of time – as they might do before a final exam. So assessments can be used to help space out learning. If you just rely on final tests or exams then there is a risk of encouraging people to cram at the end of the course, rather than helping them learn and remember information for the long term.

How can we use research findings about the spacing of learning in designing tests and quizzes?

One of the things I covered last year and will review in this session is the benefit of retrieval practice – the fact that if you want to remember something for the long term then you want to practice retrieving it from memory. Things you have practiced retrieving are easier to remember. So if you have to answer a question or take a test on material, it makes it more likely you will remember it in future. For example, for retrieval practice it’s best if possible to use open-ended questions that require the person to recall information rather than recognize it from a multiple choice list. And you definitely should include feedback.

Another point to keep in mind is that the material that gets tested is the material people will remember, so it’s important to cover the material you really regard as valuable and worth remembering. We’ll discuss these and other ideas about how to apply the research to assessments during the session.

What you do expect participants in your session to take away from it?

I’ll be sharing some of the evidence from cognitive psychology both to communicate to what I understand to be the results and to have people come up with their own views. I will also share links to resources from experts like Dr. Will Thalheimer of Work-Learning Research so that they can learn more on their own.

I don’t just want to say, “Here’s the data and believe me!” I want to give attendees a way to look into this themselves and understand it. Once they grasp the principles coming out of the research, it will help them formulate assessment practices and plans in ways that apply to their individual organizations’ needs.

———-
We hope to see you in New Orleans March 20 – 23 and encourage you to sign up by January 27 for early-bird savings.

Conference agenda nearly full as early-bird registration nears end (Friday!)

Joan PhaupPosted by Joan Phaup

As we zero in on the first early-bird registration deadline for the Questionmark 2012 Users Conference this Friday, I’m pleased to report that the agenda is nearly full.

The line-up may change somewhat between now and March 20 – 23, when we convene at the Ritz Carlton New Orleans, but here’s an outline of what’s in store.

Before the conference gets rolling, there are two optional full-day workshops to choose from on Tuesday, March 20:

During her presentation, Look Before You Leap: What You Measure is What You Get. our keynote speaker, Dr. Jane Bozarth, will include tips on planning assessments with clear objectives and outcomes.

Here’s the current list of breakouts, which you explore in detail within the conference agenda:

Questionmark Features & Functions

  • Introduction to Questionmark Perception for Beginners
  • Answers to Windows Authoring FAQs
  • Using Questionmark Live for Surveys and Course Evaluations
  • Reporting and Analytics: Understanding and sharing assessment results
  • Deploying Questionmark Perception 5.4
  • Questionmark Live Bring-your-own-laptop Tutorial
  • Customizing the Participant Interface

Case Studies

  • An Architect’s Approach to Questionmark Assessment Development: How to Architect, Design and Implement an Efficient Assessment-Building Process
  • How e-Testing is Improving Assessment for the U.S. Coast Guard
  • The Net Promoter Score (NPS) Question Type
  • Analysis of Examination Time Data at the Question- and Person-Specific Level with Perception
  • The Big Switch: Moving Training and Assessment to Mobile Devices
  • Using Questionmark to Answer Business Questions Related to Your Work Processes
  • Installation and Deployment of Perception over the U.S. Navy’s Internet

Best Practices

  • Applying the Principles of Item and Test Analysis to Your Assessment Program
  • Timing is Everything : Using psychology research to make your assessments more effective
  • Alignment, Impact and Measurement with the A-model
  • Instructional Design for the Real World
  • Using Captivate and Flash Simulations in eLearning and Assessments
  • Using Web Services to Integrate with Questionmark Perception

Discussions

  • Proctored versus non-proctored: How does assessment setting affect student achievement on web-based assessments?
  • Managing Assessment Security as the Stakes are Getting Higher Test Defensibility
  • Using the Angoff Method to Set Cut Scores

Drop-in Demos

  • Delivering Assessments to Mobile Devices
  • What’s New in Perception 5.3 and 5.4?

Register for the conference by December 9th to save $200!

Questions? Email conference@questionmark.com. We’re happy to help!

Questionmark Users Conference

Podcast: Jane Bozarth on looking before you leap into learning measurement

Joan PhaupPosted by Joan Phaup

I had a great time the other day chatting with Dr. Jane Bozarth, our keynote speaker for the Questionmark 2012 Users Conference in New Orleans March 20 – 23.

Jane Bozarth

Dr. Jane Bozarth

Jane, whose degrees include a doctorate in Training and Development, will be speaking from her extensive experience as a training practitioner for more than 20 years. She is Elearning Coordinator for the state of North Carolina as well as a book author, Learning Solutions Magazine columnist and blogger.

During her keynote, Look Before You Leap: What You Measure is What You Get, Jane will share methods for building assessment directly into learning design.

We are delighted that she will also present a best practice session on Instructional Design for the Real World,  for those looking for tools and tricks that will support rapid instructional design and get to the heart of needs analysis and improve communication with subject matter experts, managers and others. You can get details about this and other breakout sessions by visiting the conference agenda.

Early-bird conference registration is open until December 9th, so this is a good time to sign up!

Listen in on my conversation with Jane in this podcast, or click here for a transcript.