Observational Assessments—why and how

Posted by Julie Delazyn

An Observational Assessment, in which an observer watches a participant perform a task and rates his or her performance, make it possible to evaluate skills or abilities that are difficult to measure using “traditional” assessments.

As Jim Farrell noted in a previous post, “By allowing a mentor to observe someone perform while applying a rubric to their performance, you allow for not only analytics of performance but the ability to compare to other individuals or to agreed benchmarks for performing a task. Also, feedback collected during the assessment can be displayed in a coaching report for later debriefing and learning.”

Click here for examples of how different types of organizations capture performance data and measure competencies using observational assessments.

If you would like to learn more about Observational Assessments, check out this SlideShare presentation. Also, this video – one of many instructional resources available in our Learning Café –offers a brief overview and shows how to schedule an observational assessment, deliver it to mobile device and report on the results.

Delivering and Reporting on Observational Assessments Using Questionmark Perception

Fusion dining: Brinkerhoff, Maslow and software development!

Posted by Steve Lay

A friend of mine recently attended a session at the The Agile Testing & BDD eXchange 2012 in London and, as a result, brought an interesting fusion of ideas to my attention.

I went online to watch Gojko Adzic and Dan North talking about Impact Mapping, a strategic planning technique, and I was intrigued to hear them discuss Brinkerhoff’s work on improving the effectiveness of training programmes.

Long-time readers of this blog may be familiar with Brinkerhoff’s work already. John Kleeman wrote a two-part post about Brinkerhoff’s Success Case Method back in 2010. For Gojko though, it seems that Brinkerhoff’s earlier work, “The Learning Alliance: Systems Thinking in Human Resource Development” has been the key influence. It is this work that first defines the Impact Mapping approach. There is a nice summary of the concept in this synopsis of Brinkerhoff and Apking’s work High Impact Learning.

Gojko Adzic is someone who applies these ideas to the world of software development, a field that is very close to my heart! At Questionmark we use agile software development techniques through our adoption of the scrum process. This means we use tools like user stories, sprints and the product backlog to help us develop our products and services. In the talk about Impact Mapping, Gojko Adzic makes a charming analogy between software development and Abraham Maslow’s hierarchy of needs. You can read about it in detail on Wikipedia but for me, Douglas Adams summarised the idea perfectly:

“The history of every major galactic civilization tends to pass through three distinct and recognizable phases, those of Survival, Inquiry, and Sophistication, otherwise known as the How, Why, and Where phases. For instance, the first phase is characterized by the question “How can we eat?” the second by “Why do we eat?” and the third by “Where shall we have lunch?”

To continue this analogy, Gojko’s argument runs something like this: of course we need to become proficient at finding food but there comes a time when continually asking people if they are still hungry is missing the point, and we should be asking whether or not they enjoyed their meal. Put another way, businesses also have a hierarchy of needs and once we move beyond basic proficiency in our business processes we need to move up to measuring the impact at higher levels.

While Gojko Adzic and Dan North have some suggestions for how to do this in the world of software development it did bring me back to Brinkerhoff’s work and how this advice can be applied more generally, for example, by adopting techniques like the Success Case Method for identifying high impact training programmes.

This year’s top five blog posts

Posted by Julie Delazyn

From videos to SlideShare presentations, we use this blog to share research findings, best practices and success stories related to assessment and measurement.

But what do you think of the subjects we cover?

We took a look to see the five most requested posts in this past year. Here they are, in no particular order:

12 Tips for Writing Good Test Questions

With so much to remember about writing effective test questions, this post makes it easy to focus on what’s important.

Use a survey with feedback to aid student retention

Questionmark Chairman John Kleeman shares a success story about how students at the University of Glamorgan in Wales answer questions about aspects of their studying and receive feedback to help them improve. He describes how and “Early Days” exercise for new students and a “Study Health Check Exercise” for all students help avoid situations that can lead to drop-out.

Being a Good SME Wrangler

Questionmark Product Manager Jim Farrell talks about how important it is for instructional designers to empower subject matter experts (SMEs) to transfer their knowledge and to involve them in creating deliverables to be used by their peers. He considers how Questionmark Live can help learning professionals foster successful relationships with SMEs by making it easy to harvest content from them.

Golden Topics: Making success on key topics essential for passing a test

This post takes a look at why some topics can be more important than others and how testing should reflect that fact. It also explains how to set up tests that require participants to achieve a particular score on critical topics as well as a passing score on the entire test.

Timing is Everything: Using psychology research to make your assessments more effective

John Kleeman offers a SlideShare presentation that will help you learn about psychology research and how you can apply it to improve your use of assessments.

New Online Training Opportunities for Questionmark Users

Kate Sopper

Posted by Chloe Mendonca

For those who want to get started using Questionmark; we are pleased to have launched a new range of online training courses to get participants acquainted with our software. We have completed our first round of these courses and are pleased at the response so far.

Our European Training Manager Kate Soper, who has extensive experience in e-assessment and has taught many Questionmark training courses over the years, leads all three of these online courses. She was able to share with me some of the ways course participants will be able to benefit from the syllabus as well as her advice and support:

How long have you been with Questionmark and what does your role entail?

I’ve been working with the company since 2008 as the European trainer. I travel to our customers’ offices in many different countries to train them in using Questionmark Perception, in addition to holding courses at our London Office.  I have just completed my Post Graduate Certificate in ‘Online and Distance Education: Practices and Debates’ with the Open University in the UK, which has added to my knowledge of online assessment methods.

What led Questionmark to start offering  online courses?

We started these courses in  response to requests from our customers to learn more about Questionmark in a limited time frame. We thought the online courses would be the perfect way to reach out to those unable to attend our 3 day in-house course. Many encounter difficulties attending our 3-day face-to-face training course because of the pre-selected dates as well as the travel and accommodation expenses involved, whereas Online Training now permits learning from the comfort of your own desk!

Who do you think would benefit from this kind of training ?

At the moment we have 3 courses available. Our Introduction to Authoring Manager and Introduction to Enterprise Manager courses, which last 2 to 3 hours, are for people with  basic computer skills who want to get started creating, delivering and analyzing surveys, quizzes, tests and exams.

Our third course, Customising the Participant Interface, lasts about 5 to 6 hours. It goes into more detail than the other two and is aimed at people who already have some experience using Questionmark technologies.

Whatever the level of each course, we provide supporting materials to all participants and I can  offer additional help after the course is completed if necessary.

What kind of reactions do you generally get from people who attend these courses?

People say they appreciate the flexibility of this approach and like learning about the full capabilities of our software. They also like the fact that I can help them a bit even after the course is over. This helps them to move forward with their projects and put into practice what they’ve learned .

Click here for full course details, dates and online registration

Podcast: John Kleeman on Questionmark’s expanding assessment cloud

Posted by Joan Phaup

John Kleeman

We recently celebrated a major expansion of our cloud-based Questionmark OnDemand service with the addition of Questionmark’s European Data Centre to the service we have been providing for some time in the United States.

Wanting to learn more about the reasons for adding the new data centre, I spoke with Questionmark Chairman John Kleeman about the transition many customers are making to cloud-based assessment management.

Here are some of the  points we covered:

  • why software-as-a-service offers a reliable, secure and cost-effective alternative to deploying assessments from in-house servers
  • the importance of giving customers the option of storing data either in the United States or the European Union
  • the extensive security procedures, system redundancy, multiple power sources, round-the clock technical supervision and other measures that make for reliable cloud-based assessments
  • the relative merits of  Questionmark OnDemand  (cloud-based) and Questionmark Perception (on-premise) assessment management — and how to choose between them

You can learn more listening to our conversation here and/or reading the transcript.

Top ten pillars for effective compliance

Posted by John Kleeman

What are the ten pillars for an effective compliance and ethics programme? And how can assessment help?

I mentioned in a previous blog post that I’d enjoyed hearing Carlo di Florio from the US SEC talk at the Ethics & Compliance Officer Association’s (ECOA) Conference. He suggested ten pillars for an effective compliance and ethics programme which I’d like to paraphrase here:

1. Good governance: setting the right tone at the top

2. Culture and values: for instance not saying one thing and doing another

3. Incentives & rewards: these can be key enablers for compliance, but they can also be key indicators for risk

4. Risk management: allocating resources based on risk

5. Policies and procedures: setting correct policies and making people aware of them

6. Role-based training and education – not just generic communication

7. Monitoring and reporting –  using technology where appropriate

8. Investigation and enforcement – effective response

9. Issues management – having a good process to deal with escalating problems

10. On-going improvement process – continually making things better

(see here for a speech by Carlo di Fiorio explaining setting out the pillars in detail).

I thought it might be interesting to see where assessments (surveys, quizzes, tests and exams) fit in here. For which pillars is assessment one of the foundations?

Clearly for #2, surveys measure culture and values and how they are changing. And obviously for #5 and #6, quizzes, tests and exams are a key way to check understanding of policies and of the effectiveness of training. And for #7, assessments are one of the few ways of monitoring your people in all your different offices and getting early warning of problems.

And how about #10? If you want to continually make things better, you will need to measure your improvements. Assessments provide trusted and valuable data to help see where you are and whether you are improving. I also thought how Dr. Bruce C. Aaron’s A-model framework might help organizations measure improvements in compliance?

If you’re not familiar with the A-model, see my colleague, Doug Peterson’s excellent videos (here and here) on using the A-model to measure business improvement. And consider how you might be able to apply this framework within your own organization.