Celebrating two e-learning awards

julie-smallPosted by Julie Delazyn

At Questionmark we are all about our customers.  Through case studies and the US and European users conferences we listen to your stories and experiences and learn how to shape our product to suit your needs.

We’re pleased that our efforts have been earning the attention of the wider learning community and are honored to have recently received e-learning awards from both sides of the globe:

Jeff Place receives our eLearning! award

  • 2011 Best of Elearning! Awards, based on nominations by the readers of Elearning! and Government Elearning! publications, cover 26 categories  of enterprise learning solutions and workplace technology products and services.  We are  very pleased to be winners in the “Best Assessment Tool” category  We’ve been recognized in this award program since 2007.

Our customers’ suggestions and support have made all this possible  – so we thank you!

Brussels-bound for a full conference programme 9th – 11th October

Posted by Jane Townsend

In less than two weeks, Questionmark users will gather in Brussels for the Questionmark European Users Conference.

Delegates will enjoy a full programme, with lots of choices  — from case studies to presentations about best practices and training sessions on the use of Questionmark technologies.

Networking opportunities will include our hot n’ happening “Speak Easy” evening event.  If you are attending the conference, get your lucky charms ready for a big night out!

Here’s the complete conference line-up. Click here for details about all these sessions: 

General Sessions:

  • Conference Kick-off and Opening General Session
  • Assessment Security : The Stakes are Getting Higher

Tech Training

  • Introduction to Questionmark Perception for Beginners
  • Using Questionmark Reporting and Analytics Tools
  • Methods for Integrating Perception with Other Systems
  • Planning your Migration from Perception v4 to v5
  • Customising the Participant Experience
  • Delivering Assessments to Mobile Devices

Case Studies

  • Practical Tips for Sustainable Large scale e-Assessment – University of Bradford
  • Deploying Questionmark Perception v5 Globally at HSBC
  • Automated testing of assessment delivery performance using JMeter –  Rotterdam University of Applied Science
  • Extended Essay Editor – Complete process for managing both Question and Answers – Nocada IT-Konsult  and IT-Arkitekterna Västerås
  • Online Assessment at the University of Graz
  • e-Assessment in Dental Radiology: observing the future – University of Dundee
  • Deploying and Using a Blackboard-integrated Questionmark Module Evaluation System- University of Glamorgan
  • Questionmark can be more than Questionmark: Connections, integrations and optimising the deployment – ECABO
  • Using Questionmark to create and mark paper-based exams – Harper Adams University College
  • User attitudes towards online testing at Aberystwyth University

Best Practices

  • Using Save as you Go? Technical and other security aspects of test taking – Leen Vegter, Stoas
  • Using “Regular Expressions” when Importing Questions from a Word or Excel Document- Gregory Furter,  EMLYON Business School
  • Using the Latest Learning Research to Improve Your Questionmark Assessments – John Kleeman,  Questionmark
  • Using Web Services to Integrate with Questionmark Perception – Steve Lay,  Questionmark
  • Using Captivate and Flash Simulations in eLearning and Assessments – Doug Peterson,  Questionmark
  • Using Questionmark Perception to Make SharePoint an Effective Learning Platform – John Kleeman, Questionmark
  • Discussing successful implementation strategies – Esther van der Linde, HAN University
  • Play the Game Digital Assessments: is your organisation ready? – Michel Kao, HAN University
  • Using QMWise to Customise Questionmark Perception- Mauro Chieppa, Stoas

We’ll also be hosting  group discussions with our product owners about potential new offerings. You can register for the conference online or email euconference@questionmark.co.uk more information.

Alignment, Impact and Measurement With the A-model

Joan PhaupPosted by Joan Phaup

Many posts in this blog have emphasized the importance of aligning learning and assessment with an organization’s strategic goals.

The A-model

Now we’re very pleased to announce a new white paper that sets forth a powerful yet flexible framework that offers a goal-oriented approach to assessment and evaluation: the A-model. Developed by measurement and evaluation specialist and Ametrico founder Dr. Bruce C. Aaron,  this new model provides a sequence of activities that ensure a linkage between goals, solutions and assessment, so that organizations can trace the progress from analysis and design to measurement and evaluation.

Dr. Bruce C. Aaron

Questionmark commissioned Bruce to write this white paper, which you are invited to download free of charge from our website. The paper describes a framework for helping individuals and organizations clarify the goals, objectives and human performance issues of their work. It explains how to design systematic assessment systems to evaluate progress in a way that you can tailor to the specific needs of your organization.

Bruce developed the A-model as a practical structure for accountability and a comprehensive system for planning and measurement — an important development considering the high demand for return on investment in HRD programs. He’ll present a free one-hour web seminar about the A-model at 1 p.m. Eastern Daylight Time on Thursday, October 20.

Sign up for the web seminar and learn about:

  • Putting together a customized assessment and evaluation system to meet the needs of your organization.
  • Developing metrics and accountability plans during the analysis and design of solutions.
  • Continually assessing the value of initiatives to improve human performance and quality.
  • Using the A-model to align your organization’s strategy, work, and accountability.

Candidate Agreements: Establishing honor codes for test takers

julie-smallPosted by Julie Delazyn

With schools, colleges and universities now fully launched into a new academic year, it’s certain testing season!

The security of test results is crucial to the validity of test scores – something we explored in a previous post. Today, I’d like to look at another helpful tool for promoting secure and fair tests: the candidate agreement or examination honor code.

These agreements outline what is expected of test takers. They present a code of conduct that test takers must agree to before they start an assessment. This can be done manually as an outline or electronically before an online exam begins. When participants sign the code, they’re consciously acknowledging the rules and the repercussions of cheating. Such codes apply to all types of high-stakes testing, such as certification tests.

What expectations should you include in a candidate agreement? Here are some to consider:

  • The candidate must abide by the rules of the test center, organization, or program
  • The candidate will not provide false ID or false papers
  • The candidate cannot take the test on behalf of someone else
  • The candidate will not engage in cheating in any form
  • The candidate will not help others cheat
  • The candidate will not use aids that are not allowed
  • The candidate will not solicit someone else to take the test
  • The candidate will not cause a disturbance in the testing center
  • The candidate will not tamper with the test center in any way
  • The candidate will not share information about the assessment content they saw (non-disclosure agreement)
  • The test vendor will have the option to terminate the assessment if suspicious behavior is detected

If you’d like more details about these and other tips on ensuring the security and defensibility of your assessments you can download our white paper, “Delivering Assessments Safely and Securely.”

eLearning Design: Start with the Assessment!

Posted By Doug Peterson

I would venture to guess that many elearning designers/developers start designing their new elearning course “at the beginning” – they start writing content and gathering illustrations, then maybe work in some delivery considerations, go through the review and sign-off procedures … and only at the very end do they remember, “Oh, yeah, I should probably have some sort of quiz or test.”

Dr. Jane Bozarth is an accomplished elearning designer and developer, and her latest article in Learning Solutions Magazine is called Nuts and Bolts: The 10-Minute Instructional Design Degree. In her article she recognizes that a lot of elearning designers and developers come from other disciplines and may not have much formal training when it comes to elearning, so she provides eight recommendations for designing and developing the best elearning possible.

Her #1 recommendation? Design assessments first. Jane writes:

Too often we create assessments and tests as an afterthought, in a scramble after the training program is essentially otherwise complete. The result? Usually, it’s a pile of badly written multiple-choice questions. When approaching a project, ask: “What is it you want people to do back on the job?” Then, “What does successful performance look like?” “How will you measure that?” Design that assessment first. Then design the instruction that leads to that goal.

For example, I used to support the call center agent training for a large telecommunications company. It was important that the agents come out of the training with an understanding of the software applications they would be using at their station – not just the correct values for certain fields, but an understanding of the application itself, including how to log in, how to navigate, which fields were mandatory and which were not, etc. Therefore we knew we had to include software simulation questions in our assessments (something that can be done amazingly well with Flash in Questionmark Perception), which in turn meant that we knew we had to include simulations in our training.

Does this mean that your elearning will be “teaching to the test?” Some people might see it that way, but I would suggest that since the test reflects the desired behaviors back on the job, teaching to that test is not a bad thing. And by working backwards from the specific desired behaviors and the assessment of those behaviors, your training will be very focused on just what is needed.

Making Shared Question Sets Easier to Manage

 Posted by Jim Farrell

One feature that people often overlook in Questionmark Live browser-based authoring is the ability to copy the sharing permissions of a question set to a new question set. This is extremely useful if you are getting ready to have a large item-sourcing event. Here is how it works.

  1. Create a question set.
  2. Set up sharing for the question set
  3. Create a new question set
  4. Check the box to “Copy sharing from another Question Set”
  5. Select the Question Set that has the sharing set up from the pull down list.
  6. Click OK.

You can do this for as many folders as you need. The nicest thing about doing things this way is that when your subject matter experts log in, the folders are all set up and the SMEs can begin creating questions immediately.

Watch the following video to see how easy it is to do.