At Questionmark we are all about our customers. Through case studies and the US and European users conferences we listen to your stories and experiences and learn how to shape our product to suit your needs.
We’re pleased that our efforts have been earning the attention of the wider learning community and are honored to have recently received e-learning awards from both sides of the globe:
Jeff Place receives our eLearning! award
2011 Best of Elearning! Awards, based on nominations by the readers of Elearning! and Government Elearning! publications, cover 26 categories of enterprise learning solutions and workplace technology products and services. We are very pleased to be winners in the “Best Assessment Tool” category We’ve been recognized in this award program since 2007.
Many posts in this blog have emphasized the importance of aligning learning and assessment with an organization’s strategic goals.
Now we’re very pleased to announce a new white paper that sets forth a powerful yet flexible framework that offers a goal-oriented approach to assessment and evaluation: the A-model. Developed by measurement and evaluation specialist and Ametrico founder Dr. Bruce C. Aaron, this new model provides a sequence of activities that ensure a linkage between goals, solutions and assessment, so that organizations can trace the progress from analysis and design to measurement and evaluation.
Dr. Bruce C. Aaron
Questionmark commissioned Bruce to write this white paper, which you are invited to download free of charge from our website. The paper describes a framework for helping individuals and organizations clarify the goals, objectives and human performance issues of their work. It explains how to design systematic assessment systems to evaluate progress in a way that you can tailor to the specific needs of your organization.
Bruce developed the A-model as a practical structure for accountability and a comprehensive system for planning and measurement — an important development considering the high demand for return on investment in HRD programs. He’ll present a free one-hour web seminar about the A-model at 1 p.m. Eastern Daylight Time on Thursday, October 20.
Sign up for the web seminar and learn about:
Putting together a customized assessment and evaluation system to meet the needs of your organization.
Developing metrics and accountability plans during the analysis and design of solutions.
Continually assessing the value of initiatives to improve human performance and quality.
Using the A-model to align your organization’s strategy, work, and accountability.
With schools, colleges and universities now fully launched into a new academic year, it’s certain testing season!
The security of test results is crucial to the validity of test scores – something we explored in a previous post. Today, I’d like to look at another helpful tool for promoting secure and fair tests: the candidate agreement or examination honor code.
These agreements outline what is expected of test takers. They present a code of conduct that test takers must agree to before they start an assessment. This can be done manually as an outline or electronically before an online exam begins. When participants sign the code, they’re consciously acknowledging the rules and the repercussions of cheating. Such codes apply to all types of high-stakes testing, such as certification tests.
What expectations should you include in a candidate agreement? Here are some to consider:
The candidate must abide by the rules of the test center, organization, or program
The candidate will not provide false ID or false papers
The candidate cannot take the test on behalf of someone else
The candidate will not engage in cheating in any form
The candidate will not help others cheat
The candidate will not use aids that are not allowed
The candidate will not solicit someone else to take the test
The candidate will not cause a disturbance in the testing center
The candidate will not tamper with the test center in any way
The candidate will not share information about the assessment content they saw (non-disclosure agreement)
The test vendor will have the option to terminate the assessment if suspicious behavior is detected
I would venture to guess that many elearning designers/developers start designing their new elearning course “at the beginning” – they start writing content and gathering illustrations, then maybe work in some delivery considerations, go through the review and sign-off procedures … and only at the very end do they remember, “Oh, yeah, I should probably have some sort of quiz or test.”
Dr. Jane Bozarth is an accomplished elearning designer and developer, and her latest article in Learning Solutions Magazine is called Nuts and Bolts: The 10-Minute Instructional Design Degree. In her article she recognizes that a lot of elearning designers and developers come from other disciplines and may not have much formal training when it comes to elearning, so she provides eight recommendations for designing and developing the best elearning possible.
Her #1 recommendation? Design assessments first. Jane writes:
Too often we create assessments and tests as an afterthought, in a scramble after the training program is essentially otherwise complete. The result? Usually, it’s a pile of badly written multiple-choice questions. When approaching a project, ask: “What is it you want people to do back on the job?” Then, “What does successful performance look like?” “How will you measure that?” Design that assessment first. Then design the instruction that leads to that goal.
For example, I used to support the call center agent training for a large telecommunications company. It was important that the agents come out of the training with an understanding of the software applications they would be using at their station – not just the correct values for certain fields, but an understanding of the application itself, including how to log in, how to navigate, which fields were mandatory and which were not, etc. Therefore we knew we had to include software simulation questions in our assessments (something that can be done amazingly well with Flash in Questionmark Perception), which in turn meant that we knew we had to include simulations in our training.
Does this mean that your elearning will be “teaching to the test?” Some people might see it that way, but I would suggest that since the test reflects the desired behaviors back on the job, teaching to that test is not a bad thing. And by working backwards from the specific desired behaviors and the assessment of those behaviors, your training will be very focused on just what is needed.
One feature that people often overlook in Questionmark Live browser-based authoring is the ability to copy the sharing permissions of a question set to a new question set. This is extremely useful if you are getting ready to have a large item-sourcing event. Here is how it works.
Create a question set.
Set up sharing for the question set
Create a new question set
Check the box to “Copy sharing from another Question Set”
Select the Question Set that has the sharing set up from the pull down list.
You can do this for as many folders as you need. The nicest thing about doing things this way is that when your subject matter experts log in, the folders are all set up and the SMEs can begin creating questions immediately.
Watch the following video to see how easy it is to do.