The suspense is over for those of you waiting for Part 2 of our tutorial on creating and editing assessments in Questionmark Perception! This video demonstrates how to customize and control how your assessment works. Find out how to use the assessment editor to arrange time limits, security options and other settings. Learn how to establish the look and feel of your assessment, create jump blocks, set up feedback options, organize post-assessment email messages and perform many other tasks.
Click here to review Part 1 of the assessment creation tutorial, and here to watch our video about creating and managing questions.
Online assessments are used in many different way at Sanlam Personal Finance in South Africa. The company uses Questionmark Perception to test competencies and product knowledge. Assessments also play an important role in compliance. And in keeping with the company’s goal of engendering a high-performance culture, they are used on an ongoing basis to support learning.
Because Sanlam uses online assessments so extensively, my recent conversation with Sanlam Training Technology Consultant Mark Julius covered a wide array of topics. We spoke about how his organization uses graphics and animations to simulate on-the-job situations during assessments. We also talked about the special challenges of operating in countries with varying levels of internet connectivity and the ever-expanding importance of proving compliance with government regulations.
You can learn more by reading our case study about Sanlam and their use of Perception, which they use together with the SAP Human Resources and Learning Management System.
As I started thinking about what I wanted to blog about, I couldn’t get past the podcast done by our very own Joan Phaup and Dr. Will Thalheimer of Work-Learning Research on the use of feedback. One of the most powerful features in Questionmark Live is the ability to leave choice-based feedback. I will likely have many blog posts on this topic and Dr. Thalheimer’s white paper, but let’s start at the beginning:
Retrieval is more important than feedback. The role that feedback plays is to support retrieval.
This statement by Will seems simple, but it helps to understand how to write good feedback. There are so many things to think about when creating feedback in a question.
When is the retrieval opportunity presented?
What is the feedback for a correct answer?
What is the feedback for an incorrect answer?
How does Questionmark Live fit into this? Well, it is pretty easy to write feedback for late-in-learning retrieval since you are only trying to get the learner back on track. It is the early-in-learning feedback that needs to be more extensive so it can help the learner develop pathways to information to support later retrieval. Allowing a subject matter expert (SME) to create extensive feedback in Questionmark Live will ensure that your feedback is detailed and accurate. No one is expecting the SME to be an expert in question writing. You may need to tweak the question once you bring it into Perception, but your feedback will be far more powerful if you glean it from someone who knows about a subject in depth.
I really encourage you to read Dr. Thalheimer’s white paper to help you use feedback to improve the learning process.
Earlier this month I shared a tutorial on creating and managing questions using Questionmark Perception. Now it’s time to move on to organizing your questions into assessments. This video from the Questionmark Learning Cafe will show you how to create assessment folders and assign administrators to them. It will demonstrate how Perception’s assessment wizard guides you step-by-step through the process of creating surveys, quizzes, tests and exams.
What type of assessment do you want? Do you want to set a time limit for taking it? Do you want to provide feedback to participants? How do you want to order your questions, and how many questions would like to include? Do you want to set a pass/fail threshold? The wizard will walk you through these and many other decisions as you create your assessment.
In my next post I will share another video showing how to customize your assessments and control how they work. Watch this space for Part 2!
I’m reporting from the E-Assessment Live event at Loughborough University on a practical experience of crowdsourcing assessment content organized by our events team. We had a session with around 20 workstations in a room and gave everyone access via a browser to Questionmark Live, our new software-as-a-service authoring system that allows anyone with a browser to create questions easily and email them out for use in Questionmark Perception.
Most of the people in the room were not familiar with Questionmark. We asked them all to create a question and email them to me from the system. They all logged into Questionmark Live and wrote a question on their home town which I brought into Questionmark Perception very easily, and within 20 minutes from the first question being authored we had an assessment. See below for a screenshot.
I think the availability of applications like Questionmark Live, which allow easy creation of questions by lots of people at the same time and amalgamation into an assessment, is going to make a big difference in the assessment world. Obvious ideas include getting students to create questions for each other and having SMEs brainstorm and then review questions as a group in an item writing workshop. Essentially harness the power of the crowd by letting each person contribute simultaneously rather than write items sequentially or hierarchically.
I am sure there will be ways of using crowdsourcing for questions that no one has thought of yet and this will hugely improve our productivity. Questionmark Live is free to Questionmark software support plan customers and open for anyone to evaluate. Seeing is believing, so I encourage you to try it out on our website.
In my last blog post I talked about the high level purpose and process of conducting an item analysis. Now I will describe some of the essential things to look for in a typical Item Analysis Report.
You may sometimes see “Alpha if item deleted” statistics in Item Analysis Reports. These statistics provide information about whether the internal consistency reliability (e.g., Cronbach’s Alpha) will increase if the question is deleted from the assessment. An increase in the reliability value indicates that the question is not performing well psychometrically. Many Item Analysis Reports do not display the “Alpha if item deleted” statistic because the item-total correlation coefficient provides basically the same information. Questions with higher item-total correlation coefficient values will contribute to higher internal consistency reliability values, and lower item-total correlation coefficient values will contribute to lower internal consistency reliability values.
Other statistics you might see are variations of the point-biserial item-total correlation coefficient such as “Corrected Point-biserial correlation,” “biserial correlation” or “corrected biserial correlation.” The “corrected” in these refers to taking out the question scores from the calculations so that the question being examined is not “contributing to itself” in terms of the statistics.
In my next post I will dive into the nitty-gritty of item analysis. I will look at example questions and how to use the Questionmark Item Analysis Report in an applied context. Stay tuned to the Questionmark Blog…