New tools for building questions and assessments

Jim Farrell HeadshotPosted by Jim Farrell

If you are a Questionmark customer and aren’t using Questionmark Live, what are you waiting for?

More than 2000 of our customers have started using Questionmark Live this year, so I think now is a good time to call out some of the features that are making it a vital part of their assessment development processes.

Let’s start with building questions. One new tool our customers are using is the ability to add notes to a question. This allows reviewers to open questions and leave comments for content developers without changing the version of the question.

image 1

Now over to the assessment-building side of things. Our new assessment interface allows users to add questions in many different ways including single questions, entire topics, and random pull from a topic. You can even prevent participants from seeing repeated of questions during retakes when pulling questions at random. Jump blocks allow you to shorten test time or redirect for extra questions to participants who obtain a certain score. You can also easily tag questions as demographic questions so they can be used as filters in our reporting and analytics tools.

image 2

We have also added a more robust outcome capabilities to give your test administrators new tools for controlling how assessments are completed and reported. You can have multiple outcomes for different score bands, but you can also make it so participants have to get certain scores on particular topics before they can pass a test. For example, suppose you are giving a test on Microsoft Office and you set a pass score at 80%. You probably want to make sure that your participants understand all the products and don’t bomb one of them. You can set a prerequisite for each topic at 80% to make sure participants have knowledge of all areas before passing. If someone gets a 100% on Word questions and 60% on Excel questions, they would not pass. Powerful outcome controls help ensure you are truly measuring the goals of your learning organization.

image 3

If you aren’t using Questionmark Live you are missing out, as we are releasing new functionality every month. Get access and start getting your subject matter experts to contribute to your item banks.

Item Development – Training Item Writers

Austin FosseyPosted by Austin Fossey

Once we have defined the purpose of the assessment, completed our domain analysis, and finalized a test blueprint, we might be eager to jump right in to item writing, but there is one important step to take before we begin: training!

Unless you are writing the entire assessment yourself, you will need a group of item writers to develop the content. These item writers are likely experts in their fields, but they may have very little understanding of how to create assessment content. Even if these experts have experience writing items, it may be beneficial to provide refresher trainings, especially if anything has changed in your assessment design.

In their chapter in Educational Measurement (4 th ed.), Cynthia Shmeiser and Catherine Welch note that it is important to consider the qualifications and representativeness of your item writers. It is common to ask item writers to fill out a brief survey to collect demographic information. You should keep these responses on file and possibly add a brief document explaining why you consider these item writers to be a qualified and representative sample.

Shmeiser and Welch also underscore the need for security. Item writers should be trained on your content security guidelines, and your organization may even ask them to sign an agreement stating that they will abide by those guidelines. Make sure everyone understands the security guidelines, and have a plan in place in case there are any violations.

Next, begin training your item writers on how to author items, which should include basic concepts about cognitive levels, drafting stems, picking distractors, and using specific item types appropriately. Shmeiser and Welch suggest that the test blueprint be used as the foundation of the training. Item writers should understand the content included in the specifications and the types of items they are expected to create for that content. Be sure to share examples of good and bad items.

If possible, ask your writers to create some practice items, then review their work and provide feedback. If they are using the item authoring software for the first time, be sure to acquaint them with the tools before they are given their item writing assignments.

Your item writers may also need training on your item data, delivery method, or scoring rules. For example, you may ask item writers to cite a reference for each item, or you might ask them to weight certain items differently. Your instructions need to be clear and precise, and you should spot check your item writers’ work. If possible, write a style guide that includes clear guidelines about item construction, such as fonts to use, acceptable abbreviations, scoring rules, acceptable item types, et cetera.

I know from my own experience (and Shmeiser and Welch agree) that investing more time in training will have a big payoff down the line. Better training leads to substantially better item retention rates when items are reviewed. If your item writers are not trained well, you may end up throwing out many of their items, which may not leave you enough for your assessment design. Considering the cost of item development and the time spent writing and reviewing items, putting in a few more hours of training can equal big savings for your program in the long run.

Integrating with other systems: video tutorials

Julie Delazyn HeadshotPosted by Julie Delazyn

Although you can use Questionmark as a stand-alone Assessment Management System (AMS), it also integrates seamlessly with other key systems – everything from learning management systems and content management systems to portals and scanning technologies.

Questionmark Connectors make these integrations possible.

Some of these, such as the Blackboard Connector, the SAP Connector and the SharePoint Connector, are designed for use with specific systems.

We also support integrations with LTI-, AICC- and SCORM-compliant systems.

You can find video tutorials about many of these connectors in the Questionmark web site. There you’ll find videos on integrating with Moodle, SuccessFactors and Cornerstone OnDemand, as well as other systems such as SharePoint, Canvas, and Ning.

Here is a sneak peak – click to view each video:

blackboardcornerstonesharepoint9-3-2014 10-06-56 AM

2014 South African Users Conference – Addressing Compliance

Austin FosseyPosted by Austin Fossey

We are back from the first South African Users Conference which was hosted by Bytes People Solutions. Like all of our users conferences, the most valuable aspect of this gathering was hearing from our customers and potential customers—through presentations as well as informal conversations.

Many attendees manage assessment programs for large academic or commercial institutions, and I was struck by their teams’ organizational skills. From my conversations, it sounds as if many of these program managers have to strike a balance between traditional practices at their organizations and the needs to adopt innovative strategies to improve measurement practices. For example, one program manager spoke about helping item writers transition from writing items in MS Word to writing them in Questionmark Live. The people I spoke to appeared to be pushing the envelope of their assessment capabilities, helping their stakeholders through technological transitions, while simultaneously delivering thousands of assessments. It was impressive.

Compliance was a recurring theme. In the U.S., test developers are always collecting evidence to demonstrate the legal defensibility of their assessments, and we often turn to The Standards for Educational and Psychological Testing for guidance (the latest edition was released just last week). Though the legal and cultural expectations for test development may differ slightly in other regions, no modern test developer is exempt from accountability. Demonstrating compliance with organizational or legal requirements seemed to be a big consideration for many attendees.

Regardless of what compliance means to different organizations, one thing was the same for everyone: demonstrating compliance means having accurate, easily-accessed data. I noticed that many clients were able to cite data-backed evidence for the decisions they made in their testing programs to meet their stakeholders’ compliance requirements. Some of these data came from Questionmark through our APIs and assessment results, but these presenters also clearly did research about other important factors that impact the validity of the results.

For example, presenters talked about the evidence they gathered to support the use of computer-based testing over paper and pencil tests. Another presenter shared qualitative data from interviewing subject matter experts about their impressions of Questionmark’s authoring tools. These decisions affect the delivery mode and task models of the assessment, which directly relate to the validity of the results, so it is encouraging to see test developers documenting their rationales for these kinds of decisions.

All in all, it was an impressive group of professionals who gathered in Midrand, and I am sure that I learned just as much (if not more) from the participants as they did from me. Special thanks to everyone who attended and presented!

Get trustable results : Require a topic score as a prerequisite to pass a test

John Kleeman HeadshotPosted by John Kleeman

If you are taking an assessment to prove your competence as a machine operator, and you get all the questions right except the health and safety ones, should you pass the assessment? Probably not. Some topics can be more important than others, and assessment results should reflect that fact.

In most assessments, it’s acceptable to define a pass or cut score, and all that is required to pass the assessment is for the participant to achieve the passing score or higher. The logic for this is that success on one item can make up for failure on another item,  so skills in one area are substitutable for skills in another. However, there are other assessments where some skills or knowledge are critical, and here you might want to require a passing score or even a 100% score in the key or “golden” topics as well as a pass score for the test as a whole.

This is easy to set up in Questionmark when you author your assessments. When you create the assessment outcome that defines passing the test, you define some topic prerequisites.

Here is an illustrative example, showing 4 topics. As well as achieving the pass score on the test, the participant must achieve 60% in three topics: “Closing at end of day”, “Operations” and “Starting up”, and 100% in one topic: “Safety”.

Prerequisites

If you need to ensure that participants don’t pass a test unless they have achieved scores in certain topics, topic prerequisites are the way to achieve this.

Case Study: Live monitoring offers security for online tests

Headshot JuliePosted by Julie Delazyn

Thomas Edison State College (TESC) is one of the oldest schools in the country designed specifically for adults. The college’s 20,000+ students, many of them involved with careers and families, live all over the world and favor courses that enable online study.

In setting up online midterm and final exams, the college wanted to give distance leaners the same kind of security as on-campus students experience at more traditional institutions. At the same time, it was essential to give students some control over where and when they take tests.

Online proctoring offered a way to achieve both of these goals.

Working with Questionmark and ProctorU has enabled TESC to administer proctored exams to students at their home or work computers.

Proctors connect with test takers via webcam and audio hook-ups, verify the each test-taker’s identity, initiate the authentication process, ensure the students are not using any unauthorized materials or aids and troubleshoot technical problems. The college can now run secure tests while meeting the needs of busy students for flexible access to exams.

You can read the full case study here.

Next Page »
SAP Microsoft Oracle HR-XML AAIC