The key to reliability and validity is authoring

John Kleeman HeadshotPosted by John Kleeman

In my earlier post I explained how reliability and validity are the keys to trustable assessments results. A reliable assessment means that it is consistent and a valid assessment means that it measures what you need it to measure.

The key to validity and reliability starts with the authoring process. If you do not have a repeatable, defensible process for authoring questions and assessments, then however good the other parts of your process are, you will not have valid and reliable assessments.

The critical value that Questionmark brings is its structured authoring processes, which enable effective planning, authoring, Questionmark Liveand reviewing of questions and assessments and makes them more likely to be valid.

Questionmark’s white paper “Assessment Results You Can Trust” suggests 18 key authoring measures for making trustable assessments – here are three of the most important.

Organize items in an item bank with topic structure

There are huge benefits to using an assessment management system with an item bank that structures items by hierarchical topics as this facilitates:

  • An easy management view of all items and assessments under development
  • Mapping of topics to relevant organizational areas of importance
  • Clear references from items to topics
  • Use of the same item in multiple assessments
  • Simple addition of new items within a topic
  • Easy retiring of items when they are no longer needed
  • Version history maintained for legal defensibility
  • Search capabilities to identify questions that need updating when laws change or a product is retired

Some stand alone e-Learning creation tools and some LMSs do not provide you with an item bank and require you to insert questions individually within an assessment. If you only have a handful of assessments or you rarely need to update assessments, such systems can work, but for anyone with more than a few assessments, you need an item bank to be able to make effective assessments.

Authoring tool subject matter experts can use directly

One of the critical factors in making successful items is to get effective input from subject matter experts (SMEs), as they are usually more knowledgeable and better able to construct and review questions than learning technology specialists or general trainers.

If you can use a system like Questionmark Live to harvest or “crowdsource” items from SMEs and have learning or assessment specialists review them, your items will be of better quality.

Easy collaboration for item reviewers to help make items more valid

Items will be more valid if they have been properly reviewed. They will also be more defensible if the past changes are auditable. A track-changes capability, like that shown in the example screenshot below, is invaluable to aid the review process. It allows authors to see what changes are being proposed and to check they make sense.

Screenshot of track changes functionality in Questionmark Live

These three capabilities – having an item bank, having an authoring tools SMEs can access directly and allowing easy collaboration with “track changes” are critical for obtaining reliable and valid, and therefore trustable assessments.

For more information on how to make trustable assessments, see our white paper “Assessment Results You can Trust” 

Assessment Standards 101: SCORM

john_smallPosted by John Kleeman

This is the fourth of a series of  posts on standards that impact assessment.

The ADL SCORM standard rose out of an initiative by the US Department of Defence (DoD), who were large users of e-learning. They wanted to ensure that e-learning content could be interoperable and reusable, for instance to ensure that if e-learning content was developed by one vendor, it could run in another vendor’s environment.

The DoD has a track record of setting technical standards:  for instance  in the 1980s they helped popularize and make TCP/IP an effective standard. The DoD is also a very large customer for most companies in the learning and technology software industry. So when the DoD announced that it would only purchase e-learning software that worked with SCORM, the industry jumped quickly to support it!

One of the ways in which SCORM was made successful was by a series of Plugfests, where vendors could get together in practical labs and check that interoperability was possible in practice, not just in theory. These were well run events, a kind of technological speed dating, where each vendor could try out their compatibility with other vendors. It was great to have technical experts from each vendor in the room and be able to have many different LMSs all able to call our assessments.

In Questionmark Perception, to make an assessment run via SCORM, you use the Publish to LMS capability to create a content package, which is a small XML document that references the assessment. And as you can see in the screenshot below, you can choose from AICC and two flavours of SCORM. Once you’ve made the package, you simply upload it to a management system and participants can then be directed to it.

Publish to LMS screenshot with options including AICC, SCORM 1.2 and SCORM 2004

SCORM is used widely, both within the military and outside it. If you have a choice between AICC and SCORM, it’s often better to choose AICC (see my earlier post in this series), partly because SCORM has a potential security issue (see our past blog article). However, providing you are aware of this issue, SCORM can be a very effective means of calling assessments.

The ADL are currently reviewing SCORM and working out how to improve it, including potentially making it more useful for assessments. As part of their listening for this review, ADL’s technical advisor, Dan Rehak, who was one of the architects of SCORM, is running a session at Questionmark’s user conference in Miami in March to gain feedback on how SCORM could be improved. If you’re interested in influencing this standard to be better in future, this would be a great session to go to. Stay tuned here on the blog for a Questionmark Conference Close-up interview with Dan.