Posted by John Kleeman
This is the fourth of a series of posts on standards that impact assessment.
The ADL SCORM standard rose out of an initiative by the US Department of Defence (DoD), who were large users of e-learning. They wanted to ensure that e-learning content could be interoperable and reusable, for instance to ensure that if e-learning content was developed by one vendor, it could run in another vendor’s environment.
The DoD has a track record of setting technical standards: for instance in the 1980s they helped popularize and make TCP/IP an effective standard. The DoD is also a very large customer for most companies in the learning and technology software industry. So when the DoD announced that it would only purchase e-learning software that worked with SCORM, the industry jumped quickly to support it!
One of the ways in which SCORM was made successful was by a series of Plugfests, where vendors could get together in practical labs and check that interoperability was possible in practice, not just in theory. These were well run events, a kind of technological speed dating, where each vendor could try out their compatibility with other vendors. It was great to have technical experts from each vendor in the room and be able to have many different LMSs all able to call our assessments.
In Questionmark Perception, to make an assessment run via SCORM, you use the Publish to LMS capability to create a content package, which is a small XML document that references the assessment. And as you can see in the screenshot below, you can choose from AICC and two flavours of SCORM. Once you’ve made the package, you simply upload it to a management system and participants can then be directed to it.
SCORM is used widely, both within the military and outside it. If you have a choice between AICC and SCORM, it’s often better to choose AICC (see my earlier post in this series), partly because SCORM has a potential security issue (see our past blog article). However, providing you are aware of this issue, SCORM can be a very effective means of calling assessments.
The ADL are currently reviewing SCORM and working out how to improve it, including potentially making it more useful for assessments. As part of their listening for this review, ADL’s technical advisor, Dan Rehak, who was one of the architects of SCORM, is running a session at Questionmark’s user conference in Miami in March to gain feedback on how SCORM could be improved. If you’re interested in influencing this standard to be better in future, this would be a great session to go to. Stay tuned here on the blog for a Questionmark Conference Close-up interview with Dan.