Short quiz on the history of assessment

john_smallPosted by John Kleeman

How well do you know the history of assessment? Did you know, for instance, that the Likert scale, used for surveys in which respondents specify their level of agreement (.e.g., Agree / Neither Agree nor Disagree / Disagree) to various statements, was invented only 80 years ago?

Here is a short 8-question summer quiz on the history of assessment. I hope you enjoy it.

As with all assessments in Questionmark Perception version 5, this assessment auto-senses and auto-sizes to fit the device, page-size or frame it’s on. The assessment you see here is embedded in the blog page. If you want to see the assessment in a different size, try calling the same URL (www.questionmark.com/go/ahistory) directly from your browser and you’ll see that the Next and Submit buttons and other screen furniture will size themselves to fit the screen you are on.

Some day, all assessments will be like this.

Embedding Assessments in SocialGO

Embed a Questionmark Perception assessment, survey or quiz within SocialGO by creating an “Assessment” tab.

  • To see how this would look, see a snapshot of an assessment embedded within a SocialGO page.
  • Check out this How-to on our developer Web site.
  • SocialGO is a British service that allows users to build their own online social networks around a personal topic or interest. It is not possible to embed an assessment directly into a SocialGO Web site unless you are paying for their full subscription membership. However, you can easily assign one of the available navigation tabs within your site to link directly to an assessment.

Adding Audio to Assessment Questions

jim_small

Posted by Jim Farrell

One of the most powerful pieces of media that can be added to a question is audio. I have used audio in a variety of ways for assessments ranging from interpreting sounds from a computer to setting up scenarios where a customer service agent has to understand the emotions of a customer and respond accordingly.

There are a number of ways to add audio to questions and assessments using Questionmark products. Today let’s look at the newest way to add audio: Questionmark Live browser-based authoring. The following video shows you how easy it is to add audio to a question in Questionmark Live.

I invite our software support plan customers to head over to the Authoring Zone in Questionmark Communitty Spaces to share how they are using audio in their assessment programs after they’re watched the video. If you’re not familiar with Questionmark Live, you can click here to try it out.

Guidelines and standards for defensible assessments

greg_pope-150x1502

Posted by Greg Pope

I have been asked on occasion what guidelines and standards are available to ensure that an assessment program aligns with best practices and is defensible.

Organizations conducting assessments undertake internal reviews of where their assessment program is in relation to internationally recognized guidelines on assessment. High -stakes organizations will at times hire companies that specialize in psychometric audits to conduct a thorough review of the assessment processes and practices within the organization. This will usually yield an audit scorecard outlining where an organization is doing well and where is should improve according to the guidelines.

The most common guidelines document to be used for these sorts of audits is the “Standards for Educational and Psychological Testing” . This document is organized into numbered sections, each of which provides details on what is expected of an assessment program in various areas. For example, section 8.7 states:
“Test takers should be made aware that having someone else take the test for them, disclosing confidential test material, or any other form of cheating is inappropriate and that such behavior may result in sanctions.”

During an audit an organization may get scored on how it performs in relation to each section of the document with information on where the organization performed well and where the organization performed poorly and needs to improve. So in the example above, if an organization has a clearly written candidate agreement in place that participants need to agree to before continuing to the assessment, the organization would obtain full marks for this standard. If the organization does not have a candidate agreement in place, this would be an area for improvement.

One can go through the entire standards document and create a checklist to record the status of where the organization fairs compared to each of the standards, as in the example below:

Other guidelines and standards that you can use to help benchmark and improve your assessment program are:

I hope this article was helpful!

Embedding Assessments in TikiWiki

Embed a Questionmark Perception assessment, survey or quiz within an TikiWiki using an iFrame.

  • To see how this would look, see a snapshot of an assessment embedded within a TikiWiki page.
  • Check out this How-to on our developer Web site.
  • TikiWiki, also known as Tiki Wiki CMS Groupware, is a free and open-source, wiki-based content management system written primarily in PHP. In addition to enabling websites and portals on the internet, TikiWiki contains a number of unique collaboration features that allow it to operate as a Groupware web application.

The CAA Conference, 10 years on

Posted by Steve Lay

I had great fun at the recent CAA (Computer Assisted Assessment) Conference hosted by University of Southampton, UK.  I’d like to thank the team there for taking the lead in organizing the event and opening a new chapter in its history.  This conference builds on the success of the 12 previous CAA conferences hosted at Loughborough University. Although I didn’t go to the first event in 1997, I’ve been a regular attendee on and off for the past 10 years.

I was given the job of summing up the conference and providing closing remarks.  With just two days to read over 30 academic papers I found myself searching for a tool to help me summarize the information quickly.  After a little bit of text processing with a python script and the excellent TagCrowd tool I came up with a tag cloud based on the top 50 terms from the abstracts of the papers presented at the conference:

tagcloud4

Assessment obviously remains a key focus of this community, but I was also struck by the very technical language used: system, tools, design, computer and so on.  However, it was also interesting to see which words were missing.  Traditionally I would have expected words like reliability and validity to feature strongly.  Although summative assessment makes an appearance formative assessment does not feature strongly enough to appear in the cloud.  Clearly students, learners and the individual are important but where is adaptivity or personalization?

It is interesting to compare this picture with a similar one taken from the abstracts of the papers in 2000, ten years ago.

tagcloud2000

An important part of our mission at Questionmark is learning from communities like this one and using the knowledge and best practices to develop our software and services.  During the conference I witnessed a range of presentations covering ideas that we can apply right now through to some fascinating areas of research that point the way to future possibilities.

The conference was a great success, and planning for next year (5th-6th July 2011) has already started.  Check out the CAA Web site for the latest information.

« Previous PageNext Page »