Making sure assessment video and audio are accessible to participants

Posted by Noel Thethy

One of the benefits of an assessment management system like Questionmark’s is its ability to include rich and interactive media within questions. But how do you cope with this content to ensure it is accessible to those with needs?

Video and audio content

When including this type of content in questions, you should provide alternative means for consuming the information. It is important to provide equivalents for users who cannot see or hear.

Captions should be added for video and audio information detailing all the spoken content — and for videos any important non-spoken information. You can include a transcript in several different ways, including:

  • Adding the transcript in the questions stimulus
  • Adding a link to the transcript from the question
  • Embedding closed captions in the video or audio
  • Using the scenario/case question format to display the transcript in parallel with the question multimedia (as shown below)

Rich media content

If you are using any rich media content like Flash or Captivate, be sure to follow the Adobe Accessibility guidelines. This will ensure that the content (which can be interactive) has been created with the necessary attention to the available accessibility features and designs.

Auto-sensing and auto-sizing wizardry at work

Posted by Noel Thethy

The best way to describe the wizardry at work in Questionmark is to see it happen. Click here and watch what happens to the assessment screen as you resize it in your browser window. Did you miss it? Try it again.

What you would have see (if you are using a supported web browser) is the assessment screen resizing and adjusting to accommodate the screen size. When reducing your browser, the buttons get smaller, the text wraps the screen and resizes so it’s still readable, and the date and time disappear when there is not enough space. If you are not using a supported browser or you are using a mobile device, you would see Questionmark’s ability to auto-detect your browser/device and display a compatible version of the assessment.

Auto-sensing and auto-sizing make it possible to reuse the same assessment in a variety of different circumstances without needing to create new templates or assessments for each occasion. Without modifications you can embed assessments in learning materials, display surveys in your portal, and deliver  assessments via mobile devices. (Each time we update our software, we test for compatibility with the latest browsers — so that participants can easily view questions on whatever device they are using.)

To find out more about  auto-sensing/auto-sizing  and the blended delivery options supported by Questionmark, click here.

Ensuring question text is accessible

Posted by Noel Thethy

This post is part of the accessibility series I am running. Here we will look at ensuring text and table elements are accessible.

We have done our best to ensure that Questionmark’s participant interface is readable via screen readers. However, to ensure these work as expected you need to make sure that:

  • The text you use does not contain any inline styles that may confuse a screen reader.
  • Any tables in your content use captions and header information to ensure the screen reader can distinguish content.

If you have copied and pasted text from another application, particularly Microsoft Word, you may find when looking at the HTML code that the question contains extraneous HTML. For example, when content is copied and pasted from Microsoft Word, the text copied will appear as follows in the HTML tab.

 

 

The text copied includes HTML mark-up tags which override the style determined by the templates and could affect how a screen reader interprets what is on the screen. The HTML used to provide the formatting can be viewed in the HTML tab of the Advanced HTML Editor in Authoring Manager and should be cleaned up as much as possible.

Alternatively, Questionmark Live automatically removes any style HTML that may be included from applications such as Word or other Internet pages. To find out more about Questionmark Live, please click here: Questionmark Live

If you are using tables, we recommend that you build them following the W3C guidelines rather than the default tables available. They should ideally look something like this:

 

 

 

 

 

By using the <caption>, <thead> and <tfoot> tags in your table you can clearly identify parts of the table to be read by the screen reader.

For more information see the W3C recommendations for non-visual user agents. These tables can be added by using the Advanced HTML Editor in Authoring Manager.

Making media and images within assessments more accessible

Posted by Noel Thethy

This post is part of the accessibility series I am running. We will look at using media and images in questions and how to ensure these are accessible.

A number of media files can be included in a question/assessment. However, if you are interested in producing accessible assessments you should consider the following:

  • Ensure color is not the only way to distinguish among different pieces of stimulus or answers. Common forms of color blindness (affecting up to 5% of the population) could make answering questions difficult. You should, where possible, change how the information is portrayed or include other indicators.

Original


Alternatives

 

 

 

 

 

 

 

 

When including graphics always include ALT (Alternative) text. This will be displayed when a participant hovers the mouse pointer over an image or if a screen reader is being used.

  • If you are using Adobe Flash media, make sure that it has been created in an accessible fashion.  Refer to the Adobe guidelines for Flash  and Captivate.
  • If you are using video or audio in a question you should include a transcript. You can include a transcript in several ways:
  • Link to a document that includes a transcript (Remember if you are including the transcript as a PDF you should also make sure this is accessible as well. See the Adobe guideline here)
  • Embed a YouTube video, using the Closed Caption feature to include subtitles of any dialog in the video.Provide a video transcript within an assessment, using Questionmark’s Side-By-Side template.

(My advice and recommendations may or may not be suitable for your particular situation, so be sure to take into account your organization’s accepted practices and pedagogy.)

How you can improve assessment accessibility

Posted by Noel Thethy

Over the next few weeks I’ll be releasing several “How To” blog posts that I hope will provide some insight in to the features and functionality available to Questionmark users.

In particular I’ll be looking at:

•    How to ensure your questions are as accessible as possible
•    How-to tweak Questionmark’s accessibility features to suit your specific needs.

This series will consist of several video demonstrations and some explanations of basics concepts.

I’ll be covering topics like:

•    Using media and images in a question in an accessible way
•    Ensuring question text is appropriately accessible
•    Customizing the font size changer
•    Customizing the contrast changer

Before we go on this journey I’d like to remind you of some other posts about accessibility that have already appeared on this blog. They include:

•    Assessment Accessibility: A View from the Inside
•    Assessment Accessibility in Questionmark Perception Version 5

I hope you will join me as I take a look at making assessments accessible.

Knowledge-Check Assessments within Software User Assistance and Documentation

Posted by John Kleeman

We’ve been advocating for our customers to embed knowledge checks within learning, and I’m glad to say that we have been doing this ourselves. As we say in the software industry when a company uses its own products, we’re eating our own dog food!

The evidence shows that you learn more if you study and take a quiz than if you just study and study, so we wanted to give this benefit to our users.

Questionmark has an extensive knowledge base of 600+ articles, which supplement our user manuals. These knowledge bases require registration to view, but here is an example knowledge base that is free for all to view.  Our knowledge checks typically ask 3 to 5 questions and are randomized so you get different questions when you come to the page again. We’ve put knowledge checks within the most popular articles, and since these have now been live for several months we can share some of the results:

  • On average, 13% of visitors to our knowledge base pages with embedded knowledge checks answer the questions and press Submit to see the results and feedback.
  • The response rate varies considerably depending on the subject matter from 2% in a few technical articles to over 50% in a few where the knowledge check is very appropriate.
  • About 60% of participants get a score of 75% or higher.

Here is some advice from  our documentation team lead (Noel Thethy) and me on what we’ve learned about knowledge checks in user assistance:

  1. Focus knowledge checks in articles that give learning in areas people want to learn for the long term. We found few people clicked on the knowledge checks in areas like installation advice, where you just want to do something once, but there was more interest in articles that explained concepts.
  2. Embed knowledge checks in prominent locations within content so that people can see them easily.
  3. Align questions with key learning points.
  4. Ensure the vocabulary within a knowledge check is consistent with the information it pertains to.
  5. Provide meaningful feedback to correct misconceptions.
  6. Review the questions carefully before publishing (Questionmark Live is great for this).
  7. Plan for regular reviews to make sure the content remains valid as your software changes.
  8. Use references or a naming convention to ensure it is easy to associate knowledge checks with articles in reporting.
  9. Unless you want to capture individual results, use a generic participant name to make filtering and reporting on results easier.
  10. Use the Assessment Overview report to get an overview of results and the Question Statistics or Item Analysis reports to identify which questions people are weak at; this may show that you need to improve your learning material.

I’d love to hear any questions or comments from anyone interested in knowledge checks in user assistance, feel free to email me at john@questionmark.com. To give you a flavour of how a knowledge check helps you practice retrieval on something you’ve learned, answer the three questions on this page to check your knowledge on some of the above concepts.