Badging and Assessment: If they know it, let them show it!

Posted by Brian McNamara

We are delighted to announce the availability of Questionmark Badging!

With Questionmark Badging and Questionmark OnDemand, you can grant “badges” to participants based on the outcomes achieved on assessments such as certification exams, post-course tests or advancement exams.Badges associated with Questionmark assessments provide participants with portable, verifiable digital credentials.

Badges aligned with Questionmark assessments can be tied in with competencies and achievements, helping organizations provide recognition and motivation for increasing knowledge and skills. For credentialing and awarding bodies, they can increase the visibility and value of certification programs.

The new app couples Questionmark’s capabilities in delivering valid, reliable and trustworthy assessments with the industry-leading digital credentialing platform from Credly. More than just a visual representation of accomplishment, digital badges provide participants with verifiable, portable credentials that can be shared and displayed across the web, including social networking sites such as LinkedIn, Facebook and Twitter.

Find more info about Questionmark Badging right here!

Learning, Training and Assessments in Regulatory Compliance – Implementation Best Practices

Posted by John Kleeman

I’m pleased to let you know of a new joint SAP and Questionmark white paper on implementation best practices for learning, training and assessments in regulatory compliance. You can download the white paper here.

There has been a huge change in the regulatory environment for companies in the last few years. This is illustrated nicely by the graph below showing the number of formal warning letters the U.S. Food and Drug Administration (FDA) issued in the period from 2010 to 2016 for various compliance infractions.

Rise in FDA warning letters from 2010 to 2016, numbers increase from a few hundred a year to over 10,000 a year

Of course it’s not just letters that regulators issue, there have also been huge increases in the fines issued on companies in areas including rules breaches in banking, data protection, price-fixing and manufacturing.

Failure to effectively train or assess employees is a significant cause of compliance errors, and this white paper authored by SAP experts Thomas Jenewein, Simone Buchwald and Mark Tarallo and me (Questionmark Founder and Executive Director, John Kleeman) explains how technology can help address the issue.

The white paper starts by looking at key factors increasing the need for training, learning, and assessments to ensure that businesses stay compliant and then goes on to consider three drivers for compliance learning – Organization Imposed, Operations Critical and Regulatory. The white paper then looks at how

  • A Learning Management System (LMS) can manage compliance learning
  • Learning Content and Documentation Authoring Tools can author compliance learning
  • An Assessment Management System can be used to diagnose the training needed, to help direct learning and to check competence, knowledge and skills.

A typical LMS includes basic quiz and survey capabilities, but when making decisions about people, such as whether to promote, hire, fire, or confirm competence for compliance or certification purposes, companies need more. The robust functionality of an effective assessment management system allows organizations to create reliable, valid, and more trustworthy assessments. Often, assessment management systems and LMSs work together, and test-takers will often be directed to take assessments by using a single sign-on from the LMS.

The white paper describes how companies can use SAP SuccessFactors Learning, SAP Enable Now and Questionmark software work together productively to help companies manage and deliver effective compliance learning, training and assessments – to mitigate regulatory risk. It goes on to describe some key trends in compliance training and assessments that the authors see going forwards, including how cybersecurity and data protection are impacting compliance.

The white paper is a quick, easy and useful read – you can download it here.

New white paper: Questionmark and Microsoft Office 365

Posted by John Kleeman

I’m pleased to inform you of a new white paper fresh off the press on Questionmark and Microsoft Office 365.

Office logoThis white paper explains how Microsoft Office 365 complements the Questionmark OnDemand assessment management system; and how you can use Office 365 to launch Questionmark surveys, quizzes, tests and exams, how to consume Office 365 resources within Questionmark, and how Office 365 can help analyze results from assessments. You can download the white paper here.

The white paper also describes some of the reasons that organizations use assessments and why it is important for assessments to be valid, reliable and trustable.

Launching assessments from Office 365

Being able to call assessments from within Office 365 allows you to closely connect an assessment to content, for example to check understanding after learning. The white paper describes how you can:

  • Call Questionmark assessments from the Office 365 app launcher
  • Launch an assessment from within a Word, Excel or other Office document
  • Embed an assessment inside a PowerPoint presentation
  • Launch or embed assessments from SharePoint
  • Use SAML to have common identities and seamless authentication between Office 365 and Questionmark OnDemand. The benefit of this is that test-takers can login once to Office 365 and then can take tests in Questionmark OnDemand without needing to login again.

Using Office 365 resources within assessments

Illustration of a picture of a video being used inside an assessmentAssessments of competence are in general more accurate when the questions simulate the performance environment being measured. By putting video, sound, graphics and other media within question stimulus, you help put the participant’s mind into an environment closer to how he/she will be when doing a real-world job task. This makes the question more accurate in measuring the performance of such tasks.

To help take advantage of this, a common use of Office 365 with Questionmark OnDemand is  to make media and other resources that you can use within assessments. The white paper describes how you can use Office 365 Video, PowerPoint, SmartArt and other Office 365 tools to make videos and other useful question content.

Using Office 365 to help analyze results of assessments

People have been using Microsoft Excel to help analyze assessment results since the 1980s and the white paper describes some suggestions on how to do that most effectively with Questionmark OnDemand.

Newer Microsoft tools can also be used to provide powerful insight into assessment results. Questionmark OnDemand makes available assessment data in an OData feed, which can be consumed by business intelligence systems like Power BI. OData is an open protocol to allow the creation and consumption of queryable and interoperable data in a simple and standard way. The white paper also describes how to use OData and Power BI to get further analysis and visualizations from Questionmark OnDemand.

 

The white paper is easy to read and gives practical advice. I recommend reading this white paper if your organization uses Office 365 and Questionmark or if you are considering doing so. You can download the white paper (free with registration) from the Questionmark website.  You can also see other white papers, eBooks and other helpful resources at www.questionmark.com/learningresources.

 

Twelve tips to make questions translation ready

Posted by John Kleeman

We all know the perils of mis-translation. My favourite mis-translation is the perhaps apocryphal tale of a laundry in Rome, Italy putting up a sign in English saying “Ladies, leave your clothes here and spend the afternoon having a good time.” With Questionmark having a translation management system to help you translate questions and assessments, here are some good practice tips on writing questions so they will be easy to translate.

1. Avoid questions that assume syntax is the same in all languages, for example fill-in-blank questions that rely on word order. For example, in English, the verb goes in the middle of a sentence but in Turkish and Korean, the verb is usually at the end of a sentence.

2. Also avoid “broken stem questions”, where the stem is an incomplete sentence and the participant must select the most appropriate answer to finish the sentence. That’s likely to be challenging to translate in some languages where the ordering may not make sense.

3. Keep questions simple. Avoid unnecessarily complex text or question stems with redundancy or unnecessary repetition. Such questions are best simplified before you translate them.

4. Avoid metaphors and idiomatic language in general; things like “in small steps” or “disappear into thin air”, could well introduce translation mistakes.

5. Avoid passive voice where you can. Not all languages make it easy to translate this, and it’s usually best to just use active voice.

6. Thoroughly review questions prior to translation to ensure no ambiguity. if the question wording is ambiguous, the translator’s interpretation of the question may not be the same as that of the question author.

7. If you are using a rating scale across many questions, investigate its cultural appropriateness and, if possible, whether it is widely used in the target language.

8. Test items based on nuances of vocabulary, descriptions of emotions or abstract concepts can be hard to translate, as different languages may have different vocabulary connotations.

9. You also need to be aware of the risk that translating a question could cause a question to become obvious due to different words in the target language, like the following Swedish example.

Question that shows a Swedish translation giving the answer to a question by words being the same

10. Avoid using cultural context within question stimulus. If you are presenting a scenario, make it one that is relevant to different cultures and languages. If it is difficult to avoid a culturally marked context, consider preparing good guidelines for translators in which you define what adaptations are encouraged, desirable and ruled out

11. If your question contains a graphic or video, consider if you can remove any text from it and still keep the question meaningful. Otherwise you need to translate the text in the graphic or video in each language.

12. if you are translating items into several languages, it can be cost effective to conduct a translatability assessment on the items before you do the detailed translation. This will alert to possible issues within various language families prior to doing the more substantial work of full translation. A translatability assessment lets you identify and fix issues early and relatively cheaply. See here for a blog from Steve Dept of CApStAn that explains more.

Thanks to Steve Dept for inspiring this blog post with an excellent conference presentation at EATP last year and for helping me write this article. For some more advice on translating and adapting tests, see the International Test Commission Guidelines for Translating and Adapting Tests or the Cross-cultural Survey Guidelines (CCSG), both of which have been recently updated.

I hope this advice helps you be efficient in your translation efforts. For information on Questionmark OnDemand which includes translation management system capabilities, see www.questionmark.com.

Six tips to increase content validity in competence tests and exams

Posted by John Kleeman

Content validity is one of the most important criteria on which to judge a test, exam or quiz. This blog post explains what content validity is, why it matters and how to increase it when using competence tests and exams within regulatory compliance and other work settings.

What is content validity?

An assessment has content validity if the content of the assessment matches what is being measured, i.e. it reflects the knowledge/skills required to do a job or demonstrate that the participant grasps course content sufficiently.

Content validity is often measured by having a group of subject matter experts (SMEs) verify that the test measures what it is supposed to measure.

Why does content validity matter?

If an assessment doesn’t have content validity, then the test isn’t actually testing what it seeks to, or it misses important aspects of job skills.

Would you want to fly in a plane, where the pilot knows how to take off but not land? Obviously not! Assessments for airline pilots take account all job functions including landing in emergency scenarios.

Similarly, if you are testing your employees to ensure competence for regulatory compliance purposes, or before you let them sell your products, you need to ensure the tests have content validity – that is to say they cover the job skills required.

Additionally to these common sense reasons, if you use an assessment without content validity to make decisions about people, you could face a lawsuit. See this blog post which describes a US lawsuit where a court ruled that because a policing test didn’t match the job skills, it couldn’t be used fairly for promotion purposes.

How can you increase content validity?

Here are some tips to get you started. For a deeper dive, Questionmark has several white papers that will help, and I also recommend Shrock & Coscarelli’s excellent book “Criterion-Referenced Test Development”.

  1. Conduct a job task analysis (JTA). A JTA is a survey which asks experts in the job role what tasks are important and how often they are done. A JTA gives you the information to define assessment topics in terms of what the job needs. Questionmark has a JTA question type which makes it easy to deliver and report on JTAs.
  2. Define the topics in the test before authoring. Use an item bank to store questions, and define the topics carefully before you start writing the questions. See Know what your questions are about before you deliver the test for some more reasoning on this.
  3. You can poll subject matter experts to check content validity for an existing test. If you have an existing assessment, and you need to check its content validity, get a panel of SMEs (experts) to rate each question as to whether it is  “essential,” “useful, but not essential,” or “not necessary” to the performance of what is being measured. The more SMEs who agree that items are essential, the higher the content validity. See Understanding Assessment Validity- Content Validity for a way to do this within Questionmark software.
  4. Use item analysis reporting. Item analysis reports flag questions which are don’t correlate well with the rest of the assessment. Questionmark has an easy to understand item analysis report which will flag potential questions for review. One of the reasons a question might get flagged is because participants who do well on other questions don’t do well on this question – this could indicate the question lacks content validity.
  5. Involve Subject Matter Experts (SMEs). It might sound obvious, but the more you involve SMEs in your assessment development, the more content validity you are likely to get. Use an assessment management system which is easy for busy SMEs to use, and involve SMEs in writing and reviewing questions.
  6. Review and update tests frequently. Skills required for jobs change quickly with changing technology and changing regulations.  Many workplace tests that were valid two years ago, are not valid today. Use an item bank with a search facility to manage your questions, and review and update or retire questions that are no longer relevant.

I hope this blog post reminds you why content validity matters and gives helpful tips to improve the content validity of your tests. If you are using a Learning Management System to create and deliver assessments, you may struggle to obtain and demonstrate content validity. If you want to see how Questionmark software can help manage your assessments, request a personalized demo today.

 

GDPR is coming. Are you ready?

Posted by Julie Delazyn

Don’t get left behind as the most important change in data privacy takes effect May 2018. The new General Data Protection Regulation (GDPR) intends to strengthen and unify privacy and data protection and any organization that stores or manages data about Europeans will need to comply.

With eye-watering regulatory fines of up to €20 million or 4% of global annual turnover (whichever is greater), a credible compliance strategy is essential.

Join us for a FREE 45 minute Webinar July 26, 2017, to understand how online assessments can help you meet your GDPR challenges.

The webinar will cover:

  • What the GDPR is and who it impacts
  • Why you should care about GDPR compliance
  • How to overcome the challenges presented by GDPR — including the learning curve for your employees
  • How assessment can help mitigate GDPR risks and aid your compliance strategy
  • Considerations for implementing assessment management software to aid in compliance

We look forward to speaking to you at the webinar!

Next Page »