New white paper: Questionmark and Microsoft Office 365

Posted by John Kleeman

I’m pleased to inform you of a new white paper fresh off the press on Questionmark and Microsoft Office 365.

Office logoThis white paper explains how Microsoft Office 365 complements the Questionmark OnDemand assessment management system; and how you can use Office 365 to launch Questionmark surveys, quizzes, tests and exams, how to consume Office 365 resources within Questionmark, and how Office 365 can help analyze results from assessments. You can download the white paper here.

The white paper also describes some of the reasons that organizations use assessments and why it is important for assessments to be valid, reliable and trustable.

Launching assessments from Office 365

Being able to call assessments from within Office 365 allows you to closely connect an assessment to content, for example to check understanding after learning. The white paper describes how you can:

  • Call Questionmark assessments from the Office 365 app launcher
  • Launch an assessment from within a Word, Excel or other Office document
  • Embed an assessment inside a PowerPoint presentation
  • Launch or embed assessments from SharePoint
  • Use SAML to have common identities and seamless authentication between Office 365 and Questionmark OnDemand. The benefit of this is that test-takers can login once to Office 365 and then can take tests in Questionmark OnDemand without needing to login again.

Using Office 365 resources within assessments

Illustration of a picture of a video being used inside an assessmentAssessments of competence are in general more accurate when the questions simulate the performance environment being measured. By putting video, sound, graphics and other media within question stimulus, you help put the participant’s mind into an environment closer to how he/she will be when doing a real-world job task. This makes the question more accurate in measuring the performance of such tasks.

To help take advantage of this, a common use of Office 365 with Questionmark OnDemand is  to make media and other resources that you can use within assessments. The white paper describes how you can use Office 365 Video, PowerPoint, SmartArt and other Office 365 tools to make videos and other useful question content.

Using Office 365 to help analyze results of assessments

People have been using Microsoft Excel to help analyze assessment results since the 1980s and the white paper describes some suggestions on how to do that most effectively with Questionmark OnDemand.

Newer Microsoft tools can also be used to provide powerful insight into assessment results. Questionmark OnDemand makes available assessment data in an OData feed, which can be consumed by business intelligence systems like Power BI. OData is an open protocol to allow the creation and consumption of queryable and interoperable data in a simple and standard way. The white paper also describes how to use OData and Power BI to get further analysis and visualizations from Questionmark OnDemand.

 

The white paper is easy to read and gives practical advice. I recommend reading this white paper if your organization uses Office 365 and Questionmark or if you are considering doing so. You can download the white paper (free with registration) from the Questionmark website.  You can also see other white papers, eBooks and other helpful resources at www.questionmark.com/learningresources.

 

Authoring questions in a CSV file and importing them into Questionmark Live

Posted By Doug Peterson

Questionmark Live is an easy means of authoring questions, but sometimes it’s helpful to import questions authored elsewhere.

Being able to import questions that have been saved in a CSV file makes a lot of sense when an author is traveling and doesn’t have access to Questionmark Live — or when authoring has been contracted out to someone who doesn’t use Questionmark Live or an author simply prefers to work in Excel.

This video focuses on authoring questions in a simple spreadsheet, saving it as a CSV file, and importing that file into Questionmark Live.

Understanding Assessment Validity: Content Validity

greg_pope-150x1502

Posted by Greg Pope

In my last post I discussed criterion validity and showed how an organization can go about doing a simple criterion-related validity study with little more than Excel and a smile. In this post I will talk about content validity, what it is and how one can undertake a content-related validity study.

Content validity deals with whether the assessment content and composition are appropriate, given what is being measured. For example, does the test content reflect the knowledge/skills required to do a job or demonstrate that one grasps the course content sufficiently? In the example I discussed in the last post regarding the sales course exam, one would want to ensure that the questions on the exam cover the course content area of focus appropriately, in appropriate ratios. For example, if 40% of the four-day sales course deals with product demo techniques then we would want about 40% of the questions on the exam to measure knowledge/skills in the area of demo skills.

I like to think of content validity in two slices. The first slice of the content validity pie is addressed when an assessment is first being developed: content validity should be one of the primary considerations in assembling the assessment. Developing a “test blueprint” that outlines the relative weightings of content covered in a course and how that maps onto the number of questions in an assessment is a great way to help ensure content validity from the start. Questions are of course classified when they are being authored as fitting into the specific topics and subtopics. Before an assessment is put into production to be administered to actual participants, an independent group of subject matter experts should review the assessment and compare the questions included on the assessment against a blueprint. An example of a test blueprint is provided below for the sales course exam, which has 20 questions in total.

validity 4

The second slice of content validity is addressed after an assessment has been created. There are a number of methods available in the academic literature outlining how to conduct a content validity study. One way, developed by Lawshe in the mid 1970s, is to get a panel of subject matter experts to rate each question on an assessment in terms of whether the knowledge or skills measured by each question is “essential,” “useful, but not essential,” or “not necessary” to the performance of what is being measured (i.e., the construct). The more SMEs who agree that items are essential, the higher the content validity. Lawshe also developed a funky formula called the “content validity ratio” (CVR) that can be calculated for each question. The average of the CVR across all questions on the assessment can be taken as a measure of the overall content validity of the assessment.

validity 5

You can use Questionmark Perception to easily conduct a CVR study by taking an image of each question on an assessment (e.g., sales course exam) and creating a survey question for each assessment question to be reviewed by the SME panel, similar to the example below.

validity 6You can then use the Questionmark Survey Report or other Questionmark reports to review and present the content validity results.

So how does “face validity” relate to content validity? Well, face validity is more about the subjective perception of what the assessment is trying to measure than about conducting validity studies. For example, if our sales people sat down after the four-day sales course to take the sales course exam and all the questions on the exam were asking about things that didn’t seem related to the information they just learned on the course (e.g., what kind of car they would like to drive or how far they can hit a golf ball), the sales people would not feel that the exam was very “face valid” as it doesn’t appear to measure what it is supposed to measure. Face validity, therefore, has to do with whether an assessment looks valid or feels valid to the participant. However, face validity is somewhat important:  if participants or instructors don’t buy in to the assessment being administered, they may not take it seriously,  they may complain about and appeal their results more often, and so on.

In my next post I will turn the dial up to 11 and discuss the ins and outs of construct validity.