Embedding Assessments in Microsoft Word

Embed a Questionmark Perception assessment, survey or quiz within Microsoft Word.

  • To see how this would look, see a snapshot of an assessment embedded within Microsoft Word.
  • Check out this How-to on our developer Web site.
  • You can embed an assessment in Microsoft Word using the Visual Basic for Applications Web Browser control. It is important to note that embedding an assessment uses certain VBA controls. You will need to ensure that Macros are enabled when viewing the document with the embedded quiz.

Podcast: Dental Education at the University of Maryland


Posted by Joan Phaup

I spoke recently with Professor James Craig and Instructional Technology Specialist Sarita Sanjoy from the University of Maryland Dental School  about the school’s move to a new building and the innovative uses of technology the move has brought to to all aspects of the dental program. On the academic side, this has included implementing the use of Sonic Foundry Mediasite, which the school uses to capture lectures in order to make content available to students 24/7.

We talked about how the school uses Questionmark Perception together with Blackboard and the many lessons Sarita and James have learned about effective security measures, the use of assessment analytics to improve instruction and testing, and supportive ways to train faculty for preparing electronic assessments, among other topics. I hope you will enjoy and share this podcast!

Erik Christensen photo: Wikimedia Commons

How many questions do I need on my assessment?


Posted by Greg Pope

I recently was asked a common question regarding creating assessments: How many questions are needed on an assessment in order to obtain valid and reliable participant scores? The answer to this question depends on the context/purpose of the assessment and how scores are used. For example, if an organization is administering a low-stakes quiz designed to facilitate learning during study with on-the-spot question-level feedback and no summary scores, then one question would be enough (although probably more would be better to achieve the intended purpose). If no summary scores are calculated (e.g., an overall assessment score), or if these overall scores are not used for anything, then very small numbers of questions are fine. However, if an organization is administering an end-of-course exam that a participant has to pass in order to complete a course, the volume of questions on that exam is important. (A few questions aren’t going to cut it!) The issue in terms of psychometrics is whether very few questions would provide enough measurement information to allow someone to draw conclusions from the score obtained (e.g., does this participant know enough to be considered proficient).

Ever wonder why you have to take so many questions on a certification or licensing exam? One rarely gets to take only 2-3 questions on a driving test, and certainly not a chartered accountant licensing exam. Oftentimes one might take close to 100 questions on such exams. One of the reasons for this is because more individual measurements of what a participant knows and can do need to be provided in order to ensure that the reliability of the scores obtained are high (and therefore that the error is low). Individual measurements are questions, and if we only asked one question to a participant on an accounting licensing exam we likely would not get a reliable estimate regarding the participant’s accounting knowledge and skills. Reliability is required for an assessment score to be considered valid, and generally the more questions on an assessment (to a practical limit), the higher the reliability.

Generally, what an organization would do is have a target reliability value in mind that would help determine how many questions one would need at a minimum in order to have the measurement accuracy required in a given context. For example, in a high-stakes testing program where people are being certified or licensed based on their assessment scores a reliability of 0.9 or higher (the closer to 1 the better) would likely be required. Once a target minimum reliability target is established one can estimate how many items might be required in order to achieve this reliability. An organization could administer a pilot beta test of an assessment and run the Test Analysis Report to obtain the Cronbach’s Alpha test reliability coefficient. One could then use the Spearman-Brown prophecy formula (described further in “Psychometric Theory” by Nunnally & Bernstein, 1994) to estimate how much the internal consistency reliability will be increased if the number of questions on the assessment increases:


  • k=the increase in length of the assessment (e.g., k=3 would mean the assessment is 3x longer)
  • r11=the existing internal consistency reliability of the assessment

For example, if the Cronbach’s Alpha reliability coefficient of a 20-item exam is 0.70 and 40 items are added to the assessment (increasing the length of the test by 3x), the estimated reliability of the new 60-item exam will be 0.88:

Let’s look at this information visually:

If you would like to learn more about validity and reliability, see our white paper: Defensible Assessments: What you need to know.

I hope this helps to shed light on this burning psychometric issue!

Embedding Assessments in Microsoft SharePoint

Screenshot of Questionmark Assessment embedded in SharePointThere are several ways to use Microsoft SharePoint within your organization.  One of the easiest things you can do is embed a Questionmark Perception survey or quiz within SharePoint.

  • To see how this would look, see a snapshot of an assessment embedded within SharePoint.
  • Check out this How-to on our developer Web site.
  • Microsoft’s SharePoint is increasingly used to manage and provide access to learning content. In addition to embedding assessments into SharePoint blogs, portals and wikis, you can provide single sign-on access to assessments by using a standard SharePoint page-viewer web part (if your organization uses Windows authentication) or by using Questionmark’s SharePoint Connector, an add-on that integrates Perception with Microsoft SharePoint Portal Server 2007 (SPS).

Hope you can join us next week in Manchester, Edinburgh or London!

Mel Lynch headshot Posted by Mel Lynch

We’re looking forward to meeting our friends and customers next week at our annual Breakfast Briefings in Manchester, Edinburgh and London!

If you haven’t attended a Questionmark Breakfast Briefing before, these events provide a great opportunity to meet fellow learning, testing and assessment professionals and to learn about the latest developments in online assessment technology.  The briefings kick off with breakfast and networking, followed by overviews and demonstrations of new products and features from Questionmark. Plus, Questionmark’s customer care, training, product management and technical support staff will be on hand to answer your questions.

There’s still time to sign up for a briefing, click here to learn more or to register for Manchester, Edinburgh or London.  But hurry – registration will close this Friday!

Packing for India and Questionmark Events in Bangalore and Mumbai

rafael-conf-australia2Posted by Rafael Lami Dozo

It is time to pack our suits and hit the road again…This time we are going to India!

We will be holding events in two different cities:

The first event will be on May 5th at the Taj Residency of Bangalore in the middle of the Indian Silicon Valley. You can register here.

The second event will be on May 7th at the Grand Hyatt in bustling Mumbai. You can register here.

Participants in these meetings will hear a power-packed briefing session on “Effectively measuring knowledge, skills and ability with well-crafted assessments.”

This briefing session will show how to create assessments that shed light on a person’s skill and ability as well as their knowledge. We will examine a number of different question types – as well as question writing techniques – that can be used to measure cognitive skills and abilities.

This session will help you:

 – Understand the relationship between knowledge, skills and abilities
 – Distinguish between cognitive processes and types of knowledge
 – Connect appropriate question types to specific skills
 – Write questions that probe more effectively into cognitive skills and abilities
 – Determine when to use “multiple choice” item formats versus constructed response

Also, Indian users of Questionmark Perception will explain how they are using our assessment solutions.