Integrating and Connectors – Blackboard

The Questionmark Blackboard Connector is a proprietary connector that provides unprecedented integration between the Blackboard LMS and Questionmark. Through the Blackboard Connector:

  • The first time an instructor interfaces with Questionmark, a Questionmark admin ID is created for them automatically.
  • When an instructor adds a Questionmark assessment to a Blackboard course, the course short name becomes a Questionmark group, and the instructor and any
    students launching the assessment are automatically added to the group.
  • The first time a student launches any Questionmark assessment, a participant ID is created in Questionmark for the student.

And all of this automatic synchronization is optional! You can just as easily set up the connector to require that instructors, students and/or groups be created by a Questionmark admin in Questionmark so that you can control exactly who, and what courses, can interface with Questionmark.

Watch this video for a complete explanation of what the Blackboard Connector can do for you:

bb Video

Using OData queries to calculate simple statistics

Steve Lay HeadshotPosted by Steve Lay

In previous blog posts we’ve looked at how to use OData clients, like the PowerPivot plugin for Excel, to create sophisticated reports based on the data exported from your Questionmark Analytics results warehouse. In this post, I’ll show you that your web developers don’t need a complex tool like Excel to harness the power of OData.

There are several third party libraries available that make it easy for your developers to incorporate support for OData in their applications, but OData itself contains a powerful query language and developers need nothing more than the ability to fetch a URL to take advantage of it. For example, suppose I want to know what percentage of my participants have passed one of my exams.

Step 1: find out the <id> of your assessment.OnDemand

To start with, I’ll build a URL that returns the complete list of assessments in my repository. For example, if your customer number is 123456 then a URL like the following will do the trick: (This and the following URLS are examples, not live links.)

The resulting output is an XML file containing one record for each assessment. Open up the result in a text editor or ‘view source’ in your browser, scan down for the Assessment you are interested in, and make a note of the entry’s <id>, that’s the URL of the assessment you are interested in. If you’ve got lots of assessments you might like to filter the results using an OData query. In my case, I know the assessment name starts with the word “Chemistry”, so the following URL makes it easier to find the right one:$filter=startswith(Name,’Chem’)

All I’ve done is add a $filter parameter to the URL! The resulting document contains a single assessment! I can see that the <id> of my assessment is actually the following URL:

Step 2: count the results

I’m not interested in the information about the assessment but I am interested in the results. OData makes it easy to navigate from one data item to another. I just add “/Results” to the URL:

This URL gives me a similar list to the assessment list I had earlier but this time I have one entry for each result of this assessment. Of course, there could be thousands, but for my application I only want to know how many. Again, OData has a way of finding this information out just by manipulating the URL:$count

By adding /$count to the URL I’m asking OData not to send me all the data, but just to send me a count of the number of items that it would have sent back. The result is a tiny plain text document containing just the number. If you view this URL in your web browser you’ll see the number appear as the only thing on the page.

I’ve now calculated the total number of results for my assessment without having to do anything more sophisticated than fetch a URL. But what I really want is the percentage of these results that represent a pass. It turns out I can use the same technique as before to filter the results and include only those that have passed. My assessment has Pass/Fail information represented using the ScorebandName.


the $count option just sends a count of the number of items that it would have sent back without it$count?$filter=ScorebandName eq ‘Pass’

Notice that by combining $count and $filter I can count how many passing results there are without having to view the results themselves. It is now trivial to combine the two values that have been returned to your application by these URLs to display a percentage passed or to display some other graphic representation such as a pie chart or part filled bar.

As you can see, your developers don’t need a sophisticated library to write powerful client applications with OData. And these HTTP documents are just a few bytes in size so they won’t use much bandwidth (or memory) in your application either. For additional resources and definitions of all the OData filters, you can visit the OData URI conventions page at

Questionmark Users Conferences offer many opportunities to learn more about Questionmark Analytics and OnDemand. Registration is already open for the 2014 Users Conference March 4 – 7 in San Antonio, Texas. Plan to be there!


LTI certification and news from the IMS quarterly meeting

Steve Lay HeadshotPosted by Steve Lay

Earlier this month I travelled to Michigan for the IMS Global Learning Consortium’s quarterly meeting. The meeting was hosted at the University of Michigan, Ann Arbor, the home of “Dr Chuck”, the father of the IMS Learning Tools Interoperability (LTI) protocol.

I’m pleased to say that, while there, I put our own LTI Connector through the new conformance test suite and we have now been certified against the LTI 1.0 and 1.1 protocol versions.IMS

The new conformance tests re-enforce a subtle change in direction at IMS. For many years the specifications have focused on packaged content that can be moved from system to system. The certification process involved testing this content in its transportable form, matching the data against the format defined by the IMS data specifications. This model works well for checking that content *publishers* are playing by the rules, but it isn’t possible to check if a content player is working properly.

In contrast, the LTI protocol is not moving the content around but integrating and aggregating tools and content that run over the web. This shifts conformance from checking the format of transport packages to checking that online tools, content and the containers used to aggregate them (typically an LMS) are all adhering to the protocol. With a protocol it is much easier to check that both sides are playing by the rules  — so overall interoperability should improve.

In Michigan, the LTI team discussed the next steps with the protocol. Version 2 promises to be backwards-compatible but will also make it much easier to set up the trusted link between the tool consumer (e.g., your LMS) and the tool provider (e.g., Questionmark OnDemand).  IMS are also looking to expand the protocol to enable a deeper integration between the consumer and the provider. For example, the next revision of the protocol will make it easier for an LMS to make a copy of a course while retaining the details of any LTI-based integrations. They are also looking at improving the reporting of outcomes using a little-known part of the Question and Test Interoperability (QTI) specification called QTI Results Reporting.

After many years of being ‘on the shelf’ there is a renewed interest in the QTI specification in general. QTI has been incorporated into the Accessible Portable Item Protocol (APIP) specification that has been used by content publishers involved in the recent US Race to the Top Assessment Program. What does the future of QTI look like?  It is hard to tell at this early stage, but the buzzword in Michigan was definitely EPUB3.

Celebrating 25 years of change — from DOS to SaaS

Question Mark for DOS (1988)John Kleeman HeadshotPosted by John Kleeman

Considering all the security, availability and flexibility we can achieve today with cloud-based assessment management systems, it’s remarkable to look back at the many changes and milestones we’ve seen over the past 25 years.

I wrote the first version of Question Mark for DOS in 1987-88. When I launched the company, 25 years ago, in London in August 1988, I always wanted to bring the benefits of computerized assessment to the world, but it was hard to foresee the dramatic technological changes that would transform our industry and make online assessment as widespread as it is today.

Coinciding with the rise of the PC, Question Mark for DOS empowered trainers and teachers to create, deliver and report on computerized assessments without having to rely on IT specialists.

Question Mark Designer for Windows (1993)QM Web 1995Things have been changing quickly ever since.  The early 1990s brought the move from DOS — functional but boring — to Windows — visual and graphical. This was radical at the time. To quote our marketing for Question Mark Designer for Windows, launched in 1993:

“Using Question Mark Designer, you can create tests using the full graphical power of Windows. You can use fonts of any size and type, and you can include graphics up to 256 colours. One of the most exciting features is a new question type, called the “Hot spot” question. This lets the student answer by “pointing” at a place on the screen.”

The switch to a visual user interface was huge, but the biggest paradigm shift of all was the move to delivering assessments over the Internet.

Pre-Internet, communicating results from assessments at a distance meant sending floppy disks by post. The World Wide Web made it possible to put an assessment on a web server, have participants answer it online and get instantly viewable results. This changed the world of online assessments forever.

QuestionmQuestion Mark Perception (1998)ark Technical Director Paul Roberts, who still plays an important role in Questionmark product development, wrote the code for the world’s first-ever Internet assessment product, “QM Web”, in 1995.  We followed up QM Web with the first version of Questionmark Perception, our database-driven assessment system, in 1998.

Eric Shepherd founded the U.S. division of Questionmark in the 1990s and in 2000 became CEO of what is now a global company. He is the heart and soul of Questionmark, an inspiring chief executive who has turned Questionmark from a small company into an industry leader.

One key paradigm shift in the 2000s was the desire to use surveys, quizzes, tests and exams in more than one department — across the entire enterprise. To make this practical, we began building scalability, reliability, translatability, accessibility, maintainability and controllability into our technologies. These attributes, along with multiple delivery options and in-depth reporting tools, are key reasons people use Questionmark today.

Cutting the ribbon at the Questionmark European Data Center

Opening our European Data Centre last year marked a major expansion of our cloud-based Questionmark OnDemand service

In recent years, we’ve seen another dramatic change – towards software-as-a-service applications in the “cloud”.  Just as Question Mark for DOS 25 years ago empowered ordinary users to create assessments without needing much IT support, so Questionmark OnDemand today allows easy creation, delivery and reporting on assessments without in-house servers.

So what’s in store for the future? Technology is making rapid advances in responsive design, security, “big data”, mobile devices and more. Questionmark keeps spending around 25% of revenues on product development. The huge demand for online assessments is making this our busiest time ever, and we expect continued, rapid improvement.

I’d like to thank our customers, suppliers, partners, users and employees – whose support, collaboration and enthusiasm have been critical to Questionmark’s growth during our first 25 years. I look forward to continuing the journey and am eager to work with all of you to shape what happens next!

Case study: streamlining training with cloud-based assessments

Joan Phaup HeadshotPosted by Joan Phaup

We love to feature customer case studies about using Questionmark technologies in innovative ways. Founded in 1968, Spaulding for Children is of the first child welfare agencies in the country to specialize in finding and training adoptive families for special needs children. Believing that every child is born with the right to a loving home and family, the agency works closely with other organizations to improve the lives of children who have Logo-spauldingsuffered abuse, neglect, or abandonment.

More than 800 trainers affiliated with the agency collect evaluations from learners including social workers, care providers and others involved in adoption, foster care and related services.

Having moved from paper-based surveys to Questionmark Perception several years ago, Spaulding recently switched to the Questionmark OnDemand software-as-a-service platform. This change to cloud-based assessments has freed staff from dealing with technology. This in turn has given them more time to look after their core responsibilities.

For a more details, check out our case study.

Integrating your LMS with Questionmark OnDemand just got easier!

Steve Lay Headshot

Posted by Steve Lay

Last year I wrote about the impact that the IMS LTI standard could have on the way people integrate their LMS with external tools.

I’m pleased to say that we have just released our own LTI Connector for Questionmark OnDemand. The connector makes it easy to integrate your LMS with your Questionmark repository. Just enter some security credentials to set up the trusted relationships and your instructors are ready to start embedding assessments directly into the learning experience.

Moodle TestBy using a standard, the LTI connector enables a wide range of LMSs to be integrated in the same way. Many of them have LTI support built in directly too, so you won’t have to install additional software or request optional plugins from your LMS hosting provider.

You can read more about how to use the LTI connector with Questionmark OnDemand on our website: Questionmark Connectors.

You can also find out which tools are currently supporting the LTI standard from the IMS Conformance Certification page (which we hope to be joining shortly).

From Content to Tool Provider

The LTI standard, in many ways, does a similar job to the older SCORM and AICC standards. It provides a mechanism for an LMS to launch a student into an activity and for that activity to pass performance information (outcomes) back to the LMS to be recorded in their learning record.

Both the SCORM and AICC standards were designed with content portability in mind, before the Web became established. As a result, they defined the concept of a package of content that has to be published and ‘physically’ moved to the LMS to be run. The LMS became a player of the content.

Contrast this approach with that of IMS LTI. In LTI, the activity is provided by an external Tool Provider. The Tool Provider is hosted on the web and is identified by a simple URL; there is no publishing required! When the Tool’s URL is placed into the LMS, along with appropriate security credentials, the link is made. Now the student just follows an embedded link to the Tool Provider’s website where they interact with the activity directly. The two websites communicate via web services (much like AICC) to pass back information about outcomes.

The result is simpler and more secure! It is no wonder that the LTI specification has been adopted so quickly by the community.