Summer webinars — including tips on better test planning and delivery

Joan Phaup HeadshotPosted by Joan Phaup

Students (and teachers) may be clicking their heels about summer vacation, but the joy of learning continues year-round for us!

Helping our customers understand how to use assessments effectively is as important to us as providing good testing and assessment technologies — we ‘re keeping our web seminars going strong during the summer months.

Here’s the current line-up:

Questionmark Customers Online: Using Questionmark and SAP for Effective Learning and Compliance — June 20 at 1 p.m. Eastern Time:

Learn about the use of Questionmark and SAP for a wide array of learning and compliance needs, including safety training, certifications and regulatory compliance testing. This presentation by Kati Sulzberger of BNSF Railway also describes how Questionmark helped the company meet some unique test delivery requirements.

Five Steps to Better Tests: Best Practices for Design and Delivery — July 18 at noon Eastern Time:

Get practical tips for planning tests, creating items, and building, delivering and evaluating tests that yield actionable, meaningful results. Questionmark Product Owner Doug Peterson, who will present this webinar, previously spent more than 12 years in workforce development.  During that time, Doug created training materials, taught in the classroom and over the Web, and created many online surveys, quizzes and tests.

Questionmark Customers Online: Achieving a Better Assessment-Development Process — August 22 at 1 p.m. Eastern Time:

Need a better assessment building process? Find out how enterprise architecture principles can help you and your team work more efficiently.  Tom Metzler,  Knowledge Assessment Administrator at TIBCO Software, Inc.,  will explain how the company’s certification team uses well-established software architecture principles to continually improve the efficiency of its assessment development process. Find out how using systematic processes and thorough documentation result in better information for subject matter experts, time-savings and higher-quality assessments.

Introduction to Questionmark’s Assessment Management System — Choose from a variety of dates and times

This primer  explains and demonstrates key features and functions available in Questionmark OnDemand and Questionmark Perception. Spend  an hour with a Questionmark expert learning the basics of authoring, delivering and reporting on surveys, quizzes, tests and exams.

Click here for more details and free online registration.

Why mobile assessment matters

John Kleeman HeadshotPosted by John Kleeman

The reason assessment on mobile phones and tablets matters is because so many people have these devices, and there is a huge opportunity to use them. Sometimes it’s easy to forget how rapid a change this has been!

I’m indebted to my colleague Ivan Forward for this visualization showing the increase in mobile phone ownership in 10 years. It’s based on data from the South African census reported by the BBC.

See http://www.bbc.co.uk/news/world-africa-20138322 for text figures behind this graph

As you can see, in the decade from 2001 to 2011, more South Africans gained access to electricity, flush toilets and higher education, but the change in use of mobile phones has been far more dramatic.

Figures in other countries will vary, but in every country mobile phone use has increased hugely.

Not only does this explain why mobile assessment matters, it also explains why so many organizations (including Questionmark customers) are moving to Software as a Service / on-demand systems. Because of the rise of mobile phones, the parallel rise in tablets and the fast changing nature of mobile technology, you need your software to be up to date. And for most organizations, this is easier to do if you delegate it to a system like Questionmark OnDemand than if you have to update and re-install your own software frequently.

New Questionmark OnDemand release enhances analytics and mobile delivery

Jim Farrell HeadshotPosted by Jim Farrell

With Questionmark having just released a major upgrade of our OnDemand platform. I want to highlight some of the great new features and functionality now available to our customers.

Let’s start with my favorite. Questionmark released a new API known as OData, which allows Questionmark customers to access data in their results warehouse database and create reports using third-party tools like PowerPivot for Excel and Tableau. Through a client, a user makes a request to the data service, and the data service processes that request and returns an appropriate response.

You can use just about any client to access the OData API as long as it can make HTTP requests and parse XML responses. Wow…that’s technical! But  the power of the new OData API is that it liberates your data from the results warehouse and lets you build custom reports, create dashboards, or feed results data into other business intelligence tools.

5.6

The OData API is not the only update we have made to Analytics. The addition of the Assessment Content Report allows users to review participant comments for all questions within an assessment, topic, or specific question. Enhancements to the Item Analysis report include the ability to ignore question and assessment revisions. This report now also supports our dichotomously-scored Multiple Response, Matching, and Raking question types.

Another improvement I want to highlight is the way Questionmark now works with mobile assessments. An updated template design for assessments when taken from a mobile device embraces responsive design, enhancing our ability to author once and deploy anywhere. The new mobile offering supports Drag and Drop and Hotspot question types — and Flash questions can now run on all Flash-enabled mobile devices.

Click here for more details about this new release of Questionmark OnDemand.

Recommended Reading: Learning on Demand by Reuben Tozman

Posted by Jim Farrell

I don’t know about you, but I often feel spoiled by Twitter.

Being busy forces me to mostly consume short articles and blog posts with the attention span similar to my 6-year-old son. Over the course of the year, the pile of books on my nightstand grows, and I fall behind in books I want to read. My favorite thing about this time of the year (besides football and eggnog) is catching up on my reading.

One book that I’ve been really looking forward to reading, since hearing rumors of its creation by the author, is Learning on Demand by Reuben Tozman.

For those of you who are regulars at e-learning conferences, the name Reuben Tozman will not be new to you. Reuben is not one for the status quo. Like many of us, he is constantly looking for the disruptive force that will move the “learner” from the cookie-cutter, one-size-fits-all model that many of us have grown up with to a world where everything revolves around the context of performance. I put the word learner in quotes because Reuben hates the word. We are all learners all of the time in the 70+20+10 world. You are not only a learner when you are logged into your LMS.

Learning on Demand takes the reader through the topics of understanding and designing learning material with the evolving semantic web, the new technologies available today to make learning more effective and efficient, structuring content for an on-demand system, and key skills for instructional designers.

Each chapter includes real-world examples that anyone involved in education will connect with. This isn’t a book that tells you to throw away the baby with the bath water: There are a lot of skills that Instructional Designers use today that will help them be successful in a learning-on-demand world.

Even the appendix of case studies has nuggets to take forward and expand into your everyday work. My favorite was a short piece on work Reuben did with the Forum for International Trade Training (FITT). They called it a “J3 vision” which goes beyond training to performance support. The “Js” are:  J1 – just enough, J2 – Just in time (regardless of time and/or location), and J3 – Just for me (delivered in the medium I like to learn in,) (Notice I did not say learning style: That is a discussion for another time.) To me, this is the perfect way to define good performance support.

I think it would be good for Instructional Designers to put their Dick and Carey books into the closet and keep Reuben’s book close at hand.

Scalability testing for online assessments

Posted by Steve Lay

Last year I wrote a series of blog posts with accompanying videos on the basics of setting up virtual machines in the cloud and running them ready to install Questionmark Perception:

This type of virtual machine environment is very useful for development and testing; we use a similar capability ourselves when testing the Perception software as well as new releases of our US and EU OnDemand services. One thing these environments are particularly useful for is scalability testing.

Scalability can be summarised as the ability to handle increased load when resources are added. We actually publish details of the scalability testing we do for our OnDemand service in our white paper on the “Security of Questionmark’s US OnDemand Service”.

The connection between scalability and security is not always obvious, but application availability is an important part of any organisation’s security strategy. For example, a denial-of-service or DoS attack is one in which an attacker deliberately exploits a weakness of a system in order to make it unavailable. Most DoS attacks do not involve any breach of confidentiality or data integrity, but they are still managed under the umbrella of security. Scalability testing focuses on the ‘friendly’ threat from increased demand but, like a DoS attack, the impact of a failure on the end user is the same: loss of availability.

As the popularity of our OnDemand service continues to increase, we’ve been ramping up our scalability testing, too. Using an external virtual machine service we are able to temporarily, and cost-effectively, simulate loads that exceed the highest peaks of expected demand. As more and more customers join our OnDemand service, the peaks of demand tend to smooth out when compared to a single customer’s usage — allowing us to scale our hardware requirements more efficiently. Our test results are also used to help users of Question Perception, our system for on-premise installation, provision suitable resources for their peak loads.

I thought I’d share a graph from a recent test run to help illustrate how we test the software behind our services. These results were obtained with a set of virtual resources designed to support a peak rate equivalent of 1 million assessments per day. The graph shows results from 13 different types of test, such as logging in, starting a test, submitting results, etc. The vertical axis represents the response times (in ms) for the minimum, median and 90th percentile cases at peak load. As you can see, all results are well within the target time of 5000ms.

I hope I’ve given you a flavour of the type of testing we do to ensure that Questionmark OnDemand lives up to being a scalable platform for your high-volume delivery needs.

 

Podcast: John Kleeman on Questionmark’s expanding assessment cloud

Posted by Joan Phaup

John Kleeman

We recently celebrated a major expansion of our cloud-based Questionmark OnDemand service with the addition of Questionmark’s European Data Centre to the service we have been providing for some time in the United States.

Wanting to learn more about the reasons for adding the new data centre, I spoke with Questionmark Chairman John Kleeman about the transition many customers are making to cloud-based assessment management.

Here are some of the  points we covered:

  • why software-as-a-service offers a reliable, secure and cost-effective alternative to deploying assessments from in-house servers
  • the importance of giving customers the option of storing data either in the United States or the European Union
  • the extensive security procedures, system redundancy, multiple power sources, round-the clock technical supervision and other measures that make for reliable cloud-based assessments
  • the relative merits of  Questionmark OnDemand  (cloud-based) and Questionmark Perception (on-premise) assessment management — and how to choose between them

You can learn more listening to our conversation here and/or reading the transcript.