The Future Looks Bright

Posted by Jim Farrell

Snapshot from a “Future Solutions” focus group

Our Users Conferences are a time for us to celebrate our accomplishments and look forward to the challenges that lie in front of us. This year’s conference was full of amazing sessions presented by Questionmark staff and customers. Our software is being used to solve complex business problems. From a product standpoint, it is the very exciting to bring these real-life scenarios to our development teams to inspire them.

So where do we go from here? The Conference is our chance to stand in front of our customers and get feedback on our roadmap. We also held smaller “Future Solutions” focus groups to get feedback from our customers on what we have done and what we could do in the future to help them. In the best of times, these are an affirmation that we are on the right path. This was definitely one of those years.

One of our Future Solutions sessions focused on authoring. During that session, Doug Peterson and I laid out the future of Questionmark Live. This included an aggressive delivery cycle that will bring future releases at a rapid pace. Stay tuned for videos on new features available soon.

Ok…enough about us. This conference is really about our customers. The panel and peer discussion strand of this year’s conference had some of the most interesting topics. John Kleeman has already mentioned the security panel with our friends from Pearson Vue, ProctorU, Innovative Exams and Shenandoah University.

Another session that stood out was as a peer discussion test defensibility using the Angoff method to set cut scores. This conversation was very  interesting to me as someone who once had to create defensible assessments. I am eager to see organizations utilize Angoff because not only do  you want legally defensible assessments, you want to define levels of competency for a role and be able to determine how that can  predict future performance.

For those of you who do not know, the Angoff method is a way for Subject Matter Experts (SMEs) to grade the probability of a marginal student getting a question right. Attendees at this conference session were provided a handout that includes a seven-step flowchart guiding them in the design, development and implementation of the Angoff method.

If you are interested in Angoff and setting test scores I highly recommend reading Criterion-Referenced Test Development written by our good friends Sharon Shrock and Bill Coscarelli.

We really hope to see everyone at the 2013 Users Conference in Baltimore March 3 – 6. (I am hoping we may even get a chance to visit the beautiful Camden Yards!)

Paper-Free Smile Sheets, Real-time Insight

Posted by Brian McNamara

Last week was Questionmark’s 10th annual Users Conference – and the eighth one that I’ve personally been able to attend. At each of the previous conferences I’ve attended, I’ve been asked the same question: Why are you doing paper evaluations when you’re an online assessment company?

We actually had a very good reason: our main interest in doing session evaluations was to maximize the quality and quantity of the feedback that we received. If you wanted to maximize response rates, you had to make it as EASY as possible to answer the survey.  For smile sheets, paper trumped online… until now.

Enter the second decade of the 21st century:

  • Smartphones equipped with cameras are commonplace – many of us couldn’t imagine leaving home without ours.
  • QR Codes embedding website URLs are easy to come by, too: at grocery/department stores and malls, on billboards, in magazines, brochures, you name it.
  • QR Code- reading software, which essentially turns your Smartphone into a hand-held scanner, is freely available.
  • Questionmark’s auto-sensing, auto-sizing assessment delivery engine enables a single assessment to be delivered reliably and effectively on many different browsers and platforms – PC or mobile – without any extra steps for the author or administrator. Plus, you can capture key demographic information to the assessment results from an assessment URL.

Delegates at the 2012 Users Conference had two choices for providing session evaluation feedback: scan a QR code with their mobile and answer the survey online, or fill out a paper form (which would be scanned and uploaded via Questionmark Printing and Scanning following the event).

For those using mobiles, scanning the QR code would launch the assessment. The intro screen would display the name of the session they had just attended (just to be sure they scanned the right code).

While I was happy that the mobile smile sheet delivery would ease the burden of post-conference scanning and uploading of results, it was particularly useful to get a real-time view on how delegates were reacting to sessions – during the conference we ran our Course Summary report (see below) a few times a day to get a sense of which sessions were bringing the most benefit to our customers.

This was the first year we offered this option, and I was pleased to see to that more than 20% of all the session evals submitted were done via mobile device! Next year we’ll shoot for 80% and for 100% in 2014!  Paper-free and loving it!

What I Learned Back in the Big Easy

Posted by John Kleeman

Questionmark cutomers and staff pose for a picture at the dessert reception

I’m just back from the 2012 Questionmark user conference in New Orleans “the Big Easy”. This is our second user conference in New Orleans – we were there in 2005, and it was great to re-visit a lovely city. Here is some of what I learned from Questionmark customers and speakers.

Most striking session: Fantastic to hear from a major US government agency about how they deployed Questionmark to do electronic testing after a long history of 92 years of paper testing. The agency have a major programme – with 800,000 items and 300,000 test takers. I hope to cover what they are doing in a future blog post.

Most memorable saying: Bill Coscarelli (co-author of Criterion-Referenced Test Development) noting that in his huge experience, the two things in testing that really make a difference are “test above knowledge” and “Use the Angoff method to set a cut/pass score for your tests”.

Meeting a hero: You’ll have seen me writing on this blog about the Bruce C. Aaron’s A-model and it was great to meet Bruce in person for the first time and hear him talk so intelligently about the A-model as a way of measuring effectiveness of training – starting from a business problem.

Most fun time: Got to mention the amazing food in New Orleans – especially the catfish and the oysters. But the highlight was being part of a parade down Bourbon Street with a marching band, all the Questionmark conference attendees and a full New Orleans ensemble.

Best conversation: Discussion with a user who’d come to the conference uncertain that Questionmark was the right system for them, but realized on talking to other customers and the Questionmark team that they were only using 10% of what Questionmark could offer them, and left the conference enthused about all they would do going forward.

Session I learned most from: a security panel chaired by Questionmark CEO Eric Shepherd with executives from Pearson Vue, ProctorU, Innovative Exams and Shenandoah University. There was a great discussion about the advantages of test center delivery vs remote proctoring. Test center delivery is highly secure, but testing at home with a remote proctor observing by video link is very convenient, and it was great to hear real experts debating pros and cons.

Most actionable advice: Keynote speaker Jane Bozarth: “what you measure is what you get”, a reminder that what we do with measurement will define behavior in our organization.

It was great to spend time with and learn from customers, partners and our own team. Look forward to seeing many of you again in the 2013 Questionmark user conference in Baltimore Inner Harbor in March 3-6 2013.

PricewaterhouseCoopers wins assessment excellence award

Posted by Joan Phaup

Greg Line and Sean Farrell accept the Questionmark Getting Results Award on behalf of PwC

I’m blogging this morning from New Orleans, where we have just completed the 10th annual Questionmark Users Conference.

It’s been a terrific time for all of us, and we are already looking forward to next year’s gathering. 2013.

One highlight this week was yesterday’s presentation of a Questionmark Getting Results Award to PricewaterhouseCoopers.

Greg Line, a PwC Director in Global Human Capital Transformation, and Sean Farrell, Senior Manager of Evaluation & Assessment at PwC, accepted the award.

Getting Results Award

The award acknowledges PWC’s successful global deployment of diagnostic and post-training assessments to more than 100,000 employees worldwide, as well as 35,000 employees in the United States.

In delivering more than 230,000 tests each year —  in seven different languages — PwC defines and adheres to sound practices in the use of diagnostic and post-training assessments as part of its highly respected learning and compliance initiatives. These practices include developing test blueprints, aligning test content with organizational goals, utilizing sound item writing techniques, carefully reviewing question quality and using Angoff ratings to set passing scores.

Adhering to these practices has helped PwC deploy valid, reliable tests for a vast audience – an impressive accomplishment that we were very pleased to celebrate at the conference.

So that’s it for 2012! But mark your calendar for March 3 – 6, 2013, when we will meet at the Hyatt Regency in Baltimore!

Conference kicks off in New Orleans

Dessert reception

Posted by Joan Phaup

Last night’s opening reception at the 10th annual Questionmark Users Conference in the U.S. brought many old friends back together and introduced newcomers to staff members and fellow customers.

I’ve just returned to the blog from today’s opening general session, which prompted enthusiastic responses about a number of announcements and demos. Among them:

Lynn Cram receiving her perfect attendance award

  • Details about Questionmark Perception version 5.4, which includes Questionmark Analytics and the ability to deploy observational assessments,
  • The security of Questionmark’s OnDemand platform, starting with the many security measures taken at our SSAE 16-audited data center and covering data, network and application security – all reinforced through redundancy, monitoring and testing
  • Questionmark Live demonstrations of  the new Math Formula Editor  and the ability to embed YouTube videos
  • Using QR codes to launch assessments on mobile devices
  • New reports in Questionmark Analytics: Assessment Completion Time and Item Analysis
  • An update on Questionmark’s Open Platform
  • Our research about the role of assessments in mitigating risk

We had some fun honoring Lynn Cram from Progressive Insurance for having attended every single U.S. Users Conference since the first one in 2003. Congratulations to Lynn, who says she keeps returning to the conference because there’s always something new to learn!

You can follow the conference and learn some new things on Twitter at #QMCON, so check in whenever you like.

Open Platform, Security and Eating Out: looking forward to New Orleans

Posted by Steve Lay

Just a few days to go until the Questionmark Users Conference.  As usual I’ve been perusing the conference programme on the lookout for sessions of particular interest to people integrating with Questionmark technologies.

One that stood out to me was Using Innovative Technologies to Aid High-Volume Testing in Multiple Environments. The team from Oklahoma Department of Career and Technology Education will be talking about their application which was developed in partnership with Questionmark’s Solution Services team.  If you want to get an idea of what can be achieved using our application programming interfaces (APIs), you might want to check out this session.

For people who want to get a bit more technical, I’ll be giving a session on Using Web Services to Integrate with Questionmark Perception. In that session I’ll dive a bit deeper into our web service APIs – called QMWISe – and update you on our progress towards the next generation of APIs using REST and OData – but that’s enough acronyms for one blog post!

A significant theme of the conference next week is security.  To an integration specialist, security often means protocols for authentication, authorisation and cryptographic algorithms.  But this traditional concept of security is just one part of a more complex picture.  I’m currently reading “Liars and Outliers” by Bruce Schneier, which provides an accessible survey of the wider social context from the perspective of an author traditionally associated with computer security.  I touched on some of these issues in a blog post a few months ago about a session at our European conference last Autumn.

On this theme, Richard Pierce, from  Shenandoah University will be talking about proctored versus non-proctored testing, grappling directly with the effects of the assessment setting on cheating.  You might also like to attend the panel discussion on Managing Assessment Security as the Stakes are Getting Higher.  It seems unfair to pick out just one or two sessions: as you’d expect, there are plenty more for people interested in a theme that affects every part of an assessment programme.

As usual, the product owners will be available in their own track so why not come and see us to talk about our product road maps in the Future Solutions strand?  Oh, and before I go, did I mention that I’ll also be talking about how to deploy the brand new Perception 5.4 OnPremise?

I look forward to meeting everyone, and even if you can’t make it in person don’t forget to keep up with the conference hash tag (#qm12).  Finally, I’ll leave you with one of my favourite quotes from the conference programme, in Richard Pierce’s words:  “Where are we eating tonight? It is New Orleans, for goodness’ sake!”

See you next week!