With customers now using the Questionmark OData API to harvest meaning from their assessment results with greater freedom and flexibility, we are excited to hear about the broader implications of open data and the opportunities for learning organizations to make the most of it.
The Open Data Protocol, an industry standard for accessing data sources via the web, provides new opportunities reporting and analyzing assessment results. OData feeds can be consumed and analyzed by many leading business intelligence applications, provide new options for custom reports and dashboards..
I asked Bryan recently about what the advent of OData will mean, particularly with regard to learning and measurement:
The concept of open data is being talked about a lot these days. Why is it important?
I’ve got to confess that I’m a total data junkie, so I get very excited about the endless possibilities of open data. Think of how much data is being collected on a daily basis all over the world from scientific discoveries, government data, opinion polls, and even learning. Gartner recently said that companies are collecting 300% more data than they did 4 years ago. It’s crazy. But consider what we can discover by selectively combining data in meaningful ways – something OData enables us to do.
Here’s a non-learning example of how powerful data can be: The CDC (Centers for Disease Control) collects information showing incidence of heart disease on a county-by-county basis. A group called Third World Congress on Positive Psychology developed a way to parse through Twitter feeds across two counties, analyzing the use of 40,000 words in over 80 million tweets.
By combining these separate data feeds, they discovered a pattern between having a positive attitude and having lower risk of heart attack. This is a rather simple example, but just think of what kinds of patterns we will find as open data takes hold.
What is the significance of OData for testing and assessment professionals?
If you want to discover patterns of behavior that make companies successful, what better variable to plug in than testing and assessment data?
Case in point: A while ago I worked with a large software company. Independent, outside data suggested that customers felt that the company’s help desk support team often lacked knowledge in specific technical areas. We went in, created an assessment (using Questionmark, by the way!) and created a gap analysis across 60 technical skills areas for all help desk support.
We turned the data into a gap analysis heat map with red, yellow and green indicators showing a range of levels from expertise down to lack of knowledge. When we presented their senior management with the map, it was very clear to see several things…where training was needed…sometimes having the wrong person on the wrong team, and a whole lot more.
This was great as far as it went, but it was just a single snapshot in time…it wasn’t ongoing. I think back on that project and wonder how much more impressive it would have been if the data was continually measured, linked to a dashboard and frequently compared to the independent audit of customer responses.
OData makes this possible.
How can the use of open data impact learning and performance?
First, with open data, it’s relatively easy to flow the results of learning, testing and assessment right into the performance review process. I’ve been watching the major talent management vendors who have tools to conduct annual performance reviews, do staff planning, succession planning and pay for performance; many of them are gradually adding OData feeds (both in and out of their systems). So creating that level of interoperability is already starting to shape up. The bigger win is the ability to link learning with organizational performance: Kirkpatrick Level 4!
Most of us feel very comfortable collecting Level 1 and Level 2. Do they like the training? And are they learning, comparing pre and post test score? Some get how to do level 3 by sending out delayed questionnaires asking what skills are being used on the job; or through performance observation. But open data is really the enabler for linking learning and performance with key company metrics, income, productivity, retention, and lots of bottom line results — especially as other parts of the business make their data available through open channels.
How will your keynote address relate to the specific interests of Questionmark customers?
I’m not a technical guru when it comes to open data…not by a long shot. There will be others there who can tell you all about Questionmark’s OData capabilities. But I really think the hardest part of this is re-imagining how data can be creatively combined to paint the whole picture, or at least understanding what others might do with data that we make available through testing and assessment.
I’ll be sharing several examples of innovative approaches, but that’s just the tip of the iceberg. If our organizations are really collecting 300% more data than 4 years ago, there are simply way too many data streams to combine. So we need to start off by keeping things simple – to figure out which data streams can get us where we need to go.
If do my job well, we’ll all start dreaming of ways we can marry data together and apply meaning. And before long, we can expect to see some very creative dashboards linking learning data with actual business performance.
Learn more about the conference program, which includes two new pre-conference workshops: Test Development Fundamentals and The Art and Craft of Item Writing. Register for the conference by January 30 to save $100. Another current learning opportunity: 3-day Questionmark training in Las Vegas February 4 – 6.