Video: How to create hotspot questions for quizzes and tests

Brian McNamara HeadshotPosted by Brian McNamara

In the past couple of months we’ve been excited to share new assessment authoring features that you can use in Questionmark Live.

We have most recently discussed the built-in math formula editor and a LaTeX formula editor.

Today, I have put together a video on how you can easily and seamlessly author a hotspot question in Questionmark Live. In this demonstration, I create a basic anatomy hotspot question and upload an image where a participant will mark their answer. This is a simple-to-use but very exciting tool that allows you to accurately define your hotspot region.

Check out the video below or in our Learning Cafe:
[youtube http://www.youtube.com/watch?v=v_RpEb80_wg]

This capability and many others will be demonstrated the Questionmark Users Conference in Baltimore next week Register online by Friday!

Meet Questionmark’s New Product Owner for Analytics

Jim Farrell HeadshotPosted by Jim Farrell

On Monday, Austin Fossey takes over as Questionmark’s Product Owner of Reporting and Analytics – the person in charge of our reporting tools. Austin will be working with our customers and product development teams to make sure our reporting and analytics teams solve real business problems.

Austin

Austin Fossey

The other day I asked Austin some questions to help you see what a tremendous addition he will be to the Questionmark team.

What has been your career track so far?

Most recently, I have been developing assessment and value-added reporting systems at the American Institutes for Research. Before that, I spent three years doing test development and psychometric work for a certification company in the construction industry. I also spent a year at the Independent Evaluation Group at the World Bank, which does program evaluation.

What are some of your career highlights?

I was very excited to be a member on a team that developed data training for K-12 educators. This training built upon previous programs I had implemented, and it really took data analysis to a higher level. The training focused on using data to drive instructional decisions at the class, school, and district level. What I liked about this project was that we did not just cover what their assessment data meant. Instead, we thought critically about the valid uses and limitations of the data, and we worked through real-life problems where data could be used to inform instruction.

Another highlight was the work I did developing a credential based on a portfolio assessment. This was particularly challenging because the portfolio products took many different forms, even though they all reflected the same domain. I got to work with some very creative psychometricians and subject matter experts to research and implement an assessment process that was standardized enough to be defensible, yet flexible enough to accommodate a wide range of candidate profiles. The project was especially fun for me because of the unique measurement challenges inherent in a portfolio assessment.

What attracted you to the position at Questionmark?

Questionmark stood out for me because they are a company that takes reporting and analytics seriously. Some companies seem to treat assessment reporting as an afterthought–a byproduct of the assessment. But in many ways, the reports are the manifestation of the goals of assessment. We collect data so that we can make valid, informed decisions, and this can only be done with efficient, comprehensive reporting strategies. When I saw that Questionmark had a position dedicated solely to reporting and analytics, I knew that this was a company I wanted to work for.

Psychometrics: that’s quite a word! Can you describe what it means in terms that all of us will understand?

It is quite a word–one that spell checker can’t even handle. The most concise and comprehensive description I have read was from Mislevy et al. in the Journal of Educational Data Mining, 4(1). Psychometrics measures educational and psychological constructs with a probabilistic model. In short, psychometricians must take observable evidence (e.g., answers on a test) and make a probabilistic inference about something unobservable (e.g., the test taker’s true ability).

What do you hope to achieve as Product Owner of Analytics?

I am going to continue to build on Questionmark’s reports so that they best suit the needs of our clients and the changing practices in the assessment industry. I plan on using my background in psychometrics and reporting to make sure that our reports are designed and organized in a way that lets users quickly and intuitively leverage their data to make good decisions about their tests and testing programs. I am very fortunate because Questionmark already has a great reporting and analytics structure, and I am excited to continue that work.

How has the field of measurement and evaluation changed in the last 10 years?

In my opinion, measurement and evaluation has changed substantially over the past ten years as a result of improved technology. Fantastic work is being done to improve the accuracy, validity, and reliability of assessment. It is great to see organizations researching new item types, methods for shorter tests, and models that provide diagnostic feedback to test takers. Unfortunately, with such easy access to testing software, there are the occasional cases of “testing for testing’s sake,” where tests are administered without consideration for design or how the results will be used.

Overall, I am optimistic about the course of measurement. There is increased accountability, more support for methods like Evidence Centered Design, and better training and tools available for professional test developers.

How do you see psychometrics contributing to the future of learning?

Psychometrics is contributing to the future of learning by helping educators and students make sense of patterns in the data. While statistical models will never be a replacement for the expertise and instincts of a trained educator, assessments can be a handy tool for understanding students’ strengths, weaknesses, and work strategies. Tests no longer have to be about classifying who passed and who failed. They can now help provide diagnostic feedback so that educators and students can interpret their performance and adjust their learning accordingly. In this way, I think psychometrics has a large role to play in bridging the gap between grades and learning.

What are some of your current research interests?

Right now, I am interested in evidence models for task assessments like games and simulations. Technology lets us create some stunning virtual environments, but the research around how to score these assessments is still in an inchoate stage. While expert ratings and rubrics remain the standard for scoring these assessments, I am researching how we can model difficulty and discrimination in a complex task environment, especially as they relate to the student’s observed process for solving the task.

Tell me some things that interest you outside of work.

I love being around good friends, good food, and good music. I love to travel and go camping with my family, but most weekends you will find us puttering around our plot in our community garden or riding our bikes around the city. I also like working with my hands a lot–I help plant trees with a local nonprofit, and I try to fix stuff around the house (with mixed results).

If you’re attending the Questionmark Users Conference in Baltimore next week, please introduce yourself to Austin!

Tips for delivering effective compliance assessments

Headshot Julie

Posted by Julie Delazyn

Our white paper, The Role of Assessments in Mitigating Risk for Financial Services Organizations, describes five stages of deploying legally defensible assessments:

Compliance five steps

I’ve shared pointers about Planning, Deployment and Authoring in previous posts. For today, here are some tips about Delivery.

Some of these recommendations are specific to Questionmark technologies, but most can be applied to any testing and assessment system:

Delivery chartClick here to read the paper, which you can download free after login or sign-up.

If you are interested in this topic and are eager to learn more about the effective use of online assessments, join us at the Questionmark Users Conference for a full program of cases studies, discussions, presentations about best practices and instruction in the use of the latest Questionmark technologies. Sign up while there’s still time. See you in Baltimore!

 

Hard and soft defences in our castle in the cloud

John Kleeman HeadshotPosted by John Kleeman

How do you communicate the security of a service like Questionmark OnDemand? I find the concept of a castle useful in explaining security. Back in medieval days, people stored safe things – their “crown jewels” – inside a castle to protect them. And today, websites that store confidential information need to set up a “castle in the cloud” to protect data.

Let’s look at a castle’s defences:  hard defences such as walls moats — and soft defences such as the guards who man the watch towers and entry points:

 

castle2

 

How do a castle’s hard and soft defences translate into defences for software-as-a-service in the Cloud?

Hard defences

A castle has a moat and layers of walls. Questionmark OnDemand has firewalls,  and it is tiered so data moves from presentation tier to business tier to data tier to protect the data. (See What’s the key to reliable assessment management? for more on tiers.)

And a castle has watch towers. Questionmark OnDemand has intrusion detection: automatic systems that keep watch for inappropriate traffic.

A castle also has limited entry points. Questionmark OnDemand has limited entry points, too, and it only lets certain types of Internet traffic come in, for example all browser traffic has to use a sufficient level of HTTPS.
Soft defences

A castle is only strong if it has alert guards to protect it. In the medieval world, you needed trained guards on duty 24/7 in case of intruders. You also needed to carefully check identity and authorization in case someone came in to steal the crown jewels by deceit.

Similarly, in a service like Questionmark OnDemand, we have authentication and authorization systems. We also have people behind the scenes who are trained to detect security risks. For instance, everyone at Questionmark is trained on data security and has to take and pass a data security test each year.

I hope the castle analogy amuses and informs. But of course, the real point is that your assessment data is well protected in our castle.

For more information on the security of Questionmark Ondemand, see our security white paper. Watch this blog for future articles on security – or in the meantime, feel free to check out my earlier post 13 Scary Questions to Ask your Assessment Cloud Provider.

 

The Portalization of eLearning and Assessment

Joan Phaup HeadshotPosted by Joan Phaup

Social networking, wikis, blogs, portals and collaboration tools offer powerful ways to increase participation and sustain momentum in learning communities – but how can these tools be blended to create strong learning ecosystems?  Enterprise applications such as Microsoft SharePoint can play a powerful role here, through content management and information sharing.

I talked about this the other day about this with Bill Finegan, who is Vice President – Enterprise Technology Solutions at GP Strategies. He is an expert on  information systems and operations management, with a heavy focus on learning technology.

Bill will be explaining Best Practices for Leveraging SharePoint in your Learning Infrastructure at the Questionmark Users Conference March 3 – 6 in Baltimore. His presentation will help people who are trying to map out their strategy for the evolution of their learning technology as well as technical professionals who are interested in discussing how to link systems together.

How are enterprise portal applications changing the learning landscape?

Bill Finegan

Bill Finegan

What we’re seeing in larger organizations is the need to move to what I tend to call a composite application approach (a mash-up), where the different enterprise applications are being linked to from one landing site, one location, one major hub.

This is allowing learners to get to their learning from the company portal site, one consolidated spot. It’s allowing organizations to tie learning into other enterprise applications such as  SharePoint and enterprise Learning Applications like SuccessFactors, SumTotal, and Questionmark — something they haven’t been able to in the past.

I’ve heard you refer to the “portalization” of eLearning. Could you talk about that development?

We’ve seen  it from two angles. One is the Learning Management Systems attempting to make themselves into that overall portal. You’ll see such as interfaces as the latest version s of SuccessFactors, SumTotal, Moodle, etc. They may have a portal look and feel with links out to different applications.

That being said, a lot of our customers are layering products like SharePoint on top of the LMS to have easier operability. Suppose they were using SuccessFactors as the LMS and Questionmark for assessments. They would leverage SharePoint as their intranet. If they had an existing intranet site, they would allow a subpage for learning, to give a more design-centered approach. With most LMSs going to a software-as-a service (Saas) environment, the portal allows a more personalized look and feel while not interfering what you are doing from SaaS perspective  (no customizations, etc.) and allowing for cleaner upgrades and so on.

How can organizations makes sense of all these different possibilities?

By deciding on their approach to collaboration and their overall approach to social learning. Are they looking for a Facebook-type approach?  An Amazon-type approach?  What systems do they want to use, and how do they want to connect the dots? If they have five different systems but don’t want to go to five different pages, how do they want users to get where they need to go?  Do they want to integrate through their LMS? Of do they want to put a portal on as the interface to their systems and use it to provide discussion threads and other collaboration tools?

How do you envision SharePoint, Questionmark and other systems working effectively together?

I view it as allowing for Questionmark functionality to get linked from and applied at a “presentation-level” perspective from SharePoint for notifications of available assessments — and to allow Questionmark to be the assessment engine underneath the portal. The portal would work the same way with other applications. The main learning technology applications become the proverbial “engine underneath the hood,” powering the systems in place but allowing for a more flexible and intuitive interface.

What would you like your audience to take away?

That Questionmark is ready for portalization, that portalization fits in with integrating Questionmark cleanly with other learning technical applications and that Questionmark and SharePoint can fit together in an overall mash-up/composite application approach.

You can learn more at the conference about this and choose from more than 30 other presentationsRegister online today!

Authoring compliance-related assessments: good practice recommendations

Headshot JuliePosted by Julie Delazyn

Last week I wrote about deploying compliance-related assessments, as part of a series of posts offering good practice recommendations from our white paper, The Role of Assessments in Mitigating Risk for Financial Services Organizations.

This paper describes five stages of deploying legally defensible assessments, along with specific recommendations for people in different job roles. Some of these recommendations are specific to Questionmark technologies, but most can be applied to any testing and assessment system.

The five stages:

Compliance five steps
Today, let’s look at good practice for the third stage: authoring. You will find more recommendations in the White Paper:

authoring chart

Sharon Shrock & William Coscarelli’s Criterion-Referenced Test Development: Technical and Legal Guidelines for Corporate Training provides actionable, practical advice on test development. Sharon and Bill will conduct a workshop on writing valid, reliable tests in Baltimore on Sunday, March 3. Participants will explore testing best practices and will learn how to meet rigorous competency testing standards.

You can register for this workshop when you register for the Questionmark Users Conference or add the workshop later. It’s up to you!

Next Page »