How much do you know about assessment? Quiz 1: Cut Scores

Posted by John Kleeman

Taking quizzes is a fun way to learn. One of the Questionmark blog’s most popular-ever entries was Howard Eisenberg’s Take our Quiz on Writing Good Test Questions, back in 2009.

Readers told us that it was instructive and engaging to take quizzes on using assessments, and we like to listen to you! So here is the first of a new series of quizzes on assessment topics. This week’s quiz is on setting a cut score (pass score).  The questions, written for us by Neil Bachelor of Pure Questions, are about what to do when designing a diagnostic test for safety procedures.

We regard resources like this quiz as a way of contributing to the ongoing process of learning about assessment. In that spirit, please enjoy the quiz below and feel free to comment if you have any suggestions to improve the questions.

Now on to the quiz! Be sure to look for your feedback after you have completed it!

My “#Follow Monday” Picks

jim_small Posted by Jim Farrell

I don’t know about the rest of you, but Twitter and Google+ have become my main ways of keeping up with industry happenings (as well as interesting celebrities).

The highlight of my week in terms of professional and personal growth is #FollowFriday. For those of you who are new to Twitter, every Friday, people list people they find interesting and think their followers should pay attention to. So, since today is Monday, consider today #FollowMonday Jim Farrell style with 10 people and companies I think are worth following.

@Questionmark – Did you really think I was going to start with something different? Following Questionmark gives you blog articles, videos and case studies for people interested in assessments.

@JaneBozarth – Jane is a fellow North Carolinian and the E-Learning Coordinator for the North Carolina Office of State Personal. That title is impressive but what is more impressive are her books, articles and blogging about learning and e-learning. A must follow!

@ambermac – Amber Mac was one of the keynote speakers and mLearnCon 2011 and is the host of a television show in Canada on emerging technologies. Her book, Power Friending, is a must-have for people in the world of communication and social media.

@Quinnovator – If you have been to an e-learning conference you know Clark Quinn. His sessions, blog articles and books on learning are always extremely thought provoking and lately seem to be focusing on mLearning.

@MegSecatore – Megan Secatore is an instructional technologist I follow most closely during e-learning conferences (although since she is in Boston — my hometown — I enjoy her non-work tweets too). She is an extremely active tweeter  — and she’s so good at it that I can learn all about the various conference sessions she attends by reading her tweets. She’s could outdo the play-by-play announcers at most sporting events!

@WillWorkLearn – Will Thalheimer is an old friend of Questionmark (and I mean friend for a long time, and not old). I would consider Will a learning visionary. Will is not a heavy tweeter but he is worth following to see which conferences he is attending.

@JohnKleeman – John is Questionmark’s founder and chairman. John is a very active tweeter and is currently passionate about Questionmark’s integrations with SAP and SharePoint.

@eLearningGuild – Besides having the second best conferences (behind Questionmark of course), the eLearning Guild is a tremendous source of information from e-learning, assessment and learning professionals. A definite must-follow.

@ASTD – Like eLearning Guild, the national and local ASTD chapters are a great way to learn about conferences and find interesting people to follow.

@marciamarcia – Marcia Conner was one of the first people I followed on Twitter. She has a great book titled The New Social Learning that I have read two times now. She is a speaker at many learning conferences and is someone to definitely check out.

Conference Close-up: Sustaining large-scale e-assessment

Posted by Joan Phaup

The University of Bradford in the U.K.  is delivering four times the number of e-assessments now as it did four years ago – about 60,000 annually these days.

John Dermo from the University’s Centre for Educational Development will tell how this came about – and how the university sustains this high level of assessment – during a case study presentation at the Questionmark European Users Conference in Brussels this October.

John Dermo

John’s session will build on some tips he shared in a previous blog article, but I asked him for a few details about what’s happening at the university and what he’ll be sharing at the conference.

Tell me about your work.

I’m responsible for technology-enhanced  formative and summative assessment  at the university.  I work with a range of people involved in assessment in different ways:  academic staff, administrators, IT support, the exams office and the invigilators. As well as initiating changes, I’m the go-between for the different groups.

A couple of projects that took place between 2007 and 2009 have paved the way for expansion and innovation in the area of assessment. Before that, we had limited, ad hoc use of e-assessment, but demand was building up so we built support systems to meet it.  We created a workflow model and figured out exactly who did what, and we aimed to make the whole thing scalable. We also built a new e-assessment room to help build up the summative, high-stakes side of our assessment programme.

How has e-assessment at the university changed in the past few years?

With summative assessments, we’ve seen an increase in the speed with which we can get grades to students.  Also, it’s now possible to use more multimedia, particularly high-res photographs.  The sort of the thing that’s too costly on paper is more practical on the screen. We can also run more authentic types of assessments:  we might combine a standard multiple choice assessment with some other online or computer-based tool.

For low-stakes and formative assessments the impact has been slightly different. There has been an increase in the amount of feedback that can be given, and certainly where that’s used it has been very popular with students. There has also been an increase in regular low-stakes assessments, so it has certainly  affected the way in which people use blended learning. There’s more interaction now than there was before.

What are the key issues and challenges in achieving sustainable development of e-assessment?

The key things are communication and knowing who does what at what point in the process. It’s easy to think that someone else is going to do a particular task, but that may not be so. Forward planning, fallback plans and communication between the roles is absolutely vital. It’s also important to give staff a certain level of autonomy. Yes, we need processes, but we need to allow for flexibility.

Another thing is keeping training and support as flexible as possible. Some people want to use e-assessment on a regular basis, other people just once or twice a year, so we often need to deliver support and training on a just-in-time basis as well as through more structured programmes. But it’s important to be realistic about what you can do for people. You can’t do everything yourself so you have to set realistic goals and  negotiate the most practical way of delivering  assessments, managing the workload between different groups as needed.

A big challenge is how to deal with the pioneers who drive innovation. Whilst of course you encourage the pioneers and the innovators, it’s more effective if you weave their enthusiasm into their teams. Relying on just one person won’t make innovation sustainable across the institution. A pioneer might move on, retire, something like that, and where does that leave you?  Having innovators share with those around them helps build a sustainable future.

Also, make sure you have some sort of institutionally recognized policy about assessment and keep revisiting it and documenting any changes that you make.

How will your session help people from other institutions expand their use of e-assessment?

Mainly by sharing my experience over the years in an institution where we have seen this growth. I’ll try to draw out practical tips that people can take away. I also want to give people the opportunity to share their own experiences.

I find the Questionmark users community really very, very supportive. It’s one of the reasons I’m attending the conference, in addition to being in constant contact with some members of the community.  I think the more we can share our experiences the better.

What are you looking forward to at the conference?

A lot! I’m looking forward to meeting up with some people I haven’t seen in a couple of years. I’m also very interested in the new functionality in Questionmark Perception version 5 because we are in the process of upgrading. And I want to learn about integrating v5 with other tools, in particular virtual learning environments.

There’s still time to register for the conference. Click here to learn more.

Key Innovation Drivers for Learning Environments

julie-smallPosted by Julie Delazyn

It seems we’re in the midst of a revolution that will dramatically change the way we learn. In a recent post  about Key Innovation Drivers for Learning Environments on his own blog, our CEO, Eric Shepherd, suggests that key goals of these new environments will be to:

  • Use competency maps to understand where  learners are and help them navigate to where they want to be
  • Magically expose content at the moment of need and in the right context

Inter-system data sharing to allow personalization

Eric regards technology as crucial to turning us away from the formal, academic learning model (reliant on hard-to-find content  such as  books stored in libraries) to learning that is more personalized, accessible and shareable. This change will rely on learning content that is available and discoverable, inter-system data sharing that allows personalization and data flows that keep stakeholders engaged.

Eric describes innovation drivers that would help make all this happen and in the process reduce learner distraction, make learning easier anywhere, anytime, and create a more enjoyable, personalized learning experience. These drivers include:

  • Funding of open educational resources base on open standards
  • Inter-system data sharing to allow personalization
  • Standard integrations that allow one environment to launch another system with learner context, and
  • The creation of registries that make learning resources  easier to find and share.
  • Inter-system data sharing
  • The user of library science techniques for content classification

There’s a lot more detail in Eric’s post, so click here to read the whole article.. If this and other topics about assessment and learning interest you, check out Eric’s blog.

Podcast: Demonstrating the business value of training initiatives

Posted by Joan Phaup

How do learning organizations demonstrate the business value of their initiatives?

I was delighted to speak recently with Art Dobrucki, director of learning strategy and performance at Farmers Insurance Group, about how he and his colleagues approach this question.

Art Dobrucki

We talked about how important it is to start out by identifying the business problem to be solved, then basing learning and evaluation on desired business results and the behaviors people need to exhibit in order to achieve them.

Going on to identify the knowledge, skills and attitudes that people need in order to exhibit those behaviors provides a framework for building and developing a strategic learning program with measurable results.

We also discussed the importance of coupling learning initiatives with a measurement strategy, the role of assessments in measurement and the effective use of training scorecards. It all adds up demonstrating that learning is an essential driver of organizational success.

I hope you enjoy this podcast of our conversation:

Conference Close-up: Options for Integrating with Questionmark Perception

Posted by Jane Townsend

I’m pleased that Steve Lay, Questionmark’s Integration Team Lead, will be with us at the Questionmark European Users Conference in Brussels to talk about various ways to integrate Questionmark Perception with other systems.

Steve’s presentation will focus on using QMWISe (Questionmark Web Integrated Services Environment) to integrate with Questionmark Perception.

Steve Lay

I asked him for a few details about his subject:

What are the key methods for integrating Questionmark Perception with other systems?

The first and simplest is something we call PIP – Perception Integration Protocol. This enables people to create special web links that in turn allow participants to go straight in to launch and take a test. We can even include things like single sign-on in this type of integration.

The second is to use package standards — such as SCORM or AICC – which internally build on the PIP protocol and provide a standards-based approach to integrating with learning content.

Our other method of integration, called QMWISe (Questionmark Web Integrated Services Environment), is a web services-based integration. You have to be a programmer to work with QMWISe, but the reward is a much more powerful integration, for example the synchronization of people information with Perception. You get many more options with this method, which more than compensates for the extra work needed at the programming stage.

Could you tell me more about the more powerful integrations that QMWISe makes possible?

With QMWISe, you can automatically schedule assessments to people and manage other aspects of the assessment process in a variety of ways. It can be used in a portal to integrate user data and for single sign-on: you can then connect straight through from the portal to Perception. It enables the synchronization of people information from, say, HR systems or student information systems:  in particular you might track changes to the people information held in these external systems. Custom programs can then push these changes through QMWISe to Perception.

The QMWISe method also enables users to develop deeper connections to LMSs. This is the technology we use in our Connectors such as the integrations with the SAP Learning System and Blackboard.

How are customers using QMWISe at the moment?

Perhaps the most popular way we are seeing customers use QMWISE is to integrate something like an external portal with Perception. This could be something like a Learning Management System (LMS) or a company portal. They typically provide single sign-on and show information about assessments that are available. They often find that this can be done more easily than they think by using QMWISe. Customers often use QMWISe to synchronize people information, as I mentioned before.

Who will benefit from attending your presentation?

In my opinion anybody who is thinking of doing integration with Perception will benefit. They’ll have the opportunity to find out what the options are, although we’ll be looking at web services integration in particular. People interested in deeper integration will benefit the most from the presentation. We’ll show a couple of typical applications to get things started. So if you’re using one of these common applications you’ll find something you can identify with and may even get to take home some recipes you can work with.

We hope you will be able to join us at the conference from 9 to 11 October. Click here to register.

Next Page »