4 Tips to Help Ensure the Security of Intellectual Property

julie-smallPosted by Julie Chazyn

Protecting the intellectual property contained in a test or exam is essential, not only because of the time, effort and cost of creating assessments but also because IP theft undermines the accurate measurement of knowledge and skills.

Protecting intellectual property protects the credibility of tests. Here are four tips for helping to ensure the security of intellectual property:

Create and administer multiple test forms

Rather than having only one form of the assessment being administered, delivering multiple forms of the same exam can help limit item exposure. This method also allows for the possibility of interspersing large-scale integrated beta test questions within the forms to collect psychometric information on newly developed questions.

Restrict and control administration of beta test items

Beta testing questions is an important part of high-stakes assessment, ensuring the psychometric quality of questions before they appear on actual assessments. However, it is vital that a well conceptualized beta test model is in effect to limit the exposure of newly developed questions to participants.

Update exam forms periodically

Letting exam forms become stale can over-expose questions to participants, increasing the likelihood of IP theft. An organization could consider retiring old exam forms and turning them into exam prep materials that can be sold to participants. In this way, participants could periodically expect new practice questions.

Produce exam prep materials

Organizations should consider making exam prep materials available to participants before an assessment. This will help reduce the demand for participants to try to obtain exam questions via illegal means as they will have access to the type of questions that will be asked on the actual assessment.

For more details on this subject, plust information about various means for deploying a wide range of assessment types with assurance, download our White Paper: Delivering Assessments Safely and Securely.

Questionmark Live: 25,000 Questions Later

jim_small

Posted by Jim Farrell

Just over six months ago we announced Questionmark Live at the Questionmark Users Conference in Memphis.Since then, there have been 25,000 questions created using this new browser-based tool.  That sounds great, doesn’t it! But I was surprised to meet a number of customers at our Breakfast Briefings and User Group Meetings who had not yet seen this easy way to author questions. So I’d like to turn back the clock and re-announce the availability of Questionmark Live, which  all of our Software Support Plan customers can use free of charge.

Anyone you want questions from can have unlimited access to this tool. There’s nothing to download. They can just start creating questions sets on a desired theme  or topic, using seven different question formats. The questions can utilize multimedia, links and choice-based feedback.

Do you utilize workflows that involve SMEs, instructional designers and editors? Questionmark Live offers a great way to help people work together. Users can share their question sets with each other for true collaboration. Every question created and edited has a full revision history that can be used to compare revisions and roll back to previous versions.

If you haven’t taken a look at this exciting and dynamic tool, just click here to get started. If you would like a demonstration, please email customercare@questionmark.com to set one up!

frontview

5 Steps for Designing Appropriate Learning Experiences

julie-smallPosted by Julie Chazyn

Assessments provide a valuable tool for helping organizations properly design effective and useful learning experiences. Doing so involves a five-step process.

Step 1: Define the objectives. An objective might be to increase customer satisfaction, reduce error rates or improve safety.

Step 2: Ask what knowledge and skills are required to meet the objectives. In a college or university course on organic chemistry, for example, it’s important to ask what are the knowledge and skills required to comprehend those concepts and use them. The answers to that question will help the professor establish the topic structure to define the knowledge, skills, abilities and attitudes required to meet the objectives.

Step 3: Run a needs analysis survey or skills gap survey. Here, people take a needs assessment to reveal the knowledge and skills they already have as well as what they still need. A gap analysis can be derived from the difference between what is required and people’s current knowledge and skills.

Step 4: Develop a learning plan. That plan will describe the learning objectives and explain how the plan will be administered. The learning objectives will guide the production of learning materials and assessments. Facilitating the learning process might involve instructor-led training, coaching by managers, or e-learning courses.

Step 5: Conduct a pre-learning assessment. The pre-learning assessment will have two purposes: to create intrigue about the course and  to guide each participant to the right learning experience. Advanced and novice learners will require different approaches.

To learn more about  how instructors and organizations can use assessments to improve learning, download the white paper Assessments through the Learning Process.

Item Analysis Analytics: The White Paper

greg_pope-150x1502

Posted by Greg Pope

I had a great time putting together an eight-part series on Item Analysis Analytics for this blog and was pleased with the interest it received.

When a reader asked if it would be possible to present all the posts in a single document I thought hey, let’s present the content of these articles in the form of a Questionmark White Paper! So here it is for you to download with our compliments.

I hope the paper helps you in your efforts to create test questions that make the grade!

Topic based feedback goes to the ball

john_smallPosted by John Kleeman

In talking with some of our customers last week, I was reminded how valuable it can be to offer participants topic based feedback.

Obviously, everyone wants to know whether they’ve passed or failed a test. And most people look at their feedback on questions they got wrong, to understand how to improve. But you can get a single question wrong for many different reasons including misunderstanding the question, making a mistake or having a tiny gap in knowledge. However, if you score weakly in a topic area, it very likely means you have a weakness in that area that needs addressing.

In many ways, topic feedback is the Cinderella in the feedback world. Everyone expects assessment level feedback and item level feedback (perhaps it’s unfair to call them ugly sisters because they are useful and valuable), but there is a huge and often untapped learning value in topic feedback. For pre tests, post course tests, quizzes during learning and practice tests particularly, topic feedback is vital.

Suppose someone is taking an assessment in health and safety and they score 66% and this is a passing score. Sounds good! But what happens if, as in the screenshot below, they’ve scored very well in some topics and poorly in others?

Assessment feedback screenshot showing score of 88% and 100% in two topics and 63% and 13% in other two topics. Weaker topics have links to learning resources.

In this example, the fact that someone is very weak in electrical safety could well be concerning. (Don’t let them set up the lighting for the Holidays Ball!)

It’s obvious that you want to give people feedback on the topic level, but in many tools this isn’t as easy to set up as it should be. Questionmark Perception can be a fairy godmother for topic based feedback. There are lots of easy-to-use capabilities to present topic based feedback. Here are some links to support resources to help you create topic feedback in Perception.

  • You can easily create topic outcomes with feedback for different topic scores in Authoring Manager
  • You set these as standard at the topic level, but can override to adjust on a per assessment level
  • You can also make a topic a prerequisite for passing an assessment (for instance to prevent someone passing an assessment unless they get 60% or a specified score in key topics).
  • If you want to display only some topics in the list and not all, for instance if some of your topics aren’t meaningful to the participant, you can define which topics are reported on.
  • And then the feedback is displayed to participants at the end of assessment easily as in the screenshot above.

I hope you find this useful in getting your topic feedback working and helping your learners achieve their full potential.

Measuring Learning Results: Eight Recommendations for Assessment Designers

Joan PhaupPosted by Joan Phaup

Is it possible to build the perfect assessment design? Not likely, given the intricacies of the learning process! But a white paper available on the Questionmark Web site helps test authors respond effectively to the inevitable tradeoffs in order to create better assessments.

Measuring Learning Results, by Dr. Will Thalheimer of Work-Learning Research, considers findings from fundamental learning research and how they relate to assessment. The paper explores how to create assessments that measure how well learning interventions are preparing learners to retrieve information in future situations—which as Will states it is the ultimate goal of training and education.

The eight bits of wisdom that conclude the paper give plenty of food for thought for test designers. You can download the paper to find out how Will arrived at them.

1. Figure out what learning outcomes you really care about. Measure them. Prioritize the importance of the learning outcomes you are targeting. Use more of your assessment time on high-priority information.

2. Figure out what retrieval situations you are preparing your learners for. Create assessment items that mirror or simulate those retrieval situations.

3. Consider using delayed assessments a week or month (or more) after the original learning ends—in addition to end-of-learning assessments.

4. Consider using delayed assessments instead of end-of-learning assessments, but be aware that there are significant tradeoffs in using this approach.

5. Utilize authentic questions, decisions, or demonstrations of skill that require learners to retrieve information from memory in a way that is similar to how they’ll have to retrieve it in the retrieval situations for which you are preparing them. Simulation-like questions that provide realistic decisions set in real-world contexts are ideal.

6. Cover a significant portion of the most important learning points you want your learners to understand or be able to utilize. This will require you to create a list of the objectives that will be targeted by the instruction.

7. Avoid factors that will bias your assessments. Or, if you can’t avoid them, make sure you understand them, mitigate them as much as possible, and report their influence. Beware of the biasing effects of end-of-learning assessments, pretests, assessments given in the learning context, and assessment items that are focused on low-level information.

8. Follow all the general rules about how to create assessment items. For example, write clearly, use only plausible alternatives (for multiple-choice questions), pilot-test your assessment items to improve them, and utilize psychometric techniques where applicable.

Next Page »