UK Breakfast Briefings Update: On to Edinburgh!

manchester-2009-briefing-staff4

Here I am (on the right) with fellow Questionmark staff members Anthony Harvey (Solutions Architect), Kate Soper (European Trainer), Ivan Forward (Sales Manager) at the Manchester briefing. Our London briefing was the next day, and our final stop on May 19th will be in Edinburgh.

sarah-small

Posted By Sarah Elkins

This week I’ve been to Manchester and London for the 2009 UK Breakfast Briefings and met with many Questionmark Perception Users and assessment professionals.

We’ve had some great feedback about the latest products and features, including the enhanced participant experience and the translation management capabilities. It’s been fantastic to see so many people get together and share their knowledge of online assessments.

We’ve also seen some great applications of assessments, including a secondary school that found a significant increase in learning retention when students were not only quizzed on what they had learned but also were asked to create the questions themselves. The students actually became the subject matter experts (SMEs) for the assessments!

Next week we’re off to Edinburgh for the final UK Briefing on May 19th. If you’d like to join us, please just register on the website.

Podcast: Tim Ellis on Lancaster University’s Module Evaluation System

 

Posted By Sarah Elkins

I spoke recently with Tim Ellis about Lancaster University, which has a Virtual Learning Environment (LUVLE) that incorporates Questionmark Perception assessments as well as a  student-controlled social web space called MyPlace.

In addition to  discussing the general use of assessments at the university, we focused on the Lancaster University Module Evaluation System, which has increased the quality of student response and reduced the workload of administrators.

Twitter: A Job Analysis Tool?

greg_pope-150x1502

Posted by Greg Pope

I was talking recently with Sean Farrell, a manager of Evaluation and Assessment at a global professional services firm. Sean mentioned an interesting idea that I’d like to share with you, and I’d like to know what you think of it.

Sean recently signed up for a Twitter account. Observing how easy it is for people to post updates and comments there, he began to wonder about how an industrial psychologist could use Twitter. He found a Twitter application to use on his Blackberry, began to search through the options, and came across a function that would remind him to update his tweets on a timed schedule, say every 30 or 60 minutes. Then it hit him! Sean thought perhaps Twitter could be a very useful tool for collecting job task information. This idea made sense to me! I wanted to hear more about what Sean was thinking.

Job analysis is an important part of building valid assessments but in practice it is very difficult to capture good job analysis information. One technique cited in text books is to have job incumbents complete a work journal that captures what they are doing at various times of the work day. Often this technique is viewed as too time consuming and cumbersome for employees to complete. Sean thought: what if we were to ask employees to tweet every 15 or 30 minutes and explain what they are doing at that moment? The person conducting the study could ‘follow’ all the employees and have an instant combined view of tasks completed throughout the day.

twitter-logoIf today’s emerging workforce is already familiar with Twitter and finds it a fun activity then perhaps employees would not mind participating in a Twitter-based job analysis. I think this potential application of Twitter that Sean came up with is really interesting and could be a great way to augment traditional job task analysis information collection via surveys and other means. I want to throw it out there for discussion. Does anyone else think this approach could have merit and want to try it?

The Secret of Writing Multiple-Choice Test Items

julie-smallPosted by Julie Chazyn

I read a very informative blog entry on the CareerTech Testing Center Blog that I thought was worth sharing. It’s about multiple-choice questions: how they are constructed and some tips and tricks to creating them.

I asked its author, Kerry Eades, an Assessment Specialist at the Oklahoma Department of Career and Technology teacherEducation (ODCTE), about his reasons for blogging on The Secret of Writing Multiple-Choice Test Items. According to Kerry, CareerTech Testing Center took this lesson out of a booklet they put together as a resource for subject matter experts who write multiple-choice questions for their item banks, as well as for instructors who needed better instruments to create strong in-class assessments for their own classrooms. Kerry points out that the popularity of multiple-choice questions “stems from the fact that they can be designed to measure a variety of learning outcomes.” He says it takes a great deal of time, skill, and adherence to a set of well-recognized rules for item construction to develop a good multiple-choice question item.

The CareerTech Testing Center works closely with instructors, program administrators, industry representatives, and credentialing entities to ensure skills standards and assessments meet Carl Perkins requirements, reflect national standards and local industry needs. Using Questionmark Perception, CareerTech conducts tests for more than 100 career majors, with an online competency assessment system that delivers approximately 75,000 assessments per year.

Check out The Secret of Writing Multiple-Choice Test Items.

For more authoring tips visit Questionmark’s Learning Café.

12 Tips for Writing Good Test Questions

Posted by Joan Phaup

Writing effective questions takes time and practice. Whether your goal is to measure knowledge and skills, survey opinions and attitudes or enhance a learning experience, poorly worded questions can adversely affect the quality of the results.

I’ve gleaned the following tips for writing and reviewing questions from Questionmark’s learning resources:

1. Keep stems and statements as short as possible and use clear, concise language.toolbox
2. Use questions whenever possible (What, Who, When, Where, Why and How).
3. Maintain grammatical consistency to avoid cueing.
4. List choices in a logical order.
5. Avoid negatives, especially double negatives.
6. Avoid unnecessary modifiers, especially absolutes (e.g. always, never, etc.).
7. Avoid “All of the above” and use of “None of the above” with caution.
8. Avoid vague pronouns (e.g. it, they).
9. Avoid conflicting alternatives.
10. Avoid syllogistic reasoning choices (e.g. “both a and b are correct”) unless absolutely necessary.
11. Avoid providing cues to correct answer in the stem.
12. Avoid providing clues to the answer of one question in another question.

If you would like more information about writing question and assessments, a good place to start is the Questionmark Learning Cafe.

Test Maintenance: Can’t Live Without It!

Posted by Joan Phaup

I enjoyed talking recently with Shannon Bonner of Southern California Edison about the importance of good test maintenance and how to establish solid test maintenance practices. Listen to this podcast for tips about how to maintain the underlying validity of  assessments, ensure the quality of questions and protect test security.

« Previous PageNext Page »