Understanding Assessment Validity: New Perspectives

Posted by Greg Pope

In my last post I discussed specific aspects of construct validity. I’m capping off this series with a discussion of modern views and thinking on validity.

Dr. Bruno D. Zumbo

Recently my former graduate supervisor, Dr. Bruno D. Zumbo at the University of British Columbia, wrote a fascinating chapter in the new book, The Concept of Validity: Revisions, New Directions and Applications, edited by Dr. Robert W. Lissitz. Bruno’s chapter, “Validity as Contextualized and Pragmatic Explanation, and its Implications for Validation Practice,” provides a great modern perspective on validity.

The chapter has two aims: to provide an overview of what Bruno considers to be the concept of validity, and to discuss the implications for the process of validation.

Something I really liked about the chapter was its focus on why we conduct psychometric analyses digging into how our assessments perform. As Bruno discusses, the real purpose of all the psychometric analysis we do is to support or provide evidence for the claims that we make about the validity of the assessment measures we gather. For example, the reason we would do a Differential Functioning Analysis (DIF), in which we ensure that test questions are not biased against/towards a certain group, is not only to protect test developers against lawsuits but also to weed out invalidity in order to help us set where the inferential limits of assessment results are.

Bruno drives home the point that examining validity is an ongoing process of validation. One doesn’t just do a validity study or two and then be done: validation is an ongoing process in which multilevel construct validation occurs and procedures are tied in to program evaluation and assessment quality processes.

I would highly recommend that people interested in diving more into the theoretical and practical details of validity check out this book, which includes chapters from many highly respected psychometrics and testing industry experts.

I hope that this series on validity has been useful and interesting! Stay tuned for more psychometric tidbits in upcoming posts.


Editor’s Note: Greg will be doing a presentation at the Questionmark Users Conference on Conducting Validity Studies within Your Organization. The conference will take place in Miami March 14 – 17. Learn more at www.questionmark.com/go/conference

Conference Close-up: Assessments That Measure Knowledge, Skill & Ability

Posted by Joan Phaup

I’ve been having a great time talking to presenters at the Questionmark 2010 Users Conference – customers, our keynote speaker and Questionmark staff. I wanted to find out from Howard Eisenberg about the Best Practices presentation he will deliver at the conference on Effectively Measuring Knowledge, Skill and Ability with Well-crafted Assessments

Q: Could you explain your role at Questionmark?

A: I manage Training and Consulting, so I work with our customers to get the most of their assessments and their use of Questionmark Perception. For some that might mean training on how to use the software effectively. For others it might mean providing solutions that allow them to use the software within the context of their current business processes, such as synchronizing the data between the organization’s central user directory and Perception. In some cases we might need to create reports to supplement those that come with the product or do some other custom development. Sometimes we go on site, install the Perception software and set it up within the customers’ LMS and do any troubleshooting right on the spot. Whatever we do, our goal is to ensure customers’ speed to success, getting them operational faster.

Q: What will you be talking about during your Best Practice session in Miami?

A:  Over the years I’ve given presentations on Creating Assessments That Get Results, where I cover the dos and don’ts of writing test items. A question that always comes up during those talks is how to write test content that goes beyond testing information recall…content that tests a person’s ability to perform a task. There are limitations to using software like Perception to do that: certain things simply require that a person perform a task and have someone observe them, so that all the scoring and evaluation is done by an observer or rater. But there are a lot of possibilities for creating computer-scored items that can measure skill and ability rather than just recall of information. This session is designed to give people tools to take their tests to that level. First we need a framework for categorizing knowledge, skills and abilities: what makes a skill a skill and an ability an ability. We’ll help people classify their learning objectives along those lines and look at specific types of questions that can be used to measure skill and ability. The questions that provide this kind of measurement expand upon the question types that are supported in Questionmark Perception—selected response types as well as constructed responses.   We’ll use several real-world examples to illustrate how questions of this nature go beyond recall of knowledge and go to skill and ability.

Q: What are you looking forward to at the conference?

A: I am really looking forward to meeting customers and in some cases reconnecting with customers I’ve gotten to know over the years. That’s really a highlight for me…reconnecting with our great customers. I am consistently amazed and impressed about how passionate our customers are about what they do with our software and how smart they are in using it. Every year, after talking with a customer or sitting in on a case study, I come away thinking, Wow! That was really clever! So I’m looking forward to hearing those kinds of stories again this year.

The conference program is nearly finalized and includes case studies, tech training and best practice sessions for every experience level.  Check it out and plan to join us March 14 – 17 in Miami!

An Updated Questionmark Connector for Blackboard Learn 8 and 9

Joan Phaup

Posted by Joan Phaup

If you use Blackboard, you may be interested to know that the Questionmark Blackboard Connector can now be used with Blackboard Learn™ 8 and 9. Our recent update of the connector harmonizes with Blackboard’s new user interface and offers improved  synchronization and administrative controls.

The connector enables you to maintain student and instructor profiles within the Blackboard platform while taking advantage of Questionmark Perception’s powerful assessment features and numerous question types. This new version of the connector makes it easier access to Perception tests from within Blackboard. And administrators can now limit the number of times participants may take an assessment.

You can learn more about the Questionmark-Blackboard integration by visiting our Blackboard Connector page, where you will find an animated overview explaining how the connector works.

Assessment Accessibility in Questionmark Perception Version 5

john_smallPosted by John Kleeman

One of the capabilities in Perception version 5 that I am most proud of is that we have produced an assessment system that is genuinely accessible for participants. Most assessments created by someone using the software out-of-the-box will meet the accessibility assessment criteria set by the W3C and by the US Government.

Questionmark has always been a leader in accessibility of assessments. The ability to add time as a disability accommodation is one of our widely used capabilities, but for version 5 our customers had been asking for much more. You might think that only a very small number of people taking tests will have accessibility needs, but in fact the number is surprisingly large.


In the United States, there are between 250,000 and 500,000 legally blind people under the age of 65. In England, there are 76,000 blind or partially sighted people under the age of 65 formally registered with the government.

Accommodations for blind people include making software compatible with text to screen readers and allowing people to increase the size of text and change the contrast and colors.

Color blindness

About 7% of men and 0.5% of women are color blind

Color should not be used to distinguish information or navigation or questions.


About 10% of people have dyslexia, 4% of them severely

Accommodations for dyslexic people include providing more time and allowing people to change text size and contrast color, sometimes also screen readers.

Motor disabilities

Some disabilities (multiple sclerosis, paralysis, muscle and joint problems) prevent use of mouse/keyboard and require special input devices. There is no central register of the numbers of such people but a lot of people are impacted – for instance there are around 100,000 sufferers of quadriplegia (often caused by vehicle or sports injury) in the US.

The key accommodation here is to ensure that the assessment system can be used by keyboard (not mouse) as such special input devices emulate keyboard. For example in the picture below, someone is entering keystrokes to control the computer when they cannot use a mouse.

Providing accessibility improvements within the software does not just aid disabled people, it also helps people with temporary disabilities – for example someone who has broken their arm, or a factory worker who’s not used to screens with a lot of text on them.

Questionmark has been receiving increasing requests from our customers to provide a great accessibility solution for them. And in developing version 5, we worked with two very inspiring experts. The main accessibility standard used in the United States is section 508, and we were helped by one of the people who helped draft section 508, Jim Thatcher. In Europe, the main accessibility standard used is the W3C WCAG and we were helped by Dr David Sloan from the University of Dundee. You can see our formal statements of compliance for section 508 here and for W3C WCAG here.

Unless you choose to disable them, most Questionmark assessments now have buttons at the top right of the screen which allow you to change font size and change contrast – you can try out some example assessments at http://www.questionmark.com/us/tryitout_features.aspx.  Or you can see these buttons in the screenshot below.


We’ve thoroughly checked Questionmark with screen readers including the market leader JAWS, and we’ve made sure that assessments are keyboard accessible. We’ve also provided a best practice guide for accessibility, which Questionmark customers can use to ensure that their assessments are accessible.

Accessibility is a journey not just a destination, and we’ll be improving our accessibility over time, but I hope and believe that version 5 will help our many customers produce much more accessible assessments than they could before, and also that the whole assessment community might start to expect more from every piece of software used to deliver assessments.  Because if assessments are to be fair, then they have to be available to all.

A Warm Miami Welcome!

eric_smallPosted by Eric Shepherd

As you probably know, I live in Miami, but as I spend more than 250 nights in hotels around the world I love it when I can stay at home. This year, I am happy that the Questionmark Users Conference is in Miami March 14 – 17! So much so that I’ve posted some pictures and videos to my blog that I feel capture the spirit of Miami, South Beach and what you can expect when you attend the conference.

This year’s conference will have a strong focus on Questionmark Perception version 5 and new trends in assessment delivery.  In addition to tech training, case studies, best practice sessions and peer discussions, you’ll be able to meet one-on-one with our technicians and product managers and network with other Perception users who share your interests. Miami Green and Blue

We are proud to be making this conference greener than ever. The Hilton Miami Downtown participates enthusiastically in the Florida Green Lodging Program, and we at Questionmark are doing our part for the environment by reducing the amount of paper we use, decreasing the size of our conference notebooks and making handouts available online.

I love living in Miami and think it’s a vibrant, colorful and friendly backdrop for the conference. I hope you will take the opportunity to stroll down Lincoln Road in South Beach during our Monday evening dine-around and that you’ll join us on Tuesday evening for a dinner cruise around the harbor. These events are a lot of fun, but they also have a serious purpose: helping you get to know colleagues who share your interest in assessment and create connections that will help you succeed in your work.

I can’t wait to welcome you to the conference—and I am looking forward to learning together with you. So come on down to Miami and warm up!

Early-bird registration for the conference ends today, so I hope you can sign up soon.

Conference Close-up: Best Practices for High-Stakes Testing and More

Joan Phaup

Posted by Joan Phaup

Questionmark Analytics and Psychometrics Manager Greg Pope, like many other staff members, is busy preparing for the Questionmark Users Conference in Miami March 14-17.  Here’s a little background about Greg and a quick round-up of his activities at the conference.


Q: What’s your role at  Questionmark?
A: I have several roles at Questionmark. I am the product owner for reporting, so I’m always talking to customers to find out their requirements for reporting and plan the best ways to add new reports and reporting features into our software. I am also the in-house psychometrics expert, making sure the software we develop conforms to best practices and high standards. Finally, I do a lot of writing and presenting about psychometrics and other issues in the area of assessment.

Q: What will you be presenting at the Users Conference?
A: I have two Best Practice sessions: one on item and test analysis and one on conducting validity studies. I’m also helping out with a Peer Discussion about high-stakes testing.

Q: What would you say are the things most people want to learn about item and test analysis and analytics?
A: People want to know how to use the tools available to them in our software to make the best assessments possible and to make sure they conform to best practices. For example, they want to know what specific things to look for in the Item Analysis Report to find out how their questions are performing. They want to find out how to make their questions better and to make sure their assessments are measuring what they need to measure.

Q: Could you give some details about your session on conducting validity studies?
A:  People have an interest in what validity means and what they can do to evaluate the validity of their assessment program. You don’t have to hire out a team of Ph.Ds to do validity studies. With a solid knowledge of the concepts, and using tools like Excel, organizations can investigate validity in their contexts. I’ll share the theoretical background on validity and provide applied examples of conducting validity studies to help organizations conduct their own studies.

Q: Tell me more about the Peer Discussion, too!
A: We’ll cover some pertinent topics in high stakes testing and what the current thinking is about them. I’ll spend a little time sharing some best practices, but most of the session will be Perception users talking about what challenges they’ve encountered and how they’ve addressed them. Perhaps we will be able, as a group, to come up with some good approaches for particular situations.

Q: What are you look forward to most at the conference?
A:  I like having the opportunity to talk with customers at the Best Practice sessions and in Product Central to find out what issues people are encountering – not just the features they’d like in the software but what issues they’re encountering in the assessment industry as a whole…what kinds of tools they need and what kinds of knowledge they need in order to achieve their business goals.

Earlybird conference registration ends tomorrow, January 22nd. Learn more about the conference and register at www.questionmark.com/go/conference.

And just for fun, check out this brief video we put together after last year’s conference!