Conference Close-up: Best Practices for High-Stakes Testing and More
Posted by Joan Phaup
Questionmark Analytics and Psychometrics Manager Greg Pope, like many other staff members, is busy preparing for the Questionmark Users Conference in Miami March 14-17. Here’s a little background about Greg and a quick round-up of his activities at the conference.
Q: What’s your role at Questionmark?
A: I have several roles at Questionmark. I am the product owner for reporting, so I’m always talking to customers to find out their requirements for reporting and plan the best ways to add new reports and reporting features into our software. I am also the in-house psychometrics expert, making sure the software we develop conforms to best practices and high standards. Finally, I do a lot of writing and presenting about psychometrics and other issues in the area of assessment.
Q: What will you be presenting at the Users Conference?
A: I have two Best Practice sessions: one on item and test analysis and one on conducting validity studies. I’m also helping out with a Peer Discussion about high-stakes testing.
Q: What would you say are the things most people want to learn about item and test analysis and analytics?
A: People want to know how to use the tools available to them in our software to make the best assessments possible and to make sure they conform to best practices. For example, they want to know what specific things to look for in the Item Analysis Report to find out how their questions are performing. They want to find out how to make their questions better and to make sure their assessments are measuring what they need to measure.
Q: Could you give some details about your session on conducting validity studies?
A: People have an interest in what validity means and what they can do to evaluate the validity of their assessment program. You don’t have to hire out a team of Ph.Ds to do validity studies. With a solid knowledge of the concepts, and using tools like Excel, organizations can investigate validity in their contexts. I’ll share the theoretical background on validity and provide applied examples of conducting validity studies to help organizations conduct their own studies.
Q: Tell me more about the Peer Discussion, too!
A: We’ll cover some pertinent topics in high stakes testing and what the current thinking is about them. I’ll spend a little time sharing some best practices, but most of the session will be Perception users talking about what challenges they’ve encountered and how they’ve addressed them. Perhaps we will be able, as a group, to come up with some good approaches for particular situations.
Q: What are you look forward to most at the conference?
A: I like having the opportunity to talk with customers at the Best Practice sessions and in Product Central to find out what issues people are encountering – not just the features they’d like in the software but what issues they’re encountering in the assessment industry as a whole…what kinds of tools they need and what kinds of knowledge they need in order to achieve their business goals.
And just for fun, check out this brief video we put together after last year’s conference!