See You in LA!

eric_smallPosted by Eric Shepherd

I am looking forward to meeting old friends and new at this year’s Questionmark Users Conference in Los Angeles March 15 – 18!

LA is a place that revels in finding new ways to do things, and the conference will reflect that spirit by exploring a sea change that’s transforming the world of learning and assessment: the increasing adoption of social and informal learning initiatives by organizations of all stripes.

One of the things we’ll be talking about at the conference is the 70+20+10 model for learning and development, which I recently wrote about in my own blog . This model suggests that about 70% what we learn is from real-life and on-the job experiences — with about 20% coming from feedback and from observing and working with other people. That leaves about 10% of learning taking place through study or formal instruction. So how do we measure the other 90%? Where does assessment fit in to 70+20+10? These questions will make for some lively conversation!

We’ll be providing some answers to them by showing how Questionmark’s Open Assessment Platform works together with many commonly used informal/social learning technologies such as wikis, blogs and portals – and we’ll be showing how we will build on that going forward. We’ll demonstrate features and applications ranging from embedded, observational and mobile assessments to content evaluation tools, open user interfaces, new authoring capabilities in Questionmark Live, and next-generation reporting and analytics tools.

Of course we’ll share plenty of information and inspiration about assessments in the here and now as well as in the future! In addition to tech training, case studies, best practice sessions and peer discussions, you’ll be able to meet one-on-one with our technicians and product managers and network with other Perception users who share your interests.

I can’t wait to welcome you to the conference and I am looking forward to learning together with you. The conference program offers something for every experience level, so I hope you will take a look at it, sign up soon and join us in Los Angeles.

Coming soon to Los Angeles: A full conference program by and for Questionmark users

Joan Phaup

Posted by Joan Phaup

Questionmark users who join us in Los Angeles for the Questionmark Users Conference  March 15 – 18 will be spoiled for choice when it comes to choosing breakout sessions. Many delegates will be presenting case studies and leading peer discussions. And we’ve put together product training, best practice and drop-in demo sessions to suit everyone from beginners to power users. 

Here’s the complete conference line-up. You can Click here for details about all these sessions:

General Sessions

  • Opening General Session: Would You “Like” To Know? The Roles of Assessment in Social Learning
  •  Keynote: Bryan Chapman on Assessment’s Strategic Role in Enterprise Learning: Innovation and Trends
  • Closing General Session: The Road Ahead

Case Studies

  • Tiered Assessment Model for Internal Certification (Intermediate/Advanced) – Accenture
  • Applying Diagnostic Assessment in a Virtual Corporate University (All Experience Levels) – PricewaterhouseCoopers
  • Dynamically Generating Certificates from an Assessment (Intermediate/Advanced) –  Beckman Coulter, Inc.
  • Questionmark in Medical Education: Planning and Implementing Blended Course Exams (Beginning/Intermediate) –  Loma Linda University School of Medicine
  • Enabling Self-Service Reporting (Intermediate) – Accenture
  • Implementing E-Testing in the US Coast Guard: Challenges and Solutions (All Experience Levels)
  • Ensuring the Security and Integrity of Certification Tests (All Experience Levels) – Philadelphia Parking Authority
  • Using Captivate and Flash Simulations in eLearning and Assessments (Beginning/Intermediate) – Verizon Communications

Tech Training

  • Introduction to Questionmark Perception for Beginners
  • Advanced Authoring Techniques Using Authoring Manager (Intermediate/Advanced)
  • Planning Your Migration from Perception v4 to v5 (Intermediate/Advanced)
  • Configuring the User Experience and Understanding Templates in Perception v5 (Intermediate/Advanced)
  • Authoring with Questionmark Live – A hands-on introduction (Bring your own laptop!) (Beginning/Intermediate)
  • Analyzing and Sharing Assessment Results (Beginning/Intermediate)
  • Integrating Perception with Other Systems (Advanced)

Best Practices

  • Best Practices for Surveys and Course Evaluations (All Experience Levels) – Greg Pope, Analytics and Psychometrics Manager, Questionmark
  • Using Questionmark Perception to Make SharePoint an Effective Learning Platform (All Experience Levels) – John Kleeman, Chairman, Questionmark
  • Getting the Most from a Small Screen: Design Strategies for Mobile Devices (All Experience Levels) –  Silke Fleischer, Co-founder and CEO, ATIV Software
  • Principles of Item and Test Analysis (All Experience Levels) – Greg Pope, Analytics and Psychometrics Manager, Questionmark
  • Using the Latest Learning Research to Improve Your Questionmark Assessments (All Experience Levels) – John Kleeman, Chairman, Questionmark
  • Writing High-quality Assessments that Yield Meaningful Results (All Experience Levels) – Howard Eisenberg, Training and Consultancy Manager, Questionmark

Peer Discussions

  • Evaluating the Effectiveness of Your Training Programs: Metrics and ROI – Farmers Insurance
  • Using Printing and Scanning to Capture Survey and Test Results – PG&E Academy
  • Delivering Assessments at Academic Institutions: An Open Conversation- Rio Salado College
  • Improving Your Workflow: Ensuring smooth sailing from question creation to release  –  U.S. Coast Guard

Drop-in Demos

  • Using Assessments with Social/Informal Learning Tools
  • Perception v5 and Questionmark Live
  • Tools and Features for Enhancing Your Assessments and Reports

We’ll also be hosting drop-in sessions with Questionmark technicians and group discussions with our product owners about potential new offerings. You can register for the conference online or email conference@questionmark.com for more information.

Conference Close-up: Assessments That Measure Knowledge, Skill & Ability

Posted by Joan Phaup

I’ve been having a great time talking to presenters at the Questionmark 2010 Users Conference – customers, our keynote speaker and Questionmark staff. I wanted to find out from Howard Eisenberg about the Best Practices presentation he will deliver at the conference on Effectively Measuring Knowledge, Skill and Ability with Well-crafted Assessments

Q: Could you explain your role at Questionmark?

A: I manage Training and Consulting, so I work with our customers to get the most of their assessments and their use of Questionmark Perception. For some that might mean training on how to use the software effectively. For others it might mean providing solutions that allow them to use the software within the context of their current business processes, such as synchronizing the data between the organization’s central user directory and Perception. In some cases we might need to create reports to supplement those that come with the product or do some other custom development. Sometimes we go on site, install the Perception software and set it up within the customers’ LMS and do any troubleshooting right on the spot. Whatever we do, our goal is to ensure customers’ speed to success, getting them operational faster.

Q: What will you be talking about during your Best Practice session in Miami?

A:  Over the years I’ve given presentations on Creating Assessments That Get Results, where I cover the dos and don’ts of writing test items. A question that always comes up during those talks is how to write test content that goes beyond testing information recall…content that tests a person’s ability to perform a task. There are limitations to using software like Perception to do that: certain things simply require that a person perform a task and have someone observe them, so that all the scoring and evaluation is done by an observer or rater. But there are a lot of possibilities for creating computer-scored items that can measure skill and ability rather than just recall of information. This session is designed to give people tools to take their tests to that level. First we need a framework for categorizing knowledge, skills and abilities: what makes a skill a skill and an ability an ability. We’ll help people classify their learning objectives along those lines and look at specific types of questions that can be used to measure skill and ability. The questions that provide this kind of measurement expand upon the question types that are supported in Questionmark Perception—selected response types as well as constructed responses.   We’ll use several real-world examples to illustrate how questions of this nature go beyond recall of knowledge and go to skill and ability.

Q: What are you looking forward to at the conference?

A: I am really looking forward to meeting customers and in some cases reconnecting with customers I’ve gotten to know over the years. That’s really a highlight for me…reconnecting with our great customers. I am consistently amazed and impressed about how passionate our customers are about what they do with our software and how smart they are in using it. Every year, after talking with a customer or sitting in on a case study, I come away thinking, Wow! That was really clever! So I’m looking forward to hearing those kinds of stories again this year.

The conference program is nearly finalized and includes case studies, tech training and best practice sessions for every experience level.  Check it out and plan to join us March 14 – 17 in Miami!