Reminiscing about Santa Fe: Presentations, pictures & the weird and wonderful art house

Posted by Chloe Mendonca

After eagerly looking forward to Questionmark’s most important annual learning event for months, it was over before we even knew it! The Questionmark Conference gave all of us three special days to meet so many of our globally dispersed customers and employees face to face, learn best practices, have fun with one another and discuss new ways to leverage Questionmark’s technologies.

This year I was fortunate enough to be there, and a big highlight was getting a deeper understanding of how others are using Questionmark’s technologies. From our evening networking events to our stimulating panel discussion — which brought together experts from the US State Department, Caterpillar Inc., Scantron and Compass Consultants to discuss best practices for making data work within learning and assessment programs — to more specific breakout sessions, our guest speakers did a wonderful job of sharing lessons learned and best-practice tips.

Todd Horner from Accenture, for example, hosted a great discussion, “Taking the Migraine out of Migration: Accenture’s journey to next-gen authoring.” He spoke about the shared “fear of the unknown” and how to get around change-management challenges. Lauri Buckley and Lindsey Clayton from Caterpillar Inc, delivered an impressive presentation, “A Process to Mastery: Assessments as career development tools,” during which they shared valuable tips about how to effectively design and develop various types of competence assessments, from proficiency tests to validation and observational assessments. You can get the handouts from these presentations and more right here.

For those who couldn’t be there in person, we webcast selected conference sessions — hitting record numbers online. If you joined us for the webcast, got a sense of the Questionmark Conference atmosphere and want to join us in person next year, keep your eyes peeled for our dates and location announcement coming to the blog in the next few months. See the recordings of our selected webcast sessions at: www.questionmark.com/go/2017uconwebcast (Please note you must be logged into the website with your Questionmark username and password).

I’d like to take this opportunity to say a big thank you to all of our wonderful speakers for taking the time to share their knowledge. Without them there would be no conference!

Now for the bit you’ve all been waiting for… conference pictures! To all those who went back to the office struggling to describe the weird and wonderful art house that is Meow Wolf’s House of Eternal Return, hopefully these snaps will make things a little easier 😊  View conference and evening event pictures here on our flickr page.

What did you enjoy most about Questionmark Conference 2017? Leave me a comment below and stay in touch!


Just in case you missed it…

John Kleeman, Questionmark’s Founder & Executive Director reported back 6 good practice tips heard in Santa Fe.

7 ways assessments can save you money and protect your reputation [Compliance webinar]

Julie ProfilePosted by Julie Delazyn

Last week, illegal banking practices cost Wells Fargo, one of America’s largest banks, $185 million in fines. Regulators have called the scandal “outrageous” and stated that the widespread nature of the illegal behavior shows the bank lacked the necessary controls and oversight of its employees.

Educating and monitoring employee understanding of proper practices is vital for regulatory compliance.  How do you ensure your workers are compliant with the rules and regulations in your industry? How do you prove that employee training is understood?

Register today for the FREE webinar: 7 Ways Assessments Fortify Compliance

The webinar will examine real-world examples of how assessments are used to strengthen compliance programs. It will also provide tips for developing valid, reliable assessments.

Predicting Success at Entel Begins with Trustworthy Assessment Results [Case Study]

Julie ProfilePosted by Julie Delazyn

Entel is one of Chile’s largest telecommunications firms, serving both the consumer and commercial sectors. With more than 12,000 employees across its extended Entelenterprise, Entel provides a broad range of mobile and fixed communications services, IT and call center outsourcing and network infrastructure services.

The Challenge

Success in the highly competitive mobile and telecommunications market takes more than world-class infrastructure, great connectivity, an established dealer network and extensive range of retail location. Achieving optimal on-the-job performance yields a competitive edge in the form of satisfied customers, increased revenues and lower costs. Yet actually accomplishing this objective is no small feat – especially for an industry job role notorious for high turnover rates.

With these challenges in mind, Entel embarked on an innovative new strategy to enhance the predictability of the hiring, onboarding, training and developing practices for its nationwide team of 6,000+ retail store and call center representatives.

Certification as a Predictive Metric

Entel conducted an exhaustive analysis – a “big data” initiative that mapped correlations between dozens of disparate data points mined from various business systems, HR systems as well as assessment results – to develop a comprehensive model of the factors contributing to employee performance.

Working with Questionmark OnDemand enabled Entel to create the valid and reliable tests and exams necessary to measure and document representatives’ knowledge, skills and abilities.

Find out more about Entel’s program planning and development, which helped identify and set benchmarks for required knowledge and skills, optimal behaviors and performance metrics, its use of SAP SuccessFactors to manage and monitor performance against many of the key behavioral aspects of the program, as well as the growing role their trustworthy assessment results are having on future product launches and the business as a whole.

Click here to read the full case study.

6 Steps to Authoring Trustworthy Assessments

AprilPosted by April Barnum

I recently met with customers and the topic of authoring trustworthy assessments and getting back trustable results was a top concern. No matter what they were assessing on, everyone wants results that are trustable, meaning that they are both valid and reliable. The reasons were similar, with the top three being: Safety concerns, being able to assert job competency, and regulatory compliance. I often share this white paper: 5 steps to better tests, as a strong resource to help you plan a strong assessment, and I encourage you to check it out. But here are six authoring steps to that can help you achieve trustworthy assessment results:

  1. Planning the assessment or blueprinting it. You basically are working out what it is that the test covers.
  2. Authoring or creating the items.
  3. Assembling the assessment or harvesting the items and assemble them for use in a test.
  4. Piloting and reviewing the assessment prior to using it for production use.
  5. Delivering the assessment or making the assessment available to participants; following security, proctoring and other requirements set out in the planning stage.
  6. Analyzing the results of the assessment or looking at the results and sharing them with stakeholders. This step also involves using the data to weed out any problem items or other issues that might be uncovered.

Each step contributes to the next, and useful analysis of the results is only possible if every previous stage has been done effectively. In future posts, I will go into each step in detail and highlight aspects you should be considering at each stage of the process.

assessment plan

7 Strategies to Shrink Satisficing & Improve Survey Results

John Kleeman Headshot

Posted by John Kleeman

My previous post Satisficing: Why it might as well be a four-letter word explained that satisficing on a survey is when someone answers survey questions adequately but not as well as they can. Typically they just fill in questions without thinking too hard. As a commenter on the blog said: “Interesting! I have been guilty of this, didn’t even know it had a name!”

Examples of satisficing behavior are skipping questions or picking the first answer that makes some kind of sense. Satisficing is very common.  As explained in the previous blog, some reasons for it are participants not being motivated to answer well, not having the ability to answer well, them finding the survey too hard or them simply becoming fatigued at too long a survey.

Satisficing is a significant cause of survey error, so here are 7 strategies for a survey author to reduce satisficing:

1. Keep surveys short. Even the keenest survey respondent will get tired in a long survey and most of your respondents will probably not be keen. To get better results, make the survey as short as you possibly can.Bubble-Sheet---Printing-and-Scanning_2

2. Keep questions short and simple. A long and complex question is much more likely to get a poor quality answer.  You should deconstruct complex questions into shorter ones. Also don’t ask about events that are difficult to remember. People’s memory of the past and of the time things happened is surprisingly fragile, and if you ask someone about events weeks or months ago, many will not recall well.

3. Avoid agree/disagree questions. Satisficing participants will most likely just agree with whatever statement you present. For more on the weaknesses of these kind of questions, see my blog on the SAP community network: Strongly Disagree? Should you use Agree/Disagree in survey questions?

4. Similarly remove don’t know options. If someone is trying to answer as quickly as possible, answering that they don’t know is easy for them to do, and avoids thinking about the questions.

5. Communicate the benefit of the survey to make participants want to answer well. You are doing the survey for a good reason.  Make participants believe the survey will have positive benefits for them or their organization. Also make sure each question’s results are actionable. If the participant doesn’t feel that spending the time to give you a good answer is going to help you take some useful action, why should they bother?

6. Find ways to encourage participants to think as they answer. For example, include a request to ask participants to carefully deliberate – it could remind them to pay attention. It can also be helpful to occasionally ask participants to justify their answers – perhaps adding a text comment box after the question explaining why they answered that way. Adding comment boxes is very easy to do in Questionmark software.

7. Put the most important questions early on. Some people will satisfice and they are more likely to do it later on in the survey. If you put the questions that matter most early on, you are more likely to get good results from them.

There is a lot you can do to reduce satisficing and encourage people to give their best answers. I hope these strategies help you shrink the amount of satisficing your survey participants do, and in turn give you more accurate results.

Getting Assessment Results You Can Trust: White Paper

Headshot JuliePosted by Julie Delazyn

Modern organizations need their people to be competent.

Would you be comfortable in a high-rise building designed by an unqualified architect? Would you fly in a plane whose pilot hadn’t passed a flying test? Would you send a sales person out on a call  if they didn’t know what your products do? Can you demonstrate to a regulatory authority that your staff are competent and fit for their jobs if you do not have trustworthy assessments?

In all these cases and many more, it’s essential to have a reliable and valid test of competence. If you do not ensure that your workforce is qualified and competent, then you should not be surprised if your employees have accidents, cause your organization to be fined for regulatory infractions, give poor customer service or can’t repair systems effectively.

The white paper, Assessment Results you Can Trust, explains that trustworthy assessment results must be both valid (measuring what you are looking for them to measure) and reliable (consistently measuring what you want to be measured).The 6 stages of trustable results; Planning assessment, Authoring items, Assembling assessment, Pilot and review, Delivery, Analyze results

For assessments to be valid and reliable, it’s necessary to follow structured processes at each step from planning through authoring to delivery and reporting.

The white paper covers these six stages of the assessment process:

  • Planning assessment
  • Authoring items
  • Assembling assessment
  • Pilot and review
  • Delivery
  • Analyze results

Following the advice in the white paper and using the capabilities it describes will help you produce assessments that are more valid and reliable — and hence more trustable.

To download the complimentary white paper, click here.

Interested in finding out more about authoring assessments you can trust?  Make sure to join April Barnum’s session: Authoring Assessments You Can Trust: What’s the Process? We look forward to seeing you in Miami next week at Questionmark Conference 2016!

Next Page »