7 ways assessments can save you money and protect your reputation [Compliance webinar]

Julie ProfilePosted by Julie Delazyn

Last week, illegal banking practices cost Wells Fargo, one of America’s largest banks, $185 million in fines. Regulators have called the scandal “outrageous” and stated that the widespread nature of the illegal behavior shows the bank lacked the necessary controls and oversight of its employees.

Educating and monitoring employee understanding of proper practices is vital for regulatory compliance.  How do you ensure your workers are compliant with the rules and regulations in your industry? How do you prove that employee training is understood?

Register today for the FREE webinar: 7 Ways Assessments Fortify Compliance

The webinar will examine real-world examples of how assessments are used to strengthen compliance programs. It will also provide tips for developing valid, reliable assessments.

Predicting Success at Entel Begins with Trustworthy Assessment Results [Case Study]

Julie ProfilePosted by Julie Delazyn

Entel is one of Chile’s largest telecommunications firms, serving both the consumer and commercial sectors. With more than 12,000 employees across its extended Entelenterprise, Entel provides a broad range of mobile and fixed communications services, IT and call center outsourcing and network infrastructure services.

The Challenge

Success in the highly competitive mobile and telecommunications market takes more than world-class infrastructure, great connectivity, an established dealer network and extensive range of retail location. Achieving optimal on-the-job performance yields a competitive edge in the form of satisfied customers, increased revenues and lower costs. Yet actually accomplishing this objective is no small feat – especially for an industry job role notorious for high turnover rates.

With these challenges in mind, Entel embarked on an innovative new strategy to enhance the predictability of the hiring, onboarding, training and developing practices for its nationwide team of 6,000+ retail store and call center representatives.

Certification as a Predictive Metric

Entel conducted an exhaustive analysis – a “big data” initiative that mapped correlations between dozens of disparate data points mined from various business systems, HR systems as well as assessment results – to develop a comprehensive model of the factors contributing to employee performance.

Working with Questionmark OnDemand enabled Entel to create the valid and reliable tests and exams necessary to measure and document representatives’ knowledge, skills and abilities.

Find out more about Entel’s program planning and development, which helped identify and set benchmarks for required knowledge and skills, optimal behaviors and performance metrics, its use of SAP SuccessFactors to manage and monitor performance against many of the key behavioral aspects of the program, as well as the growing role their trustworthy assessment results are having on future product launches and the business as a whole.

Click here to read the full case study.

6 Steps to Authoring Trustworthy Assessments

AprilPosted by April Barnum

I recently met with customers and the topic of authoring trustworthy assessments and getting back trustable results was a top concern. No matter what they were assessing on, everyone wants results that are trustable, meaning that they are both valid and reliable. The reasons were similar, with the top three being: Safety concerns, being able to assert job competency, and regulatory compliance. I often share this white paper: 5 steps to better tests, as a strong resource to help you plan a strong assessment, and I encourage you to check it out. But here are six authoring steps to that can help you achieve trustworthy assessment results:

  1. Planning the assessment or blueprinting it. You basically are working out what it is that the test covers.
  2. Authoring or creating the items.
  3. Assembling the assessment or harvesting the items and assemble them for use in a test.
  4. Piloting and reviewing the assessment prior to using it for production use.
  5. Delivering the assessment or making the assessment available to participants; following security, proctoring and other requirements set out in the planning stage.
  6. Analyzing the results of the assessment or looking at the results and sharing them with stakeholders. This step also involves using the data to weed out any problem items or other issues that might be uncovered.

Each step contributes to the next, and useful analysis of the results is only possible if every previous stage has been done effectively. In future posts, I will go into each step in detail and highlight aspects you should be considering at each stage of the process.

assessment plan

7 Strategies to Shrink Satisficing & Improve Survey Results

John Kleeman Headshot

Posted by John Kleeman

My previous post Satisficing: Why it might as well be a four-letter word explained that satisficing on a survey is when someone answers survey questions adequately but not as well as they can. Typically they just fill in questions without thinking too hard. As a commenter on the blog said: “Interesting! I have been guilty of this, didn’t even know it had a name!”

Examples of satisficing behavior are skipping questions or picking the first answer that makes some kind of sense. Satisficing is very common.  As explained in the previous blog, some reasons for it are participants not being motivated to answer well, not having the ability to answer well, them finding the survey too hard or them simply becoming fatigued at too long a survey.

Satisficing is a significant cause of survey error, so here are 7 strategies for a survey author to reduce satisficing:

1. Keep surveys short. Even the keenest survey respondent will get tired in a long survey and most of your respondents will probably not be keen. To get better results, make the survey as short as you possibly can.Bubble-Sheet---Printing-and-Scanning_2

2. Keep questions short and simple. A long and complex question is much more likely to get a poor quality answer.  You should deconstruct complex questions into shorter ones. Also don’t ask about events that are difficult to remember. People’s memory of the past and of the time things happened is surprisingly fragile, and if you ask someone about events weeks or months ago, many will not recall well.

3. Avoid agree/disagree questions. Satisficing participants will most likely just agree with whatever statement you present. For more on the weaknesses of these kind of questions, see my blog on the SAP community network: Strongly Disagree? Should you use Agree/Disagree in survey questions?

4. Similarly remove don’t know options. If someone is trying to answer as quickly as possible, answering that they don’t know is easy for them to do, and avoids thinking about the questions.

5. Communicate the benefit of the survey to make participants want to answer well. You are doing the survey for a good reason.  Make participants believe the survey will have positive benefits for them or their organization. Also make sure each question’s results are actionable. If the participant doesn’t feel that spending the time to give you a good answer is going to help you take some useful action, why should they bother?

6. Find ways to encourage participants to think as they answer. For example, include a request to ask participants to carefully deliberate – it could remind them to pay attention. It can also be helpful to occasionally ask participants to justify their answers – perhaps adding a text comment box after the question explaining why they answered that way. Adding comment boxes is very easy to do in Questionmark software.

7. Put the most important questions early on. Some people will satisfice and they are more likely to do it later on in the survey. If you put the questions that matter most early on, you are more likely to get good results from them.

There is a lot you can do to reduce satisficing and encourage people to give their best answers. I hope these strategies help you shrink the amount of satisficing your survey participants do, and in turn give you more accurate results.

Getting Assessment Results You Can Trust: White Paper

Headshot JuliePosted by Julie Delazyn

Modern organizations need their people to be competent.

Would you be comfortable in a high-rise building designed by an unqualified architect? Would you fly in a plane whose pilot hadn’t passed a flying test? Would you send a sales person out on a call  if they didn’t know what your products do? Can you demonstrate to a regulatory authority that your staff are competent and fit for their jobs if you do not have trustworthy assessments?

In all these cases and many more, it’s essential to have a reliable and valid test of competence. If you do not ensure that your workforce is qualified and competent, then you should not be surprised if your employees have accidents, cause your organization to be fined for regulatory infractions, give poor customer service or can’t repair systems effectively.

The white paper, Assessment Results you Can Trust, explains that trustworthy assessment results must be both valid (measuring what you are looking for them to measure) and reliable (consistently measuring what you want to be measured).The 6 stages of trustable results; Planning assessment, Authoring items, Assembling assessment, Pilot and review, Delivery, Analyze results

For assessments to be valid and reliable, it’s necessary to follow structured processes at each step from planning through authoring to delivery and reporting.

The white paper covers these six stages of the assessment process:

  • Planning assessment
  • Authoring items
  • Assembling assessment
  • Pilot and review
  • Delivery
  • Analyze results

Following the advice in the white paper and using the capabilities it describes will help you produce assessments that are more valid and reliable — and hence more trustable.

To download the complimentary white paper, click here.

Interested in finding out more about authoring assessments you can trust?  Make sure to join April Barnum’s session: Authoring Assessments You Can Trust: What’s the Process? We look forward to seeing you in Miami next week at Questionmark Conference 2016!

Establishing a data-driven assessment strategy – A Q&A with Amazon

Headshot Julie

Posted by Julie Delazyn

Jason Sunseri is a senior Program Manager – Learning Technology at Amazon. He will be leading a discussion at Questionmark Conference 2016 in Miami, about Creating a Global Knowledge and Skills Assessment Program for Amazon Sellers.

Jason Sunseri, Program Manager – Learning Technology, Amazon

Jason’s session will look at how Amazon Seller Support and Questionmark OnDemand have partnered to deliver a world-class solution. Jason will illustrate how Amazon has used the OnDemand platform to deliver a robust, data-driven assessment strategy.

I recently asked him about his session:

Tell me about Amazon and its use of assessments:

Amazon Seller Support engages with the 2.5 million+ global sellers represented on the Amazon platform. Due to rapid global expansion across the platform, the Amazon Seller Support needed to find a technology and assessment partner that could support both its knowledge and skill acquisition assessment strategies.

How does Amazon use data to drive strategy?

Assessments play a huge role at Amazon. We have really evolved into a data-driven culture and we use assessments in surveys and inside curriculum to assess training and performance, and to identify early issues and trends in order to tweak training content and fix errors.

What role does Questionmark play in that strategy?

We rely heavily on reports — Survey Matrix, Job Task Analysis and other report functions — to assess performance. We’re able to leverage the tool by having individual training centers analyze learning and training gaps and pass on those results. It allows us to see how and why a site is succeeding; where that behavior stems from — it’s really cool to see.

What are you looking forward to at the conference?

It’s Miami, so…the weather, for sure! In all seriousness, I look forward to learning about how other Questionmark users utilize the same tools and how their approach varies from ours.

Thank you, Jason for taking time out of your busy schedules to discuss your session with us!

Next Page »