Seven tips to recruit and manage SMEs for technology certification exams

imagePosted by John Kleeman

How do you keep a certification exam up to date when the technology it is assessing is changing rapidly?

Certifications in new technologies like software-as-a-service and cloud solutions have some specific challenges. The nature of the technology usually means that questions often require very specialist knowledge to author. And because knowledge of the new technology is in short supply, subject matter experts (SMEs) who are able to author and review new items will be in high demand within the organization for other purposes.

Cloud technological offerings also change rapidly. It used to be that new technology releases came out every year or two, and if you were writing certification exams or other assessments to test knowledge and skill in them, you had plenty of notice and could plan an update cycle. But nowadays most technology organizations adopt an agile approach to development with the motto “release early, release often”. The use of cloud technology makes frequent, evolutionary releases – often monthly or quarterly – normal.

So how can you keep an exam valid and reliable if the content you are assessing is changing rapidly? Here are seven tips that could help – a few inspired by an excellent presentation by Cisco and Microsoft at the recent European Association of Test Publishers conference.

  1. Try to obtain item writing SMEs from product development. They will know what is coming and what is changing and will be in a good position to write accurate questions. 
  2. Also network for SMEs outside the organization – at technology conferences, via partners and resellers, on social media and/or via an online form on your certification website. A good source of SMEs will be existing certified people.
  3. Incentivize SMEs – what will work best for you will depend on your organization, but you can consider free re-certifications, vouchers, discounts off conferences, books and other incentives. Remember also that for many people working in technology, recognition and appreciation are as important as financial incentives. Appreciate and recognize your SMEs. For internal SMEs, send thank you letters to their managers to appreciate their effort.
  4. Focus your exam on underlying key knowledge and skills that are not going to become obsolete quickly. Work with your experts to avoid items that are likely to become obsolete and seek to test on fundamental concepts not version specific features.
  5. When working with item writers, don’t be frightened to develop questions based on beta or planned functionality, but always do a check before questions go live in case the planned functionality hasn’t been released yet.
  6. Analyze, create, deliverSince your item writers will likely be geographically spread and will be busy and tech-literate, use a good collaborative tool for item writing and item banking that allows easy online review and tracking of changes. (See https://www.questionmark.com/content/distributed-authoring-and-item-management for information on Questionmark’s authoring solution.)
  7. In technology as in other areas, confidentiality and exam security are crucial to ensure the integrity of the exam. You should have a a formal agreement with internal and external SMEs  who author or review questions to remind them not to pass the questions to others. Ensure that your HR or legal department are involved in the drafting of these so that they are enforceable.

Certification of new technologies helps adoption and deployment and contributes to all stakeholders success. I hope these tips help you improve your assessment programme.

5 methods to use when planning your assessments

AprilPosted by April Barnum

In my previous article, I gave an overview of the six authoring steps that can help you achieve trustable assessment results. Each step contributes to the next and useful analysis of results is only possible if all six steps are done effectively.assessment plan

Now, let’s dig into step 1 of the authoring process: planning the assessment. There are five methods you can use to plan your assessments for trustable results. Questionmark offers you the technology to do each of these five methods covered below.

  1. To determine what the test should cover you can use job task analysis surveys to make sure you assess the right competencies. This will help analyze what tasks within a job role are most important and are key to discover what topics need to be covered in an assessment. Questionmark technology offers a JTA question type and provides JTA reports to help you run JTAs easily and effectively and get useful data to use in your assessment design.
  2. Once the JTA has been completed, you can determine the topics that an assessment needs to cover. Using an assessment management system with an item bank that structures items by hierarchical topics is hugely beneficial and makes it easy to manage and view all items and assessments under development.
  3. Indexing or metatagging items by specific job tasks, knowledge, skills and abilities can be useful in planning assessments to allow for more flexible management of items and selection within the appropriate assessments.
  4. Protecting against content theft is an important part of the planning of items and assessments because if item or assessment content is leaked out during the assessment construction process, it will reduce the assessment’s validity. Having secure access to items and assessments is essential. Individual logons protected by strong passwords and good policies and culture within your team can help prevent this.
  5. Planning for assessing someone’s competence in the language they are most comfortable in is an important part of the assessment planning process. Planning for translation management for managing translation and multilingual delivery capabilities is an important part of planning your assessments if you need multilingual assessments for your participants.

qm-white-paper2I often share this white paper: 5 Steps to Better Tests, as a strong resource to help you plan a strong assessment, and I encourage you to check it out.

Next time, we’ll discuss authoring items. I hope you enjoyed these tips. If there are any more that you go back to when you begin your assessment planning process, please add them to the comment section below!

6 Steps to Authoring Trustworthy Assessments

AprilPosted by April Barnum

I recently met with customers and the topic of authoring trustworthy assessments and getting back trustable results was a top concern. No matter what they were assessing on, everyone wants results that are trustable, meaning that they are both valid and reliable. The reasons were similar, with the top three being: Safety concerns, being able to assert job competency, and regulatory compliance. I often share this white paper: 5 steps to better tests, as a strong resource to help you plan a strong assessment, and I encourage you to check it out. But here are six authoring steps to that can help you achieve trustworthy assessment results:

  1. Planning the assessment or blueprinting it. You basically are working out what it is that the test covers.
  2. Authoring or creating the items.
  3. Assembling the assessment or harvesting the items and assemble them for use in a test.
  4. Piloting and reviewing the assessment prior to using it for production use.
  5. Delivering the assessment or making the assessment available to participants; following security, proctoring and other requirements set out in the planning stage.
  6. Analyzing the results of the assessment or looking at the results and sharing them with stakeholders. This step also involves using the data to weed out any problem items or other issues that might be uncovered.

Each step contributes to the next, and useful analysis of the results is only possible if every previous stage has been done effectively. In future posts, I will go into each step in detail and highlight aspects you should be considering at each stage of the process.

assessment plan

Develop Better Tests with Item Analysis [New eBook]

Posted by Chloe Mendonca

Item Analysis is probably the most important tool for increasing test effectiveness.  In order to write items that accurately and reliably measure what they’re intended to, you need to examine participant responses to each item. You can use this information to improve test items and identify unfair or biased items.

So what’s the process for conducting an item analysis? What should you be looking for? How do you determine if a question is “good enough”?

Questionmark has just published a new eBook “Item Analysis Analytics, which answers these questions. The eBook shares many examples of varying statistics that you may come across item analysis ebookin your own analyses.

Download this eBook to learn about these aspects of analytics:

  • the basics of classical test theory and item analysis
  • the process of conducting an item analysis
  • essential things to look for in a typical item analysis report
  • whether a question “makes the grade” in terms of psychometric quality

This eBook is available as a PDF and ePUB suitable for viewing on a variety of mobile devices and eReaders.

I hope you enjoy reading it!

7 Strategies to Shrink Satisficing & Improve Survey Results

John Kleeman Headshot

Posted by John Kleeman

My previous post Satisficing: Why it might as well be a four-letter word explained that satisficing on a survey is when someone answers survey questions adequately but not as well as they can. Typically they just fill in questions without thinking too hard. As a commenter on the blog said: “Interesting! I have been guilty of this, didn’t even know it had a name!”

Examples of satisficing behavior are skipping questions or picking the first answer that makes some kind of sense. Satisficing is very common.  As explained in the previous blog, some reasons for it are participants not being motivated to answer well, not having the ability to answer well, them finding the survey too hard or them simply becoming fatigued at too long a survey.

Satisficing is a significant cause of survey error, so here are 7 strategies for a survey author to reduce satisficing:

1. Keep surveys short. Even the keenest survey respondent will get tired in a long survey and most of your respondents will probably not be keen. To get better results, make the survey as short as you possibly can.Bubble-Sheet---Printing-and-Scanning_2

2. Keep questions short and simple. A long and complex question is much more likely to get a poor quality answer.  You should deconstruct complex questions into shorter ones. Also don’t ask about events that are difficult to remember. People’s memory of the past and of the time things happened is surprisingly fragile, and if you ask someone about events weeks or months ago, many will not recall well.

3. Avoid agree/disagree questions. Satisficing participants will most likely just agree with whatever statement you present. For more on the weaknesses of these kind of questions, see my blog on the SAP community network: Strongly Disagree? Should you use Agree/Disagree in survey questions?

4. Similarly remove don’t know options. If someone is trying to answer as quickly as possible, answering that they don’t know is easy for them to do, and avoids thinking about the questions.

5. Communicate the benefit of the survey to make participants want to answer well. You are doing the survey for a good reason.  Make participants believe the survey will have positive benefits for them or their organization. Also make sure each question’s results are actionable. If the participant doesn’t feel that spending the time to give you a good answer is going to help you take some useful action, why should they bother?

6. Find ways to encourage participants to think as they answer. For example, include a request to ask participants to carefully deliberate – it could remind them to pay attention. It can also be helpful to occasionally ask participants to justify their answers – perhaps adding a text comment box after the question explaining why they answered that way. Adding comment boxes is very easy to do in Questionmark software.

7. Put the most important questions early on. Some people will satisfice and they are more likely to do it later on in the survey. If you put the questions that matter most early on, you are more likely to get good results from them.

There is a lot you can do to reduce satisficing and encourage people to give their best answers. I hope these strategies help you shrink the amount of satisficing your survey participants do, and in turn give you more accurate results.

Item Development Tips For Defensible Assessments

Julie ProfilePosted by Julie Delazyn

Whether you work with low-stakes assessments, small-scale classroom assessments or large-scale, high-stakes assessment, understanding and applying some basic principles of item development will greatly enhance the quality of your results.

What began as a popular 11-part blog series has morphed into a white paper: Managing Item Development for Large-Scale Assessment, which offers sound advice on how-to organize and execute item development steps that will help you create defensible assessments. These steps include:   Item Dev.You can download your copy of the complimentary white paper here: Managing Item Development for Large-Scale Assessment

Next Page »