New best practice webinars: Taking your assessments from to good to great

Posted by Chloe Mendonca

“Good, better, best. Never let it rest. ‘Til your good is better and your better is best.” This old little rhyme teaches us a valuable lesson: There is always room for improvement! No matter what role or business you’re in, if you’re interested in long-term success, you should strive to continuously improve your knowledge, systems and processes.

But how does this relate to assessments? Well, in many ways, there are always things we can do to develop better assessments: more secure, more trustworthy assessment programs. Maybe your current assessment program is “good”, but is “good” good enough?

We’re offering two new webinars that will help you assess how you’re currently performing in two key areas — and take your assessments from good to great:

  1. Item Writing

How to write high quality test items [35-Minute Session]

  • 3rd August, 2016, 3:00 p.m. UK BST / 10:00 a.m. US EDT

Are your items poorly written? Perhaps they’re good but you want them to be “better”. Skilfully crafted items promote learning and memory recall. They help retain knowledge, skills and/or abilities over time, but writing high-quality items isn’t as easy as it looks. This session will give you tips for taking your items to the next level.

  1. Exam Integrity

Enhancing exam integrity with online proctoring [45-Minute Session]

  • 9th August, 2016, 3:00 p.m. UK BST / 10:00 a.m. US EDT

With online proctoring rapidly gaining the attention of organisations and test sponsors around the world, many are wondering how it compares with traditional test centre proctoring. This 45-minute webinar will discuss what online proctoring is, how it works and whether it can in fact boost test security. Don’t miss this session if you’re keen to extend geographic reach and lower test administration costs.


If you’re looking to learn more about what you can achieve with Questionmark’s Assessment Management System, join our 60-minute introductory session. We’ll demo the platform live and cover a number of key features and functions. Save your seat at one of these sessions:

Intro to Questionmark’s Assessment Management System [60-Minute Session]

  • 4th August, 2016, 10:30 a.m. (BST) UK
  • 10th August, 2016, 12:00 p.m. (EDT) US

We also deliver this webinar in Spanish and Portuguese. Check out the upcoming dates and times here.

Predicting Success at Entel Begins with Trustworthy Assessment Results [Case Study]

Julie ProfilePosted by Julie Delazyn

Entel is one of Chile’s largest telecommunications firms, serving both the consumer and commercial sectors. With more than 12,000 employees across its extended Entelenterprise, Entel provides a broad range of mobile and fixed communications services, IT and call center outsourcing and network infrastructure services.

The Challenge

Success in the highly competitive mobile and telecommunications market takes more than world-class infrastructure, great connectivity, an established dealer network and extensive range of retail location. Achieving optimal on-the-job performance yields a competitive edge in the form of satisfied customers, increased revenues and lower costs. Yet actually accomplishing this objective is no small feat – especially for an industry job role notorious for high turnover rates.

With these challenges in mind, Entel embarked on an innovative new strategy to enhance the predictability of the hiring, onboarding, training and developing practices for its nationwide team of 6,000+ retail store and call center representatives.

Certification as a Predictive Metric

Entel conducted an exhaustive analysis – a “big data” initiative that mapped correlations between dozens of disparate data points mined from various business systems, HR systems as well as assessment results – to develop a comprehensive model of the factors contributing to employee performance.

Working with Questionmark OnDemand enabled Entel to create the valid and reliable tests and exams necessary to measure and document representatives’ knowledge, skills and abilities.

Find out more about Entel’s program planning and development, which helped identify and set benchmarks for required knowledge and skills, optimal behaviors and performance metrics, its use of SAP SuccessFactors to manage and monitor performance against many of the key behavioral aspects of the program, as well as the growing role their trustworthy assessment results are having on future product launches and the business as a whole.

Click here to read the full case study.

Security, Reporting and Online Testing in Academia: A Q&A with Don Kassner

Julie ProfileI recently spoke with Don Kassner, who has joined Questionmark as vice president of academic markets. I wanted to take a moment to welcome Don and to ask him a few questions about his extensive background, his new job role and his hopes for the future. Here’s a snippet of our conversation:

Don Kassner, VP of academic markets, Questionmark

You have had an extensive career. What is your background, and how has it influenced the insight you have on the learning and assessment world?

At heart, I am an entrepreneur.  I started my first business when I was 8 years old and have been starting and building things ever since. In the last 15 years or so I have focused on technology, training and education. During that time, I have served as the CFO for an auto dealership training company and as president of a small university in addition to founding and building the largest online proctoring company. I have also served on the faculty of San Jose State University (economics) and have been an active member of multiple onsite accreditation evaluation teams.

Now that you are part of the Questionmark team, can you share a little about your new role and the goals you have for this position?

I am excited to reach out to the academic markets and address ongoing concerns around testing.

Colleges and universities want to deliver tests and exams that are consistent, fair, reliable and defensible. They want to deliver and monitor course evaluations, identify knowledge gaps and place students in appropriate courses, and be able to analyze questions to determine which ones are valid and fair. They want to do this while increasing student satisfaction and reducing cheating.

With Questionmark’s extensive security, reporting and analytics tools, that’s all possible.  My focus is to leverage my insight and experience and put this secure and easy-to-use tool in the right hands.

Can you share some of the important topics surrounding the academic markets that are on your radar?

In academia, there are real issues related to online testing. These are the top two:

  • Is the student doing the work? – With technology comes flexibility and as the stakes become higher, students will naturally look for ways to enhance their scores.
  • Are the exam results fair?  Did all the students face the same conditions with the same opportunities and were the exam rules followed?

Questionmark OnDemand, especially its analytics and reporting tools, can provide reports that can uncover the meaning hidden within assessment results, such as an item or assessment’s reliability and defensibility. In regards to security, Questionmark Secure’s lock-down browser plays a huge role in helping organizations provide a secure environment in which to deliver high stakes assessments. This can significantly help reduce the risk of cheating when deployed along with other defenses to combat impersonation and content theft.

We’re very excited to have you on board! What are your plans for keeping in touch with us on The Questionmark Blog?

Once a teacher, always a teacher! I look forward to turning back the clock to my university days and speaking on the hot-button topics that surround academic testing. My experience in education and entrepreneurship will help me as I share my thoughts on current market trends and the future of assessment and testing.

Thank you Don for taking the time to speak to us today!

You can follow Don on Twitter and connect with him on LinkedIn

SAML 101

Bart Hendrickx SmallPosted by Bart Hendrickx

As I mentioned in my previous post on SSO, Single Sign-On: Who’s Involved?, we’ll take a look at SAML to understand what it is and how it’s used with SSO. In this post I’ll explain what SAML is, and I will offer an example use case in my next post.

Webinar

So, What Is SAML?

SAML, or Security Assertion Markup Language, is a protocol that allows systems to exchange authentication data on users. (It facilitates other use cases as well, but I will focus on authentication.) What does that mean? It means that one system can ask: “Who is this user?” and another system can answer: “This is Jane Doe.” As I mentioned in my post previous post, I am talking about the service provider (SP) and identity provider (IdP) respectively.

Service providers (SP) in this context can be any software system with which you can do something, such as sending and receiving email, tracking projects or delivering assessments. Similarly, an identity provider (IdP) can be any software system that contain data on users that you can use to determine who those users are.

If you manage an SP, you probably don’t want just any IdP telling you who someone is. You will typically trust only one or a few IdPs. And if you are in charge of an IdP, you will likewise prefer to send user data only to those SPs you know and trust. To accomplish that, the SP and IdP exchange data allowing them to establishing a trust relationship. Those data are often called federation metadata, federation referring to the fact that there is an alliance between the different systems.

SAML is a popular protocol to set up such federations between service providers and identity providers. Look up SAML in your favorite search engine and you will get many results. One of its advantages is that it is extensible, meaning that you can exchange information that is relevant to your situation. For example, do you have an IdP that stores the hire date for an employee (or enrollment date of a student)? Do you want to share those data with an SP so that it can decide whether the user is allowed to access a certain resource? Then you can set up the federation in such a way that the IdP will send an attribute for hire (or enrollment) date to the SP.

Another advantage, and it is a huge one, is that SAML can be used in situations where the IdP and SP cannot talk to each other, for example because they are on different networks. You may have an IdP running on your internal network, behind a firewall. Your SP may be available in the cloud, as is the case with Questionmark OnDemand. The SP cannot talk to the IdP because it cannot “see” it. However, that’s not a problem for SAML. In my next post, we’ll take a look at a typical use case so we can see the practicality of using SAML with SSO.

 

6 Steps to Authoring Trustworthy Assessments

AprilPosted by April Barnum

I recently met with customers and the topic of authoring trustworthy assessments and getting back trustable results was a top concern. No matter what they were assessing on, everyone wants results that are trustable, meaning that they are both valid and reliable. The reasons were similar, with the top three being: Safety concerns, being able to assert job competency, and regulatory compliance. I often share this white paper: 5 steps to better tests, as a strong resource to help you plan a strong assessment, and I encourage you to check it out. But here are six authoring steps to that can help you achieve trustworthy assessment results:

  1. Planning the assessment or blueprinting it. You basically are working out what it is that the test covers.
  2. Authoring or creating the items.
  3. Assembling the assessment or harvesting the items and assemble them for use in a test.
  4. Piloting and reviewing the assessment prior to using it for production use.
  5. Delivering the assessment or making the assessment available to participants; following security, proctoring and other requirements set out in the planning stage.
  6. Analyzing the results of the assessment or looking at the results and sharing them with stakeholders. This step also involves using the data to weed out any problem items or other issues that might be uncovered.

Each step contributes to the next, and useful analysis of the results is only possible if every previous stage has been done effectively. In future posts, I will go into each step in detail and highlight aspects you should be considering at each stage of the process.

assessment plan