Using the Assessment results over time report in Questionmark Analytics

This week’s “how to” article highlights the “Assessment results over time report,” one of the Questionmark Analytics reports now available in Questionmark OnDemand .

  • What it does: The Assessment results over time report provides summary assessment performance information over time, including the mean, minimum and maximum values for a test or exam as well as its 95% confidence interval. It also shows the number of participants who took the assessment during that period of time. You can assign these filters to your report:
  • Assessment filter
  • Group filter
  • Date filter
  • Who should use it: Assessment, learning and education professionals can use this report to view and filter assessment results over time, making it easy for them to flag abnormal patterns that may indicate a statistically significant difference between the means.
  • How it looks: This report offers a lot of information graphically and compactly, making it easy to interpret vast amounts of information quickly. It is broken down into two components (graphs).

1. A graph displaying average assessment scores achieved by participants over a period of time. The blue triangles represent the means for the assessment results. The vertical lines next to the triangles denote confidence intervals: long bars indicate the data is varied and short bars indicate high confidence in the mean value. It’s easy to see in the first graph  that the results of tests administered just before September 6, 2010, differ dramatically from the other results during this period!
2. A graph displaying the number of results from the same time period. This volume information can help plan administration sessions and load.

A PDF and an analysis-friendly CSV can also be generated.

 

Happy Birthday SharePoint 2010

Posted by John Kleeman

A year ago (May 12th 2010), Microsoft launched SharePoint 2010 and Questionmark wishes SharePoint 2010 a happy first birthday.

A key improvement in SharePoint 2010 over the earlier 2007 version was the inclusion of much-improved tools for social networking. Blogs and wikis are stronger. You can tag and rate pages and resources. There are activity streams, better personal sites and knowledge mining. The updated user interface, with an Office-like ribbon, makes it easier to use and helps people realize SharePoint is an end-user tool, not just a techie tool.

Questionmark is seeing a lot more interest in integrating assessments in SharePoint now that the 2010 version is out. Companies are looking at expanding their existing SharePoint installations to add a social dimension to learning and training, and universities are also looking at deploying SharePoint for learning, sometimes in place of traditional VLEs. Office 365, Microsoft’s cloud offering, includes a twin sister of SharePoint 2010 (“SharePoint Online”) inside it, and I know this will help bring more people to the SharePoint party.

We’ve started our own blog on SharePoint and assessments; for more on this see http://blog.sharepointlearn.com.

According to Microsoft, every day for the last 5 years, 20,000 workers have joined the ranks of SharePoint users. If you’re one of these or are thinking of becoming one, then to celebrate SharePoint’s birthday, here is a fun quiz on SharePoint – how much do you know?

How the University of Bradford sustains a high volume of e-assessments

Posted by John Kleeman

The University of Bradford currently delivers around 60,000 assessments per academic year with Questionmark Perception, four times as many as four years ago, as you can see in the graph below.

I spoke to John Dermo from the Centre for Educational Development at the University about advice he would give others who are seeking to expand their use of e-assessment and how to sustain this over time. John explained that a common pattern of use at Bradford is that an instructor comes to the Centre interested in delivering a summative exam online, with mainly interested in saving time by having the computer do the scoring. In many cases they then start seeing the benefits of formative assessment, firstly by using practice tests to prepare for the exam, and then by adding topic and specific question feedback to make practice tests into formative tests that help students learn. So at Bradford, the initial drive for e-assessment is often to reduce grading effort, but once people have experienced the benefits there, they often want to do formative assessments too.

Here are ten tips from John on making your e-assessment policy sustainable:Graph showing growth of use of Questionmark for e-assessment at University of Bradford

1. Building confidence is vital. Confidence comes from training and support, which needs to be sustainable and flexible. In the early stages, the Centre made sure that technical support was visible in the room during every exam in case of problems. Now with e-assessment more established, they have someone support at the beginning and end only. Invest in lots of support to begin with.

2. Give students practice tests ahead of summative exams. Students will feel more comfortable and make fewer mistakes in exams if they have a chance to practice in the format beforehand. Bradford also provides training videos to help students become familiar with the software.

3. Engage with course teams rather than just individuals. Often it’s an individual who is an innovator and does the first set of e-assessments, but aim to work with their team not just them, as that is more sustainable in the long term. The dip you can see in the graph above in 2009-10 was due to a couple of heavy pioneering users leaving the University or going on sabbatical. Individuals often lead the way, but engage with course teams to make the use of e-assessment sustainable.

4. Be flexible on environment. Bradford has a 100-seat specialist e-assessment suite, but online assessment also takes place elsewhere (e.g. overflow rooms for large modules, other locations for students with accessibility needs, also for distance and overseas students), so make your procedures flexible.

5. Distinguish summative and formative assessments. For summative assessments, schedule and manage them centrally to ensure the right level of support and security. But for formative assessments, empower course teams to make them happen as independently and easily as possible from within the University VLE (at Bradford, they’ve found the Blackboard Connector to Questionmark Perception makes this very easy).

6. Define a formal policy for e-assessment and have it approved at institutional level. This will ensure that all stakeholders concerns are addressed (e.g. accreditation, IT, student records, disability office). Bradford’s current policy is here. Remember that your policy needs to be reviewed at regular intervals.

7. Include in the policy quality assurance procedures for summative assessments. Unlike in a paper exam, e-assessment lets you change a question at the last minute, but to retain quality, you need to ensure that formal policies on review are followed.

8. Define roles and responsibilities clearly. Moving from paper to electronic assessment can introduce ambiguities as to who is responsible for what, so have the conversations to resolve these.  You want to make sure that no important steps are allowed to “fall between the gaps”, where everybody assumes that someone else is responsible.

9. Involve peer institutions as a sounding board. For instance, Bradford has learned from Dundee University’s experiences with CAA policies and from talking to many other institutions about the use of Questionmark Perception. There is a healthy Questionmark Perception user community within the HE sector, which communicates via the Questionmark Users Conference, other e-learning and e-assessment conferences and via various online communities.

10. Success means empowering course teams to create reliable, valid and practical e-assessments. Support, help and train them to do so, working with the departments to meet their teaching and learning needs, and to deliver the institution’s teaching and learning strategy.

For more information on how the University of Bradford uses Questionmark Perception, see the University’s internal best practice document.

Using the Question Status Report in Questionmark Analytics

This week’s “how to” article highlights the “Question Status Report,” one of the Questionmark Analytics reports now available in Questionmark OnDemand.

  • What it does: This item bank report tells how many questions you have in your repository by question status:
  • Normal – The question can be included in assessments
  • Retired – The question is retired and cannot be included in assessments
  • Incomplete – The question is still being developed and cannot be included in assessments
  • Experimental – The question can be included in assessments but is available in experimental form only
  • Beta – The question is treated in the same way as a normal question. Beta questions can be included in assessments.
  • Who should use it: This report gives testing, assessment, learning and education professionals a quick view of the current status of questions in their item banks.
  • How it looks: This report lists question status possibilities along the left-hand side. Horizontal bars indicate the number of questions with each status. These bars can be color coded by topic as well as question status. The report can be viewed in a web browser or downloaded and distributed as a PDF. The CSV version of this report lists question status in the first column and the number of questions per topic in the remaining columns. The question detail CSV distribution provides information such as each question’s Perception question ID, wording, description, status, topic and question type. You can see the information for your entire repository or just for specific topics.

How SAP creates educational and assessment content

Posted by John Kleeman

As I’ve mentioned in previous posts, Questionmark are partners of SAP, so I blog on the SAP Community Site as well as here on Questionmark’s blog.

Dr. Thilo Buchholz

In my SAP blog I’m featuring Q&A interviews with thought leaders in the field of assessment and learning within the SAP ecosystem. I spoke recently with Dr. Thilo Buchholz, who leads a team that handles production platform management and operations for SAP Education. Our interview, which covered many aspects of the systems that are used to create all of SAP educational content, was featured in the SAP Business Process Expert Community Newsletter.

Among the various subjects we covered was SAP’s move toward mobile. Here’s just a bit of that interchange.

John : As more SAP software runs on mobiles, do you see delivering learning content on mobiles will be more important?

Thilo : Definitely yes. SAP has a strategy of delivering business applications On Premise, On Demand and On Device. And as we deliver our software in these categories, our knowledge transfer, the actual performance support needs to be delivered via the same distribution channels. So we will be publishing educational content On Device if the software is delivered on Device. Actually our single source approach provides us with the flexibility to deliver educational content wherever it is needed.

Questionmark Perception could serve in this environment by flexibly supporting assessment scenarios. One example is to deliver a pre-assessment prior to a learning object; if the learner can answer the questions, then he or she can skip the learning object. Secondly in the learning process the learner can use self-assessments in order to check progress. And after the learner has passed the learning object it’s good to be able to have some control to see if he or she achieved the learning objective.  So overall, an assessment scenario is enriching e-learning content and helps the learner to steer themselves well, without a face-to-face trainer. Remember that in a classroom environment the instructor frequently checks on the learning progress of participants and adjusts approach based on this result of his/her “assessment”.

To see the rest of the interview, including questions on how SAP sees technology influencing and changing the future production of training materials, click here.

Ever attended the Performance Testing Council Summit?

jim_small Posted by Jim Farrell

One of the things I enjoy most about working for Questionmark is attending conferences run by elearning and testing associations. I just returned from the Performance Testing Council Summit in Chevy Case, Maryland, and I must say it was one of the most interesting meetings I have attended.

As an instructional designer with previous companies, I was not aware of the Performance Testing Council. We often struggled to create rigorous yet fair performance tests and felt we were on an island. Little did I know there is a group whose sole purpose is to remove barriers in developing performance tests.

The summit is exactly what you would want from a two-day event –- lots of presentations by people and organizations using performance testing  and time to socialize with thought leaders in the area. The discussions were fascinating and inspiring, bringing together bringing together test developers, program managers, psychometricians and testing providers to talk about the finer points of creating effective performance tests.

While I was scribbling notes, one member said, “We must keep in mind that we are trying to stay faithful to what is being done on the job.” That immediately made me think about our Observational Assessment solution in the latest version of  Questionmark Perception OnDemand. Observational Assessments (sometimes called “Workplace Assessments”) offer a way to assess a participant in their everyday tasks and rate their cognitive knowledge or abilities that would not normally get reflected in answers to a standard assessment. Testing someone actually doing an everyday task in the workplace is certainly one way to achieve that goal of staying faithful to what is being done on the job!

Something else that struck me during the summit was that there really is no single industry  that has mastered performance testing. There were people who represented K-12, military, certification bodies, and large corporations with lots of experience to share.

If you’re interested in performance testing I’d highly recommend you visit the council’s Web site and check out the resources there.

« Previous Page