How the University of Bradford sustains a high volume of e-assessments

Posted by John Kleeman

The University of Bradford currently delivers around 60,000 assessments per academic year with Questionmark Perception, four times as many as four years ago, as you can see in the graph below.

I spoke to John Dermo from the Centre for Educational Development at the University about advice he would give others who are seeking to expand their use of e-assessment and how to sustain this over time. John explained that a common pattern of use at Bradford is that an instructor comes to the Centre interested in delivering a summative exam online, with mainly interested in saving time by having the computer do the scoring. In many cases they then start seeing the benefits of formative assessment, firstly by using practice tests to prepare for the exam, and then by adding topic and specific question feedback to make practice tests into formative tests that help students learn. So at Bradford, the initial drive for e-assessment is often to reduce grading effort, but once people have experienced the benefits there, they often want to do formative assessments too.

Here are ten tips from John on making your e-assessment policy sustainable:Graph showing growth of use of Questionmark for e-assessment at University of Bradford

1. Building confidence is vital. Confidence comes from training and support, which needs to be sustainable and flexible. In the early stages, the Centre made sure that technical support was visible in the room during every exam in case of problems. Now with e-assessment more established, they have someone support at the beginning and end only. Invest in lots of support to begin with.

2. Give students practice tests ahead of summative exams. Students will feel more comfortable and make fewer mistakes in exams if they have a chance to practice in the format beforehand. Bradford also provides training videos to help students become familiar with the software.

3. Engage with course teams rather than just individuals. Often it’s an individual who is an innovator and does the first set of e-assessments, but aim to work with their team not just them, as that is more sustainable in the long term. The dip you can see in the graph above in 2009-10 was due to a couple of heavy pioneering users leaving the University or going on sabbatical. Individuals often lead the way, but engage with course teams to make the use of e-assessment sustainable.

4. Be flexible on environment. Bradford has a 100-seat specialist e-assessment suite, but online assessment also takes place elsewhere (e.g. overflow rooms for large modules, other locations for students with accessibility needs, also for distance and overseas students), so make your procedures flexible.

5. Distinguish summative and formative assessments. For summative assessments, schedule and manage them centrally to ensure the right level of support and security. But for formative assessments, empower course teams to make them happen as independently and easily as possible from within the University VLE (at Bradford, they’ve found the Blackboard Connector to Questionmark Perception makes this very easy).

6. Define a formal policy for e-assessment and have it approved at institutional level. This will ensure that all stakeholders concerns are addressed (e.g. accreditation, IT, student records, disability office). Bradford’s current policy is here. Remember that your policy needs to be reviewed at regular intervals.

7. Include in the policy quality assurance procedures for summative assessments. Unlike in a paper exam, e-assessment lets you change a question at the last minute, but to retain quality, you need to ensure that formal policies on review are followed.

8. Define roles and responsibilities clearly. Moving from paper to electronic assessment can introduce ambiguities as to who is responsible for what, so have the conversations to resolve these.  You want to make sure that no important steps are allowed to “fall between the gaps”, where everybody assumes that someone else is responsible.

9. Involve peer institutions as a sounding board. For instance, Bradford has learned from Dundee University’s experiences with CAA policies and from talking to many other institutions about the use of Questionmark Perception. There is a healthy Questionmark Perception user community within the HE sector, which communicates via the Questionmark Users Conference, other e-learning and e-assessment conferences and via various online communities.

10. Success means empowering course teams to create reliable, valid and practical e-assessments. Support, help and train them to do so, working with the departments to meet their teaching and learning needs, and to deliver the institution’s teaching and learning strategy.

For more information on how the University of Bradford uses Questionmark Perception, see the University’s internal best practice document.

The CAA Conference, 10 years on

Posted by Steve Lay

I had great fun at the recent CAA (Computer Assisted Assessment) Conference hosted by University of Southampton, UK.  I’d like to thank the team there for taking the lead in organizing the event and opening a new chapter in its history.  This conference builds on the success of the 12 previous CAA conferences hosted at Loughborough University. Although I didn’t go to the first event in 1997, I’ve been a regular attendee on and off for the past 10 years.

I was given the job of summing up the conference and providing closing remarks.  With just two days to read over 30 academic papers I found myself searching for a tool to help me summarize the information quickly.  After a little bit of text processing with a python script and the excellent TagCrowd tool I came up with a tag cloud based on the top 50 terms from the abstracts of the papers presented at the conference:

tagcloud4

Assessment obviously remains a key focus of this community, but I was also struck by the very technical language used: system, tools, design, computer and so on.  However, it was also interesting to see which words were missing.  Traditionally I would have expected words like reliability and validity to feature strongly.  Although summative assessment makes an appearance formative assessment does not feature strongly enough to appear in the cloud.  Clearly students, learners and the individual are important but where is adaptivity or personalization?

It is interesting to compare this picture with a similar one taken from the abstracts of the papers in 2000, ten years ago.

tagcloud2000

An important part of our mission at Questionmark is learning from communities like this one and using the knowledge and best practices to develop our software and services.  During the conference I witnessed a range of presentations covering ideas that we can apply right now through to some fascinating areas of research that point the way to future possibilities.

The conference was a great success, and planning for next year (5th-6th July 2011) has already started.  Check out the CAA Web site for the latest information.

Podcast: An Innovative Approach to Delivering Questionmark Assessments

 

Posted By Sarah Elkins

The University of Bradford has recently developed an innovative e-assessment facility, using cutting-edge thin client technology to provide a 100-seat room dedicated primarily to summative assessment. The room provides enhanced security features for online assessment and has been used for the first time in 2009 with considerable success. The room’s flexible design maximises its usage by allowing for formative testing, diagnostic testing and general teaching.

John Dermo is the e-Assessment Advisor at the University of Bradford.  In this podcast he explains the technology behind this unique setup and talks about the benefits and challenges in using this room. John Dermo will also be presenting a session at the 2009 European Users Conference, where he will go into more detail about the project.

Podcast: David Lewis on Large Scale Online Assessments at Glamorgan University

 

Posted By Sarah Elkins

David Lewis of Glamorgan University has extensive experience with Questionmark Perception. I spoke with him recently about the large scale implementation he has been working on at Glamorgan, where they use Questionmark for formative assessment, summative assessment and module evaluation. David also spoke about the training programs that have been developed within the University, the collaboration with other higher education institutions in Wales, and provided some great advice for anyone working with online assessments.

Defining Assessment Terms: Tools for Getting the Right Results

julie-small1Posted by Julie Chazyn

In creating good, solid surveys, quizzes, test and exams it’s essential to understand what type of assessment will give you appropriate and actionable results.  We believe the ultimate objective of the assessment directly influences how it will be structured. This requires understanding the subtle distinctions that can mean big differences in the quality and outcomes of your assessments.  The language we use in talking about assessments needs to reflect those distinctions.

With that in mind, Questionmark CEO Eric Shepherd recently took some time to update Questionmark’s UK and US glossaries to help people understand different types of assessments.

Some of the terms that have been altered include:

Diagnostic assessment
Personality assessment
Pretest
Psychological assessment
Summative assessment

We hope you will bookmark the glossary and refer back to it often!