Spotlight on Alternative Assessment Methods: ‘A little bit digitalised?’ Developing practice with online exams

Photo credit: Christopher Burns, Unsplash CC0

In this ‘Spotlight on Alternative Assessment Methods’ post, Stuart Allan, Director of Online Learning at Edinburgh Business School (Heriot-Watt University), reflects on the insights gained from a study he conducted on the experience of staff who deliver and design online exams…

Like them or not, examinations are a fact of life for many of us. Even in an era of online assessment, the apparent security and scalability of a traditional exam still seems to be an appealing prospect in many contexts, particularly when cohorts are larger and academic resources are stretched.

‘Online exam’ is a broad term, and practice usually exists somewhere on a spectrum from the relatively traditional (e.g. screen-rendered or ‘take-home’ versions of pen-and-paper exams) to a complete re-imagining of exam tasks (e.g. by providing access to digital resources or simulations, facilitating video responses, or setting collaborative activities) (see Cramp et al. 2019, Myyry & Joutsenvirta 2015, Williams & Wong 2009). Some universities have been moving steadily towards online exams for several years, using systems such as Pearson Vue or BTL Surpass via exam centres or remote proctoring. However, the shift appears to be gathering pace: when I presented a webinar on online exams last week, many attendees (mainly from universities in Australia and New Zealand) said they were considering online exams for the first time as a direct result of COVID-19 lockdown.

As universities come to terms with these immediate demands, it is understandable that longer-term implications aren’t always at the forefront of our minds. However, once introduced, online exams could reconfigure some practices for the long run: students who were perhaps already unhappy with writing exam answers by hand may resist any return to ‘business as usual’; and operationalising online exams may reveal new opportunities or risks to assessment security, such as contract cheating. Therefore, it’s worth taking a breath and considering the assumptions and motivations that circulate around online exams.

A couple of years ago I did some research that analysed the experiences of university staff who designed and delivered online exams (Allan 2020). I was particularly interested in the influence and dimensions of two main discourses. The first was migration, whereby exams are expected to transition from pen-and-paper to computer-based delivery largely unaltered, resulting in improvements in efficiency and the student experience. I saw this discourse as being underpinned by instrumentalism, whereby technologies are positioned as ‘neutral means employed for ends determined independently by their users’ (Hamilton and Friesen 2013, p. 3). The second was transformation, whereby rapid technological changes require radical change in examination practices. I saw this discourse as drawing on the essentialist assumptions critiqued by Bayne (2015), whereby ‘‘’learning’ can be transformed by the immanent pedagogical value of certain technologies simply by allowing itself to be open to them “(p. 9)

I interviewed eight staff across higher-education institutions of varying sizes in the UK, Norway, Ireland and The Netherlands. I asked them to tell me about their experiences with online exams, any unexpected challenges they had faced, and their aspirations for future practice in this area.

The participants illustrated the ways in which online-exam technologies were deeply entangled with their assessment practice. They described their students’ anxieties around using unfamiliar technologies and the expectation that exams would be migrated in traditional formats rather than being re-imagined online (‘a little bit digitalised’, as one participant described it). They also outlined important challenges that only emerged once they began operationalising online exams. One Dutch participant said that the incompatibility of the various technologies in use had thwarted academics’ hopes of collaborating on large question banks across multiple institutions. Meanwhile, a Norwegian interviewee said that he wished he could redirect the unexpectedly large amounts of time and money being poured into moving traditional exams online towards truly innovative assessment practice.

Many people associate online exams with multiple-choice questions, but there are opportunities to make them more dynamic and authentic. For example, pre-releasing materials (e.g. a case study) that students critique under exam conditions tests critical thinking while increasing accessibility for candidates who need more time for reading or whose first language is not English. Or asking students to design a resource or prepare a strategic plan in advance, then adjust to a last-minute ‘curve ball’ (a piece of additional information only revealed in the exam paper, e.g. a major change in legislation or even a pandemic), could test candidates’ resilience and ability to think on their feet. In both cases, holistic assessment design that takes into account the opportunities and constraints of digital technologies and strong alignment with learning outcomes are key success factors.

As we all grapple with the challenges of lockdown, universities are assessing in ways that seemed unthinkable just a few months ago – for example relaxing regulations around closed-book exams, and extending time frames to 24 or 48 hours. If we can find the time and energy to grasp them, there are golden opportunities here to reflect, gather evidence and share knowledge that will move practice with online exams forward considerably. But it’s worth reminding ourselves that the day-to-day realities are more complex than the migration or transformation discourses suggest. In practice, online exams are more than ‘a little bit digitalised’.

References

Allan, S. (2020). Migration and transformation: a sociomaterial analysis of practitioners’ experiences with online exams. Research in Learning Technology, 28. 

Bayne, S. (2015). What’s the matter with ‘technology-enhanced learning’? Learning, Media and Technology, 40, 5–20.

Cramp, J., Medlin, J. F., Lake, P. & Sharp, C. (2019). Lessons learned from implementing remotely invigilated online exams. Journal of University Teaching and Learning Practice, 16.

Hamilton, E. & Friesen, N. (2013). Online education: a science and technology studies perspective / Éducation en ligne: perspective des études en science et technologie. Canadian Journal of Learning and Technology / La Revue Canadienne de l’Apprentissage et de la Technologie, 39.

Myyry, L. & Joutsenvirta, T. (2015). Open-book, open-web online examinations: developing examination practices to support university students’ learning and self-efficacy. Active Learning in Higher Education, 16, 119–132.

Williams, J. B. & Wong, A. (2009). The efficacy of final exams: a comparative study of closed-book, invigilated exams and open-book, open-web exams. British Journal of Educational Technology, 40, 227–236.

Stuart Allan
Stuart Allan is Director of Online Learning at Edinburgh Business School (Heriot-Watt University) and a PhD student at the University of Edinburgh’s Centre for Research in Digital Education. The outcome of his undergraduate degree at Edinburgh was decided by nine pen-and-paper exams in the space of two weeks – and they still haunt his dreams over twenty years later.

Leave a Reply

Your email address will not be published. Required fields are marked *