In this Spotlight on Alternative Assessment Methods post, Professor Tim Drysdale, Chair of Technology Enhanced Science Education in the School of Engineering, offers an open source solution to managing student marking in the remote era. Tim reflects on the historical record-keeping power of paper, and shows how it can be mimicked in PDF form…
There’s a trusted certainty to the permanence of paper. Sure, sometimes we worry about adding up the score wrong because it makes a permanent record for all to see that we are, in fact, humans, and not machines. But what if we could accidentally erase students’ work just by trying to mark it? What if we could cause the marks or comments from a colleague to disappear?
That’s the situation many of us are potentially facing this May. We’ve taken a bunch of paper exams and shoved the marking process online, assuming the lovely PDF output from Adobe Scan will see us right at marking time. Except there are some serious gotchas. If a student sends in a PDF scan that they edited with the annotation tools, then the marker can erase the student input by accident. I know, because I’ve done it to my own work, over and over, trying to revise my exam solutions (the originals were in the office and we were in lockdown). This mutability of PDF has already caused high-profile headaches and headlines.
We have lost the certainty and permanence of a paper trail.
That’s ok so long as we don’t make any
mitsakes mistakes. Some of our larger Schools are going to handle nearly half a million individual page transactions. That’s a lot of opportunity. Opportunity to automate and innovate with process to reduce that risk, as well as add some convenience, and perhaps even pleasure, to working in this new medium. Whilst keeping the same traditional exam paper format. I know from previous experience that having an automatic system add-up your marks for you improves your outlook on exam time. There is also a freedom in running a strict, computerised sanity check on your results and coming up clean; no uneasy feeling when you hand in your marking.
The problem is, when the world changed in March, it trimmed out a lot of our options for off-the-shelf software solutions – including the one I mentioned above. Many of them need special paper or other arrangements that we just can’t make in the time frame, or under lock down conditions. There is one particular option with a really attractive feature set, but it is off the table due to sensitivities surrounding their other product. Those sensitivities are not trivial, and are well explained by Ross, Bayne, and Lamb (2019). It’s a position that initially seemed odd to me given that we still use that other product and have urgent operational needs. But, on reflection, situations with tension between short-term need and long-term ethics are precisely the reason we have a saying about the choice of paving substance on a certain transport route.
What’s left to consider? Open-source projects tend to focus on use cases that don’t apply in the pandemic era or don’t suit our use case (e.g., paper surveys on special paper. There was a PhD project to make a web-based tool at UC San Diego that shared features in common with Gradescope. It lasted only a few months “because [their] university purchased the enterprise version of Gradescope,” (Sahoo et al. 2019). The technology choices were ok for a quick class-scale demo but not for building something that would support a School, let alone a College or Campus. So that’s something we can consider doing properly, ourselves, starting soon, so that it is ready in time to support our longer term adaptations (please do message me so I can gauge interest levels in this initiative: Timothy.Drysdale@ed.ac.uk).
Meanwhile, what do we do about the PDF/paper paradox? How about:
- Make PDF more like paper: All previous edits from earlier stages are permanently frozen.
- Make using a keyboard fast and easy, so everyone has a fast and easy method.
- Avoid scrolling. Put the front cover question totals on the same page as the question.
- Automatically extract the marks into a spreadsheet.
- Prepare the way to grouping questions for ease of marking.
These features are all being addressed in my new open-source project which can be found at https://pdf.gradex.io. Integrating it into your Teaching Office will be straightforward. it’s just an executable that goes on Teaching Office staff computers, preparing and processing the PDFs at each of the points in the process you were going to be running anyway. It’ll make it easier, more robust and, dare I say it, even enjoyable at times. Academics can mark using the keyboard or a pen, using any standards compliant editor (e.g. the free Adobe Reader). If needed for accessibility, this process can work with printed and scanned pages too – although the full benefit comes if the final checking stage can be done electronically.
The following illustration of the process comes from a test; the forms side of the project is working already. Obviously, real usage is to process whole sets of scripts at once, but we show just a single page here for clarity. First, the student sends in their script, handwritten and scanned:
The Teaching Office then run a process that ensures any annotations are baked into the image, and either a human or computer may optionally take a note of which question this page belongs to. A keyboard-fillable form is automatically added to the right hand side, with a place to put subtotals, question totals, alert support staff there is a bad page, or confirm you have checked the page but not awarded any marks. You can write over the student work without causing strange things to happen, just as you would have before, on paper. The erase tool only erases your work, not the students’.
After marking, the Teaching Office processes the form to extract the data and protect the marker’s annotations. Then a moderation side bar is added to some papers.
In this made-up case, the moderator has spotted an addition error, so they’ve corrected it. Now their edits are baked in, and then the Teaching Office checker bar is added. The checker keys in their scores (it is useful if a marker or moderator used freehand annotations). Automatic classification of handwritten digits and characters is an obvious feature to plug-in later on.
There are further steps from here not shown – for additional processes that may be unique to particular Schools, or for additional process steps (identifying the question, or resolving issues). Since we are online, we can use expanded page dimensions. If we need to print, it fits on A3.
Call to action: If this process looks like something you might like to adopt, then please get in touch (firstname.lastname@example.org, on Teams chat). You might just “save your time, but also your values.”
Ross, J., Bayne, S., and Lamb, J (2019). Critical approaches to valuing digital education: learning with and from the Manifesto for Teaching Online. Digital Culture & Education, 11(1), 22-35.
Sahoo, D., Bardolph, M., Ghanbari, S., and Hargis, J. (2019). Open source web-based grading assistant program: Sahoo Easygrade, GLOKALde, 5(1), Article: 2.