This extra post, by Professor Susan Rhind at the R(D)SVS, is a timely reflection on the value of mid-course feedback, which, for most UG course organisers courses, would be appropriate to gather around now. This a light touch and informal tool to facilitate prompt dialogue with students about any issues that they may be experiencing with the course. The following post summarises some of the findings from an evaluation of mid-course feedback that was carried out last academic year…
Following initial roll-out in 2016-17 for courses at honours level, all undergraduate courses for 2017-18 were required to operate a system to collect and respond to mid-course feedback (MCF) from students. Colleagues were guided in this activity by examples provided through the IAD website and face to events.
From the outset, the rational for the activity was clearly stated, ‘to promote a sense of dialogue between staff and students from the earliest stages, by providing opportunities for staff to gather (and respond to) mid-course feedback from all students’. Whilst such mechanisms already existed in some courses, this was an opportunity to ensure all students could contribute, and allow staff to explain why courses are structured in certain ways, or, indeed, why changes have evolved in response to previous cohorts’ input.
In order to evaluate colleagues experiences with MCF, a survey was developed for course organizers and went live at the end of March 2018. Due to the industrial action, no follow up reminders or ‘chasing’ was carried out. The response rate was 349 (18%).
Across the University, 85% of respondents had used mid-course feedback in their courses in 2017-18 with the methods used shown in Figure 1:
Colleagues were also asked how they rated mid-course feedback as an intervention to enhance student communication and engagement. The results are shown in the chart below:
Overall, approximately 70% of colleagues felt the intervention was extremely or quite useful. However, this was lower in CMVM at 45%.
The survey explored respondents experiences on the types of issues students raised that could be actioned or discussed. Examples of issues that were raised by more than one respondent that could be actioned fell into the categories of:
- The teaching – delivery
- Facilities/ logistics
- Course structure or content
- Signposting and/or clarification.
The main reasons given why colleagues chose not to implement MCF were captured in the themes (in order of frequency):
- Lack of awareness
- Industrial action
- Don’t agree with it
- Course too short to implement
- Use an alternative approach.
Reflection and Next Steps
The evaluation showed that implementation of MCF was patchy but, for those implementing it, a majority found it useful. There are nevertheless some key issues which were highlighted, some of which also surfaced in informal networking conversations throughout the year. These include:
- The more ‘light touch’, informal and conversational the better.
- Good evidence that MCF can highlight issues that are solvable and would otherwise not be picked up.
- Potential for feedback ‘fatigue’ for students and associated managing of expectations.
- Potential confusion with the role of the SSLC.
- The message about MCF being a requirement for all courses has not reached all course organisers.
- CMVM seems less engaged and convinced about the utility of this exercise than CSE and CAHSS.
Moving forward, we plan to run the evaluation again in semester 2 to add to the evidence base around the intervention. In the meantime, please feel free to contact me with any additional feedback or observations.