Focused Feedback: Discovering the “Lake Wobegon” effect

Photo credit: Pixabay, Tumisu, CC0

In this post, Dr Jenny Scoles, an Academic Developer at the Institute for Academic Development, shares some lessons learned through failing to successfully implement a really simple exercise aimed at improving student and staff feedback dialogue… 

Given the evidence of widespread student dissatisfaction with feedback, and the education research that suggests that tutor’s feedback is often not understood or seen as useful, a few years ago, myself and two colleagues (Professor Mark Huxham and Dr Jan McArthur) set out to develop an intervention that allowed students to identify their own feedback needs and instil a sense of control in the feedback process.

Our initiative involved a simple, easy to implement, and (we thought) fool-proof exercise that we hoped would encourage dialogue between students and lecturers: Students were given the opportunity to request specific topics on which they would like to receive focused feedback when submitting their coursework assignments. Other educational researchers have recommended such approaches. For example, Hyland (2001, p. 245) suggests ‘providing students with a cover sheet with their first assignment in which they could write some of their feedback requirements’. Bloxham and Campbell (2010) studied the use of interactive cover sheets as an initiative to generate dialogue in assessment feedback. Their small-scale study found limited student uptake, and suggested improvements for further research: ‘further activity and evaluation is required to test the findings with larger year groups, in other discipline areas and in higher levels of study than Year 1 undergraduates’ (p. 299).

Focused feedback initiative

We decided to build on Bloxham and Campbell’s suggestion to implement a larger scale study across more year groups, which explored the effects of a slight modification to an assignment cover sheet. 13 module leaders in a university science department participated in the project, and their modules included a range of undergraduate and masters level courses. In total, 710 students were enrolled on these modules. At the beginning of the trimester, the module leaders were quite simply requested to add the following sentence to the bottom of the coursework assessment brief before distributing it to students:

Would you like feedback on any specific aspects of this piece of work? If so, please indicate what you want feedback on at the end of your script.

At the respective deadline dates, lecturers collected the students’ completed coursework and sent us any scripts that had requested focused feedback. We used both quantitative and qualitative methods (a mixed-methods approach) to help answer the following research questions:

  •  What proportion of students chose to request focused feedback on their essays?
  • If students chose to request feedback, what kinds of topics did they choose to highlight?
  • Did the focused feedback option require more work for tutors marking a student essay (as measured by words written in response)?
  • Did students who requested focused feedback achieve significantly higher grades than those who did not?

Failure!

The quantitative analyses quickly showed that most students did not participate in the initiative: only 6% chose to request focused feedback (that is, only 42 students out of 710). And for those who did ask, most requested help on relatively superficial topics, such as referencing and citations, as opposed to ‘deeper’ issues like logical argument or comprehension. There was no evidence that providing focused feedback saved tutor time, with staff responding to focused feedback requests by providing more feedback. And, there was no difference in the mean mark awarded to students who did, and who did not, request focused feedback.

We were genuinely surprised that not more students had taken us this simple exercise to help them get more personal and specific feedback.

The qualitative responses helped us find out why we this initiative had failed…

Lack of trust

Our findings suggested that the students were so concerned with the final summative mark that they avoided engaging with focused feedback in case it influenced their mark negatively. There was a fundamental lack of trust in the initiative’s intention, as one student noted, “what if it’s wrong and I would draw their [the tutor’s] attention?” Students were strategically keeping quiet about their perceived weaknesses in case they lost marks by highlighting areas of work that the tutor may otherwise have glossed over.

Lake Wobegon effect

Some students appeared complacent about their need for feedback since they ‘did alright’. There was a perception that feedback was instead for students who were struggling:

I’ve always believed that the really personal more specific feedback is for the people that really went wrong.

Students with this view often achieved only average marks: there was a sense of inflated self-confidence even though there was considerable room to improve their mark. We termed this the ‘Lake Wobegon’ effect. This effect was coined by Justin Kruger (1999) and taken from Keillor’s (1985) book Lake Wobegon Days, where all the children in the book believe that they are above average. Kruger found that it was quite a common trait for people to see oneself as above average in their skills and abilities when comparing themselves with their peers. One possibility was that tutors were inadvertently re-enforcing the Lake Wobegon effect by (understandably) focusing their efforts on students who were struggling. Indeed, our quantitative findings showed that tutors tended to give less feedback to students who performed better in the coursework.

This research project was a fascinating example of how different the world can look to two groups of people involved in the same process. Whilst the staff (and us) considered assessment and feedback as an opportunity to learn, the students approached assessment as a stressful game that they needed to play in order to ‘win’ the highest mark possible. In our failure, we had discovered something important: to make focused feedback work, we needed to look closely at these hidden rules of the ‘assessment game’, and recognise the different interests of the players involved.

This blog post is adapted from a book chapter: Scoles, J., Huxham, M., & McArthur, J. (2014). Mixed-Methods Research in Education: Exploring Students’ Response to a Focused Feedback Initiative. SAGE Research Methods Cases, Sage. 

Bloxham, S., & Campbell, L. (2010). Generating dialogue in assessment feedback: Exploring the use of interactive cover sheets. Assessment & Evaluation in Higher Education, vol. 35 , pp. 291–300.

Hyland, F. (2001). Providing effective support: Investigating feedback to distance language learners. Open Learning, vol. 16 , pp. 233–247.

Keillor, G. (1985). Lake Wobegon Days. New York, NY: The Viking Press.

Kruger, J. (1999). Lake Wobegon be gone! The ‘below-average effect’ and the egocentric nature of comparative ability judgments. Journal of Personality and Social Psychology, vol. 77 , pp. 221–232.

Jenny Scoles

Dr Jenny Scoles is the editor of Teaching Matters. She is an Academic Developer (Learning and Teaching Enhancement) in the Institute for Academic Development, and provides pedagogical support for University course and programme design. Her interests include student engagement, professional learning and sociomaterial methodologies.

6 comments

  1. Fascinating.
    If fearful of negative impact on marks, I wondered whether you might expect one or two to ask for feedback on, and draw attention to, a perceived strongpoint.
    But perhaps students may look on this also as a bit like asking Qs in lectures: mostly for other people. Habitual question-askers may be seen as attention-seeking.
    I’m not convinced by the Wobegon argument. Is it really considering yourself above average to want to slip by in the crowd without too much striving and fuss? “If I’m passing comfortably, I’m OK with my performance”. Not necessary to insert misperception of superiority in the interpretation of that.
    Perhaps Academics suffer from a need to feel above average, and therefore expect their students to be striving all the time as they did/do? And some do.

    • Hi Neil – I think your point about ‘slipping into the crowd’ is a really good one. If a students wants to get by, is that such a bad thing? Maybe a pre “What do you think you would get for this piece of work” grading would be useful in disentangling that? I’m never sure if people would be honest in that kind of work, but I’ve often thought I’d like to try it!

      Excellent post, Jenny

      • Thanks Jill and Neil for your comments and thoughts! Great to get some written engagement on a blog post. I agree, the Lake Wobegon effect may be projecting too far onto students’ motivations, but it was quite striking in the results to see this effect that extra ‘help’ was those who were considered to ‘really need it’, rather than wanting to improve past the 60% line – it was less about superiority but more about lack of motivation to want to push themselves further that struck us. But as someone’s tweet pointed out, these reflections are coming from a researcher (me) who always wanted to push themselves that bit further in their own geek-self-competition 😉 Jenny

  2. Very interesting. It’s quite consistent with Dweck’s work on fixed and growth mindset where help seeking and wanting to appear capable are typical of fixed mindset behaviour. It would be interesting to do some work on fixed/growth mindset with students and then repeat the exercise to see if they are more willing to engage with the offer.
    There’s some information here about the differences.
    http://www.ahsdistance.org/OlympCoachMag_Win%2009_Vol%2021_Mindset_Carol%20Dweck.pdf

Leave a Reply to Jill MacKay Cancel reply

Your email address will not be published. Required fields are marked *