The Learning Scientists

View Original

GUEST POST: Is Your Feedback Carefully Used, or Barely Perused?

By: Robert A. Nash & Naomi E. Winstone

Dr. Robert Nash is Senior Lecturer in Psychology at Aston University, UK. Aside from his recent work on feedback, his research mainly focuses on how people remember and misremember past experiences. His Twitter handle is @DrRobNash and his website is www.robert-nash.com. Dr. Naomi Winstone is Lecturer in Higher Education, University of Surrey, UK. Her research explores psychological aspects of learning, assessment, and student engagement in Higher Education; her ebpage is www.surrey.ac.uk/dhe/people/dr_naomi_winstone/

Do your students engage with your feedback comprehensively and with an open mind? Plenty of anecdotal and empirical evidence tells us that the answer might be “no.” Indeed, our own students have openly told us that their strategies include hiding their written feedback in a drawer, reading it only when they like their grade, or even burning it! It hopefully goes without saying that strategies such as these represent a significant problem. Most teachers spend a considerable chunk of their time giving feedback to students, aiming to help them develop their skills. But merely receiving feedback does not lead people to develop their skills: using feedback does.

Image from Pixabay.com

If a sick patient fails to get better because they are furtively stockpiling their medication in a cupboard, then the doctor’s best remedy is not to increase the patient’s dosage. In the same vein, the solution to the feedback problem is not always to give more feedback. One alternative is to invest some of our valuable—but ultimately finite—resources in helping students to become better users of feedback. Plenty of good practice already exists in this vein, and because student apathy is by no means the only diagnosis for the problem, we believe it is an invaluable approach for everyone involved.

Like many other researchers, we’ve spent quite some time pondering how we actually get students to more actively reflect on and use their feedback. Nearly three years ago we began a process of systematically reviewing the academic literature on this topic, trying to find out what is collectively known. What determines whether or not learners engage with feedback? What can we do to get them engaging better, and is there any good empirical evidence that these interventions work? The outcome of this process is a newly-published paper in the journal Educational Psychologist, which is fully open-access to anybody who wants to read it (1).

Our review drew together evidence from 195 academic papers and conference proceedings published between 1985 and 2014. Some were of high quality, some less so; some reported original data, some were purely theoretical. But all of them in some way explored the topic of how learners behave in response to feedback. We should emphasize that the papers captured by our review focused heavily on students in Higher Education. This is not to say that teachers at other levels of education do not consider these issues - far from it - but perhaps their work is less likely to find its way into the academic literature.

Among the papers in our review, we found myriad ways in which practitioner-researchers have tried to improve their students’ use of feedback. Some tried delivering feedback in audio or video format rather than in written form. Some developed electronic portfolios for their students to track their feedback over time. Some used one-to-one drop-in sessions; some gave their students access to written guidance and resources; some used innovative peer-assessment activities.

Image from Unsplash

We’ll be honest, though: across many of these interventions, the empirical evidence of effectiveness was underwhelming. That’s not to say the interventions were useless, necessarily. But rather, it was just so rare that many of the great ideas had been examined more than once, to provide any kind of converging evidence. And even when they had, a vast portion of the evidence came from questionnaires or focus groups of students, rather than from directly observing or measuring their behavior. As a result, the literature tells us a great deal about what learners think helps them to engage with feedback, but relatively little about what objectively does help.

Taking a step back from specific interventions and the evidence of their effectiveness, we began to think about a more fundamental question: Why should any intervention work? In other words, which learning skills should our interventions enhance, and how should this enhancement in turn affect students’ engagement with feedback? We examined each intervention to find an explicit or implicit rationale for its use. We then gathered these rationales together, and looked for common themes. Four such themes emerged, which we have called the SAGE processes:

  • Self-appraisal
  • Assessment literacy
  • Goal-setting and self-regulation
  • Engagement and motivation

Each of these SAGE processes represents a broad set of (meta)cognitive skills underlying engagement with feedback that researchers have tried to target in their interventions. Our review helps to map how practitioner-researchers have tried to tackle each of them. By extracting these conceptual themes, we can also begin to think of our interventions in more strategic ways. We can see that no single intervention is likely to cover all of these bases: potentially a ‘package’ of interventions is what’s needed. A teacher might, for example, use peer-assessment exercises as a means to promote students’ assessment literacy skills, but might also need something else if part of the problem relates to their goal-setting and self-regulation skills. An effective solution to the problem of weak engagement with feedback may be one that, by combining several interventions, maps strongly onto all of the prerequisite skills.

In September, the UK’s Higher Education Academy published our Developing Engagement with Feedback Toolkit (DEFT) (2). The DEFT is one such package of interventions (though by no means a perfect, comprehensive, or empirically validated one), which we developed in collaboration with students, taking inspiration from here and there as we conducted our systematic review. The teaching resources in the DEFT are designed to tackle some of the key barriers that our students tell us prevent them from using feedback. It includes a feedback guide that our students played a major role in producing; discussion topics and activities for inclusion within a “Using Feedback Effectively” workshop; and resources for compiling a feedback portfolio. All of the resources can be downloaded in an editable format so that individual teachers can selectively adapt any bits they see as potentially beneficial to their own students.

As we continue our search for effective interventions, an unexpected side-effect has become apparent: we are finding ourselves increasingly aware of our own aversive and defensive reactions when receiving critical feedback. Somehow this awareness feels valuable – by reflecting on the barriers that we put up in these situations, we might be better equipped to intervene when our students put up the same barriers (and indeed, better equipped to change our own behavior). Certainly there’s more we could do to model for students the fact that defensive reactions to feedback are human rather than a sign of inadequacy and to provide springboards for discussing helpful and unhelpful responses. So maybe next time you receive infuriating feedback, you could try thinking of it as a “teachable moment.”


References:

(1) Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (in press). Supporting learners’ agentic engagement with feedback: A systematic review and a taxonomy of recipience processes. Educational Psychologist.

(2) Winstone, N. E., & Nash, R. A. (2016). The Developing Engagement with Feedback Toolkit (DEFT). York, UK: Higher Education Academy.