Why digging a little deeper into educational research matters

Back
Why digging a little deeper into educational research matters
Date24th May 2021AuthorGuest AuthorCategoriesPolicy and News

This post was initially published last year on the Education Endowment Foundation's blog here. The EEF is an independent charity dedicated to breaking the link between family income and educational achievement via research into teaching and learning.

EEF Toolkit

The Teaching and Learning Toolkit, and its counterpart the Early Years Toolkit, are heavily visited by teachers and school leaders. They are both incredibly useful, but there are pitfalls to avoid, and you will get much more from it if you scratch beneath the surface and read the full summaries. In our blog here, we set out 5 questions that you can ask:

  • How effective is this intervention? 
  • Which specific approaches are most effective?
  • Who benefits the most from this kind of approach?
  • What are the problems/gaps/areas of concern?
  • Any other factors worth considering?

Setting and streaming 1200 102 80 s

 

When dealing with a meta-analysis such as this, the overall rating can hide good and bad studies. Take ‘Setting or Streaming’ for example. The EEF Toolkit states that the impact of this is negative (-1 month's progress). But in the summary is the following passage:

The evidence suggests that setting and streaming has a very small negative impact for low and mid-range attaining learners, and a very small positive impact for higher attaining pupils. There are exceptions to this pattern, with some research studies demonstrating benefits for all learners across the attainment range.

So the average is negative, but there are examples where the impact is positive. And you see this idea of variability in other summaries, such as that for Digital Technology:

Studies consistently find that digital technology is associated with moderate learning gains: on average, an additional four months’ progress. However, there is considerable variation in impact.

And if you want to explore this even further, the summary links to the studies that fed into the meta-analysis.

Similar nuances can be seen in evaluations of EEF-funded projects. It’s easy to skim the surface of an evaluation and conclude that it worked or it didn’t. But the reality is that everything is more complicated than that. It’s always worth reading further. 

For example, following a successful efficacy trial of Catch Up Literacy, where the EEF noted an effect size of +3 months, the scaled up version showed less promise. One of the conclusions of the evaluation report was that implementation was an important factor, a nuance that is lost without reading the report:

The intervention was not always delivered as intended. Some schools struggled to resource two one-to-one sessions per week, while in other schools TAs adapted how they delivered individual sessions from what they were taught in the training.

Beyond the headlines

Interesting new research can often be published with an attention-grabbing headline in the press. 

When the Improving Behaviour in Schools guidance report launched, one such headline was ‘Greeting pupils at the door improves behaviour’. When our research lead Luke Swift looked a little further into the original study (Cook et al, 2018), he saw that ‘positive greeting at the door was only one aspect of the intervention evaluated’.

Four of the researchers were kind enough to discuss the paper with him, and while the headline is not exactly wrong, it certainly hides the nuance. Andrew Thayer said:

A simple "hello" is not necessarily the greeting we are looking for. If you must, pair it with an open-ended question that CANNOT be answered with one word like "good" or "fine." Instead, I often recommend teachers do one of two things with their greeting: 1) implement behaviour specific praise ("You have been so quiet in line, thanks so much. I can tell today is going to be a good day."); or 2) a specific relationship question, like, "Hey did you manage to win some games in Fortnite last night?"

While you may not go so far as the intrepid Luke Swift, the nuance is always there to be found. And you'll find it any time complex evidence is reduced to a soundbite or a quick summary.

Read Part 1 and Part 2 of Luke's blogs.

Following the evidence

Sometimes a lot of the how and why can be lost by the time a message around evidence is communicated. One example of this that we encountered recently was in the EEF's Metacognition and Self-Regulated Learning guidance report. 

Recommendation 7 is that 'Schools should support teachers to develop their knowledge of these approaches and expect them to be applied appropriately', and within that a bullet point suggests that 'Teachers can use tools such as ‘traces’ and observation to assess pupils’ use of self-regulated learning skills'.

This reference to 'traces' is merely the tip of an evidence iceberg, and to understand some of the nuance it's worth exploring. This reference took me to the Metacognition and Self-Regulation: Evidence Review, where the following appeared:

...researchers have advocated the use of real-time rather than retrospective measures, collecting indicators of self-regulation as students are completing a particular task. Two main types are identified in Dent and Koenka’s (2015) review: traces and think aloud protocols. Traces are observable signs of cognitive strategies students use while completing a task, such as underlining a passage or making notes alongside a piece of text. These are not reliant on self-report, but have their own inherent biases and issues, such as the fact that it is not easy or even possible to establish metacognitive processes underlying these cognitive strategies, and that such strategies may themselves be used rather unthinkingly where students are taught or expected to do so by teachers.

Not only did this reveal some of the nuances and limitations of 'traces', but it pointed me to additional research (Dent and Koenka, 2015) which pointed me to additional research (Winne and Perry, 2000). The latter revealed much finer detail into the concept of traces, and broadened my understanding beyond that simple bullet point in the guidance.

We have used so many metaphors in this post: digging deep; scratching the surface; icebergs. Whatever the metaphor, look for the nuances.

References:

Cook, C. R. et al. (2018) ‘Positive Greetings at the Door: Evaluation of a Low-Cost, High-Yield Proactive Classroom Management Strategy’, Journal of Positive Behavior Interventions, 20(3) pp149-159.

Dent, Amy L. and Alison C. Koenka. “The Relation Between Self-Regulated Learning and Academic Achievement Across Childhood and Adolescence: A Meta-Analysis.” Educational Psychology Review 28 (2016): 425-474.

Winne, Philip & Perry, Nancy. (2012). Measuring Self-Regulated Learning. Handbook of Self-regulation. 10.1016/B978-012109890-2/50045-7.

Mark Miller, the author of this post, is Head of Bradford Research School.

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×