Worlds Apart

Lee Elliot Major on the challenge of getting research to impact on education practise

As I revealed the next slide, there was an audible gasp among the 200 strong audience. I knew at that moment I had lost the crowd. Whatever I said next, everyone’s mind was focused on the big fat zero that sat at the bottom of the impact table staring out from the screen. As I turned to face the angry faces, I saw that the throngs of hard working, decent teaching assistants (TAs) had turned into a lynch mob.

On reflection, presenting the findings of the Sutton Trust-EEF teaching and learning toolkit to TAs at the end of a long hard term was not the best idea. This guide to the best (and worst) bets for improving results in the classroom shows that TAs have on average zero impact on the attainment of children. Now as I told my audience that doesn’t mean we should sack all classroom assistants. But it does mean better training, preparation and management are needed to enable the 220,000 of TAs in our schools (costing the public purse over £2 billion a year) to help our children learn.

Sadly this nuance was lost as the discussion descended into an increasingly fractious argument. No amount of caveats and constructive comments could calm the enraged ranks of TAs. All they could see was an attack on their livelihoods. I returned to London that Friday afternoon feeling like I had been mauled in a playground fight.

This admittedly was one of the more contentious toolkit talks I have given to schools during the last two years. The experience highlighted the potential evidence has to improve practise and policy, and the power a succinct accessible summary of research can have. But it also demonstrated the huge challenge of enabling evidence to actually impact on classroom practise in a constructive and useful way.

I’ve been reflecting on all this, as I prepare a talk for an Institute of Education this week on how research can impact on policy and practise.

What I will say will seem blindingly obvious, but is almost universally ignored. My ‘take home’ message is that we must acknowledge the fundamental cultural differences between the worlds of media, academe, policy and practise – if we are to reach the promised land of evidence based practise. We must recognise that communication is as an academic might say a ‘highly non-trivial task’.

Each of these worlds has its own jargon, beliefs, rules, aims. Like working with different countries, we need to embark on genuine translation and efforts from all sides to make it work.

As a former news editor, my one piece of advice to reporters was to spend as much time on the writing and presentation of articles, as gathering the news itself. What’s the point if no-one will read what you have found? I now hold the same view for the work of an education foundation: our toolkit has been successful as we spent many hours thinking carefully about how to present the often abstract and complex findings of education research.

But after years of working with schools, I’m afraid I’ve had to re-assess this rule. To affect genuine change – this is just the start: much more has to be done, and in the schools themselves. Powerfully presented evidence isn’t enough. There are countless examples of things we know work, but fail to embrace. We don’t do exercise – even though we know it’s good for us. Doctors still fail to wash their hands regularly – the most simple of medical safeguards.

For evidence-based education to work, we will need to free up time for teachers to consider research. We may need to create research leaders in every school. Inspectors may need to encourage the use of evidence more when they visit schools.

This I’m glad to say is the increasing pre-occupation of the Education Endowment Foundation as it strives to find out what works in schools. It won’t be an easy task: as with the assembled TAs during my talk, we all tend not to want to listen to evidence that confronts our own prejudices – even when the messenger has the best of intentions.

Evidence is just the start of finding what works

Lee Elliot Major and Steve Higgins, professor of Education at Durham University argue that effective implementation matters once the evidence for what works has been identified.

In England we spend around £2 billion a year on 170,000 teaching assistants in schools. The total impact of this money on the attainment of our children is zero.  The best evidence we have indicates that for every TA who improves a pupil’s progress there is another whose deployment has a negative impact.

This is a powerful example of why we need evidence based policy and practice, but it also highlights the difficulties of promoting changes to improve practice – because finding that TAs are deployed ineffectively does not tell you what to do about it.

Such issues are soon to be faced across government, following the launch last week of a network of What Works centres to champion evidence-based social policy.

At the launch, Oliver Letwin, Minister of State at the Cabinet Office, said that the biggest question was why government hadn’t done something like this before.  But if government hasn’t, others have, and the centres will be building on existing practicies of evidence use in health and education.

In health, the Cochrane Collaboration this year celebrates two decades of producing systematic research reviews. Its approach has shaped advice offered by the National Institute for Health and Clinical Excellence on NHS treatments.

The cultural shifts that introduced evidence-based medicine to surgeries and hospitals 20 years ago are now playing out in classrooms and schools. The What Works education centre will use a toolkit we developed three years ago summarizing thousands of studies to show the best and worst bets for improving pupils’ results. It is the model now being advocated by the Cabinet Office for other policy areas.

Since 2011, the Education Endowment Foundation, a charity that aims to improve the educational achievement of disadvantaged children, has developed this toolkit into a resource for disseminating evidence across the sector. The EEF has overseen a quiet revolution in England’s schools, commissioning some 40 randomized control trials so far.

The toolkit shows that working on teacher-learner interaction – improving feedback to pupils, for example –  gives the biggest bang for the education buck.  Yet our surveys reveal that most head teachers prioritise actions that evidence suggests, on average, have little impact: reducing class sizes or recruiting TAs.

In education, the route to evidence-based policy is particularly challenging, because the navigation instruments are less powerful and predictable than in medicine.  Imagine a world where the laws of nature vary through time and space. That is the reality across the thousands of different classrooms, teachers and children that make up our education system. Over the last 30 years curriculum and assessment have changed many times, and variation in schools and teachers has a profound impact on how an intervention works or doesn’t work.

 We have little idea how to help schools implement the best bets for improvement. Some may need a highly prescriptive programme; others general principles to shape and evaluate a tailored programme.

To return to TAs, for example, the evidence does not mean that they should be scrapped. There are many ways in which TAs might be better recruited, trained, deployed and evaluated. Some approaches will be more effective in lower-performing schools, schools serving a high proportion of children with special needs, or schools with particular teachers. Knowing what works where and for whom could improve a school’s choices about TAs and everything else.

A commitment to what works (strictly, what’s worked) in education must also consider the constantly changing pedagogical landscape. Take phonics teaching: if the current emphasis on phonics becomes routine, then remedial support based on phonics is likely to become less effective than research currently suggests. Children who have failed in a phonics-rich pedagogy may benefit more from a different remedial style.

These are important lessons for the other four planned What Works centres. Evidence can be boring or inconvenient for politicians more interested in an immediate and popular policy fix. But, as Letwin stressed, “this is only the start of a journey, not the destination”, and the outlook at this early stage is promising. This programme has the potential to revolutionise public policy in areas as diverse as ageing, education and policing, replacing dogma and tradition with research and randomised trials. Billions of pounds of public money could be saved.

This post first appeared on Research Fortnight