No legislation no bad thing

Conor Ryan suggests that the absence of education legislation in the Queen’s Speech may be a blessing for education reform

When I was a special adviser at the Department for Education in the late 1990s, it was seen as a measure of a Department’s success the extent to which it achieved legislation in the Queen’s Speech. This essentially macho test often led to more legislation than was strictly necessary to achieve policy goals.

Many people forget that although the Labour government needed legislation to restrict most infant class sizes to 30, it needed no legislation to introduce the literacy and numeracy hours in primary schools. The latter were the result of a mix of persuasion and accountability, and were arguably more effective as a result.

And legislation was too often used as a way to trumpet changes that could have been introduced less dramatically. Trust schools – the centrepiece of Tony Blair’s 2005 education reforms – were a good example. As with Michael Gove’s first academies legislation, the essential architecture was already in place, and what changes were needed could have been introduced with less fanfare through regulations.

So, it is no bad thing that there was no education legislation in this year’s Queen’s Speech. Of course, that didn’t stop the Government using the occasion from getting Her Majesty to remind Parliament of changes already in train, such as the curriculum overhaul or performance pay for teachers.

But nobody would argue that Michael Gove is any less powerful because he hasn’t got a fifty or a hundred clause bill to take through Parliament over the next twelve months. And I doubt any of his junior ministers – who would be tasked with the legwork – is overly concerned either.

However, what it does mean is that it is all the more important that changes the Government is introducing get the scrutiny they deserve, and that they are subjected to the sort of rigorous evaluation – usually through randomised control trials – that the Sutton Trust and the Education Endowment Foundation are using.

That is important not just for ministers who want to ensure that their reforms are making a difference to results, particularly for the poorest pupils, but also if they are to gain buy-in from teachers and headteachers.

With more than half of all secondary schools now having academy status, as well as a growing number of free schools and university technical colleges, schools are getting used to having more freedoms than before. And while complex legislation can be important on some issues – such as ensuring a fair admissions code – it is a blunt instrument over issues such as the curriculum or performance related pay.

Over-complexity militates against successful reform. When Estelle Morris first introduced performance pay in 2000, the intervention of the teaching unions ensured the whole process was wrapped in endless bureaucracy.

Leave aside for a moment the perfectly valid issue of the impact of PRP on attainment – though Gerard Kelly’s recent TES piece show why in this case there are other issues to consider – the real problem is that legal issues come to outweigh the flexibility that allows heads to reward good teachers in a straightforward way.  A less complex system may prove to be more effective in overcoming the culture against PRP in some schools. And we might then have some serious research on the issue too.

But those increasingly independent state schools will equally need to be persuaded on the curriculum – including on the detail now planned in subjects like history – and on other issues where ministers feel strongly. As they do so, it is important that they use evidence rather than past practice or even DFE guidance to make their decisions.

That’s why the increasing popularity of the Sutton Trust/EEF Toolkit is so important. Next week, we plan to publish new evidence of just how popular it is becoming. But in the meantime, we should reflect that giving Michael Gove and schools a break from the 2014 Education Act is not only no bad thing, it may allow the breathing space needed for genuine reform to take place.

Worlds Apart

Lee Elliot Major on the challenge of getting research to impact on education practise

As I revealed the next slide, there was an audible gasp among the 200 strong audience. I knew at that moment I had lost the crowd. Whatever I said next, everyone’s mind was focused on the big fat zero that sat at the bottom of the impact table staring out from the screen. As I turned to face the angry faces, I saw that the throngs of hard working, decent teaching assistants (TAs) had turned into a lynch mob.

On reflection, presenting the findings of the Sutton Trust-EEF teaching and learning toolkit to TAs at the end of a long hard term was not the best idea. This guide to the best (and worst) bets for improving results in the classroom shows that TAs have on average zero impact on the attainment of children. Now as I told my audience that doesn’t mean we should sack all classroom assistants. But it does mean better training, preparation and management are needed to enable the 220,000 of TAs in our schools (costing the public purse over £2 billion a year) to help our children learn.

Sadly this nuance was lost as the discussion descended into an increasingly fractious argument. No amount of caveats and constructive comments could calm the enraged ranks of TAs. All they could see was an attack on their livelihoods. I returned to London that Friday afternoon feeling like I had been mauled in a playground fight.

This admittedly was one of the more contentious toolkit talks I have given to schools during the last two years. The experience highlighted the potential evidence has to improve practise and policy, and the power a succinct accessible summary of research can have. But it also demonstrated the huge challenge of enabling evidence to actually impact on classroom practise in a constructive and useful way.

I’ve been reflecting on all this, as I prepare a talk for an Institute of Education this week on how research can impact on policy and practise.

What I will say will seem blindingly obvious, but is almost universally ignored. My ‘take home’ message is that we must acknowledge the fundamental cultural differences between the worlds of media, academe, policy and practise – if we are to reach the promised land of evidence based practise. We must recognise that communication is as an academic might say a ‘highly non-trivial task’.

Each of these worlds has its own jargon, beliefs, rules, aims. Like working with different countries, we need to embark on genuine translation and efforts from all sides to make it work.

As a former news editor, my one piece of advice to reporters was to spend as much time on the writing and presentation of articles, as gathering the news itself. What’s the point if no-one will read what you have found? I now hold the same view for the work of an education foundation: our toolkit has been successful as we spent many hours thinking carefully about how to present the often abstract and complex findings of education research.

But after years of working with schools, I’m afraid I’ve had to re-assess this rule. To affect genuine change – this is just the start: much more has to be done, and in the schools themselves. Powerfully presented evidence isn’t enough. There are countless examples of things we know work, but fail to embrace. We don’t do exercise – even though we know it’s good for us. Doctors still fail to wash their hands regularly – the most simple of medical safeguards.

For evidence-based education to work, we will need to free up time for teachers to consider research. We may need to create research leaders in every school. Inspectors may need to encourage the use of evidence more when they visit schools.

This I’m glad to say is the increasing pre-occupation of the Education Endowment Foundation as it strives to find out what works in schools. It won’t be an easy task: as with the assembled TAs during my talk, we all tend not to want to listen to evidence that confronts our own prejudices – even when the messenger has the best of intentions.

Evaluating the impact of widening participation initiatives

Lee Elliot Major argues for a more evidence-based approach to university access work.

It is nothing short of a scandal that the vast majority of work in our universities and colleges aimed at opening doors to students from low and middle income homes is not evaluated properly. We spend over £1 billion a year on programmes to widen participation and broaden access into our academic elites; yet we know very little about what impact most of these efforts are having. Well-intentioned efforts to aid social mobility – from school outreach programmes to financial support for students – are effectively operating in the dark, uninformed by any hard evidence of what has worked before.

The problem has come to light again with the release of a report for the Higher Education Funding Council for England (Hefce) which “found little evidence that impact is being systematically evaluated by institutions”. Previous reports have revealed a lack of even the most basic monitoring of data and outcomes across the sector, prompting the English funding council to issue guidance on evaluation.

The national strategy unveiled by Hefce and the Office for Fair Access (Offa) meanwhile has recommended a light-touch network of regional coordinators to facilitate collaboration between universities and schools. This sounds suspiciously like ‘AimHigher light’- a slim-line version of the previous national outreach programme in England. AimHigher was cut in the last Whitehall spending review due to lack of evidence of its impact. A lot of good work was undermined by the absence of hard data.

The gathering of robust evidence remains the Achilles heel of the sector. It seems tragic that this should be so in our respected seats of learning. Once when the Sutton Trust offered to evaluate an outreach scheme at a highly prestigious UK university, the head of access declined, arguing that they would rather use the extra money to help more students.

The problem with this response is twofold: Firstly, we didn’t (still don’t) know if the programme was actually having any impact on the students taking part. Secondly, if we did evaluate it, then the lessons could enable many thousands more students to be helped properly in the future. The current default – to simply survey participants to see if they enjoyed the experience – is no longer good enough. The question must be asked: did the programme impact on the student in the desired way that would not otherwise have happened if the programme had not existed. Did the programme enable students from poorer backgrounds to enter university who otherwise wouldn’t have done so?

But there are signs that the tide is at last turning. To its credit Offa is urging institutions to adopt a more ‘evidence based’ approach. What is now needed is the full mix of evaluation and monitoring – local pilot studies as well as national randomised trials – to measure the outcomes of access work.

Universities can look to the work we have been doing with schools on interventions in the classroom to learn some of the basic principles. The DIY evaluation guide published by the Education Endowment Foundation (EEF) offers simple advice on how to evaluate the impact of a programme at a local level. This is about combining professional judgment with knowledge of previous evidence to devise a programme, and then monitor outcomes of participating students in comparison to similar students not on the programme. The Trust is currently developing a common evaluation framework for all of its programmes. This will enable evaluations for small projects without the resources to commission an independent evaluation themselves.

The Government recently designated The Sutton Trust and EEF as the ‘What Works centre’ for education following the publication of our highly successful toolkit for schools. The Trust is currently developing an ‘HE access toolkit’, which we hope will summarise current evidence on the impact of access work in an accessible format. Although it is not clear how much this will be able to say, given the paucity of research in the field.

Undertaking ‘gold standard’ evaluations which involve selecting participants at random to ascertain genuine impact remains a tricky task. But the Sutton Trust has already funded a feasibility study on how a proper randomised control trial (RCT) might be undertaken for an access programme. We are now considering commissioning a fully fledged RCT.

Even if RCTs are currently a step too far for others, then evaluations need at least to involve the use of comparison groups. Two examples of such usage can be seen in recent evaluations commissioned by the Trust. Our review of summer schools used UCAS university admissions data to compare the outcomes of summer school students against similar students not on the programme. The Reach for Excellence programme meanwhile constructed a comparison group from students who qualified but didn’t enrol on the programme.

If I had my way every access programme would require an evaluation that met these basic standards. Robust evaluation is not easy to do, costs time and money, and often produces awkward and humbling results. But not to do so, is in the end failing the students we are trying to help.

This blog post first appeared on Westminster Briefing.

Evidence is just the start of finding what works

Lee Elliot Major and Steve Higgins, professor of Education at Durham University argue that effective implementation matters once the evidence for what works has been identified.

In England we spend around £2 billion a year on 170,000 teaching assistants in schools. The total impact of this money on the attainment of our children is zero.  The best evidence we have indicates that for every TA who improves a pupil’s progress there is another whose deployment has a negative impact.

This is a powerful example of why we need evidence based policy and practice, but it also highlights the difficulties of promoting changes to improve practice – because finding that TAs are deployed ineffectively does not tell you what to do about it.

Such issues are soon to be faced across government, following the launch last week of a network of What Works centres to champion evidence-based social policy.

At the launch, Oliver Letwin, Minister of State at the Cabinet Office, said that the biggest question was why government hadn’t done something like this before.  But if government hasn’t, others have, and the centres will be building on existing practicies of evidence use in health and education.

In health, the Cochrane Collaboration this year celebrates two decades of producing systematic research reviews. Its approach has shaped advice offered by the National Institute for Health and Clinical Excellence on NHS treatments.

The cultural shifts that introduced evidence-based medicine to surgeries and hospitals 20 years ago are now playing out in classrooms and schools. The What Works education centre will use a toolkit we developed three years ago summarizing thousands of studies to show the best and worst bets for improving pupils’ results. It is the model now being advocated by the Cabinet Office for other policy areas.

Since 2011, the Education Endowment Foundation, a charity that aims to improve the educational achievement of disadvantaged children, has developed this toolkit into a resource for disseminating evidence across the sector. The EEF has overseen a quiet revolution in England’s schools, commissioning some 40 randomized control trials so far.

The toolkit shows that working on teacher-learner interaction – improving feedback to pupils, for example –  gives the biggest bang for the education buck.  Yet our surveys reveal that most head teachers prioritise actions that evidence suggests, on average, have little impact: reducing class sizes or recruiting TAs.

In education, the route to evidence-based policy is particularly challenging, because the navigation instruments are less powerful and predictable than in medicine.  Imagine a world where the laws of nature vary through time and space. That is the reality across the thousands of different classrooms, teachers and children that make up our education system. Over the last 30 years curriculum and assessment have changed many times, and variation in schools and teachers has a profound impact on how an intervention works or doesn’t work.

 We have little idea how to help schools implement the best bets for improvement. Some may need a highly prescriptive programme; others general principles to shape and evaluate a tailored programme.

To return to TAs, for example, the evidence does not mean that they should be scrapped. There are many ways in which TAs might be better recruited, trained, deployed and evaluated. Some approaches will be more effective in lower-performing schools, schools serving a high proportion of children with special needs, or schools with particular teachers. Knowing what works where and for whom could improve a school’s choices about TAs and everything else.

A commitment to what works (strictly, what’s worked) in education must also consider the constantly changing pedagogical landscape. Take phonics teaching: if the current emphasis on phonics becomes routine, then remedial support based on phonics is likely to become less effective than research currently suggests. Children who have failed in a phonics-rich pedagogy may benefit more from a different remedial style.

These are important lessons for the other four planned What Works centres. Evidence can be boring or inconvenient for politicians more interested in an immediate and popular policy fix. But, as Letwin stressed, “this is only the start of a journey, not the destination”, and the outlook at this early stage is promising. This programme has the potential to revolutionise public policy in areas as diverse as ageing, education and policing, replacing dogma and tradition with research and randomised trials. Billions of pounds of public money could be saved.

This post first appeared on Research Fortnight

What Works – A winning formula

Peter Lampl welcomes the designation of the Sutton Trust and EEF as the What Works Centre for Education

This week the Sutton Trust was, together with the Education Endowment Foundation, designated the What Works evidence centre for education by the Government. There will be six leading evidence centres and we and the National Institute for Health and Clinical Excellence (NICE) have been selected to lead on education and health respectively. The centres will be the first port of call for advice on the latest research on the impact of Government programmes.

This is recognition of the Sutton Trust’s focus on evaluation and research in all the work it does. We have always aspired to subject our programmes to robust review. And as an independent foundation we have used evidence to challenge or support the Government’s education policies.

The Trust has funded over 120 research studies in the areas of social mobility and education. But it is primarily a ‘do tank’. Our flagship summer school programme for example is now the largest national university access scheme – but it is also the most scrutinised programme in this field.

We know they have impact: over three quarters (76%) of summer school attendees go on to a leading university, compared with only 55% of students with similar backgrounds who aren’t on the programme. We also know they are highly cost-effective: when Boston Consulting Group did a cost-benefit analysis of the Trust’s programmes – comparing the lifetime earnings benefits for the individuals on the schemes with the money spent – summer schools were among the programmes resulting in returns of over 20:1.

It was these disciplines – assessing the evidence on what works, assessing cost-benefit, but also ensuring that the research results are presented in a clear accessible way – that underpinned the Teaching and Learning Toolkit the Trust developed for schools on what works best at improving the results of children from poorer backgrounds. The Toolkit has now been used by thousands of schools across the country, and underpins the work of the Education Endowment Foundation.

When we established the EEF in 2011 as lead foundation with Impetus our vision was that it was going to embrace the Sutton Trust’s principles and become a gigantic do tank. The aim was to improve the results of the poorest children in our most challenging schools. But it would also have the freedom to experiment, innovate and rigorously evaluate projects and scale up those that were cost effective.

Two years on I am pleased to say that this has become the reality. To date the EEF has awarded £24.4 million to 55 projects working with over 275,000 pupils in over 1,400 schools across England. It has commissioned over 40 randomised research trials in our schools – the gold standard for evaluations on what works. Over the coming years these studies will add greatly to our knowledge of what interventions are successful in the classroom.

But with research, you have to take the rough with the smooth. Not all the Sutton Trust’s research findings have been welcome. In 2005 the Trust jointly funded a five-year study with the Department for Business, Innovation, and Skills and the College Board into the US SAT aptitude test as a potential addition tool in the selection of candidates for universities.

In particular the National Foundation for Educational Research study aimed to find out whether the SAT test could identify highly able non-privileged students whose potential was not being reflected in A-levels because of their circumstances. After five years tracking the results of thousands sixth formers who then attending university, the study concluded that the SAT added little extra information to that provided by A-levels.

If the Government is true to its word on ‘evidence-based policy’ then it will have to face up to this reality. The research may not always confirm prior convictions or favoured policies, and almost always throws up some unexpected results. That’s why I think it is important the EEF and the Sutton Trust remain fiercely independent and make public all the evidence we produce. As the Government’s What Works evidence centre for education, these will be our guiding principles.

Why teachers can’t call themselves a profession

Lee Elliot Major on the need for an evidence-based approach in the classroom

I have often thought that commentators who want to criticise teachers should first pass the ‘teacher test’ to earn the right to do so. Having spent time in front of an inner-city classroom (with a teacher beside me) I can tell you it is one of the most challenging (and rewarding) experiences I have ever had. And that was just for one hour!

We don’t value our teachers anywhere near enough. Few of us really understand (or could cope with) the demands of their job. Just ask Sir Peter Lampl, a hard-nosed business leader, who once thought teachers had it easy. After 15 years at the helm of the Sutton Trust he now talks only of admiration for the inspirational educators of our children.

But teachers remain vulnerable to one well-founded attack. Can they call themselves a true modern-day profession? I’m afraid not. And one of the main reasons is this. Teachers have yet to embrace an evidence-based approach to their work: there is no accepted body of knowledge, based on robust research, to inform what they do (or don’t do); nor is there a culture of investigation to evaluate what works best in their particular school or classroom. The contrast with the modus operandi of medics could not be starker.

Below I have adapted a famous graph in education policy circles, first produced during New Labour’s early education reforms of the late 1990’s.

Knowledge poor

The graph describes the different phases of teaching during the last half century. It contrasts them in terms of the knowledge used to underpin the work of teachers, and the levels of autonomy they have enjoyed.

Before 1988 teachers were essentially practitioners free to pursue their own ways of working, with little reference to the body of research on what worked best. Then came the Big Bang of Baker’s Government reforms – the league tables, inspections and the national curriculum – that prescribed exactly what was expected in the classroom. A decade later under New Labour, another wave of top-down programmes emerged – the national numeracy and literacy strategies. These initiatives were based at least in part on the evidence of their impact.

The last phase of the graph highlights what has yet to be realised: the promised land of teachers as informed autonomous professionals, and no longer in need of Government direction.

The good news is that an accessible summary of education research on what works to raise attainment is now available. Last week saw the re-launch of the Sutton Trust-Education Endowment Foundation Toolkit. This is latest generation of the guide updating the evidence first released two years ago. It challenges many of the assumptions among teachers – revealing the limited impact from reducing class sizes and the current deployment of teaching assistants. This is one important step in the journey towards a teaching profession that embraces evidence. But as Ben Levin stresses, it is really only the start: everyone knows that exercise is good for them, but that doesn’t mean we all do it, does it?

I see at least three major challenges for the deep cultural reforms needed for teaching to evolve into the respected profession it should be.

The first is the glacial timescale for change. The improved education systems across the world required concerted leadership and efforts over several years. This is the “grind not the glamour” that the EEF’s chief executive, Kevan Collins, talks about.

The second is the danger of Whitehall’s heavy hand that can stifle rather than stimulate change. The long hard road to reform extends well beyond Whitehall fads (evidence based policy is currently back in vogue) and Parliamentary cycles.

No, this change will have to come from teachers themselves. It is striking that in all the reviews of the nations at the top of the global education rankings, the common watchwords are professionalism, professional development, evidence, and research.

The Trouble With Boys

Conor Ryan reflects on David Willetts’ latest initiative to persuade more white working class boys to study at university.

Universities minister David Willetts was quick off the blocks for 2013 with his ideas on how to encourage white working class boys to go to university.

Young women are now a third more likely than young men to go to university, and there is a three-fold gap in applications between the poorest and richest neighbourhoods. In an article and interview for The Independent, Mr Willetts said that the education system “seems to make it harder for boys and men to face down the obstacles in the way of learning.

He told the paper that the Office for Fair Access  “look at a range of disadvantaged groups – social class and ethnicity, for instance – when it comes to access agreements, so I don’t see why they couldn’t look at white, working-class boys.”

The Minister has a point. Growing research in recent years suggests that white working class boys perform less well than many minority ethnic communities in their test and exam results.

Stephen Machin and Sandra McNally, in a 2006 LSE study, identified a stronger gender gap in secondary than primary schools. They argued that “the importance of coursework in the GCSE examination is likely to be a key explanation for the emergence of the gender gap at age 16.” They also identified differences in teaching and learning styles, and modes of assessment.

In 2007, Joseph Rowntree Foundation research, conducted by Robert Cassen and Geeta Kingdon, with some Sutton Trust input, found that nearly half of all students defined as low achievers were White British males. White British students on average – boys and girls – were more likely than other ethnic groups to persist in low achievement.

National College research by Denis Mongon and Christopher Chapman from Manchester University with the National Union of Teachers in 2008 suggested that some school leaders were better than others at narrowing this gap. They suggested a focus on clear strategies including relentless application of the highest standards in teaching and attention to data detail were key where the gap was lower.

They rightly pointed out that the social class gap is much wider than any gender gap, yet the data suggest that white working class boys are at an even greater disadvantage than white working class girls. This lower attainment can translate into lower ambitions, as reflected in applications to Sutton Trust summer schools – an important route for many low and middle income young people into leading universities.

Sutton Trust summer schools target the first child in families who might go to university. In 2012, there were 5,295 applicants from girls and 2,712 from boys, a ratio of 2:1. Even with a slightly higher acceptance rate among boys than girls, 62% of attendees were girls and just 38% boys.

Recent exam data bears out both the gender and socio-economic gaps. The 2012 Key Stage Test data suggests that 60% of White British boys eligible for free school meals reach level 2 in English and Maths, compared with 67% of White British FSM girls. This is larger than the three point gender gap among all other pupils. However, there is a six-point gender gap across all FSM pupils. The big difference is in English, where the gender gap among FSM pupils is 12 points.

On the main 5 GCSE indicator (including English and Maths), 2011  data (2012 data is due later this month). shows that 26% of White British FSM boys reached this standard compared with 32% of girls, a slightly smaller gender gap than exists for all other pupils, but one consistent with the gap at age eleven. On overall performance, only traveller boys perform worse now at GCSE. 33% of Black Caribbean boys, for example, now reach the 5 GCSE standard (though overall White British students perform ten points better than Black Caribbean students)

So what can we do? David Willetts is surely right to want universities to provide concerted help through summer schools over several years to lift aspirations. The Trust is planning to use this approach through programmes with Kent academies and working with University College London to support highly able pupils from Year 8 onwards over five years.

Of course, Michael Gove’s changes to the exam system and curriculum – more facts, more end-of-course testing – may reduce the overall gender gap, as girls are believed to perform better in coursework.

But, in addition to OFFA looking more closely at the data, a more concerted focus on white working class boys could also be productive. When the Labour government targeted Afro-Caribbean achievement in the late 1990s, after the Stephen Lawrence murder, it set clear goals and a strong focus that has particularly benefited FSM pupils.  The London Challenge will have been of particular benefit, with its strong focus on leadership, teaching and data.

Today, our sister charity, the Education Endowment Foundation is testing the most effective ways to lifting achievement for pupils in receipt of free school meals, and has the potential to make a real difference in narrowing attainment gaps. 70% of its target group is White British.

The experience of minority ethnic communities suggests cultural change is also important. Bangladeshi students used to perform relatively poorly in schools. Now they out-perform White British students overall, and 56% of Bangladeshi students eligible for free school meals – including 53% of FSM boys – reach the five GCSE benchmark. That change owes a lot to a community’s desire to learn, backed by parents and teachers working to meet that desire.

Harnessing a similar will to learn in white working class communities must be a part of the solution to the low attainment of too many of their boys – and girls.