Evaluating the impact of widening participation initiatives

Lee Elliot Major argues for a more evidence-based approach to university access work.

It is nothing short of a scandal that the vast majority of work in our universities and colleges aimed at opening doors to students from low and middle income homes is not evaluated properly. We spend over £1 billion a year on programmes to widen participation and broaden access into our academic elites; yet we know very little about what impact most of these efforts are having. Well-intentioned efforts to aid social mobility – from school outreach programmes to financial support for students – are effectively operating in the dark, uninformed by any hard evidence of what has worked before.

The problem has come to light again with the release of a report for the Higher Education Funding Council for England (Hefce) which “found little evidence that impact is being systematically evaluated by institutions”. Previous reports have revealed a lack of even the most basic monitoring of data and outcomes across the sector, prompting the English funding council to issue guidance on evaluation.

The national strategy unveiled by Hefce and the Office for Fair Access (Offa) meanwhile has recommended a light-touch network of regional coordinators to facilitate collaboration between universities and schools. This sounds suspiciously like ‘AimHigher light’- a slim-line version of the previous national outreach programme in England. AimHigher was cut in the last Whitehall spending review due to lack of evidence of its impact. A lot of good work was undermined by the absence of hard data.

The gathering of robust evidence remains the Achilles heel of the sector. It seems tragic that this should be so in our respected seats of learning. Once when the Sutton Trust offered to evaluate an outreach scheme at a highly prestigious UK university, the head of access declined, arguing that they would rather use the extra money to help more students.

The problem with this response is twofold: Firstly, we didn’t (still don’t) know if the programme was actually having any impact on the students taking part. Secondly, if we did evaluate it, then the lessons could enable many thousands more students to be helped properly in the future. The current default – to simply survey participants to see if they enjoyed the experience – is no longer good enough. The question must be asked: did the programme impact on the student in the desired way that would not otherwise have happened if the programme had not existed. Did the programme enable students from poorer backgrounds to enter university who otherwise wouldn’t have done so?

But there are signs that the tide is at last turning. To its credit Offa is urging institutions to adopt a more ‘evidence based’ approach. What is now needed is the full mix of evaluation and monitoring – local pilot studies as well as national randomised trials – to measure the outcomes of access work.

Universities can look to the work we have been doing with schools on interventions in the classroom to learn some of the basic principles. The DIY evaluation guide published by the Education Endowment Foundation (EEF) offers simple advice on how to evaluate the impact of a programme at a local level. This is about combining professional judgment with knowledge of previous evidence to devise a programme, and then monitor outcomes of participating students in comparison to similar students not on the programme. The Trust is currently developing a common evaluation framework for all of its programmes. This will enable evaluations for small projects without the resources to commission an independent evaluation themselves.

The Government recently designated The Sutton Trust and EEF as the ‘What Works centre’ for education following the publication of our highly successful toolkit for schools. The Trust is currently developing an ‘HE access toolkit’, which we hope will summarise current evidence on the impact of access work in an accessible format. Although it is not clear how much this will be able to say, given the paucity of research in the field.

Undertaking ‘gold standard’ evaluations which involve selecting participants at random to ascertain genuine impact remains a tricky task. But the Sutton Trust has already funded a feasibility study on how a proper randomised control trial (RCT) might be undertaken for an access programme. We are now considering commissioning a fully fledged RCT.

Even if RCTs are currently a step too far for others, then evaluations need at least to involve the use of comparison groups. Two examples of such usage can be seen in recent evaluations commissioned by the Trust. Our review of summer schools used UCAS university admissions data to compare the outcomes of summer school students against similar students not on the programme. The Reach for Excellence programme meanwhile constructed a comparison group from students who qualified but didn’t enrol on the programme.

If I had my way every access programme would require an evaluation that met these basic standards. Robust evaluation is not easy to do, costs time and money, and often produces awkward and humbling results. But not to do so, is in the end failing the students we are trying to help.

This blog post first appeared on Westminster Briefing.

Evidence is just the start of finding what works

Lee Elliot Major and Steve Higgins, professor of Education at Durham University argue that effective implementation matters once the evidence for what works has been identified.

In England we spend around £2 billion a year on 170,000 teaching assistants in schools. The total impact of this money on the attainment of our children is zero.  The best evidence we have indicates that for every TA who improves a pupil’s progress there is another whose deployment has a negative impact.

This is a powerful example of why we need evidence based policy and practice, but it also highlights the difficulties of promoting changes to improve practice – because finding that TAs are deployed ineffectively does not tell you what to do about it.

Such issues are soon to be faced across government, following the launch last week of a network of What Works centres to champion evidence-based social policy.

At the launch, Oliver Letwin, Minister of State at the Cabinet Office, said that the biggest question was why government hadn’t done something like this before.  But if government hasn’t, others have, and the centres will be building on existing practicies of evidence use in health and education.

In health, the Cochrane Collaboration this year celebrates two decades of producing systematic research reviews. Its approach has shaped advice offered by the National Institute for Health and Clinical Excellence on NHS treatments.

The cultural shifts that introduced evidence-based medicine to surgeries and hospitals 20 years ago are now playing out in classrooms and schools. The What Works education centre will use a toolkit we developed three years ago summarizing thousands of studies to show the best and worst bets for improving pupils’ results. It is the model now being advocated by the Cabinet Office for other policy areas.

Since 2011, the Education Endowment Foundation, a charity that aims to improve the educational achievement of disadvantaged children, has developed this toolkit into a resource for disseminating evidence across the sector. The EEF has overseen a quiet revolution in England’s schools, commissioning some 40 randomized control trials so far.

The toolkit shows that working on teacher-learner interaction – improving feedback to pupils, for example –  gives the biggest bang for the education buck.  Yet our surveys reveal that most head teachers prioritise actions that evidence suggests, on average, have little impact: reducing class sizes or recruiting TAs.

In education, the route to evidence-based policy is particularly challenging, because the navigation instruments are less powerful and predictable than in medicine.  Imagine a world where the laws of nature vary through time and space. That is the reality across the thousands of different classrooms, teachers and children that make up our education system. Over the last 30 years curriculum and assessment have changed many times, and variation in schools and teachers has a profound impact on how an intervention works or doesn’t work.

 We have little idea how to help schools implement the best bets for improvement. Some may need a highly prescriptive programme; others general principles to shape and evaluate a tailored programme.

To return to TAs, for example, the evidence does not mean that they should be scrapped. There are many ways in which TAs might be better recruited, trained, deployed and evaluated. Some approaches will be more effective in lower-performing schools, schools serving a high proportion of children with special needs, or schools with particular teachers. Knowing what works where and for whom could improve a school’s choices about TAs and everything else.

A commitment to what works (strictly, what’s worked) in education must also consider the constantly changing pedagogical landscape. Take phonics teaching: if the current emphasis on phonics becomes routine, then remedial support based on phonics is likely to become less effective than research currently suggests. Children who have failed in a phonics-rich pedagogy may benefit more from a different remedial style.

These are important lessons for the other four planned What Works centres. Evidence can be boring or inconvenient for politicians more interested in an immediate and popular policy fix. But, as Letwin stressed, “this is only the start of a journey, not the destination”, and the outlook at this early stage is promising. This programme has the potential to revolutionise public policy in areas as diverse as ageing, education and policing, replacing dogma and tradition with research and randomised trials. Billions of pounds of public money could be saved.

This post first appeared on Research Fortnight

What Works – A winning formula

Peter Lampl welcomes the designation of the Sutton Trust and EEF as the What Works Centre for Education

This week the Sutton Trust was, together with the Education Endowment Foundation, designated the What Works evidence centre for education by the Government. There will be six leading evidence centres and we and the National Institute for Health and Clinical Excellence (NICE) have been selected to lead on education and health respectively. The centres will be the first port of call for advice on the latest research on the impact of Government programmes.

This is recognition of the Sutton Trust’s focus on evaluation and research in all the work it does. We have always aspired to subject our programmes to robust review. And as an independent foundation we have used evidence to challenge or support the Government’s education policies.

The Trust has funded over 120 research studies in the areas of social mobility and education. But it is primarily a ‘do tank’. Our flagship summer school programme for example is now the largest national university access scheme – but it is also the most scrutinised programme in this field.

We know they have impact: over three quarters (76%) of summer school attendees go on to a leading university, compared with only 55% of students with similar backgrounds who aren’t on the programme. We also know they are highly cost-effective: when Boston Consulting Group did a cost-benefit analysis of the Trust’s programmes – comparing the lifetime earnings benefits for the individuals on the schemes with the money spent – summer schools were among the programmes resulting in returns of over 20:1.

It was these disciplines – assessing the evidence on what works, assessing cost-benefit, but also ensuring that the research results are presented in a clear accessible way – that underpinned the Teaching and Learning Toolkit the Trust developed for schools on what works best at improving the results of children from poorer backgrounds. The Toolkit has now been used by thousands of schools across the country, and underpins the work of the Education Endowment Foundation.

When we established the EEF in 2011 as lead foundation with Impetus our vision was that it was going to embrace the Sutton Trust’s principles and become a gigantic do tank. The aim was to improve the results of the poorest children in our most challenging schools. But it would also have the freedom to experiment, innovate and rigorously evaluate projects and scale up those that were cost effective.

Two years on I am pleased to say that this has become the reality. To date the EEF has awarded £24.4 million to 55 projects working with over 275,000 pupils in over 1,400 schools across England. It has commissioned over 40 randomised research trials in our schools – the gold standard for evaluations on what works. Over the coming years these studies will add greatly to our knowledge of what interventions are successful in the classroom.

But with research, you have to take the rough with the smooth. Not all the Sutton Trust’s research findings have been welcome. In 2005 the Trust jointly funded a five-year study with the Department for Business, Innovation, and Skills and the College Board into the US SAT aptitude test as a potential addition tool in the selection of candidates for universities.

In particular the National Foundation for Educational Research study aimed to find out whether the SAT test could identify highly able non-privileged students whose potential was not being reflected in A-levels because of their circumstances. After five years tracking the results of thousands sixth formers who then attending university, the study concluded that the SAT added little extra information to that provided by A-levels.

If the Government is true to its word on ‘evidence-based policy’ then it will have to face up to this reality. The research may not always confirm prior convictions or favoured policies, and almost always throws up some unexpected results. That’s why I think it is important the EEF and the Sutton Trust remain fiercely independent and make public all the evidence we produce. As the Government’s What Works evidence centre for education, these will be our guiding principles.