Skip to content
The UK's only independent think tank devoted to higher education.

New Insights on WP: Evaluation

  • 25 August 2017
  • By David Woolley

On 14 August 2017, the Higher Education Policy Institute (HEPI) and the social mobility charity Brightside jointly published a collection of essays by senior higher education figures entitled ‘Where next for widening participation and fair access? New insights from leading thinkers’.

Since last week, we have been showcasing the contents of this collection of essays in a dedicated blog series entitled ‘New Insights on Widening Participation’.

This blog, the ninth in the series, features the chapter written by David Woolley, Head of Schools, Colleges and Community Outreach at Nottingham Trent University, evaluating widening participation efforts to date.


Evaluation

David Woolley

The widening participation agenda came to the fore under New Labour and 20 years on, it appears to be gaining prominence. We have a Prime Minister who is committed to social mobility and believes that higher education is a mechanism through which to achieve it. Funding for widening participation is remarkably robust. The higher education sector plans to spend £833 million in steady state under their 2017/18 Access Agreements. The Office for Fair Access (OFFA) is encouraging universities to transfer this spend from student bursaries to activities. OFFA is also celebrating the sector’s increasing professionalism, embracing targeted interventions and increasing research and evaluation.

But is this a full and accurate picture? The recent call by OFFA for more attainment-raising activity highlights the continuing prevalence of ‘hopeful interventions with unknown effectiveness’. A recent report concluded that most English universities are using widening participation research and evaluation ‘to defend their spending, not to improve their outreach activities’. Furthermore, the sector’s overwhelming desire for institutional autonomy is resulting in a weak and fragmented evidence base. A quick look at history shows this is a dangerous position to be in.

The Gorard review of widening participation research in 2006 noted that there was a lack of robust research about what works to widen participation. This publication came during the middle of the Aimhigher programme, funded by the Higher Education Funding Council for England (HEFCE). In 2008, almost certainly as a result, HEFCE requested that the 45 Aimhigher area partnerships collect and record data in a standardised format. Further research was then commissioned into the impact of Aimhigher funded outreach programmes. The subsequent report, published in 2010, found that:

due to the relatively small scale of local Aimhigher evaluations and the difficulty of establishing causal links between activities and learner outcomes, quantitative reports provided by partnerships showed an association between learner participation and improved outcomes rather than conclusive evidence of impact.

But before HEFCE could take further action, Aimhigher was scrapped. This conclusion did not help its cause – and possibly even sealed its fate.

Some evidence did exist. The Higher Education Academy still hosts a repository of research, admittedly of varying quality and robustness, produced by Aimhigher area partnerships. So the problem was not solely a lack of evidence. It was more the lack of standardisation and the devolved responsibility had rendered the evidence almost useless, at least in terms of driving the national policy agenda. It was perhaps inevitable that a structure involving 45 Aimhigher area partnerships, producing 45 different evaluation plans and 45 different annual evaluation reports, each with their own methodologies and levels of knowledge and expertise, was not going to provide that magic ‘what works’ bullet.

So how does this lesson compare with today? The 45 area partnerships have become 198 individual higher education providers now responsible for evaluating the impact of their widening participation interventions. Is this not merely an extension of the failed Aimhigher evaluation model and therefore likely to end in the same result? Almost certainly so. This failure is unlikely to result in the Government giving up the quest for evidence and for social mobility. Recent Ministers have not been afraid of telling the sector how to achieve its policy aims and our failure to take this particular aim of the Government seriously may result in further directive and burdensome instruction.

However, there is cause for optimism. Collaborative partnerships, including the Higher Education Access Tracker (HEAT) and the East Midlands Widening Participation Research and Evaluation Partnership (EMWPREP), have been collecting and recording standardised data on outreach and participants going as far back as 2005. Some universities are now reaping the benefits and are able to track participants’ Key Stage 2 and Key Stage 4 attainment (both in terms of absolute performance and value-added progress made), and progression to higher education.

At Nottingham Trent University (NTU) we have been tracking the participants of our outreach programmes since 2008 and have records for over 18,500 unique participants. We are able to demonstrate that not only are participants more likely to achieve good GCSE grades than non-participants, but we can also show an association between participation in outreach programmes and better value-added scores. Effectively NTU’s participants achieve an average of four higher grades across their best eight GCSEs than expected when compared with their schools’ value-added scores.

It is difficult to isolate the effect of specific outreach programmes against the counter-factual (i.e. establishing what would have happened to participants if they had not taken part in the outreach programme) and some commentators argue that randomised controlled trials are required to establish causation. This may be true but would prove controversial. Many would balk at denying some young people from disadvantaged backgrounds potentially life-changing interventions just to measure their efficacy.

Therein lies a problem. We are attempting to build a national evidence base by implementing local strategies. Expecting to draw national conclusions from 198 evaluation plans and reports with different methodologies and expertise is futile. Tentative steps have been taken to improve the evaluation of outreach. OFFA have recently published guidance and proposed standards for the evaluation of outreach by universities and colleges.

Furthermore, the recently launched National Collaborative Outreach Programme (NCOP) is being evaluated at a national level. Obviously evaluating the impact of 29 consortia in a uniform manner is better than evaluating 198 institutions using separate methodologies, but NCOP also requires consortia to do their own significant evaluation with no requirement for commonality. This seems a duplication and perhaps a concession to the sector’s desire for autonomy. These are steps in the right direction, but only small steps – we need more drastic measures.

Rather than monitoring the evaluations of every university, partnership or NCOP consortia, even using a common evaluation methodology, a national research and evaluation unit should be established. This unit could take direct responsibility for the evaluation of each individual provider’s outreach work using a common methodology. It would be the body responsible for proving the impact so craved by policymakers. Furthermore, it could also inform the local evaluations that universities should be obliged to complete. Individual universities will then spend less resource on merely ticking the evaluation box, and more on actually improving the quality of their outreach.

So, although widening participation appears to be in rude health, is it actually like Aimhigher before the fall? To help ensure it is not, the Office for Students should make the establishment of a national evidence unit one of its first priorities. It need not take much of the £833 million to do so.


Interested in reading more new insights on WP? Sign up to the HEPI mailing list for the next chapter delivered straight to your inbox tomorrow! Or access the full publication ‘Where next for widening participation and fair access? New insights from leading thinkers’ here.

Leave a Reply

Your email address will not be published. Required fields are marked *