Skip to content
The UK's only independent think tank devoted to higher education.

Graduate employability takes top spot in the battle of the metrics – so it’s time we understood it better

  • 13 March 2023
  • By Dr Harriet Dunbar-Morris and Tom Lowe
  • This guest blog has been kindly written for HEPI by Dr Harriet Dunbar-Morris, Dean of Learning and Teaching at the University of Portsmouth, and Tom Lowe, Senior Lecturer in Higher Education at the University of Portsmouth.
  • In April, HEPI and Advance HE will be hosting a webinar on ‘Shifting priorities: has the teaching and learning agenda slipped off the sector’s radar?’ Further details, including how to book a free place, are here.

When research and satisfaction ruled reputation

Summer 2022, like each summer two decades before, saw countless university staff across the UK on the edge of their seats for the annual release of metrics that would inform their universities’ reputation for the year after. Poised with press releases, image designs at the ready, and clearing campaigns set to launch, the annual National Student Survey (NSS) posts were at the starting lines, as they were for the prior 15 years of UK higher education.

But now in 2023, the landscape has changed and the National Student Survey is no longer alone as the headline university performance indicator. ‘NSS day’ last year was dominated by other news items relating to a certain Prime Minister, and despite several Google searches, we find that the National Student Survey of 2022 received almost no coverage in the press. Compare this to the mid-2010s, when the NSS was treated as make or break publicity for institutions with major news outlets speaking of the big winners and losers, the movers and shakers in student satisfaction. Satisfaction, combined with historic reputation and Research Excellence Framework (REF) (previously Research Assessment Exercise) measures previously ruled supreme in university reputation. 

What has changed? The saturation of measures to muddy the ‘value’

With the impact of the Higher Education Act (2017), the National Student Survey is being pushed from centre stage, through continued government discussion of value for money and perceived ‘low’ vs ‘high’ value indicators. The already collected metrics of the Higher Education Statistics Agency (HESA) (on retention, completion, student-staff number ratios), were combined with the Destination of Leavers from Higher Education (DLHE) survey and the National Student Survey to create the Teaching Excellence Framework and Student Outcomes (TEF) in 2017.

This complicated mixture of benchmarked data was difficult for the majority of staff in universities to fully understand and largely ignored by potential students. Since the advent of TEF, graduate occupation measures have become more complex, with the shift to the Graduate Outcomes Survey (GOS) (run since 2019). The GOS’s first data release did get some national media coverage in 2020, but in 2021 and 2022 this major measure and costly exercise gained almost no attention from the mass media. 

It’s not just measures that inform choice

Something has also continued to change in the last decade, since the dominance of NSS began to wane –  increase in resource for and innovation in recruitment methods. The digital consumer market has only continued to grow, where universities are investing more than ever in digital footprints, advertising and marketing staffing to boost the online profile of institutions. Social media takeovers and perfectly shot images of campus perhaps become as important as the metrics to get right

The combination of this content creation with the continued investment of campuses, means that the simple National Student Survey and Research Excellence Framework reputation perhaps may not even form part of the picture for student choice. Finally, in recent times, the finger pointing ‘low value’ measures of the B3 data-set have begun to make their entrance and although no university has been publicly named, perceptions are that certain degree areas are being unofficially labelled, often with little or no evidential base, as being a poor value choice for prospective students. In recent studies, graduate careers has come out top for why students’ apply to study in higher education (UUK, 2017Unite, 2017) and in 2021, employability increased as a factor in course choice at application (UCAS, 2021).

Graduate employability on the up, although it is not alone

So where should a staff member on the ground focus their attention to win at both the reputation and recruitment metric-packed agenda. Well it is clear that Graduate Outcomes Survey measures are the new heavyweight in English higher education. This is due to the contribution Graduate Outcomes has to TEF and B3 measures (at least 25% of the metrics contributing). The weightings used for Graduate Outcomes in The Times publication have risen to parity with the REF21 metric, showcasing the importance the compilers place on Graduate Outcomes. Similarly, the importance of a positive student experience in all aspects of their time at university is clearly reflected in the fact that both National Student Survey and Graduate Outcomes Survey results are included in all three UK league table publications and are often the most cited by institutions in attracting new students.

The fact that these two metrics are ones that universities can and should have direct impact/influence over is reflected across all major publications as well as Discover Uni data. Pairing this with efforts to label programmes with ‘poor’ outcomes for not leading to graduate jobs, it is in everyone’s favour to get familiar with the Graduate Outcomes Survey measures. Staff familiarity with the evolving National Student Survey is easy to keep up with, with 27+ Likert Scales on topics one could almost memorise, very quickly leading to sector-wide National Student Survey literacy. Yet Graduate Outcomes Survey is far more complicated, with different types of survey question and indicators in code form, rather than simple 1-5 Likert Scales. 

To thrive in the current sector, we must all become as Graduate Outcomes Survey literate as we are with National Student Survey. Having a basic knowledge of when the survey takes place for your graduates (there are four survey rounds a year by the way!); what counts as a graduate occupation or further study and what does not (this is open to annual debate); what professions have your graduates gone on to (with new roles emerging every month); and how the data feeds into wider measures, are just some areas of staff development we are prioritising at the University of Portsmouth through a staff development series for academics. 

The work too is endless for students, where the University of Portsmouth Graduate Hallmarks are mapped to our entire provision, giving the students the ability to speak about their skills gained and competencies on their degrees with employers. Our work is continuously engaging with employers, for example in our Student Engagement in Knowledge Exchange Office for Students Challenge-funded project, the University shared practice of live briefs, placements and student-employer consultancy projects to enhance Knowledge Exchange and students’ employability.

Although the agenda is disputed by many, students are arriving at institutions with the future on their minds. So it is important to be prepared for a more regulated landscape where – despite a relative lack of familiarity compared to satisfaction and retention – graduate measures stand top of the metrics battle.

  • HEPI’s paper ‘Getting on: graduate employment and its influence on UK higher education’ by Rachel Hewitt is available here.

4 comments

  1. Doug Cole says:

    Nice article but it doesn’t match the headline. The article is entirely about the Graduate Outcome Survey, that’s graduate employment not employability. The two are not the same and in order to better address graduate employment, we do need to increase understanding of Employability and the factors that are involved, both from an individual perspective and from an external environment perspective.

  2. @Doug, I was about to post the same comment. The distinction between employment and employability is an important one (and one I wrote about in a HEPI paper in 2015 (Degrees of Value)).
    If we conflate the two, we end up using poor proxies as signifiers of supposed quality. Employment metrics may have more to do with he geographical location of an institution, the subjects studied and – in particular – the socioeconomic background of the students, than they do with the employability attributes conferred on them by their student experience.
    This is one of the many reasons league tables are so misleading. These kinds of heuristics encourage student choices that result in bad matches for the student, their university and the future role they will be able to play in the labour market meeting skills needs.
    That said, Harriet and Tom’s article does reflect usefully on both employment and employability, but I hope it is helpful to point out the need to unpick the difference.

  3. Stuart Marriott says:

    It would be great to see higher responses rates in certain subject disciplines within the Graduate Outcomes survey and I’d encourage HESA to consider what could be done through the use of linked data sources.

    Subject areas such as Medicine & Dentistry have struggled to achieve the target response rate for UK, Full Time, Undergraduates (of 60%) and whilst institutions do as much as they can to retain and supply high quality contact details for the purpose of the Graduate Outcomes survey, is there an opportunity for HESA to consider how to share data with professional bodies to ascertain where graduates have gone?

    I posed a question to HESA recently about using data from the regular student returns to fill in gaps in the Graduate Outcomes survey – for example where a graduate has gone on to further study (particularly at another institution) they will know this from the timings of the student return and could merge it into the Graduate Outcomes data. Sadly, it’s not a process they are keen to follow.

  4. Arti Kumar says:

    As Doug Cole and Johnny Rich have rightly pointed out, developing sustainable employability attributes involves the ability to get a job but is so much more. Enabling holistic personalised student development — motivating and engaging students ‘from the inside out’ is a powerful enabling pedagogy and process that has been evaluated extensively and authored. The main publication (2022) is accessible at the link given below in the website box.

Leave a Reply

Your email address will not be published. Required fields are marked *