This blog was kindly contributed by Paul Hayes, Deputy Vice-Chancellor, Dean Machin, Strategic Policy Adviser, and Paul Spendlove, Graduate Outcomes Manager, at the University of Portsmouth.
Graduates’ destinations – the jobs graduates do – are clearly a central part of the value for money debate.
Until recently, data about graduates’ destinations came from the Destinations of Leavers from Higher Education (DLHE) survey. But, as yesterday’s HEPI report covered, DLHE had problems. It assessed graduate employment six months after graduation which is far too soon. Universities conducted the survey themselves affecting its credibility and introducing risks around bias and gaming.
This year DLHE is replaced by the Graduate Outcome survey. Graduate Outcomes, conducted by the well-respected Higher Education Statistics Agency (HESA) fifteen months after graduation, could be a significant improvement.
One vital part of any graduate destination survey is the percentage of graduates in ‘professional-level’ roles or further study. This data goes into university league tables as well as the Teaching Excellence Framework (TEF). It is also the go-to data for anyone interested in whether there are ‘too many’ graduates or whether young people are studying the ‘wrong’ subjects.
In the last DLHE, the sector-wide percentage of graduates in professional-level occupations was 74 per cent. Adding in those in further study boosts this figure. Everyone is expecting the professional-level / further study figure to fall with the new survey (which will be expanded on later on in the blog). Of course, statistically, it is incorrect to compare directly a survey conducted six months after graduation with one carried out fifteen months after graduation. But, statistically, it is also naïve to expect politicians and journalists to care.
If any decline in professional level / further study figure is non-trivial, the headlines write themselves: ‘Revealed: massive decline in value of university’ or ‘Shock: independent review shows extent of university gaming’.
Such headlines might suit the new government. It has a big majority, with many of its MPs not well-disposed to universities. It also has a manifesto commitment to ‘tackle the problem of … low quality courses’ and, in Iain Mansfield, the Secretary of State for Education has an adviser who believes too many people go to university.
2020 is a bad year for a decline of professional level/further study figures. Of course, if the data is accurate, universities just have to lump this. But HESA has found the survey more difficult than originally expected. The survey might not be accurate. Let us explain. Forgive the technical detail.
A graduate survey, conducted fifteen months after graduation, occurs at the point that anyone who started a twelve month postgraduate course the September after graduation will soon complete their studies.
The taught course will be finished and students might think of little else beyond finding work. In short, while they will be registered as students they may no longer identify themselves as such and give inaccurate survey responses. The new graduate survey risks under-reporting the number of graduates in further study.
HESA understands the problem. It planned to link datasets so that individuals registered as students would be recorded accurately regardless of what they said in the survey. For whatever reason, HESA can no longer do this. HESA will rely on survey responses.
But the problem the sector anticipated has occurred. University data shows that the new survey will under-report the number of our graduates doing postgraduate study at the University by at least 30 per cent. This error will probably apply to our graduates doing postgraduate studies at other universities and we are not the only university affected by this.
A second problem belongs to the racy world of Standard Occupational Classification (SOC) coding. All occupations have a SOC code. Some are coded ‘professional’, others not. Accurate SOC coding is vital.
As most people who create job titles know nothing about SOC codes there is no one-to-one correspondence between the two. Judgement must be exercised in establishing which graduates’ roles fit which SOC codes – and so which roles are professional-level or not.
Where there is judgement, there is discretion and so the possibility of gaming and bias. It was expected that a centrally-conducted survey would lead to more accurate SOC coding.
Over time, this should happen but HESA has found accurate coding in this, the first year of the survey, difficult. HESA is open that there will be revisions for the 2021 survey. In making any revisions, it will use the knowledge and experience of university sector bodies that has been built-up over the DLHE years.
It is impressive that HESA is open about its need to learn from others but by 2021 the government may have launched fundamental reforms to higher education based, in part, on potentially inaccurate 2020 survey data.
Many people will say universities would say this and will not share our concerns. But HESA should. They are a statistics agency, not a political body, nor a quasi-political body like the OfS. HESA needs everyone in the higher education sector to have confidence that its data is accurate. Given the political climate, it probably needs this now more than ever.
It has three options:
- Not publish data for 2020 until it has comparable (2021) data to assess the 2020 data’s accuracy;
- Delay the 2020 survey until later this year when there is greater confidence in the data; or
- Go ahead as planned with sufficient ‘health warnings’ around the data.
Given the importance of the survey, and the problems HESA has encountered, the first two are the best options. But we expect HESA to opt for health warnings. It might be right but can HESA attach sufficient health warnings to the 2020 Graduate Outcome Survey? And if not, longer-term, where might that leave its reputation and the sector’s health?
HESA provided the following response to this blog:
We welcome the opportunity to comment on this blog and provide some additional detail.
Our approach to data quality is at the centre of what we do, and in particular the Graduate Outcomes survey. This includes how we process contact details, manage bias, record accurate and reliable responses, and deliver good quality SIC/SOC coding.
As HESA does across a range of its services, we are actively exploring areas where we can engage with the sector to understand if any further improvements can be made.
HESA is due to release details of our assessment of SIC and SOC coding during the week commencing 9 March, including provider feedback on SOC codes of concern. We hope this will provide some additional insight around the assurance processes we’ve followed for the 2017/18 collection as well as the action we have taken on the draft survey results, before their use in publication.
We realise that some providers have indicated concern about the reporting of further study. To address this, HESA will provide clear information on the treatment of interim study within our statistical outputs. HESA plans to utilise linked data from HESA’s student data collections in future to provide a richer set of information about further study, but this is not currently available in the right timeframe for HESA’s main Graduate Outcomes publications.
I see the problem of assuming that any job a graduate has is a graduate job yet otherwise we are trying to guess not just what a person is a job is expected to do (and whether that is deemed graduate worthy) and whether the graduate extends the role beyond its normal boundaries. Or whether the graduate is just doing a job they could have done several years before but now they know a bit more about something.
Hence i think employment is the marker that matters – of the whole population of graduates.
Further study is problematic. It may mean building on the previous degree in a useful way. It could mean, no job found keep studying. In Australia at least further study is heavily weighted to the older universities.