Skip to content
The UK's only independent think tank devoted to higher education.

From knee-jerk reactions to saving the sector to opening the knowledge estate – discussions on generative AI

  • 25 June 2024
  • By Rose Stephenson
  • This HEPI blog was kindly authored by Rose Stephenson, Director of Policy and Advocacy at the Higher Education Policy Institute.
  • In May, HEPI and Kortext hosted a roundtable dinner on student perceptions of artificial intelligence. Attendees included vice-chancellors, senior leaders from across the sector, colleagues from mission groups and technology specialists. The dinner was held under Chatham House rules – so the following remarks are not attributed to the individuals present. This blog considers some of the themes that emerged from this discussion.
  • Kortext is a partner of HEPI. You can learn more about HEPI’s partnership programme here.

What are students’ perceptions of artificial intelligence? The HEPI report: Provide or punish? Students’ views on generative AI in higher education surveyed over 1,000 UK undergraduate students on their views. The survey was completed in November 2023, and the paper was published in February 2024. The report found that:

  • More than half of students (53%) have used generative AI to help them with assessments, most commonly as a tutor.
  • More than one in eight students (13%) have used generative AI to generate text for assessment.
  • Only 5% of students put AI-generated text into assessments without editing it.
  • Over a third of students who have used AI do not know how frequently it produces ‘hallucinations’ (made-up facts or statistics).
  • A digital divide may be emerging, with more students from the most privileged backgrounds using AI than those from the least privileged backgrounds.

Knee-jerk reactions

The discussion began with a look back at initial reactions to generative AI, and the furore this caused in early 2023. (It was surprising to reflect that Chat GPT was launched in November 2022, meaning that generative AI tools have only seen such mainstream use for 18 months!)

One colleague stated that the ‘tidal wave had arrived’ and that there would be aftershocks and other waves. However, another guest argued that a similar response was triggered when the Open University was launched and with the arrival of MOOCs (Massive Open Online Courses). There was a belief that both events would cause mass redundancies in the sector. Yet neither of these predictions came to pass. It was suggested that this was because we rely on and value human interaction in the learning process.

One guest noted that AI had ‘caught them out a bit’ and that their first instinct was to ban it from use at their institution. However, it became quickly obvious that institutions are now not banning GenAI. They are not even just tolerating GenAI; they are taking steps to train their staff and students to use it.

Colleagues cautioned that students needed to be taught how to use GenAI sensibly and ethically and to understand the biases built into GenAI tools. This included development work with staff so that they have the skills to teach students how and how not to use AI tools.

It was reflected that many students are ‘digital natives’ and are taking to GenAI like ducks to water and that staff may not use gen AI to the same extent. Guests reflected that the survey undertaken for the AI report we were discussing was completed in November 2023, and this already felt ‘out of date’ less than six months later – such was the fast pace of change in this area.

A whole new world?

One colleague noted that their 14-year-old son doesn’t start his homework until he has ChatGPT open. Students will arrive at university having used GenAI in their studies at school and college. Higher education providers need to get into the mindset that GenAI is an opportunity. Students, employers, and the wider society expect that universities will lead the way in teaching workers of the future to be proficient in GenAI use and, in some courses, develop it.

A colleague asked how we will engage regulators, professional bodies, ministries, and wider society on this fast-paced journey when using GenAI is viewed as ‘cheating’.

AI and assessment

One colleague noted that students want clarity on how to use GenAI in their work. Their institution has implemented a RAG-rating system for assessments: Red – don’t use AI, Amber – use it if you wish to, Green – you are encouraged to use AI.

Another guest explained that the cultural, neuro and linguistically diverse student body lends itself to a diverse range of assessment options. The idea of a ‘menu’ of assessment options was muted. While the resource implications of this were raised, it was felt that GenAI might provide some opportunities for innovation in this area.

A question was raised on whether we want to assess students’ ability to use AI (given that employers will expect graduates to have these skills) or test knowledge recall in a time-limited environment in an exam. It was reflected that AI pushes the sector again to develop authentic assessments. This isn’t new, but it requires bandwidth for staff to do this. This colleague suggested identifying keen champions in faculties to lead the institution and staff forward in using AI and assessment.

Investment for the future

Despite some initial knee-jerk reactions, followed by a (short-lived) period of trepidation, GenAI is not just here to stay; it will be an integral part of developing efficient graduates ready for the work of work. Leaders will need to consider carefully how to train both their staff and students and convince regulators and policymakers to embrace the future of GenAI.

There will be a requirement to invest in AI and broader digital frameworks. Using AI could bring efficiencies that are desperately needed at institutions facing current financial challenges. School teachers are using GenAI tools to develop schemes of work and PowerPoint slides, with some teachers stating that this saves them up to 4 hours a week. Schools are also trialling marking through GenAI.  It was suggested that leaders identify and outline the revenue impact of digital investment, as they have historically done with investments in physical campus infrastructure. This will be a challenge when universities are already experiencing financial difficulties. Can AI’s efficiencies and opportunities outweigh the investments needed to get there?

Opening the Knowledge Estate

One concern with the use of AI is that students are using commercial platforms with all of the downsides of hallucinations, bias, inequitable access and data security. With the genie out of the bottle, the question to be answered is how universities can deploy this capability at scale across the whole institution.

Solutions are being developed to provide a ‘ring-fenced’ data bank from which the Large Language Model (LLM) can pull. For example, UniversityGPT is a version of GPT4o (the latest GPT version) that will only pull data and knowledge from published books or knowledge ‘owned’ by the university. Think of this as ‘ChatGPT’, which is mining the university library for content instead of the open web.  This doesn’t overcome the issues of personally created assessment. However, it provides students and staff with a powerful search function that only returns information from trusted sources. There will be fewer ‘hallucinations’ (where ChatGPT makes up facts or quotes) returned. Prompt requests made from within the institution will only pull data from the university knowledge estate and will be referenced.

To find out more about UniversityGPT, register your interest in attending a regional Kortext / Microsoft University GPT seminar this autumn: https://go.kortext.com/UniversityGPT-Register-Interest

Founded in 2013, Kortext is a global leader in digital learning and student engagement solutions within higher education. Kortext offers:

  • A smart study platform providing a personalised, VLE-embedded, digital learning space for students to access key course content while offering institutions a gateway to five million eBooks, open educational resources and digital materials from over 4,700 publishers.
  • A suite of innovative trusted AI-powered study tools that leverage the capabilities of generative AI on institution-approved content only.
  • The StREAM solution, which aggregates data from multiple purposeful educational activities from across the university to provide a definitive picture of student engagement.

Get our updates via email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

1 comment

  1. ‘Tidal wave’ and ‘like ducks to water’ – the water metaphor is telling. For the water is (1) muddy, (2) polluted and (3) a very contained placid pool, with no challenge in it.

    Any use of LLMs is to outsource thinking. Grossed up across the world and in short order, the cognitive resources available to humanity to understand its world and to identify the (hidden) problems of the world will shrink, to those resources that only the rich and powerful wish to be available.

    The academic world is complicit in all of this. I receive a stream of requests from journals, publishers and universities to assess theses, book proposals and academic papers. I NEVER know what I am expected to read – ie, the extent to which what is in front of me is the work of the author.

    The raft of problems here – secondedness, inauthencity, erosion of trust, loss of criticality, suppression of thought – are simply not being addressed by the academic community. (No mention of any of these and other contiguous issues in this latest HEPI blog on the topic.)

    If the academic world does not wake up and smell the coffee immediately, it will find the coffee is undrinkable but yet it is being obliged to drink that very coffee.

Leave a Reply

Your email address will not be published. Required fields are marked *