Skip to content
The UK's only independent think tank devoted to higher education.

Generative AI in universities: what are educators thinking?

  • 18 March 2024
  • By Lucy Haire
  • This HEPI blog was authored by Lucy Haire, Director of Partnerships at HEPI.

As the panic about artificial intelligence’s (AI) capability to bring down democratic governments, usurp all information-related jobs and ultimately cause human extinction calms down, most large organisations have devised at least their initial response to the deployment of generative AI. Some, including major law firms, have banned their use as a research tool; others are building walled-garden versions which draw on ‘approved’ data and do not feed into publicly-available AI platforms, protecting their commercial interests. There are also ongoing high-profile court cases linked to AI platforms, such as the one launched by the New York Times against OpenAI and Microsoft for allegedly breaching copyright by using New York Times content to train the AI systems. Yet many more are happily embracing the tools to see how they can improve productivity and creativity, and achieve business goals.

In higher education, the birthplace of artificial intelligence in all its forms stretching back many decades, universities and colleges have quickly devised policies and approaches for the use of generative AI tools by students for their studies, and in particular for producing assessed work. In a recent Higher Education Policy Institute (HEPI) report based on a survey of over 1,200 undergraduates, 63% felt that their universities had a clear policy on student use of AI.  

This was the topic of a recent HEPI roundtable gathering of UK pro vice-chancellors and technology learning experts, co-hosted with learning technology company Chegg. The first speaker quipped that when a personal computer arrived in the school staffroom where he was a teacher in the early 1980s, he and his colleagues thought it would be no use at all. We have all become used to new hardware and software arriving over the years and it quickly becoming an integral part of our personal and professional lives. However, there was a consensus at the discussion that new tools like generative AI should not be ‘plonked’ in the proverbial staffroom. Instead, the companies that produce them should engage with institutions and students for the best effect.

Chegg representatives, who also joined the meeting, enlightened us on their research and how they closely analyse how students study. One purpose of the roundtable was to hear feedback on a set of expert-authored draft principles, which are designed to ensure academic integrity as institutions further develop their AI-enabled learning support services.  

A recurring theme of the roundtable was the need for new models of assessment and learning in higher education. Is it time, for example, to move away from older traditions of education predicated on what students can remember when sitting in an exam hall? Students need to be prepared for the world of work and assessments need to be challenge-driven. Some universities are, for example, completely redesigning their curricula in light of the new possibilities. One participant at the roundtable said that the most valuable assessment of the year, according to her students, was feedback on their podcast productions from an industry panel. 

There was general consensus we should be ‘techno-optimists’ in higher education, but concern in equal measure about the perceived high usage of generative AI by students. Many felt it was important that students understand AI platforms, so they can contextualise and interpret the kind of content that emerges from them. The HEPI report found that more than a third of students who have used generative AI (35%) do not know how often it produces made-up facts, statistics or citations (‘hallucinations’). Boosting AI literacy is essential and, with some students having much more access and competency than others, attention should also be paid to the ever-present digital divide.

The capability of AI in university research was also touched on, such as for supporting literature reviews and for analysing mass data such as in clinical trials. The debate also veered into a discussion about universities being financially squeezed. AI tools have the potential to offer an elite education on a massive scale, but many institutions have not invested sufficiently in their back office functions and IT infrastructure, especially in connection with data, to be able to easily and fully take advantage of the new AI tools.

Chegg rounded up the discussion with a run-down of the major communication tools that previous generations encountered for the first time: books – Chegg started life as a textbook rental service – radio, film, TV and then mass-produced video with digitization and smartphones. Students will inevitably want to speed things up if the tools allow, but there was a recognition that learning must take place at the right pace and necessarily involves friction. Our opening speaker’s teenage daughter had the last word: “AI – oh yes, get used to it Dad.”

Get our updates via email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Leave a Reply

Your email address will not be published. Required fields are marked *