Skip to content
The UK's only independent think tank devoted to higher education.

The impact of generative AI: exploring challenges and opportunities at Middlesex University

  • 17 October 2023
  • By Melissa Bowden
  • This blog was kindly written for HEPI by Melissa Bowden, Content Writer at Kortext, in conversation with staff at Middlesex University.

It’s less than a year since ChatGPT-3 was launched, yet in that time it has reshaped the landscape of higher education. We spoke to a diverse group of staff from Middlesex University about the impact of generative AI on their institution and the sector as a whole.

The impact of generative AI

Since the launch of ChatGPT, universities have been grappling with a multitude of issues. There was an initial wave of anxiety, with concerns about its impact on assessment and academic integrity. This has developed into a more nuanced approach, centred around supporting students.

Yet is ChatGPT really a new technology? David Clover, Head of Library and Learning Enhancement, observes that ‘ChatGPT is just one tool, and it follows a lot of other AI work’. Nicholas Sharples, Senior Lecturer in Mathematics agrees, commenting that AI (such as grammar checkers) and generative tools (such as essay mills) have both been part of the educational landscape for some time, but it’s the ubiquity of ChatGPT which has caused apprehension. For Nicholas, it has been the catalyst for sectoral change, ‘we’re reforming things that perhaps should have been reformed a long time ago’.

One of the main challenges has been sifting through vast amounts of information to determine an institutional approach, as well as understanding what’s happening across the sector. For Alex Chapman, Head of Technology Enhanced Learning, ‘it feels like universities are playing catch-up’. He explains the conundrum: ‘policy-wise, we need to move very quickly’, but there are many questions to address, as generative AI has implications ‘for assessment, for our own practice of how we develop resources, and our systems and processes’.

The solution requires ‘a whole university approach’, according to Matthew Lawson, Director of Library and Student Support, with academics, librarians and learning technologists working collaboratively. To this end, Middlesex University has created a working group to establish the implications of AI across the institution. For Matthew, it’s not only about ‘adding value to the student learning experience’, but also preparing students for a future ‘where AI is going to change everything’.

AI and assessment

Nicholas Sharples and his academic colleagues are taking a ‘dual-speed approach’ to assessment in the light of ChatGPT. The first task is to address potential vulnerabilities in assessment for the current academic year, while staying within the advertised programme descriptions. Nicholas says, ‘this involves thinking about how we can reinterpret what we mean by a piece of coursework’. For example, ‘could coursework be interpreted as a series of conversations between a lecturer and a student?’. The second task is the long-term development of sustainable post-AI assessment practices.

He stresses that there would need to be institutional guidance on appropriate assessment methods and student expectations would have to be managed carefully. However, such considerations are in line with QAA advice on reconsidering assessment in the ChatGPT era, which encourages institutions ‘to review and, where necessary, reimagine assessment strategies’.

If generative AI is permitted in assessments, how can universities ensure that access to these tools is equitable and inclusive to avoid disadvantaging students? For David Clover, this is a fundamental question, especially as the market becomes increasingly commercialised. He asks, ‘do [institutions] need to consider purchasing some of these tools … so we’re not creating inequities between students internally, or between our students and students elsewhere?’.

Supporting students

The Russell Group principles on the use of generative AI set out the importance of AI literacy, recommending that universities ‘equip students with the skills needed to use these tools appropriately throughout their studies and future careers’. These principles are closely aligned with skills librarians have been teaching for many years, according to Ella Mitchell, Head of Library Services, such as ‘critical appraisal, information literacy and digital literacy’. David Clover agrees, adding that this is not a conversation solely about AI, but about the process of academic research and writing. He says ‘it’s a conversation about what is your work and that’s part of academic writing and presentation more generally’. 

It means encouraging students to think about what is produced by a generative AI tool and how it’s produced. Is the information accurate? What are the biases? For Kate Vasili, Copyright Officer, it includes getting them to think about their use of third-party content and the data protection implications. Kate explains that it’s not about stopping students from using generative AI, ‘it’s more about giving them support and guidance on the best way to use it and [how] not to fall into a legal trap’.

It seems clear that AI literacy is going to be a vital life skill, as well as a necessary academic skill. Nicholas Sharples returns to the point made by Matthew Lawson, commenting ‘these tools are going to be at students’ fingertips for their entire careers’ and universities have a responsibility to help, to equip students to address that’. This involves enabling students to develop critical thinking skills, allowing them to distinguish between AI-generated content and human-generated content, so they can successfully navigate an AI-driven world.

Open AI vs trusted AI

A Jisc report found that students ‘strongly advocate integrating AI into their education’. It has become an invaluable tool, assisting them with tasks such as planning, writing, revising, translating, and more. However, students articulated concerns about the accuracy of AI-generated content and the potential for false accusations of plagiarism, expressing ‘a desire for institution-recommended tools that they can trust’. Delivering on this need, Kortext has developed Kortext Premium – a suite of AI-powered study tools integrated into the Arcturus platform as an upgrade to Kortext Essential.

Unlike other generative AI tools, Kortext Premium’s GPT-based tools use only trusted content that has been prescribed to the student by their institution and made available through the Kortext platform. For Alex Chapman, this is a positive development: ‘it’s really nice that it’s contained within the student’s bookshelf. For first-year students and people who are developing their skills around the use of AI … it looks like a much safer space than the Wild West of open AI that’s out there’.

Kortext Premium’s trusted AI-powered tools can instantly summarise content, generate insightful study notes and create interactive Q&A to reinforce learning. In addition, users have access to a vastlibrary oflearning resources (including 1.9 million videos), organisation and collaboration features, 5GB of cloud storage space, plus time-saving citation and translation tools. David Clover commented: ‘I can see how they [Summarise and AI study notes tools] would benefit students who find the amount of reading difficult … particularly students who might have English as a second language or neurodiverse students who might otherwise be put off by some of the textbook language which can be challenging.’

Kortext Premium has been adopted by Middlesex University. If you would like to harness the benefits of trusted generative AI-powered study tools at your institution, click here to find out more.

Get our updates via email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Leave a Reply

Your email address will not be published. Required fields are marked *