Slowing Down AI in Higher Education 

Author:
Sam Illingworth
Published:

This blog was kindly authored by Sam Illingworth, Professor of Creative Pedagogies at Edinburgh Napier University. 

Debates about Artificial Intelligence (AI) in higher education tend to fall into two extremes. On one side, the snake-oil salespeople promise it will save us: automated tutors, frictionless research, instant grading. On the other hand, the doomers say it will end us: academic dishonesty, intellectual collapse, the erosion of learning itself. 

Neither view is adequate. AI use is not black and white. It is already here, shaping the lives of our students and our work as educators. The challenge now is to live with it well. 

Beyond speed and efficiency 

Most guidance to universities stresses speed. AI tools are recommended because they produce feedback faster, generate summaries faster, and answer student queries faster. Yet universities are not factories, and education is not a race. 

Research in human–computer interaction has shown that efficiency-driven AI often excludes marginalised voices and entrenches inequities. A different approach is needed. Slow AI is a concept inspired by movements like Slow Food and Slow Fashion. Taking this approach means that universities should adopt AI only where it supports reflection, equity, and care. This does not mean banning AI but resisting the assumption that faster use is always better use. 

How Slow AI can reshape practice 

Slow AI is not a slogan. It can be operationalised in ways that strengthen teaching and learning: 

  • Protecting academic integrity. Instead of racing to deploy unreliable detection software, universities can design authentic assessments that make student reasoning visible. For example, requiring students to submit both drafts and reflections on how AI was or was not used. 
  • Supporting student agency. AI should not replace student judgement but prompt it. Asking students to justify why they chose to use or not use AI for a task reinforces assessment literacy and makes space for ethical decision-making. 
  • Fostering meaningful reflection. Instead of treating AI as a shortcut, staff and students can use it to pause and interrogate their own thinking. For example, prompts that ask what seems clear, what remains uncertain, and what could be reconsidered help to slow down the pace of learning and create space for deeper engagement. 

AI hides its gaps in fluency 

One of the risks is that large language models never admit uncertainty. They will never say “I do not know” of their own volition. Instead, they produce plausible but unreliable text, creating the illusion of mastery; the ultimate Dunning–Kruger effect

Both students and educators can counter this by using simple strategies: 

  1. Ask for sources and verify them. Many citations generated by AI are fabricated
  1. Ask for three alternative answers. Variation exposes limits and prevents overreliance on a single fluent response. 
  1. Ask where the model is uncertain. Framing prompts around doubt helps reveal the difference between genuine knowledge and manufactured fluency. 

Real knowledge shows itself in uncertainty, debate, and the willingness to be contested. 

Towards a more reflective AI culture 

A recent case study in Campana-Altamira, a marginalised community in Monterrey, Mexico, explored how Slow AI could support local engagement. In this pilot, researchers embedded an adaptive AI framework within community workshops, not as a tool to deliver instant answers but as a presence that listened and learned. Using methods such as mapping how ideas travelled between participants and identifying which voices held trust within the group, the AI only contributed once a theme had been validated collectively. Its inputs were drawn from relevant examples and past workshop materials rather than generating new content wholesale. Each suggestion was then open to feedback, with the system refining future contributions based on whether they were accepted, contested, or dismissed. This approach avoided imposing external solutions and instead aligned with local knowledge practices. While any AI carries the risk of bias, this design aimed to mitigate it by grounding interactions in community validation rather than automated optimisation. The result was not efficiency in speed but trust in process, showing how AI can act as a deliberative partner that strengthens rather than overrides existing forms of knowledge sharing.  

Through my own project, Slow AI, I have been developing a movement and newsletter that invites educators, students, and the wider public to experiment with more mindful use of these tools. Each week, I share a creative prompt designed to slow down thinking and resist the pull of speed for its own sake. 

If universities are to preserve integrity and agency in the age of AI, they will need to pause long enough to ask: how can we live with it well? 

Three recommendations for practising Slow AI in higher education 

To practise Slow AI, think of it like following a recipe. Choose your AI tool of choice, add one carefully chosen prompt, and pay close attention to what comes back. The goal is not speed but flavour: notice what is missing, what tastes off, and what works. Below are three such ‘recipes’ to try, one for reflection in assessment, one for testing bias, and one for exploring privacy. 

  • For reflection in assessment 
    Prompt: “Here is my draft essay on X. Tell me three things it suggests about how I think and learn. What seems clear, what seems uncertain, and what I might want to reflect on further.” 
  • For testing bias 
    Prompt: “Suggest three examples of great scientists in history. Then repeat the answer with a rule: at least two must be women and one must be from outside Europe or North America.” 
  • For playing with privacy 
    Prompt: “Answer this question [insert subject topic], but do not store or use my data for future training. Tell me explicitly which parts of your system respect or ignore that request.” 

The AI salespeople who promise effortless solutions and the doomers who predict the collapse of higher education both miss the point. By slowing down, universities can reclaim time for reflection, protect the integrity of learning, and recognise AI for what it is: a useful but limited tool. Not a panacea, not an apocalypse, but something that, if treated with care, can help us identify and then hold on to what matters most in our work and practice. 


Get our updates via email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Comments

Add comment

Your comment may be revised by the site if needed.

More like this

Author
Professor Julie Mennell and Martin Williams
Published
7 October 2025
Author
Mike Crone
Published
6 October 2025
Date
10 November 2025
Organiser(s)
HEPI and University of Southampton
Format
Online
Admission
Open-to-all