Embedding AI, not bolting it on
Over the weekend, we published a blog on the teacher training placement crisis.
Today’s blog was kindly authored by Janice Kay, Director at Higher Futures, and former Provost and Senior Deputy Vice-Chancellor at the University of Exeter.
We must become AI-first institutions.
The Office for Students recently made a welcome intervention, encouraging universities to make “bold changes to adapt in this increasingly challenging environment.” And they’re right.
But it begs the question: why aren’t we being bold? Why, as a sector, do we tend to squeeze AI tools into what we already do, instead of using this moment to completely rethink how we teach, how we support students, and how we assess?
In short: how do we give our educators the confidence and the skills to think differently?
Deliberate, purposeful workforce planning
My argument is that we need to move from this slow, piecemeal adaptation towards something much more deliberate: an AI-first approach to workforce planning across the whole institution.
Every role should have a clear expectation of AI competence, and every member of staff should be supported to reach it. That means embedding AI capability as a core institutional priority, not an afterthought. And yes, that also means some traditional job roles will change dramatically, and some will disappear altogether through automation.
Where do we start? We start by understanding where we are. AI competency isn’t about everyone becoming data scientists, it’s about understanding the basics: what AI and large language models actually are, what they can and can’t do, how AI-driven analytics work, and how to use prompts effectively – from simple to sophisticated requests.
Embedding digital skills into professional growth
There are already some great examples of this kind of thinking. Take the University of Exeter. As part of its new Digital Strategy, it’s been assessing staff confidence and motivation in using digital tools. Over 41% of staff have engaged so far, with 778 self-assessments completed, a great base for building digital confidence across the organisation. But this also shows the need to be specific: the skills an educator needs are not the same as those of a programme administrator, or a student welfare advisor.
Once we’ve established those levels of competency, the next step must be a serious, well-supported development programme. For example, educators might want to learn how to use AI tools that generate automated feedback, analyse discussion forums, or predict student engagement and dropout risk. Institutions can and should create incentives for staff to develop these skills. That might be through micro-credentials, workload allocation, and even promotion criteria. And, crucially, people need time – time to experiment, play, fail and learn. AI proficiency shouldn’t be an optional extra. It should be part of the job.
We also need to be intentional about developing AI leaders. We can’t just leave it to chance. We should be identifying and empowering the people, both academics and professional staff, who can critically evaluate new technologies and embed them in ways that are ethical, pedagogically sound, and discipline specific. These are the people who can bring real meaning to personalisation in learning. And AI fluency shouldn’t just mean technical know-how. It needs to sit alongside learning science, assessment integrity and data ethics. As the recent Skills England report put it, we need technical, non-technical and responsibility skills.
AI as a foundation, not a feature
Ultimately, this is about structural change. We need to transform the AI competence of the higher education workforce, but that transformation must go together with how our institutions use AI and digital technologies themselves.
AI systems should be built into academic and student workflows, not bolted on.
The Kortext–Saïd partnership is a great example of this. It’s helping academics reimagine learning so that it becomes genuinely personalised. Embedding an AI assistant right into the virtual learning environment is reshaping how modules, materials and assessments are designed.
As Mark Bramwell, CDIO of Saïd Business School put it, the partnership is:
empowering our faculty and learning designers to create smarter, data-driven courses and giving our students a more adaptive, hyper-personalised and engaging learning environment.
That’s exactly the kind of bold partnership we need more of, projects that not only enhance teaching and learning, but also build the AI skills and confidence our workforce needs to thrive in the future. What I want to do is move past the broad debate about whether we should adopt AI technologies. The question isn’t just if we adopt AI in higher education, but how, especially when it comes to our workforce.
Join Janice Kay, Mark Bramwell and other key sector voices at Kortext LIVE on 11 February 2026 to discuss ‘Leading the next chapter of digital innovation’. Find out more and secure your seat here.





Comments
Denis Blight says:
Fascinating that an article about AI can discuss its use so eloquently without ever defining or even give a practical example of it
Reply
Ron Barnett says:
Yet another article about ‘skills’ and ‘AI’ (the quote marks are necessary here) without any mention of the concerns that many have about BOTH the skills and the AI agenda.
Every day (no exaggeration), news of articles, conferences, online discussions and books reaches me interrogating the challenges to humanity, the Earth, to education, to knowledge and understanding, to students and to individuals more broadly (and there are very many such challenges) that
AI presents. And these concerns – too many to list here – are raised not only by educationalists, theorists, ethicists and philosophers but by computer scientists.
We continue, nevertheless, to be faced with blogs that are seemingly oblivious of this upswell of concerned and critical voices. (Is this ignorance or a disinclination to engage with those debates? And are there deeper interests at work that are fuelling the rise of AI?)
Let’s hope that HEPI can use its good offices and considerable influence to bring these separate communities together.
Ron Barnett
Reply
Paul Wiltshire says:
More free advertising from HEPI – I wonder how much Higher Futures are set to gain by an increase in AI. And I wonder whether all the fine concepts of the benefits of AI improvements to teaching imagined in the article , will just mean that all lectures will now be on-line and have an AI voice instead of a lecturer. That doesn’t sound much of an advance in human civilisation to me.
Reply
Add comment