Today’s weekend reading was written by Dr Andrew Woon, Senior Lecturer in Strategic Management at Queen Mary, University of London.
Generative AI is revolutionising industries, with education at the forefront of this transformation. Traditional models of knowledge acquisition are being challenged as AI redefines how we access and process information.
As AI becomes more accessible and accepted by the public, its potential to reshape the majority of jobs in the market has become increasingly evident. Consequently, AI literacy has emerged as a foundational skill for careers and entrepreneurship. Given that universities are not only institutions of learning and development but also the cornerstone of a nation’s competitive advantage, the impact of AI on education attracts significant attention.
As an educator, I believe that AI has lowered the barriers to accessing knowledge and education, enabling more students, especially those who previously lacked opportunities to benefit. On the other hand, AI has also raised the bar for teaching, as the accessibility of information and knowledge is transforming traditional teaching and learning paradigms. To excel as a teacher today, one must possess not only subject expertise but also advanced pedagogical skills and the ability to stay current with emerging trends.
I echo the sentiments of computer scientist Professor Argamon, who views AI as a technology that can make education more human-centred rather than replacing teachers. AI enables educators to focus on the most critical aspect of their work—teaching and mentoring students rather than merely delivering courses. By leveraging AI, teachers can spend more time engaging with students and actively supporting their holistic development.
AI should not simply be seen as a new complementary skill but as a driving force for educational transformation. Our education system must evolve from a focus on traditional knowledge-based learning outcomes to prioritising skill development, reflective thinking, and innovation-driven learning. This shift will better prepare students to adapt to future challenges and enhance their competitiveness.
The Latin root of the word ‘curriculum’ is ‘currere’, meaning ‘to run’. In academic contexts, a curriculum is defined as a learning plan consisting of a series of activities and courses. Our education system has overly prioritised credit accumulation, often neglecting the ethos of lifelong learning and the importance of continuous self-improvement. Therefore, I advocate that education should not merely be a three- or four-year programme but rather the starting point of a lifelong journey encompassing both depth and breadth of learning in knowledge and skills.
The rapid development of AI should serve as a catalyst for everyone to pursue personal growth. As Professors David Lefevre and David Shrier of Imperial College Business School have suggested, we need to refocus curricula on skills and capabilities that are challenging for AI to replicate. This shift aligns with a move toward more personalised, socially focused, and mentorship-driven education models. Such a transformation would fundamentally change traditional teaching and learning methods, equipping students to better face future challenges.
The greatest value of universities lies in their role as intellectual hubs that foster curiosity, critical questioning and new creation. Universities should teach students to think independently rather than simply follow instructions. Our education system must stop producing “cookie-cutter” graduates who cannot compete with AI.
With the rise of online education and the prevalent use of AI, traditional higher education models are facing unprecedented challenges. Higher education institutions are caught in a paradox: on one hand, they require significant resources to retrain staff in new pedagogies and upgrade facilities; on the other hand, they are grappling with the pressures of cost-cutting. Therefore, balancing cost-effective solutions with quality education remains one of the greatest dilemmas for higher education institutions.
I believe fostering deeper collaboration with industry is a viable way forward to mitigate the financial pressures associated with AI investment. By engaging with industry-specific AI tools, students gain valuable exposure and hands-on learning experiences that better prepare them for employment. At the same time, employers benefit from graduates who not only meet their expectations but also possess the skills to excel in their roles.
In conclusion, the mission of education must focus on cultivating well-rounded individuals equipped with critical thinking, adaptability, curiosity, and a strong sense of social responsibility. By embracing AI as a transformative force and equipping both staff and students with the right mindset and values, universities can empower their graduates to thrive in an ever-evolving world. This approach will ensure that education remains relevant, impactful, and aligned with the demands of the future.
Yet again an article by an AI proselytizer declining to engage with the serious EDUCATIONAL challenges that AI presents. (I have also read the two links in the article.)
We are told repeatedly that education can now focus on critical thinking and creativity with no acknowledgement that AI is going seriously to diminish both.
A user of AI cannot seriously be critical of AI, being in ignorance of the logarithmic basis of AI and its outpourings – which the USA companies refuse to divulge. (Witness the woeful performance of AI in producing bibliographies.)
Creativity will fall because creativity emerges out of cognitive struggle, which will lower on any use of AI. (Creativity cannot be compartmentalised – ‘this hour, I will be creative’.)
But there are yet additional and massive educational downsides to AI, that hardly anyone dare acknowledge, either in education or more widely:
– transparency – (no publisher, or university, requires its authors to declare the % of use of AI in their ‘writings’. When I receive a doctorate thesis, or journal paper or book proposal to review, what I am reading? A computer-generated text? No-one will tell me.)
– authenticity. AI runs against the DUAL principle in education of ‘I say what I mean and I mean what I say’.
– trust: no longer in the world (of fake images and data) can we trust the evidence of our senses. Higher Education is exacerbating this matter. (See previous point.)
– decline of cognitive resources on this Earth: controlled by a few corporations in California and by multi-billionaires, the AI machines are gradually not only reducing the array of concepts and frames of understanding but are doing so in ways that favour controlling and corporatist interests (and the rise of populism) that diminishes academic, intellectual and educational freedom.
(The continued use of the term ‘skills’ in educational-speak is indicative.)
– intellectual freedom: as is well-known, AI is built on a skewed range of inputs.
– due acknowledgement: as is also well-known, AI is built by hoovering up masses of data from scholars (esp of books), so infringing their intellectual copyright and ultimately diminishing the stock of ideas in circulation.
It follows that, insofar as higher education processes are built around AI, they BOTH directly and indirectly fall foul of these many and massive concerns.
[To be clear, these remarks of mine are focused mainly on the educational use of AI, especially at the level of higher education. They are not concerned with potentially valuable research uses of AI, eg in dealing with massive data-sets. I have also had to skate over complex matters for the sake of a quick note in this ‘reply’ facility.]
I hope that HEPI might be able to sponsor a debate on higher education and AI that seriously grapples with these issues.
Ron Barnett
Emeritus Professor of Higher Education, UCL
Hallmark greeting card level of analysis here.
Its like something from a don’t be scared of computers pop in session from many decades ago.
What is the point in this puffery. I find it not worthwhile to dilate on the actual problems
for the HE sector this is so bland. Why is this stuff
being posted.?
Ron Barnett states “I hope that HEPI might be able to sponsor a debate on higher education and AI that seriously grapples with these issues” and I agree.
In fact they have already started, by publishing the thoughts of Dr Andrew Woon on the subject of AI, which I believe is also helpful in creating a wider understanding of what AI is, what it can do and what it cannot do, for a wider audience.
AI is already being used and providing benefits in many sectors including manufacturing, commerce, law, health services and education.
Like many discoveries and inventions from printing to nuclear power, it opens doors to a more complex future and is another tool that can be used for good or evil. Controls on its use will be required and limits may be necessary.
The way this is handled in the HE sector will be critical. Universities have the potential to become the leading force in the development of AI.
Albert, Many thanks for your response to my response!
It doesn’t, I fear, deal directly with any of the points I was making and concerns I was voicing in response to Andrew Moon’s HEPI blog. My remarks had a very particular focus, that of the development of students and the pedagogical relationship in higher education.
I explicitly acknowledged that AI has beneficial uses.
Many thanks
Kindly
Ron B
Thank you all for reading and for your responses. This blogpost was written during the Christmas break as I reflected on how AI has transformed my practice as an education and scholarship focused academic. I’m not an expert in AI or AI pedagogies. I reflect and learn to become a better teacher.
Important! And very relevant to the arrival of the LLE. And one requirement of that is the view that relationships between universities need to change from ‘competition’ to ‘collaboration’. There is an online event about it being provided by HEPI on Monday.
AI can and needs to benefit from AI.