There is a long history of people getting their predictions about the future of technology, including the future of technology in education, wrong.
Famously, after the Second World War IBM’s President said, ‘I think there is a world market for about five computers.’
Just ten years ago, in the words of Wired magazine, Sebastian Thrun declared that ‘In 50 years … there will be only ten institutions in the world delivering higher education’. This seems unlikely given we have more, rather than fewer, universities since Thrun spoke and given the work many institutions have been doing to deepen their local roots.
I sometimes think some of the predictions you hear are reminiscent of the record company executive who rejected The Beatles because ‘Groups of guitars were on the way out.’
Change has generally been gradual but consistent. As an undergraduate in the early 1990s, I did not find the computer room in my Department until my very final week, when I stumbled upon it by accident and found a few international students communicating with their families back home. Today, my young son and all his fellow pupils are compelled by their state school to have a laptop, which is used in around half of all lessons and for all homework. On a strike day, it is invaluable.
COVID of course provided a sharp break point, speeding everything up. HEPI’s polling with Kortext, published at the tailend of 2022, shows universities now sometimes surpass the demands of their students when it comes to digital provision – though the research also flagged concerns among students about accessibility and useability, and it is vital that we listen to what students are saying on such matters.
On artificial intelligence or AI, it is important we engage rather run away. I have enjoyed playing around with ChatGPT as much as the next person, most memorably asking it to write a Eurovision Song Contest entry for Morrissey, which came out rather well:
We can’t tell the truth, no one can hear us
Our plea is unheard in this silent night
We can’t find the truth, no one can see us
We keep on searching, searching for the light
I also recently asked ChatGPT to write a speech tackling today’s higher education policy challenges from the perspective of a think tank, and the results were much less good – or rather completely unusable: apparently, we should ‘look at the way that universities are funded’, ‘improve our understanding of what skills are needed in the labour market’ and ‘address the issue of access and participation.’
At a surface level that is all fine, and one can perhaps imagine a politician parroting such lines, but the job of a think tank is to come up with solutions, not anodyne lines for use by a particularly timid MP on Question Time.
Twenty-five years ago, I was a History teacher and the most common use of tech was a pupil lifting their homework from an Encarta CD-Rom – which was very easy to spot, not least because of factual inaccuracy. But now in 2023, Italy felt the need (briefly) to ban ChatGPT (due to data protection fears) and some universities, including here in the UK, have tried to bar their students from using it. But AI is not going to disappear: half of all Cambridge students have supposedly used ChatGPT and the student newspaper Varsity quoted one student as saying it was ‘the equivalent to dropping one of your cleverer mates a message and asking them for help’.
Our recent series of blogs on AI intriguingly all had one argument in common, which is that we need to respond to AI in a nuanced, rather than blanket, way and to learn as we go. Tim O’Shea, the former Vice-Chancellor of Edinburgh, for example, wrote on the HEPI blog that:
It does not make sense to ban it [ChatGPT] in an age when you can carry a computer on your wrist. … The key is not to try and steer around it, but to take advantage of it.
There will be some stark challenges, especially for assessment. One recent report from the think tank EDSK, warns about the threat ChatGPT poses to formal assessments:
establishing for certain whether a student produced the work that they submitted has now become a virtually impossible task for teachers, leaders and exam boards.
EDSK argues that AI means the Extended Project Qualification, which universities have tended to like, should become a compulsory but ungraded ‘low-stakes skills development programme’ and that pupils should take an extra subject in Year 12 ‘that will be examined entirely through an oral assessment.’
There are well-documented challenges with a full return to high-stakes exams, but we need to weigh up the different options carefully.
The final point to make on tech, however, is that we must not forget the humans. If ChatGPT were to write HEPI reports, they might be beautifully written, generally accurate and based on multiple sources but we could not reasonably expect them to be read except by other intelligent machines.
Education is, primarily, a human endeavour and we want validation by other humans. There was an amusing cartoon in Private Eye recently in which two Oxbridge dons are lounging around, with one saying to the other: ‘The students all use ChatGPT to write their essays so I figured what the hell, I’ll use AI to mark it… another splash of claret?’ (Private Eye, No.1596, 21 April to 4 May 2023, page 20)
The role of the human is also vital to the use of new technology in improving the student experience. There are sound reasons why the Government’s higher education student support champion, Professor Edward Peck, titled the final section of his core specification on Student analytics with the words ‘humans are key’. He explains:
when data and analytics are put at the heart of how student support is run, it can focus specialist human support time where it is needed most and allow evidenced demonstration that important-but-finite human support services are appropriately resourced and optimally addressing their goals.
HEPI hosted a dinner in London recently with the Australian software firm TechnologyOne and the point came up time and time again that tech can do routine tasks and make it easier to identify students who are at risk, but that the time freed up then needs to be used for providing intensive human support to those who need it.
We can expect this issue to come to the fore next week, when Parliament debates the petition on universities getting a new ‘duty of care’ towards their students, which – if it were to happen – would be a change of profound importance as well as challenging for universities to deliver.