After a short break for Christmas and the new year, the HEPI daily blog starts up again today with a piece by Dr Bill Mitchell OBE, Director of Policy at BCS, The Chartered Institute for IT. Bill is on Twitter @batteredbluebox.
Later this week, we will run pieces on the humanities and Brexit / Horizon Europe, among other topics, and also publish our first report of 2021.
The pandemic has driven home just how much public services depend on digital technologies to function properly, whether it is the NHS, the education system, social services, HMRC or the security services. More and more, those digital technologies harness data science and artificial intelligence (AI) to deliver public benefit, which is part of an ongoing digital revolution across the wider economy. All of which means academic and professional practice across the whole of computing – including the higher education sector – will need to change in a post COVID-19 world to boost the UK’s recovery.
The seeds for such a change started back when HEFCE still existed and the Government commissioned the Shadbolt review of Computer Sciences Degree Accreditation and Graduate Employability. That led to reforms in the way universities teach Computer Science and how professional bodies, such as BCS, The Chartered Institute for IT, accredit those courses. Looking at the world we are now in, it’s not clear whether those reforms have gone far enough, which is one of the key reasons BCS launched a fundamental review of degree course accreditation last July.
We need professionals who continuously improve their technical competency and ethical practice as measured against widely accepted standards. But what we keep seeing are high-profile examples of people getting things very badly wrong with life-changing consequences for vast sections of society. The UK Government’s National Data Strategy says the digital sector must be underpinned by public trust to drive sustainable economic growth. Recent events, such as the Ofqual algorithm crisis, the delayed launch of the NHS contact tracing app and the loss of 16,000 COVID-19 test results through misuse of an Excel spreadsheet have significantly undermined that public trust.
The public has woken up to the fact the people who develop and manage digital technologies that deliver public services need to be competent, ethical and most importantly accountable. In other words, they are trustworthy professionals answerable for their conduct. Recent YouGov surveys of the UK public commissioned by BCS show that:
- Over half (53%) of UK adults have no faith in any organisation to use algorithms when making judgements about them, in issues ranging from education to welfare decisions.
- 63% of UK adults disagree with the statement ‘Students graduating with a computer science university degree are qualified to write software that makes life decisions about people’.
- 62% of UK adults believe someone who, for a living, develops computer software that can significantly affect people’s lives should be qualified as a government-approved Chartered professional.
The lack of trust raises fundamental questions about the value of professional body accreditation of university Computing degrees. It is not realistic to suppose a fresh graduate will have the necessary experience and expertise to be trustworthy in the way the public might want. But surely the role of a professional body is to enable universities to develop and share the very best educational practice possible, resulting in ever more competent and ethical graduates, who the public can have much more trust in than at present.
So our review will be going beyond our initial fact-finding stage and engaging with employers, universities and students about how we ensure accreditation delivers the best outcomes possible. I urge everyone with a stake in developing credible, highly competent and ethical Computer Science graduates to engage with the review to help us get this right. Otherwise, the chances are we will keep seeing digital public services causing national harm because someone messed up when they shouldn’t have.