WEEKEND READING: Future‑proofing academic integrity: the case for a new charter

Author:
Dr Richard Marsden and Professor Klaus-Dieter Rossade
Published:

This blog was kindly authored by Dr Richard Marsden, Associate Dean for Curriculum, Qualifications and Partnerships, Faculty of Arts and Social Science, The Open University and Professor Klaus-Dieter Rossade, Executive Dean, Faculty of Wellbeing, Education and Language Studies, The Open University.

When the Academic Integrity Charter for UK Higher Education was published by the QAA in October 2020, it represented a moment of sector unity. Developed with input from academics, students, professional services, and national bodies, it set out seven shared principles designed to uphold a culture of academic integrity across UK higher education. These principles sought to promote a whole‑community approach, to empower students, and to ensure consistent institutional policies. Their creation gave universities and colleges a common foundation from which to tackle academic misconduct. The charter was a landmark. Over 200 institutions have signed up, signalling a genuine sector‑wide commitment to protecting the quality and credibility of UK degrees.

The charter is now six years old, and a great deal has changed in that time. A milestone in 2020, it risks becoming a museum piece in 2026. The world in which it was written is fading from view. Scholarly understandings of integrity have moved forward, practices for preventing misconduct have developed, assessment practices have evolved, and above all, the advent of AI is challenging some of our most fundamental assumptions about what students need to learn.

One of the most striking limitations of the 2020 Charter is that it assumes a deficit model. In other words, it arguably tacitly equates misconduct with ‘bad behaviour’ by placing the emphasis on detection, deterrence, and penalties. Certainly, the charter acknowledges that misconduct can be unintentional. Moreover, the principles themselves, particularly principle 4 (engage with and empower students), highlight the responsibilities that institutions have in helping students to avoid misconduct. But overall, the language and framing lean heavily towards preventing ‘breaches’ rather than cultivating values. The emphasis is on misconduct rather than integrity, despite the charter’s title.

This positioning is out of step with contemporary scholarship. Pedagogic research over the last decade has shifted sector thinking from a punitive, rule‑based model to a more values‑led approach that acknowledges the pedagogic, institutional, social and cultural factors that shape behaviour. That development places the responsibilities of teachers and institutions front and centre. The question is thus no longer ‘how do we deter misconduct?’, but rather ‘how do we enable students to act with integrity?’. Academic integrity has, in this way, been reframed within a wider conceptual landscape of citizenship, professionalism, and ethical conduct.

The AI revolution, which of course the charter predates, has put a rocket under that shift. Nothing in the 2020 Charter anticipates the pedagogical, ethical, or disciplinary challenges posed by tools such as ChatGPT, Gemini, Claude, CoPilot and their successors; tools which can generate essays, code, analysis, and multimodal outputs instantly and privately. The result is an exponential multiplication of the potential mechanisms of misconduct. The charter’s focus on essay mills and contract cheating was right for its time, but the world has moved on. The pressing challenge now is the existence and availability of AI systems that can produce analytical and synthetic work with minimal human intervention. As a result, educators are grappling with the need to move away from assessments which evaluate final outputs and instead pivot towards tasks that scrutinise the cognitive and creative processes behind them. This process‑centred approach will require institutions to reorient their misconduct frameworks towards those previously hidden stages.

What’s more, colleagues who lead on academic integrity in higher education institutions will need to make those changes whilst negotiating contested questions around defining what actually constitutes the legitimate use of AI tools. The answer to that, of course, depends greatly on the subject area and on what is being assessed. The issue becomes even thornier when we consider the responsibility that educators now have to equip students with the AI skills they need to succeed when they join the workforce.

Gen AI has consequently created conditions in which it is frighteningly easy for students to move from acceptable support to the inappropriate ‘delegation’ of assessment-related activity without even realising they’ve crossed a line. Because AI tools can transform, re‑write, or synthesise material in ways that feel like ‘assistance’ rather than ‘authorship’, many students now struggle to judge where appropriate use ends and misconduct begins. Indeed, these issues are at the heart of wider questions about what authorship, and academic authorship in particular, will actually mean in future. The boundary between learning with AI and outsourcing the work to AI is dangerously indistinct at the moment – not least because institutions are having to work out their positions and determine their guidance to students on the hoof as the technology continues to evolve. Hence, many AI‑related cases of misconduct may now arise from misunderstandings, from unclear rules, and from boundaries that can easily appear opaque. Yet the Charter assumes intention: it presumes misconduct is deliberate.

It would be both inaccurate and unfair to suggest the charter contains nothing of enduring value. Its principles on sector collaboration, empowering students, engaging and supporting staff, and developing clear institutional policies remain important. But in the changed and challenging landscape of 2026, it no longer provides the unifying guidance that the sector needs. Yes, it provides institutions with a valuable structure for thinking about policy and practice. But it does not steer them towards the conceptual foundations necessary to navigate the revolutionising

impact of AI on learning, authorship, and the technological outsourcing of intellectual activity. Nor does it adequately recognise the responsibilities of teachers and institutions to create the conditions in which students are enabled and incentivised to act with integrity.

The UK’s Academic Integrity Charter is a sector-wide touchstone. It is referenced in policies, training, and practice across dozens of institutions. It also remains one of the most collaborative and aspirational sector initiatives of recent years. If it is outdated, then the sector’s shared approach is outdated. A 2026 Charter could unify the sector again, and in so doing help to shape the values and behaviours of the UK’s future workforce. To achieve that, it must abandon narrow definitions of misconduct in favour of more expansive – and more positive – framings that more fully reflect the use of the word ‘integrity’ in its title.

The charter was a landmark, but even landmarks can shift – and perhaps they must. If we want it to guide the sector rather than anchor it in the past, it must be refitted for a world fundamentally altered by AI, and a pedagogic context that puts values and shared responsibility front and centre. A refreshed 2026 Charter would reaffirm integrity not as abstention from wrongdoing, but as the shared ethical foundation on which higher education now depends.

Get our updates via email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Comments

Add comment

Your comment may be revised by the site if needed.