AI, assessment and belonging

Author:
Professor Harriet Dunbar-Morris
Published:

This blog was kindly authored by Professor Harriet Dunbar-Morris, Provost and Pro Vice-Chancellor Academic at the University of Buckingham, on sabbatical while undertaking a Visiting Fellowship at the University of Oxford. From 1 September, I will take up the role of Deputy President and Chief Academic Officer at South East Technological University (SETU), Ireland.

The recent HEPI report on student use of generative AI highlights something many of us working in teaching, learning, and student experience already recognise: AI is no longer a future issue. It is part of everyday academic life. The question is not whether students are using AI, but how institutions respond.

Two of the report’s recommendations stand out to me. The first is the need for clear, accessible guidance on the use of AI in assessment. The second is the call for research into the ways students are using AI for companionship, advice, or support.

Both point to the same underlying matter. AI is not only a technological challenge. It is a pedagogical and student experience challenge.

Clarity in assessment matters more than detection

Students repeatedly report uncertainty about what is allowed and what is not. They worry about being accused of cheating and encounter different expectations across modules and courses. This lack of clarity is not simply about academic integrity. It affects students’ confidence, engagement, and belonging.

In my own work on the Being, Belonging, Becoming (BBB) survey, students respond strongly to questions about whether staff make expectations clear, whether they feel part of a learning community and whether they understand what is required of them. When expectations are unclear, students disengage or become anxious. In an AI context, that uncertainty is amplified – and it is something we could now measure through the BBB survey.

The answer, in my view, is not better detection software. It is better assessment design.

We need assessments that assume AI exists, that ask students to demonstrate judgement and understanding, and that make explicit when AI may be used and when it may not. Providing both AI-free and AI-supported assessment, as the report suggests, is likely to become normal practice.

Students still need to think without AI

The report is right to warn against assuming AI will replace foundational academic skills. Writing, reasoning, synthesis, and critical thinking still matter. Students need to learn how to use AI well. But they also need to learn how to think without it. That is not a contradiction. It is what designing education for the real world looks like.

AI, loneliness, and belonging

For me, perhaps the most striking recommendation in the report is the call for more research into students using AI for friendship, reassurance, or emotional support.

This connects directly to belonging.

If students turn to AI instead of staff or peers, that tells us something about their experience. It may reflect confidence, convenience, or curiosity. But it may also reflect isolation, uncertainty, or a lack of connection to their course or institution.

Understanding this means we must look beyond technology and towards the broader student experience. Questions about belonging, connection to staff, and feeling part of a learning community matter just as much as questions about digital tools.

AI for staff, not just students

Discussions about AI often focus on students, but staff use these tools too.

Colleagues have piloted using generative AI to produce the first draft of assessment feedback, which staff then edit and personalise. The aim is not to replace academic judgement, but to reduce repetitive workload and to free up more time for meaningful interaction with students.

If AI can create that time for feedback, conversation, and support, that is a gain for student experience, not a loss.

Designing for the world we are in

AI is forcing the sector to confront issues we already knew existed:

  • unclear expectations;
  • over-reliance on traditional assessment formats;
  • pressure on staff time; and
  • the fragility of student belonging.

The challenge now is not to resist AI, but to design teaching, assessment, and student support in ways that recognise its presence, while maintaining academic standards and human connection.

That is a pedagogical task, not a technical one.

Get our updates via email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Comments

Add comment

Your comment may be revised by the site if needed.