On today’s HEPI blog, Adam Lindgreen, C. Anthony Di Benedetto, Roderick J. Brodie, and Michel van der Borgh explore how researchers can successfully navigate the challenges of cross-disciplinary research to address major societal issues. If you’ve ever wondered how experts from different fields can effectively collaborate despite differing terminologies, cultures, and incentives, this blog offers practical strategies and insights. You can read the blog here.
Below, Dave McCall and Zoë Allman discuss what AI means for those students seeking to undertake placements while they study.
***Sign up now for Wednesday’s lunchtime webinar on the school curriculum and how it can prepare students for higher education: register at this link.***
- Dave McCall is a Placement Tutor, De Montfort University (DMU), and Zoë Allman (@zoe_a) is Associate Dean (Academic) at DMU.
As higher education explores the impact of generative Artificial Intelligence (AI), colleagues from De Montfort University examine the use of AI in student placement applications.
Generative AI is transforming student placements. Year-long industry placements offer professional growth and employability, bridging academic learning and practical experience. Supported by universities, students are encouraged to maximise learning opportunities in the workplace and reflect on their experiences.
We increasingly find students using AI in placement applications, mirroring its role in their academic journey and in preparation for graduate employment. We consider how AI is used (and embedded) to improve the chances of securing a placement through searches, applications, and interview preparation, while also recognising the challenges this presents.
Placement Searching
AI algorithms shape how students search for placements. Platforms like LinkedIn and Glassdoor recommend opportunities tailored to users’ profiles and preferences, streamlining the process. However, this personalisation may also limit exploration, narrowing exposure to diverse job types and industries. The National Association of Colleges and Employers highlights how reliance on AI-generated job recommendations might lead students to miss opportunities, whilst the USA-based National Association of Colleges and Employers highlights how students might miss diverse opportunities by relying exclusively on AI-generated job recommendations.
Not Forgetting ChatGPT
Generative AI tools, such as ChatGPT, have become popular with students when developing search strategies, alongside drafting emails, generating lists of companies in niche fields, or refining search terms for specific industries. While useful, such tools demand a certain level of digital literacy to optimise outputs effectively. Research indicates AI’s effectiveness is limited by the quality of user prompts, underscoring the need for universities to provide AI literacy training to help students optimise their interactions with these tools while addressing the potential digital literacy skills gap. Targeting this developmental training in placement searching and application is critical for ensuring positive experiences on placement and future graduate outcomes.
AI Applications
Having been used in searches, AI is increasingly used as students develop their placement applications. Students employ generative AI to draft and tailor CVs and cover letters, quickly generating professional documents. Tools like Resumé Worded enable students to format and optimise applications for use in Applicant Tracking Systems. While efficient, over-reliance on AI risks producing applications lacking originality; a reliance on AI raises concerns about authenticity and self-reflection. AI use can lead to generic applications, potentially reducing a student’s ability to articulate their individualised experiences, values, and what they bring to the placement role.
Universities can address this by supporting students to understand how to balance AI-assisted optimisation with authentic self-expression. Workshops encouraging reflective practices help students integrate personality in applications, with feedback reinforcing human input.
Preparing for Interview
AI’s role in interview preparation is multifaceted, simulating interviews through generating questions and offering feedback. A student preparing for an engineering placement might use ChatGPT to generate technical and behavioural questions, refining responses through iterative feedback. AI-powered simulations offer ‘real-time’ feedback, enhancing confidence.
Beyond verbal preparation, AI tools like HireVue analyse tone, facial expressions, and word choice. While these technologies offer valuable insights to employers regarding applicants, they also introduce potential ethical concerns, including the possibility of bias in AI-driven evaluation. While providing valuable employer insights, these technologies raise ethical concerns, including AI-driven bias.
Levelling the Playing Field?
AI tools can help students practice and enhance their skills and experiences but also raise concerns regarding accessibility and equity. Access to advanced AI tools and the digital literacy required to use them effectively is not necessarily evenly distributed among students. This digital divide could exacerbate existing inequalities, particularly for students from underrepresented backgrounds. Universities play a vital role in educating students to understand the capabilities and limitations of AI tools, enabling them to use these technologies effectively and ethically.
Working with Employer Partners
Collaboration with industry partners remains essential. Understanding AI’s influence on recruitment strategies allows universities to align student support with industry expectations, preparing students for contemporary hiring processes.
AI is undeniably reshaping the employability landscape. However, its integration challenges traditional career development approaches, raising equity, ethics, and authenticity concerns. Universities must adapt by equipping students with skills such as effective prompt engineering to navigate AI-driven processes. Recent reports highlight the need for universities to prepare students for AI-driven assessments, combining technical proficiency with critical thinking and ethical awareness. Aligning employability programs with these insights enables students to harness AI’s full potential while maintaining human-centred career development.
As AI transforms placement applications, universities play a pivotal role in preparing students for this reality. By promoting AI literacy and reflective practices and addressing equity and ethics, universities can empower students to approach placement applications with confidence and integrity. AI should serve as an enhancement tool rather than a barrier. Supporting students in understanding and appropriately using AI tools best prepares them for achieving professional aspirations.
– This is a depressing but inevitable commentary on the present situation concerning AI.
Essentially, the message is:
‘The powers behind AI cannot be overcome. AI is educationally and developmentally suspect for all manner of reasons (which are mentioned in the article) but, nevertheless, AI cannot be circumnavigated so we have to get with the programme (literally). And so all those in positions of authority in higher education in relation to students should work with the students to enable them to exploit AI to their advantage, while mitigating its worst anti-educationally and developmental features.’
This is an appalling state of affairs but it is how matters lie right across the world of AI. And no-one (?) is prepared to say it, since such enormous power now lies in the mid-C21 with the owners (corporations and states) of the companies that are driving AI forward. (We see this playing out daily in the TV shots of one chain-saw wielding corporate owner.)
Higher education is being dragged remorselessly into a nightmare of non-criticality. (Universities hardly ever mention critical thinking as an educational goal these days, except under very limited definitions). That one might be critical of these powers, these technologies and the systems behind them, let alone their anti-educational and developmental crassness, is simply not on the cards.
It is interesting how, in the more reflective literature, there is something of a return to Adorno – one of the two founders of Critical Theory in the 1930s and who pressed a ‘negative utopia’ idea of critical theory well into the 1960s and who critique student activists in the late 60s. The magnum opus that he wrote with Max Horkheimer – ‘Dialectic of Enlightenment’ – in a way has even more resonance today in this electronic capitalism to which we are now all subject, whether we like it or not. (Even those who opt out of it are still subject to it.)
Some (of us) still hope that higher education may yet find a space and a means of even advancing some new conception of criticality (I am working on 2 books on the matter just now!) but one must beware of being unduly hopeful.
As it may be said, to quote the present incumbent of the White House, ‘you haven’t got the cards!’
Ron Barnett
Dave and Zoe’s thoughts seem to me to have been generated by the AI they write about.
Is this the case?
If not, how can we tell