- This HEPI blog was kindly authored by Matete Madiba, Deputy Vice Chancellor: Student Development and Support and Associate Professor in the Faculty of Education at the University of the Western Cape (UWC).
There is a growing concern for the mental health of university students globally. A national study of about 70,000 students across 17 universities in South Africa found that 21% of students reported signs of clinical trauma while 37.1% of students reported anxiety symptoms. Mental health challenges impact the lives of students significantly and impede their functioning. Universities across South Africa are aware of the importance of addressing the mental health of students. Some 30.6% of students had thoughts of suicide while 16.6% had made a suicide plan and 2.4% had attempted suicide. These numbers are a grim testament to the extent of the problem. Moreover, the COVID-19 pandemic has only amplified these issues, exacerbating feelings of isolation and uncertainty among students.
The University of Western Cape (UWC) has proactively developed an Integrated Student Mental Health and Wellbeing Policy that seeks to address the issue of mental health in an integrated manner. And digital health is part of this.
Digital technology has made huge strides in connecting people, generating information at a touch of a button and changing the pace at which things can be done. One such promising innovation in the digital space is Artificial Intelligence (AI) in the form of chatbots, which holds great promise for providing support and assistance in the arena of mental health. This pioneering way of using AI has the potential to transform the way mental health care is accessed and approached.
UWC wanted to be wiser with their mental health offering, and as such introduced a mental health chatbot offering named Wysa. Wysa is a global leader in AI-driven mental health support. Its AI-first approach enables users to improve their mental health before symptoms become severe, by using free text understanding to tailor the support to an individual’s needs and guiding them through clinically proven interactive cognitive behavioural therapy (CBT) exercises.
The benefits are clear.
- The intervention has both increased the accessibility to services regardless of the student’s geographical location and the potential to provide life-saving support in critical situations. The availability of the chatbot 24/7 allows for students to have professional support available at any time. If needed, the student could connect to a real-time counselling service for additional input, and there is an SOS function connected to the university’s helplines.
- Stigma is challenged, as a student can use the technology free from judgment that people may experience when discussing their challenges with another human being. Users can receive guidance and tools without the fear of worrying about societal judgements.
- It is designed to offer tailored support based on the student’s specific needs and challenges. As the AI chatbot uses natural language processing and machine learning algorithms to understand and then formulate a response to the students’ concerns, this creates a feeling of care and support.
- The ability of the AI Chatbot to provide daily check-ins, inform the student about self-help strategies and coping mechanisms and create a constant presence in the life of the student is another feature that forms part of the continuity of care. This ongoing wraparound assistance is particularly important for students with mental health challenges and reduces risk levels.
- AI chatbots can serve as early warning systems, identifying potential mental health issues before they escalate. Early intervention is crucial in preventing mental health problems from becoming more severe.
- AI Chatbots present an economical solution for addressing the growing demands of mental health support. They can serve an entire university student and staff population making mental health resources more affordable and making resources available to a wider audience.
- Understanding trends, symptoms and user behaviour can yield valuable data for mental health research enabling the possibility of developing more effective interventions. By measuring what works and what is effective, we at UWC are able to provide the right solution at the right time on a personalised basis, without losing scalability.
Whilst the availability of an AI application such as Wysa can provide much-needed support and guidance to a wider audience, it is important to acknowledge that they are not substitutes for mental health care professionals. They augment counselling services and can provide skills to students that they can self-learn. Not all students need or want counselling but many could benefit from the assistance of self-learning new coping techniques which can help prevent more serious deterioration. It frees up time and resources for those for whom more traditional and focused forms of support are needed, helping our teams ensure no one is left without help, which could see their mental health deteriorate.
At the same time, we need to manage risk. We work with Wysa because it is clinically validated, with numerous peer-reviewed studies showing its efficacy. Privacy is a big concern, and we stress that staff cannot see conversations with the AI. It has an SOS function signposting to crisis plans and helplines when further intervention is required. Chatbot responses are checked by clinicians to minimise chances of failure, in ongoing and rigorous testing. At UWC we use the technology as part of a full suite of tools, so no one is left reliant on it. Technology isn’t foolproof, but neither is human intervention. This approach takes the best of both.
Universities, government and mental health organisations must work collaboratively to design a comprehensive strategy that integrates AI chatbots within a broader framework of support that focuses both on prevention and treatment. We must work to change the system and culture that perpetuates and exacerbates mental health challenges and build a wraparound programme of support and activity that enables individuals to thrive.
In this fusion of technology and humanity, we find the potential to bridge the gap and bring hope to those who need it most.
This article misses the point on a number of accounts.
First, UWC seems to believe that the operational word in AI is “intelligence” when, in fact, it is “artificial”.
A few years ago, when we saw the rise of websites that offer medical diagnosis on more straightforward matters (e.g., fever, pain), doctors were advising that we should stay steer from this type of advice. They were right because, as we all know, diagnosis of health problems requires a lot experience in the field.
Now, all of the sudden, and because a chat bot can use natural language and react to prompts, we seem to forget that.
Second, mental health problems are clearly significantly more difficult to diagnose and treat than physical. I wonder what made UWC to believe that a chat bot can do this job?
Third, the author claims that “stigma is challenged” because students with potential mental health problems may be more willing to open up to a chat bot (???) than to another human. The author, though, neglects to mention that AI algorithms are notoriously famous and heavily critised for the inherent bias they bring.
Fourth, it is hard to see how “[…] AI chatbot [that] uses natural language processing and machine learning algorithms to understand and then formulate a response to the students’ concerns” can create “a feeling of care and support”. AI bots are known for the generic, boilerplate response and text.
There is something that the author gets right and this is that AI can have huge economic benefits for addressing demand.