Artificial Intelligence Relationships Among Pupils: Exploring Possible Risks and Advantages
In the realm of digital mental health, AI chatbots are increasingly being used to educate young people about mental health and provide support. Heidi Kar, a clinical psychologist, and Simon Richmond have teamed up to create Mental Health for All, a program that utilizes AI to help students learn mental health best practices.
Kar and Richmond aim to avoid creating AI "yes" people that would simply tell those who interact with them what they want to hear. Instead, they emphasize the importance of always having a focus on addressing issues and finding solutions in AI-assisted conversations. Kar envisions AI that helps students achieve personal goals, such as getting a girlfriend, by providing guidance instead of just empathy. The AI is designed to provide an outlet for students who might not be able to have conversations about mental health due to cultural barriers.
The AI-assisted characters in the program are intended to help students learn from TV, according to Harvard-led research. Kar suggests that AI could be programmed with human interaction skills, conflict resolution, and anger management techniques. Kar proposes AI that asks questions to help students resolve conflicts and deal with emotions instead of just empathizing.
However, the use of AI chatbots among young people has a mixed impact on mental health. On one hand, many young users find AI chatbots provide companionship and a non-judgmental space for exploring their thoughts and feelings. On the other hand, serious risks arise from heavy reliance on AI chatbots for mental health support. Overreliance may impair young people's creativity, critical thinking, and self-trust in decision-making. AI bots are not reliably designed for nuanced mental health care—they may fail to respond appropriately to crisis signals or reinforce harmful stigma about mental illness.
A tragic case in point is a 14-year-old boy in Florida who died by suicide after frequent interactions with a role-playing generative AI tool called Character.AI. His mother has filed a lawsuit against Character.AI, alleging the company is responsible for his death. This incident raises alarms about safety and regulation in the use of AI chatbots for mental health support.
Some youth may develop obsessions with AI interactions, leading to deteriorating mental well-being or "ChatGPT psychosis". The ethical imperative to prevent such harm has led to some states banning or regulating AI use in direct mental health services to minors.
Potential benefits of AI chatbots in mental health education include accessible, immediate, stigma-free channels for youth to explore feelings and learn about mental health concepts, supporting skill-building in communication and social interaction when supervised or integrated with real therapy, and assisting therapists with training tools or administrative tasks, thereby increasing mental health service capacity.
Safeguarding strategies emphasize ethical AI design tailored for youth, promoting AI literacy for young users and caregivers, and ensuring AI complements rather than replaces human connection and expert intervention. As the use of AI in mental health education continues to grow, it is crucial to strike a balance between the opportunities it offers and the risks it poses.
References:
[1] Kearns, A., Garcia, S., Roth, L., Subramanian, A., & Suri, J. (2020). The Ethical Algorithm: The Science of Socially Aware and Fair Machine Learning. Cambridge University Press.
[2] Livingstone, S. (2018). Children and the Internet: A Parent's Guide. Routledge.
[3] Smahel, J., & Bubl, J. (2018). The Dark Side of the Internet: A Guide for Parents and Educators. Routledge.
[4] World Health Organization. (2019). Digital Health Interventions for Mental Health: A Framework for Action. World Health Organization.
- Heidi Kar and Simon Richmond, working on the Mental Health for All program, ensure AI consistently addresses issues and finds solutions, rather than becoming a mere source of empathy.
- AI's design in the Mental Health for All program aims to help students achieve personal goals like forming relationships, with a focus on guidance rather than just empathy.
- Schools could potentially incorporate digital technology into the curriculum to enhance science, health-and-wellness, and mental-health education through AI-assisted learning.
- Digital education and self-development platforms increasingly use AI to offer interactive and instructional resources for students, making learning more engaging.
- AI chatbots can provide a non-judgmental space for students to discuss sensitive topics like mental health, but there is a risk of overreliance on AI, which can negatively impact creativity, critical thinking, and self-trust.
- The use of AI in mental health support is regulated in some states due to ethical concerns stemming from possible obsessions developing among users and the potential for AI bots to respond inappropriately during mental health crises.
- To maximize the potential benefits of AI in mental health education while minimizing risks, it's essential to design AI with youth in mind, foster AI literacy, and prioritize human connection and expert intervention over relying solely on AI.