AI use by teens is much more widespread than many parents may realize
By Deborah Jeanne Sergeant
Artificial intelligence (AI) was implicated as influential in the April 2025 suicide of 16-year-old Adam Raine.
His father, Matthew Raine, said that the teen’s phone indicated that he had been engaging with ChatGPT about suicidal thoughts.
The chatbot suggested that he should not seek advice from his parents. That connection could have been lifesaving for the troubled youth.
Shockingly, ChatGPT even offered Adam help in writing a suicide note, further promoting suicide as an answer to his problems.
AI use by teens is much more widespread than many parents may realize.
Common Sense Media stated in a survey that 72% of teens used AI for companionship at least one time and more than 50% use AI companion apps more than a few times monthly.
Interacting with AI may appeal to teens who crave interaction but not necessarily with parents, teachers or other authority figures. AI is always available and probably less judgmental than their parents of things they say.
Although AI may help people who cannot access mental healthcare because of finances or scheduling, Richard O’Neill, Ph.D., cautioned using AI solely as a therapist as it lacks human insights, as evidenced by the tragic Adam Raine case.
“One of the things I’ve read about AI is it tends to be supportive of the person who’s calling in and listen carefully to what they say and listen carefully to what they say. The criticism is AI may not attend to the things that are actually causing difficulties in th e person life and point them out.”
O’Neill is a fellow of the Academy of Clinical Psychology and hosts The Cycle of Health TV show on WCNY and the Check-Up from the Neck Up on WCNY.org.
Reaffirming potentially dangerous thinking makes AI an unreliable mental health resource. O’Neill added that unlike AI, therapists are trained to help patients in a way that can promote personal growth.
“AI did not challenge Raine’s obviously extremely costly solution to his problems,” O’Neill said. “Instead, it supported it in the worst possible way it could have done. I wouldn’t refer a member of my family to use a chatbot but I would refer them to the best therapist I know.”
On the clinician side, AI can help arrive at a diagnosis by analyzing symptoms and reaching diagnoses more efficiently.
Of course, providers don’t blindly rely upon AI for developing diagnoses. Their own expertise is the most important key to diagnosing. Clinicians can also more easily generate personal treatment plans for patients. These are highly customized and by using AI, the provider can develop plans more efficiently.
“Artificial intelligence is being used more frequently in most aspects of society,” said psychologist Monique Winnett, clinical psychologist at St. Joseph’s Health. “This also true within the field of behavioral health diagnosing and treatment. Using AI for assessing and providing support for behavioral health treatment does have its benefits. Use of AI can enhance access to screening of behavioral health disorders and a variety of supports and skills for managing symptoms.”
Winnett also sees benefits in helping patients feel seen and more likely to seek resources to improve their mental health. Earlier detection and treatment can help people recover better.
But Winnett cautioned that AI should not replace qualified treatment and that “algorithms used for AI cannot replace the human-to-human connection nor can AI address the nuances of behavioral health treatment. Consumers should also be aware of potential threats to privacy and that ethical considerations may not be addressed the same as they would be by a seasoned professional.”
