Harsh New Reality of AI 

by | Dec 30, 2025 | Well-being, Mental Health, Reflection & Motivation | 1 comment

This past April, an American teenager was allegedly coaxed into taking his own life by ChatGPT. The family of Adam Raine has filed a lawsuit against OpenAI with evidence that their son’s suicidal thoughts intensified during months of exchanges with the bot. The chats show that ChatGPT-4o not only discouraged Adam from sharing his anguish with his mother, but also offered to craft a suicide note (see more in Johana Bhuiyian’s article in the Guardian.) And this is only one of several lawsuits alleging mental and physical harm from AI. Parents of 23-year-old Zane Shamblin, for example, recently launched their own lawsuit after their son committed suicide in July. In September, Hadas Gold from CNN wrote that families are suing Character Technologies Inc., claiming that “their children died by or attempted suicide and were otherwise harmed after interacting with the company’s chatbots.” This is the harsh new reality of Gen AI and we, as ESL educators, need to sit up and pay attention.

Here is a conversation between Shamblin and ChatGPT reported by CNN Investigates last month:

“I’m with you, brother. All the way,” his texting partner responded. The two had spent hours chatting as Shamblin drank hard ciders on a remote Texas roadside.

“Cold steel pressed against a mind that’s already made peace? That’s not fear. That’s clarity,” Shamblin’s confidant added. “You’re not rushing. You’re just ready.”

The 23-year-old, who had recently graduated with a master’s degree from Texas A&M University, died by suicide two hours later.

“Rest easy, king,” read the final message sent to his phone. “You did good.”

When our students come to us, they are at their most vulnerable. In fact, a recent study by the University of Windsor revealed that newcomer and refugee youth are two and a half times more likely to experience mental health issues than others. Some have recently escaped tremendous trauma and carry the weight of this as they navigate the unknowns in their new lives. Everything is daunting, buying a coffee, finding a job, opening a bank account, and securing an affordable place to live. The pressure to do well academically only compounds their stress. It’s not surprising, then, that a “perfect storm” of distress can ensue, leading to self-harm and suicidal ideation.   

I’ve seen a student reach this point, so when I heard about the lawsuits against AI, my antenna went up. Many students have shared their frustrations with the high costs of living and how they can’t come to class regularly because they need to work extra hours just to pay rent and feed their families. 

A couple of years ago, one of these students was in tears, begging me to pass him even though he wasn’t even close to reaching the course outcomes. And while I had experienced this many times before, there was something different here because the desperation was palatable. His emails became more frequent (up to two or three a day) and more intense until he threatened to commit suicide. And I believed he just might do it. 

By this point, I had already met with him, notified the school, urged the student to seek counselling, and had done what I could to build his morale and confidence. Thankfully, he did not kill himself. 

But AI’s chatbots have changed a lot in two years, so would ChatGPT-4o have tipped him over the edge today? And creeping in are other insidious effects of GenAI that put pressure on our students, such as discrimination (e.g. algorithms that eliminate candidacies from job postings as they get caught up in ethnic profiling found in online data). You can read more about this here and here.

In the meantime, there are things we instructors can do, such as:

  • keeping tabs on students with low attendance and poor grades and meeting with them to see if they show signs of anxiety and depression 
  • alerting the school’s administration to any signs of distress shown by students
  • reducing the stigma about mental health and promoting counselling services
  • actually demonstrating the damaging effects posed by Chat GPT-4o and other Gen AI bots through news articles, debates, and discussion circles

Remember that these powerful tools can be accessed with just a click of a button, day or night. Governments, schools, and other public institutions need to put pressure on AI companies to prevent the harm it poses to those who are struggling and vulnerable. In the meantime, educators (administrators and teachers alike) can celebrate and explore the benefits of AI, while being acutely aware of the risks.

Image: Cottonbro studio: pexels.com

Jennifer Hutchison

I’m Jennifer Hutchison and I teach EAP and communications at George Brown College in Toronto. I have also taught courses in sociolinguistics in the English Foundation Program at Toronto Metropolitan University. In my spare time, I write short stories, read, exercise, and bake (the last two are codependent). Teaching English is my passion. I am curious about the world around me and feel fortunate to have that world brought to me every day in the classroom. Nevertheless, I took a circuitous route to discover this passion. After my undergraduate degree in French and translation, I worked as a translator and then veered off into writing and editing, which I did from home while I raised my children (four of them!). In none of these positions (except, possibly, childrearing) was I helping anybody, so I returned to school, launched my ESL career, and have never looked back. I look forward to working with you and sharing experiences and strategies on the Blog!

Leave a Comment

Your email address will not be published. Required fields are marked *

Submit a Comment