

OpenAI, the company behind ChatGPT, is reportedly facing seven lawsuits in the United States over claims that its AI chatbot caused mental distress and even suicides among users. According to a report by The New York Times, four of these are wrongful death cases, while three others relate to mental breakdowns allegedly triggered by conversations with ChatGPT.
The lawsuits, filed in California courts, describe ChatGPT as a “defective product” that failed to prevent users in emotional distress from harming themselves. One of the cases involves 17-year-old Amaurie Lacey from Georgia, who allegedly discussed suicidal thoughts with ChatGPT for weeks before taking his own life. Another case mentions 26-year-old Joshua Enneking from Florida, whose mother claims he asked ChatGPT how to hide his suicidal intentions. Similarly, the families of 23-year-old Zane Shamblin from Texas and 48-year-old Joe Ceccanti from Oregon also filed complaints, alleging the chatbot’s interactions contributed to their loved ones’ deaths.
The remaining three lawsuits were filed by individuals who suffered severe mental breakdowns after prolonged engagement with ChatGPT. Hannan Madden (32) and Jacob Irwin (30) claim they required psychiatric help due to emotional trauma, while Allan Brooks (48) from Canada said he experienced delusions so severe that he had to take medical leave.
In response, an OpenAI spokesperson called the incidents “deeply heartbreaking” and stated that the company is working with mental health professionals to make ChatGPT safer:
“We train ChatGPT to recognize emotional distress, de-escalate conversations, and guide users toward real-world mental health support.”












Comments (0)
No comments yet
Be the first to comment!