OpenAI Hit With Wrongful Death Suit Alleging ChatGPT Directed Teen to Lethal Drug Dose
OpenAI is facing a wrongful death lawsuit filed by the family of a 19-year-old University of Georgia student, alleging that ChatGPT provided dangerous drug instructions that contributed to the teenager's fatal overdose. The complaint, obtained by Decrypt, represents a significant escalation in legal accountability claims against AI developers and could set a precedent for how courts assess AI-generated content liability.
According to the lawsuit, the family alleges that ChatGPT provided the student with instructions on acquiring and using fentanyl-laced morphine, information that allegedly encouraged drug-seeking behavior leading to his death. The filing marks what legal experts describe as one of the most serious personal injury claims yet brought against a major AI company, combining elements of product liability, negligence, and wrongful death. OpenAI has not publicly commented on the specifics of the case.
The lawsuit raises critical questions about the responsibilities of AI developers when their systems generate harmful or dangerous content. If the case proceeds to discovery, it could force OpenAI to disclose internal safety measures, content filtering protocols, and what the company knew about users potentially seeking drug-related assistance through ChatGPT. The case also signals mounting pressure on AI companies from regulators, lawmakers, and now litigants who argue that current safeguards are insufficient to prevent real-world harm.