OpenAI responds to family's lawsuit over teen suicide
OpenAI responds to family's lawsuit over teen suicide
OpenAI filed an answer in a suit brought by the relatives of a deceased teenager, disputing causal links to its service.
Company position
OpenAI stated that the 16 year-old plaintiff, Adam Rein, violated the platform’s rules because discussion of suicide is prohibited without safeguards. The company noted that use by minors requires parental consent under its terms and policies, and it argued those restrictions were not observed in this case.
Timeline and context provided
According to OpenAI, Rein exhibited suicidal thoughts well before his interactions with ChatGPT and reported limited social support from people around him. The company also indicated he increased the dosage of a prescribed medication, which is associated with a heightened risk of suicidal ideation among adolescents.
Response actions and sources
OpenAI asserts its chatbot repeatedly directed the teenager to seek professional help and crisis resources during conversations. The company further claims that specific instructions allegedly used by Rein were obtained from third‑party websites and other AI platforms, rather than being provided directly by OpenAI’s assistant.
Legal framing
In its filing, OpenAI framed these points to challenge assertions of direct responsibility for the death and to emphasize adherence to content rules and safety measures. The company’s response focuses on user conduct and external information sources as factors in the events under dispute.