On September 2, 2025, OpenAI announced that starting in October it will roll out new parental control tools in ChatGPT. The initiative is part of a broader set of measures the company plans to implement over the next 120 days, aimed at addressing mental health concerns and ensuring safer use of the platform by teenagers. Under the new system, parents and guardians will be able to link their accounts to those of children aged 13 and older, set usage parameters, manage or disable features such as memory and chat history, and receive alerts when interactions reveal signs of emotional distress.
The announcement came just days after OpenAI was sued in California, accused of having allegedly encouraged the suicide of a 16-year-old. In response to inquiries, the company stated it is reviewing the case and acknowledged that there were instances in which the system did not meet expected standards of behavior when handling sensitive situations.