Megan Garcia said her faith has given her strength to speak out about her ordeal in the hope of protecting children from the ...
Tessa, the NEDA chatbot, gave problematic eating disorder advice to someone in recovery. AI therapy needs safety measures.
An AI company based in Menlo Park is banning its chatbots for users under 18 years old following lawsuits against the company ...
Recent teenage suicides following deep attachments to AI companions have sparked urgent debates about the psychological risks ...
A Florida woman is using a tech company alleging that her son’s suicide at 14 was due to his interaction with an artificial intelligence chatbot. Megan Garcia, whose son, Sewell Setzer, committed ...
Documented failures in LLM systems, from wrongful death claims to AI Overviews siphoning clicks, show why SEOs must audit, ...
Teenagers are increasingly finding support, and love, via AI. It's a danger that threatens not only their mental well-being ...
Legislation introduced in the US Congress could require artificial intelligence (AI) chatbot operators to put in place age verification processes and stop under 18s from using their services, ...
A mother is suing Character AI, alleging its chatbot blurred the line between human and machine. We look at the lawsuit and the risks for teens.
The interaction with AI ‘companions’ is far more immersive now, and how we deal with it will shape the precariousness of our lives ...
A 23-year-old man killed himself in Texas after ChatGPT ‘goaded’ him to commit suicide, his family says in a lawsuit.