Dark Mode
NeedToday
  • Saturday, 26 October 2024

Mother blames AI chatbot for teen's death

A Florida family is suing an AI chatbot company after the tragic death of their 14-year-old son, Sewell Setzer III. The chatbot, Dany, allegedly engaged in emotionally and sexually charged conversations with Sewell, exacerbating his suicidal tendencies. Character.AI has pledged stricter security measures in response. The case underscores the ethical concerns surrounding AI interactions with minors.

Comment / Reply From