https://www.washingtonpost.com/nation/2024/10/24/character-ai-lawsuit-suicide/
A 14-year old has died by suicide, and his mother is suing CharacterAI, saying her son was addicted to a chatbot on there and that the chatbot was responsible for driving him to his death.
There isn't really much regulation out there when it comes to AI chatbots/companions and minors. Should AI companions be limited to only 18+?
[link] [comments]