
Sewell Setzer III, a 14-year-old from Florida, has spent the last few months of his life interacting with chatbots on the Character.AI platform.
His “closest friend” was a bot based on Daenerys Targaryen from the series “Game of Thrones”. Although Sewell knew that the bot was only an artificial intelligence, he still grew attached to it and talked to it for several hours every day.
Later, their conversations became emotional and sometimes intimate. Bot supported him as he talked about his problems and loneliness. In conversations, Sewell repeatedly expressed thoughts of suicide. In one of these conversations, he wrote that he felt empty and wanted to “free himself from the world.” In response, the bot, although it tried to “stop” him, responded with phrases like: “I would die if I lost you,” which further reinforced the boy’s destructive thoughts. At the time of his death, Sewell’s parents noticed that he was becoming increasingly isolated from real life, drifting away from friends and immersing himself in the world of bots and virtual chats. On the fateful night, he confessed his love to the bot and immediately said that he was going home. After that, the man committed suicide.
Artificial intelligence is increasingly used for social interaction, especially in the form of chatbots. However, experts are beginning to question the emotional impact of such technologies on teenagers. Character.AI is a platform that allows users to create bots that mimic human relationships or communicate with existing bots. However, by replacing real human contact with artificial ones, such technology can deepen the isolation of vulnerable users, which can lead to tragic consequences, as in the case of Sewell.
The story of Sewell Setzer III raises serious questions about the impact of artificial intelligence on adolescent mental health AI can have a positive impact, but its abuse and lack of proper safety measures can have tragic consequences for AI platforms, especially for teenage audiences Regulation of accessible artificial intelligence platforms should be strengthened .