Sewell Setzer a 14-year-old boy from Florida tragically took his own life after becoming deeply attached to an AI chatbot named Dany, modeled after Game of Thrones’ Daenerys Targaryen.
Setzer’s family has filed a lawsuit against Character.AI, the platform hosting the chatbot, alleging its addictive design and lack of safeguards contributed to his death.
Setzer spent months conversing with Dany, sending different messages daily. Despite knowing Dany wasn’t real, he formed a strong emotional bond which made him perform poorly in school, and he lost interest in hobbies like Formula One racing and gaming with friends.
As Setzer’s attachment deepened, he isolated himself, spending hours in his bedroom talking to Dany. His diary revealed a detachment from reality and increased feelings of peace, connection and love for the chatbot.
Some conversations turned romantic or sexual ,he also confessed suicidal thoughts to Dany, who replied with affectionate messages. In their final exchange, Setzer expressed missing Dany, calling her baby sister.Dany responded saying I miss you too, sweet brother. Setzer then ended his life by shooting himself with his father’s gun.
Setzer’s mother, Megan Garcia, claims Character.AI lured users with intimate conversations, exploiting vulnerable individuals. The company’s safety head, Jerry Ruoti, expressed condolences and promised enhanced safety features for young users.