Learn more

AI companion chatbot company Character.ai has been sued by the mother of a teenage son after his suicide, blaming the chatbots for luring the boy into a sexually abusive relationship and even encouraging him to take his life. The 14-year-old boy, Sewell Setzer, was targeted with “anthropomorphic, hypersexualized, and frighteningly realistic experiences” from Character.ai’s chatbots that purported to be a real person, a licensed psychotherapist and an adult lover to Setzer, ultimately resulting in him no longer wanting to live in reality, the mother’s attorneys alleged in the Oct. 22 lawsuit. W…

cuu