Diamond Member Pelican Press 0 Posted October 30 Diamond Member Share Posted October 30 This is the hidden content, please Sign In or Sign Up ‘Sickening’ Molly Russell chatbots found on Character.ai Chatbot versions of the teenagers Molly Russell and Brianna Ghey have been found on Character.ai – a platform which allows users to create digital versions of people. Molly Russell took her life at the age of 14 This is the hidden content, please Sign In or Sign Up while Brianna Ghey, 16, was This is the hidden content, please Sign In or Sign Up . The foundation set up in Molly Russell’s memory said it was “sickening” and an “utterly reprehensible ******** of moderation.” The platform is already being sued in the US by the mother of a 14-year-old boy who she says took his own life after becoming obsessed with an Character.ai chatbot. In a statement to the Telegraph, which This is the hidden content, please Sign In or Sign Up , the firm said it “takes safety on our platform seriously and moderates Characters proactively and in response to user reports.” The firm appeared to have deleted the chatbots after being alerted to them the paper said. Andy Burrows, chief executive of the Molly Rose Foundation, said the creation of the bots was a “sickening action that will cause further heartache to everyone who knew and loved Molly.” “It vividly underscores why stronger regulation of both AI and user generated platforms cannot come soon enough”, he said. Esther Ghey, Brianna Ghey’s mother, told the Telegraph it was yet another example of how “manipulative and dangerous” the online world could be. Chatbots are computer programme which can simulate human conversation. The recent rapid development in artificial intelligence (AI) have seen them become much more sophisticated and realistic, prompting more companies to set up platforms where users can create digital “people” to interact with. Character.ai – which was founded by former This is the hidden content, please Sign In or Sign Up engineers Noam Shazeer and Daniel De Freitas – is one such platform. It has terms of service which ban using the platform to “impersonate any person or entity” and in its “ This is the hidden content, please Sign In or Sign Up ” the company says its guiding principle is that its “product should never produce responses that are likely to harm users or others”. It says it uses automated tools and user reports to identify uses that break its rules and is also building a “trust and safety” team. But it notes that “no AI is currently perfect” and safety in AI is an “evolving space”. Character.ai is currently the subject of a lawsuit brought by Megan Garcia, a woman from Florida whose 14-year-old son, Sewell Setzer, took his own life after becoming obsessed with an AI avatar inspired by a Game of Thrones character. According to transcripts of their chats in Garcia’s court filings her son discussed ending his life with the chatbot. In a final conversation Setzer told the chatbot he was “coming home” – and it encouraged him to do so “as soon as possible”. Shortly afterwards he ended his life. Character.ai This is the hidden content, please Sign In or Sign Up it had protections specifically focused on suicidal and self-harm behaviours and that it would be This is the hidden content, please Sign In or Sign Up features for under-18s “imminently”. This is the hidden content, please Sign In or Sign Up #Sickening #Molly #Russell #chatbots #Character.ai This is the hidden content, please Sign In or Sign Up This is the hidden content, please Sign In or Sign Up Link to comment https://hopzone.eu/forums/topic/157346-%E2%80%98sickening%E2%80%99-molly-russell-chatbots-found-on-characterai/ Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now