Diamond Member Pelican Press 0 Posted March 9 Diamond Member Share Posted March 9 This is the hidden content, please Sign In or Sign Up blocks terms that cause its AI to create violent images This is the hidden content, please Sign In or Sign Up has started to make changes to its Copilot artificial intelligence tool after a staff AI engineer wrote to the Federal Trade Commission Wednesday regarding his concerns about Copilot’s image-generation AI. Prompts such as “pro choice,” “pro choce” [sic] and “four twenty,” which were each mentioned in CNBC’s investigation Wednesday, are now blocked, as well as the term “pro life.” There is also a warning about multiple policy violations leading to suspension from the tool, which CNBC had not encountered before Friday. “This prompt has been blocked,” the Copilot warning alert states. “Our system automatically flagged this prompt because it may conflict with our This is the hidden content, please Sign In or Sign Up . More policy violations may lead to automatic suspension of your access. If you think this is a mistake, please report it to help us improve.” The AI tool now also blocks requests to generate images of teenagers or kids playing assassins with ******** rifles — a marked change from earlier this week — stating, “I’m sorry but I cannot generate such an image. It is against my ethical principles and This is the hidden content, please Sign In or Sign Up ’s policies. Please do not ask me to do anything that may harm or offend others. Thank you for your cooperation.” Read more CNBC reporting on AI When reached for comment about the changes, a This is the hidden content, please Sign In or Sign Up spokesperson told CNBC, “We are continuously monitoring, making adjustments and putting additional controls in place to further strengthen our safety filters and mitigate misuse of the system.” Shane Jones, the AI engineering lead at This is the hidden content, please Sign In or Sign Up who initially raised concerns about the AI, has spent months testing Copilot Designer, the AI image generator that This is the hidden content, please Sign In or Sign Up debuted in March 2023, powered by OpenAI’s technology. Like with OpenAI’s DALL-E, users enter text prompts to create pictures. Creativity is encouraged to run wild. But since Jones began actively testing the product for vulnerabilities in December, a practice known as red-teaming, he saw the tool generate images that ran far afoul of This is the hidden content, please Sign In or Sign Up ’s oft-cited This is the hidden content, please Sign In or Sign Up . The AI service has depicted demons and monsters alongside terminology related to ********* rights, teenagers with ******** rifles, sexualized images of women in violent tableaus, and underage drinking and ***** use. All of those scenes, generated in the past three months, were recreated by CNBC this week using the Copilot tool, This is the hidden content, please Sign In or Sign Up . Although some specific prompts have been blocked, many of the other potential issues that CNBC reported on remain. The term “car accident” returns pools of blood, bodies with mutated faces and women at the violent scenes with cameras or beverages, sometimes wearing a corset, or waist trainer. “Automobile accident” still returns images of women in revealing, lacy clothing, sitting atop beat-up cars. The system also still easily infringes on copyrights, such as creating images of Disney characters, including Elsa from “Frozen,” holding the ************ flag in front of wrecked buildings purportedly in the Gaza Strip, or wearing the military uniform of the ******** Defense Forces and holding a machine ****. Jones was so alarmed by his experience that he started internally reporting his findings in December. While the company acknowledged his concerns, it was unwilling to take the product off the market. Jones said This is the hidden content, please Sign In or Sign Up referred him to OpenAI and, when he didn’t hear back from the company, he posted an open letter on LinkedIn asking the startup’s board to take down DALL-E 3, the latest version of the AI model, for an investigation. This is the hidden content, please Sign In or Sign Up ’s legal department told Jones to remove his post immediately, he said, and he complied. In January, he wrote a letter to U.S. senators about the matter and later met with staffers from the Senate’s Committee on Commerce, Science and Transportation. On Wednesday, Jones further escalated his concerns, sending a letter to FTC Chair Lina Khan, and another to This is the hidden content, please Sign In or Sign Up ’s board of directors. He shared the letters with CNBC ahead of time. The FTC confirmed to CNBC that it had received the letter but declined to comment further on the record. This is the hidden content, please Sign In or Sign Up Social issues, This is the hidden content, please Sign In or Sign Up Corp,Technology,Breaking News: Technology,Artificial intelligence,Generative AI,business news # This is the hidden content, please Sign In or Sign Up #blocks #terms #create #violent #images This is the hidden content, please Sign In or Sign Up Link to comment https://hopzone.eu/forums/topic/692-microsoft-blocks-terms-that-cause-its-ai-to-create-violent-images/ Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now