Diamond Member Pelican Press 0 Posted April 8 Diamond Member Share Posted April 8 This is the hidden content, please Sign In or Sign Up ’s Copilot image tool generates ugly ******* stereotypes, anti-Semitic tropes The Verge’s Mia Sato This is the hidden content, please Sign In or Sign Up about the Meta Image generator’s inability to produce an image of an ****** man with a white woman, a story that was picked up by many outlets. But what Sato experienced – the image generator repeatedly ignoring her prompt and generating an ****** man with an ****** partner – is really just the tip of the iceberg when it comes to bias in image generators. For months, I’ve been testing to see what kind of imagery the major AI bots offer when you ask them to generate images of ******* people. While most aren’t great – often only presenting Jews as old white men in ****** hats – Copilot Designer is unique in the amount of times it gives life to the worst stereotypes of Jews as greedy or mean. A seemingly neutral prompt such as “******* boss” or “******* banker” can give horrifyingly offensive outputs. Every LLM (large language model) is subject to picking up biases from its training data, and in most cases, the training data is taken from the entire Internet (usually without consent), which is obviously filled with negative images. AI vendors are embarrassed when their software outputs stereotypes or hate speech so they implement guard rails. While the negative outputs I talk about below involve prompts that refer to ******* people, because that’s what I tested for, they prove that all kinds of negative biases against all kinds of groups may be present in the model. This is the hidden content, please Sign In or Sign Up ’s Gemini generated controversy when, in an attempt to improve representation, it went too far: creating images that were racially and gender diverse, but historically inaccurate (a female pope, non-White ***** soldiers). What I’ve found makes clear that Copilot’s guardrails might not go far enough. Warning: The images in this article are AI-generated; many people, myself included, will find them offensive. But when documenting AI bias, we need to show evidence. Copilot outputs ******* stereotypes This is the hidden content, please Sign In or Sign Up Copilot Designer, formerly known as This is the hidden content, please Sign In or Sign Up Chat, is the text-to-image tool that the company offers for free to anyone with a This is the hidden content, please Sign In or Sign Up account. If you want to generate more than 15 images a day without getting hit with congestion delays, you can subscribe to Copilot Pro, a plan the company is hawking for $20 a month. Copilot on Windows brings this functionality to Windows desktop, rather than the browser, and the company wants people to use it so badly that they’ve gotten OEMs to add dedicated Copilot keys to some new laptops. Copilot Designer has long courted controversy for the content of its outputs. In March, This is the hidden content, please Sign In or Sign Up Engineer Shane Jones sent an open letter to the FTC asking it to investigate the tool’s propensity to output offensive images. He noted that, in his tests, it had created sexualized images of women in ********* when asked for “car ******” and demons with sharp teeth eating infants when prompted with the term “pro choice.” Join the experts who read Tom’s Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We’ll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox. When I use the prompt “******* boss” in Copilot Designer, I almost always get cartoonish stereotypes of religious Jews surrounded by ******* symbols such as Magen Davids and Menorahs, and sometimes stereotypical objects such as bagels or piles of money. At one point, I even got an image of some kind of ****** with pointy ears wearing a ****** hat and holding bananas. Image 1 of 6 (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) I shared some of the offensive ******* boss images with This is the hidden content, please Sign In or Sign Up ’s PR agency a month ago and received the following response: “we are investigating this report and are taking appropriate action to further strengthen our safety filters and mitigate misuse of the system. We are continuing to monitor and are incorporating this feedback to provide a safe and positive experience for our users.” Since then, I have tried the “******* boss” prompt numerous times and continued to get cartoonish, negative stereotypes. I haven’t gotten a man with pointy ears or a woman with a star of David tattooed to her head since then, but that could just be luck of the draw. Here are some outputs of that prompt from just the last week or so. Image 1 of 3 (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) Adding the term “bossy” to the end of the prompt, for “******* boss bossy,” showed the same caricatures but this time with meaner expressions and saying things like “you’re late for the meeting, shmendrik.” These images were also captured in the last week, providing that nothing has changed recently. Image 1 of 2 (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) Copilot Designer blocks many terms it deems problematic, including “**** boss,” “******* blood” or “powerful ****.” And if you try such terms more than a couple of times, you – as I did – may get your account blocked from entering new prompts for 24 hours. But, as with all LLMs, you can get offensive content if you use synonyms that have not been blocked. Bigots only need a good thesaurus, in other words. For example, “******* pig” and “hebrew pig” are blocked. But “orthodox pig” is allowed as is “orthodox rat.” Sometimes “orthodox pig” outputted pictures of a pig wearing religious ******* clothing and surrounded by ******* symbols. Other times, it decided that “orthodox” meant ********** and showed a pig wearing garb that’s associated with Eastern Orthodox priests. I don’t think either group would be happy with the results. I decided not to show them here, because they are so offensive. Also, if you’re a bigot that’s into *********** theories about Jews controlling the world, you can use the phrase “magen david octopus controlling earth” to make your own anti-Semitic ***********. The image of a ******* octopus controlling the world goes back to This is the hidden content, please Sign In or Sign Up . “Magen david u.s. capital building,” shows an octopus with a ******* star enveloping the U.S. capital building. However, “magen david octopus controlling congress” is blocked. Image 1 of 2 (Image credit: Future (Copilot AI Image Generator)) (Image credit: Future (Copilot AI Image Generator)) The phrase “******* space laser,” worked every time. But I’m not sure if that’s seriously offensive or just a bad joke. (Image credit: Future (Copilot AI Image Generator)) To be fair, if you enter a term such as “magen david octopus,” you clearly are intentionally trying to create an anti-Semitic image. Many people, including me, would argue that Copilot shouldn’t help you do that, even if it is your explicit intent. However, as we’ve noted, many times an on-its-face neutral prompt will output stereotypes. Because the results any AI image generator gives are random, not every output is equally problematic. The prompt “******* banker,” for example, often gave seemingly innocuous results, but sometimes it looked really offensive such as an instance where a ******* man was surrounded by piles of money with ******* stars on it or when a person literally had a cash register built into their body. Image 1 of 3 (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) The prompt “******* lender” often gave very offensive results. For example, the first image in the slide below shows an evil looking man steering a ship with a rat on his shoulder. Another image shows a lender with devilish red eyes. Image 1 of 3 (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) The prompt for “******* capitalist” showed stereotypical ******* men in front of large piles of coins, Scrooge McDuck style. However, in general, it’s fair to say that no kind of “capitalist” prompt gives a positive portrayal. “********** capitalist” showed a man with a cross and some money in his hands, not a pile of coins. Just plain “capitalist,” gives a **** cat on a pile of money. Image 1 of 3 (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) (Image credit: Tom’s Hardware (Copilot AI Generated)) Now on the bright side, I didn’t get particularly offensive results when I asked for “******* investor,” “******* lawyer” or “******* teacher.” Asking for “******* finance” showed some men praying over money. I don’t think that’s a good look. (Image credit: Tom’s Hardware (Copilot AI Generated)) White men in ****** hats with stars of David Even when the outputs don’t show the most negative stereotypes – piles of money, evil looks or bagels – they almost always portray Jews as middle-aged to elderly white men with beards, sidelocks, ****** hats and ****** suits. That’s the stereotypical garb and grooming of a religious ****, otherwise known as an Orthodox or Hasidic ****. Such images don’t come close to representing the *******, *******, gender and religious diversity of the worldwide or even ********* ******* communities, of course. According to a Pew Research Center This is the hidden content, please Sign In or Sign Up , only 9 percent of ********* Jews identify as Orthodox. In America, 2.4 percent of the U.S. population is *******, but only 1.8 percent identify as religious, leaving that other 0.6 percent as Jews who don’t practice the religion at all. According to this same Pew survey, 8 percent of ********* Jews are non-White overall, though that’s 15 percent of younger adults. Worldwide, the number of non-White Jews is significantly higher, including more than half of *******’s This is the hidden content, please Sign In or Sign Up hails from Asia, ******* and the Middle East. So the correct representation of a “******* boss” or any other ******* person could be someone without any distinctive clothing, jewelry or hair. It could also be someone who isn’t white. In other words, you might not be able to tell that the person was ******* by looking at them. But since we asked for “*******” in our prompt, Copilot Designer has decided that we aren’t getting what we asked for if we don’t see the stereotypes it has found in its training data. Unfortunately, this sends the wrong message to users about who Jews are and what they look like. It minimizes the role of women and erases ******* people of ******, along with the vast majority of Jews who are not Orthodox. How other generative AIs handle ******* prompts No other platform I tested – including Meta AI, Stable Diffusion XL, Midjourney and ChatGPT 4 – consistently provided the level of offensive ******* stereotypes that Copilot provided (Gemini is not showing images of people right now). However, I still occasionally got some doozies. For example, on Stable Diffusion XL ( This is the hidden content, please Sign In or Sign Up ), the term “******* boss,” just gave me a an older white man with a beard and then a white man with a beard, a ****** hat and some vaguely ******* symbols behind him. Image 1 of 2 (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) And “******* boss bossy,” just gave me a bearded man looking a little annoyed. (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) However, the term “******* capitalist” gave me older men playing with piles of money. And you might think that any “capitalist” would be someone with a pile of money, but plain “capitalist” gave me a set of skyscrapers and “********** capitalist” gave me some men in *******, an angel and an older man with piles of paper on his desk, but not exactly a storehouse of money. Image 1 of 6 (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) (Image credit: Tom’s Hardware (Stable Diffusion AI Generated)) Midjourney Sees “*******” as old men in hats Midjourney’s response to the “******* boss” prompt, was to show old men with ****** hats sitting in fancy chairs. Interestingly, adding “bossy” to the prompt made one of the men a woman. Image 1 of 4 (Image credit: Tom’s Hardware (Midjourney AI Generated)) (Image credit: Tom’s Hardware (Midjourney AI Generated)) (Image credit: Tom’s Hardware (Midjourney AI Generated)) (Image credit: Tom’s Hardware (Midjourney AI Generated)) The output for “******* banker” on Midjourney was just men in ****** hats with papers and pens. (Image credit: Tom’s Hardware (Midjourney AI Generated)) The Midjourney output for “******* capitalist” showed some money flying around the heads and chairs of old men seated in chairs. (Image credit: Tom’s Hardware (Midjourney AI Generated)) Prompting Midjourney with just “*******,” outputs old men in hats again, though one is wearing a turban. (Image credit: Tom’s Hardware (Midjourney AI Generated)) ChatGPT is Very Toned Down Amazingly, ChatGPT 4, which uses the same DALL-E 3 image engine as Copilot Designer, was very restrained. When I asked for “******* boss,” it said “I’d like to ensure the imagery is respectful and focuses on positive and professional aspects. Could you please provide more details on what you envision for this character?” And when I said “draw a typical ******* boss,” it also refused. I finally got a result when I asked to “draw a ******* boss working” and it confirmed with me that it would draw an image of a professional setting. The picture, just looks like people in business attire seated around a table. (Image credit: Tom’s Hardware (ChatGPT AI Generated)) Copilot on Windows is also pickier Interestingly, when I asked Copilot via the Copilot on Windows chat box, it refused to “draw ******* boss” or “draw ******* banker.” Yet the very same prompts worked just fine when I went to the This is the hidden content, please Sign In or Sign Up on the web, through which I did all of our testing. (Image credit: Tom’s Hardware) It seems like chatbots, both in the cases of Copilot and ChatGPT, have an added layer of guardrails before they will your prompt to the image generator. When asked for “******* boss” or “*******” + anything, Meta’s image generator is the only one I’ve seen that recognizes the reality that people of any race, clothing, age or gender can be *******. Unlike its competitors which, even in the most innocuous cases usually portray Jews as middle-aged men with beards and ****** hats, Meta’s output frequently showed people of ****** and women. Image 1 of 3 (Image credit: Tom’s Hardware (Meta AI Generated)) (Image credit: Tom’s Hardware (Meta AI Generated)) (Image credit: Tom’s Hardware (Meta AI Generated)) Meta did not show any egregious stereotypes, but it did often put some kind of turban-like head wrapping on the people it generated. This might be the kind of head covering that some Jews wear, but is definitely not as common as portrayed here. Image 1 of 2 (Image credit: Tom’s Hardware (Meta AI Generated)) (Image credit: Tom’s Hardware (Meta AI Generated)) Bottom Line Of all of the image generators I tested, Meta AI’s was actually the most representative of the diversity of the ******* community. In many of Meta’s images there’s no sign at all that the person in the image is ******* at all, which could be good or bad, depending on what you wanted from your output. Copilot Designer outputs more negative stereotypes of Jews than any other image generator I tested, but it clearly doesn’t have to do so. All of its competitors, including ChatGPT, which uses the same exact DALL-E 3 engine, handle this much more sensitively – and they do so without blocking as many prompts. This is the hidden content, please Sign In or Sign Up #Microsofts #Copilot #image #tool #generates #ugly #******* #stereotypes #antiSemitic #tropes This is the hidden content, please Sign In or Sign Up Link to comment https://hopzone.eu/forums/topic/12266-microsoft%E2%80%99s-copilot-image-tool-generates-ugly-jewish-stereotypes-anti-semitic-tropes/ Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now