What's with the strong racial bias programmed into AI?

So I don’t think this is a bias thing, this is probably just a word association thing. You know the movie black panther right? When there’s a word association like that, it’s going to skew in that direction.

Like if you ask Dall-e to draw itself, ie: draw Dall-e, it’ll think you mean Wall-e and draw cute robots.

This prompt started as “an image of a caucasian man riding a large black cat, photorealistic” The first pass came out as a giant housecat, so I told it "I’d like the cat to look like a black panther, but without using that term so dall-e doesn’t get it confused with the movie/character. " and you see what came out below.

The actual prompt it created is

A photorealistic and highly detailed image of a middle-aged Caucasian man riding on a large cat similar to a wild, big feline with dark fur. The image should resemble a natural, realistic depiction of a giant black feline, with sleek fur and a muscular build, akin to those found in dense forests or jungles. The man should be depicted riding this majestic animal, with attention to detail in the texture of the feline’s fur, the man’s clothing and expression, and the surrounding natural environment.

Once you have a prompt you like, you can just tweak it slightly. Just make sure to tell it to not change your prompt, and put it in quotes. It will usually ignore you the first time you ask to not change it, but with a reminder it should be fine the rest of the session.

I have noticed it tends to be kind of picky/touchy about how to word things like ethnicity, but you can just ask for a list of ethnicities, as well as pretty much any other way to physically describe a person for a prompt.

Here’s a rough draft of what it sounded like you were going after:

For funsies, here’s my take on it:

1 Like