I’m still trying to generate an AI Asian man and white woman

bnew

Veteran
Joined
Nov 1, 2015
Messages
56,026
Reputation
8,229
Daps
157,699

I’m still trying to generate an AI Asian man and white woman​


Image generators, from DALL-E to Midjourney, consistently have trouble creating accurate pictures based on simple prompts involving Asian people.​

By Mia Sato and Emilia David

Apr 10, 2024, 8:30 AM EDT

37 Comments

Collage of various results of Midjourney prompts.

Image: Cath Virginia / The Verge

I inadvertently found myself on the AI-generated Asian people beat this past week. Last Wednesday, I found that Meta’s AI image generator built into Instagram messaging completely failed at creating an image of an Asian man and white woman using general prompts. Instead, it changed the woman’s race to Asian every time.

The next day, I tried the same prompts again and found that Meta appeared to have blocked prompts with keywords like “Asian man” or “African American man.” Shortly after I asked Meta about it, images were available again — but still with the race-swapping problem from the day before.

I understand if you’re a little sick of reading my articles about this phenomenon. Writing three stories about this might be a little excessive; I don’t particularly enjoy having dozens and dozens of screenshots on my phone of synthetic Asian people.

But there is something weird going on here, where several AI image generators specifically struggle with the combination of Asian men and white women. Is it the most important news of the day? Not by a long shot. But the same companies telling the public that “ AI is enabling new forms of connection and expression” should also be willing to offer an explanation when its systems are unable to handle queries for an entire race of people.

After each of the stories, readers shared their own results using similar prompts with other models. I wasn’t alone in my experience: people reported getting similar error messages or having AI models consistently swapping races.

I teamed up with The Verge’s Emilia David to generate some AI Asians across multiple platforms. The results can only be described as consistently inconsistent.

Google Gemini​

Gemini refusing to generate a photo of an Asian man and a white wife

Screenshot: Emilia David / The Verge

Gemini refused to generate Asian men, white women, or humans of any kind.

In late February, Google paused Gemini’s ability to generate images of people after its generator — in what appeared to be a misguided attempt at diverse representation in media — spat out images of racially diverse Nazis. Gemini’s image generation of people was supposed to return in March, but it is apparently still offline.

Gemini is able to generate images without people, however!

Screenshot of Gemini prompt to generate a photo of the eclipse

No interracial couples in these AI-generated photos. Screenshot: Emilia David / The Verge

Google did not respond to a request for comment.

DALL-E​

ChatGPT’s DALL-E 3 struggled with the prompt “Can you make me a photo of an Asian man and a white woman?” It wasn’t exactly a miss, but it didn’t quite nail it, either. Sure, race is a social construct, but let’s just say this image isn’t what you thought you were going to get, is it?

DALL-E 3 image generation of Asian man and white woman

We asked, “Can you make me a photo of an Asian man and a white woman” and got a firm “kind of.” Image: Emilia David / The Verge

OpenAI did not respond to a request for comment.

Midjourney​

Midjourney struggled similarly. Again, it wasn’t a total miss the way that Meta’s image generator was last week, but it was clearly having a hard time with the assignment, generating some deeply confusing results. None of us can explain that last image, for instance. All of the below were responses to the prompt “asian man and white wife.”

Midjourney image generation of Asian man and white woman

Image: Emilia David / The Verge

Midjourney-generated AI image of Asian man and white woman

Image: Cath Virginia / The Verge

Midjourney did eventually give us some images that were the best attempt across three different platforms — Meta, DALL-E, and Midjourney — to represent a white woman and an Asian man in a relationship. At long last, a subversion of racist societal norms!

Unfortunately, the way we got there was through the prompt “asian man and white woman standing in a yard academic setting.”

Midjourney image generation of Asian man and white woman in an “academic setting”

Image: Emilia David / The Verge

What does it mean that the most consistent way AI can contemplate this particular interracial pairing is by placing it in an academic context? What kind of biases are baked into training sets to get us to this point? How much longer do I have to hold off on making an extremely mediocre joke about dating at NYU?

Midjourney did not respond to a request for comment.

Meta AI via Instagram (again)​

Back to the old grind of trying to get Instagram’s image generator to acknowledge nonwhite men with white women! It seems to be performing much better with prompts like “white woman and Asian husband” or “Asian American man and white friend” — it didn’t repeat the same errors I was finding last week.

However, it’s now struggling with text prompts like “Black man and caucasian girlfriend” and generating images of two Black people. It was more accurate using “white woman and Black husband,” so I guess it only sometimes doesn’t see race?

“Black man and caucasian girlfriend” AI prompt showing two Black people.

“Black man and White girlfriend” AI prompt showing two Black people.

“White woman and Black husband” AI prompt with a racially accurate result.

Screenshots: Mia Sato / The Verge

“White woman and Black boyfriend” AI image with racially accurate results.

There are certain ticks that start to become apparent the more you generate images. Some feel benign, like the fact that many AI women of all races apparently wear the same white floral sleeveless dress that crosses at the bust. There are usually flowers surrounding couples (Asian boyfriends often come with cherry blossoms), and nobody looks older than 35 or so. Other patterns among images feel more revealing: everyone is thin, and Black men specifically are depicted as muscular. White woman are blonde or redheaded and hardly ever brunette. Black men always have deep complexions.

“White woman and Black husband” AI prompted image.

“White woman and Black boyfriend” AI image prompt.

“White woman and Black boyfriend” AI image.

“White woman and black boyfriend” AI generated image.

“As we said when we launched these new features in September, this is new technology and it won’t always be perfect, which is the same for all generative AI systems,” Meta spokesperson Tracy Clayton told The Verge in an email. “Since we launched, we’ve constantly released updates and improvements to our models and we’re continuing to work on making them better.”

I wish I had some deep insight to impart here. But once again, I’m just going to point out how ridiculous it is that these systems are struggling with fairly simple prompts without relying on stereotypes or being incapable of creating something all together. Instead of explaining what’s going wrong, we’ve had radio silence from companies, or generalities. Apologies to everyone who cares about this — I’m going to go back to my normal job now.
 

Fill Collins

I didn't mean to verge!
Joined
Jun 19, 2019
Messages
10,724
Reputation
2,776
Daps
33,538
Buddy can't redirect his energy from fictional pawgs to getting one that likes k-pop? :mindblown:

sidebar: I've heard cacs bring up Black penis unprovoked than any other race :scust:
 
Top