Categories: Technology

AI-generated Asians were briefly unavailable on Instagram

/

After I reported that Instagram’s AI image generation was making weird mistakes around race, the tool temporarily returned an error message today.

Illustration by Kristen Radtke / The Verge

Yesterday, I reported that Meta’s AI image generator was making everyone Asian, even when the text prompt specified another race. Today, I briefly had the opposite problem: I was unable to generate any Asian people using the same prompts as the day before.

The tests I did yesterday were on Instagram, via the AI image generator available in direct messages. After dozens of tries, I was unable to generate a single accurate image using prompts like “Asian man and Caucasian friend” and “Asian man and white wife.” Only once was the system able to successfully create a picture of an Asian woman and a white man — it kept making everyone Asian.

After I initially reached out for comment yesterday, a Meta spokesperson asked for more details about my story, like when my deadline was. I responded and never heard back. Today, I was curious if the problem was resolved or if the system was still unable to create an accurate image showing an Asian person with their white friend. Instead of a slew of racially inaccurate pictures, I got an error message: “Looks like something went wrong. Please try again later or try a different prompt.”

Weird. Did I hit my cap for generating fake Asian people? I had a Verge co-worker try, and she got the same result.

I tried other even more general prompts about Asian people, like “Asian man in suit,” “Asian woman shopping,” and “Asian woman smiling.” Instead of an image, I got the same error message. Again, I reached out to Meta’s communications team — what gives? Let me make fake Asian people! (During this time, I was also unable to generate images using prompts like “Latino man in suit” and “African American man in suit,” which I asked Meta about as well.)

Forty minutes later, after I got out of a meeting, I still hadn’t heard back from Meta. But by then, the Instagram feature was working for simple prompts like “Asian man.” Silently changing something, correcting an error, or removing a feature after a reporter asks about it is fairly standard for many of the companies I cover. Did I personally cause a temporary shortage of AI-generated Asian people? Was it just a coincidence in timing? Is Meta working on fixing the problem? I wish I knew, but Meta never answered my questions or offered an explanation.

Whatever is happening over at Meta HQ, it still has some work to do — prompts like “Asian man and white woman” now return an image, but the system still screws up the races and makes them both Asian like yesterday. So I guess we’re back to where we started. I will keep an eye on things.

Screenshots by Mia Sato / The Verge

https://www.theverge.com/rss/index.xml

Mia Sato

Mia Sato

Share
Published by
Mia Sato

Recent Posts

LIZ JONES’S DIARY: This is my most deeply shameful and shocking confession…

By Liz Jones For You Magazine Published: 06:57 EDT, 18 May 2024 | Updated: 06:57…

2 hours ago

Dr Ranj Singh ‘failed to tell BBC about secret £22.5k payment to promote Covid vaccine on TV’

DR RANJ Singh has been locked in a row with the BBC over his financial…

3 hours ago

Dr Ranj Singh ‘fails to tell BBC bosses about £22,500 AstraZeneca advert before jabs feature’ on Morning Live

By Kirsten Murray For Mailonline Published: 04:08 EDT, 18 May 2024 | Updated: 05:19 EDT,…

4 hours ago