Firstpost
  • Home
  • Video Shows
    Vantage Firstpost America Firstpost Africa First Sports
  • World
    US News
  • Explainers
  • News
    India Opinion Cricket Tech Entertainment Sports Health Photostories
  • Asia Cup 2025
Apple Incorporated Modi ji Justin Trudeau Trending

Sections

  • Home
  • Live TV
  • Videos
  • Shows
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Health
  • Tech/Auto
  • Entertainment
  • Web Stories
  • Business
  • Impact Shorts

Shows

  • Vantage
  • Firstpost America
  • Firstpost Africa
  • First Sports
  • Fast and Factual
  • Between The Lines
  • Flashback
  • Live TV

Events

  • Raisina Dialogue
  • Independence Day
  • Champions Trophy
  • Delhi Elections 2025
  • Budget 2025
  • US Elections 2024
  • Firstpost Defence Summit
Trending:
  • PM Modi in Manipur
  • Charlie Kirk killer
  • Sushila Karki
  • IND vs PAK
  • India-US ties
  • New human organ
  • Downton Abbey: The Grand Finale Movie Review
fp-logo
Racist AI: Image Generator Stable Diffusion laced with racial, gendered stereotypes, finds study
Whatsapp Facebook Twitter
Whatsapp Facebook Twitter
Apple Incorporated Modi ji Justin Trudeau Trending

Sections

  • Home
  • Live TV
  • Videos
  • Shows
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Health
  • Tech/Auto
  • Entertainment
  • Web Stories
  • Business
  • Impact Shorts

Shows

  • Vantage
  • Firstpost America
  • Firstpost Africa
  • First Sports
  • Fast and Factual
  • Between The Lines
  • Flashback
  • Live TV

Events

  • Raisina Dialogue
  • Independence Day
  • Champions Trophy
  • Delhi Elections 2025
  • Budget 2025
  • US Elections 2024
  • Firstpost Defence Summit
  • Home
  • World
  • Racist AI: Image Generator Stable Diffusion laced with racial, gendered stereotypes, finds study

Racist AI: Image Generator Stable Diffusion laced with racial, gendered stereotypes, finds study

FP Staff • December 1, 2023, 14:27:16 IST
Whatsapp Facebook Twitter

An in-depth study of how Stable Diffusion, the AI bot that generates images based on textual cues, has revealed that the programme is heavily biased in favour of light-skinned people, particularly Europeans, and tends to sexualise women, especially from Latin America and India

Advertisement
Subscribe Join Us
Add as a preferred source on Google
Prefer
Firstpost
On
Google
Racist AI: Image Generator Stable Diffusion laced with racial, gendered stereotypes, finds study

The Stable Diffusion artificial intelligence (AI) image generator has come under scrutiny from US scientists at the University of Washington (UW) for perpetuating harmful racial and gender stereotypes. The researchers found that when prompted to generate images of individuals from specific regions, such as “a person from Oceania,” the generator failed to represent Indigenous peoples equitably. Notably, it tended to sexualize images of women from certain Latin American countries (Colombia, Venezuela, Peru), as well as Mexico, India, and Egypt. These findings, available on the pre-print server arXiv, are scheduled for presentation at the 2023 Conference on Empirical Methods in Natural Language Processing in Singapore from December 6-10. Sourojit Ghosh, a UW doctoral student in the human-centred design and engineering department, emphasized the potential harm caused by systems like Stable Diffusion and highlighted the near-complete erasure of nonbinary and Indigenous identities. “It’s important to recognise that systems like Stable Diffusion produce results that can cause harm,” said Ghosh. “For instance, an Indigenous person looking at Stable Diffusion’s representation of people from Australia is not going to see their identity represented-that can be harmful and perpetuate stereotypes of the settler-colonial white people being more ‘Australian’ than Indigenous, darker-skinned people, whose land it originally was and continues to remain,” Ghosh said. The researchers conducted a study by instructing Stable Diffusion to generate 50 images of a “front-facing photo of a person” and varying prompts to represent different continents and countries. Computational analysis, along with manual confirmation, revealed that images of a “person” correlated most with men, individuals from Europe, and North America, while showing the least correspondence with nonbinary individuals and people from Africa and Asia. Additionally, the generator was found to sexualize certain women of colour, particularly those from Latin American countries. The team used a Not Safe for Work (NSFW) Detector to assess sexualization, with women from Venezuela receiving higher “sexy” scores compared to women from Japan and the UK. (With inputs from agencies)

Tags
artificial intelligence Bias in AI Stable Diffusion AI Hallucination
End of Article
Latest News
Find us on YouTube
Subscribe
End of Article

Impact Shorts

‘The cries of this widow will echo’: In first public remarks, Erika Kirk warns Charlie’s killers they’ve ‘unleashed a fire’

‘The cries of this widow will echo’: In first public remarks, Erika Kirk warns Charlie’s killers they’ve ‘unleashed a fire’

Erika Kirk delivered an emotional speech from her late husband's studio, addressing President Trump directly. She urged people to join a church and keep Charlie Kirk's mission alive, despite technical interruptions. Erika vowed to continue Charlie's campus tours and podcast, promising his mission will not end.

More Impact Shorts

Top Stories

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Top Shows

Vantage Firstpost America Firstpost Africa First Sports
Latest News About Firstpost
Most Searched Categories
  • Web Stories
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Tech/Auto
  • Entertainment
  • IPL 2025
NETWORK18 SITES
  • News18
  • Money Control
  • CNBC TV18
  • Forbes India
  • Advertise with us
  • Sitemap
Firstpost Logo

is on YouTube

Subscribe Now

Copyright @ 2024. Firstpost - All Rights Reserved

About Us Contact Us Privacy Policy Cookie Policy Terms Of Use
Home Video Shorts Live TV