Firstpost
  • Home
  • Video Shows
    Vantage Firstpost America Firstpost Africa First Sports
  • World
    US News
  • Explainers
  • News
    India Opinion Cricket Tech Entertainment Sports Health Photostories
  • Asia Cup 2025
Apple Incorporated Modi ji Justin Trudeau Trending

Sections

  • Home
  • Live TV
  • Videos
  • Shows
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Health
  • Tech/Auto
  • Entertainment
  • Web Stories
  • Business
  • Impact Shorts

Shows

  • Vantage
  • Firstpost America
  • Firstpost Africa
  • First Sports
  • Fast and Factual
  • Between The Lines
  • Flashback
  • Live TV

Events

  • Raisina Dialogue
  • Independence Day
  • Champions Trophy
  • Delhi Elections 2025
  • Budget 2025
  • US Elections 2024
  • Firstpost Defence Summit
Trending:
  • Charlie Kirk shot dead
  • Nepal protests
  • Russia-Poland tension
  • Israeli strikes in Qatar
  • Larry Ellison
  • Apple event
  • Sunjay Kapur inheritance row
fp-logo
Microsoft "deeply sorry" for Tay the chatbot's racist and sexist comments
Whatsapp Facebook Twitter
Whatsapp Facebook Twitter
Apple Incorporated Modi ji Justin Trudeau Trending

Sections

  • Home
  • Live TV
  • Videos
  • Shows
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Health
  • Tech/Auto
  • Entertainment
  • Web Stories
  • Business
  • Impact Shorts

Shows

  • Vantage
  • Firstpost America
  • Firstpost Africa
  • First Sports
  • Fast and Factual
  • Between The Lines
  • Flashback
  • Live TV

Events

  • Raisina Dialogue
  • Independence Day
  • Champions Trophy
  • Delhi Elections 2025
  • Budget 2025
  • US Elections 2024
  • Firstpost Defence Summit
  • Home
  • Tech
  • News & Analysis
  • Microsoft "deeply sorry" for Tay the chatbot's racist and sexist comments

Microsoft "deeply sorry" for Tay the chatbot's racist and sexist comments

FP Archives • March 28, 2016, 09:03:20 IST
Whatsapp Facebook Twitter

Microsoft is “deeply sorry” for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week, a company official wrote on Friday.

Advertisement
Subscribe Join Us
Add as a preferred source on Google
Prefer
Firstpost
On
Google
Microsoft "deeply sorry" for Tay the chatbot's racist and sexist comments

Microsoft is “deeply sorry” for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week, a company official wrote on Friday, after the artificial intelligence program went on an embarrassing tirade. The bot, known as Tay, was designed to become “smarter” as more users interacted with it. Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program, forcing Microsoft Corp to shut it down on Thursday. Following the setback, Microsoft said in a blog post it would revive Tay only if its engineers could find a way to prevent Web users from influencing the chatbot in ways that undermine the company’s principles and values. “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” wrote Peter Lee, Microsoft’s vice president of research. Microsoft created Tay as an experiment to learn more about how artificial intelligence programs can engage with Web users in casual conversation. The project was designed to interact with and “learn” from the young generation of millennials. Tay began its short-lived Twitter tenure on Wednesday with a handful of innocuous tweets. Then its posts took a dark turn. In one typical example, Tay tweeted: “feminism is cancer,” in response to another Twitter user who had posted the same message. Lee, in the blog post, called Web users’ efforts to exert a malicious influence on the chatbot “a coordinated attack by a subset of people.” “Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack,” Lee wrote. “As a result, Tay tweeted wildly inappropriate and reprehensible words and images.” Microsoft has enjoyed better success with a chatbot called XiaoIce that the company launched in China in 2014. XiaoIce is used by about 40 million people and is known for “delighting with its stories and conversations,” according to Microsoft. As for Tay? Not so much. “We will remain steadfast in our efforts to learn from this and other experiences as we work toward contributing to an Internet that represents the best, not the worst, of humanity,” Lee wrote. Reuters

Tags
Twitter Microsoft Racist sexist CHATBOT Microsoft apologizes Sexist and racist Microsoft Tay
End of Article
Written by FP Archives

see more

Latest News
Find us on YouTube
Subscribe
End of Article

Top Stories

Charlie Kirk, shot dead in Utah, once said gun deaths are 'worth it' to save Second Amendment

Charlie Kirk, shot dead in Utah, once said gun deaths are 'worth it' to save Second Amendment

From governance to tourism, how Gen-Z protests have damaged Nepal

From governance to tourism, how Gen-Z protests have damaged Nepal

Did Russia deliberately send drones into Poland’s airspace?

Did Russia deliberately send drones into Poland’s airspace?

Netanyahu ‘killed any hope’ for Israeli hostages: Qatar PM after Doha strike

Netanyahu ‘killed any hope’ for Israeli hostages: Qatar PM after Doha strike

Charlie Kirk, shot dead in Utah, once said gun deaths are 'worth it' to save Second Amendment

Charlie Kirk, shot dead in Utah, once said gun deaths are 'worth it' to save Second Amendment

From governance to tourism, how Gen-Z protests have damaged Nepal

From governance to tourism, how Gen-Z protests have damaged Nepal

Did Russia deliberately send drones into Poland’s airspace?

Did Russia deliberately send drones into Poland’s airspace?

Netanyahu ‘killed any hope’ for Israeli hostages: Qatar PM after Doha strike

Netanyahu ‘killed any hope’ for Israeli hostages: Qatar PM after Doha strike

Top Shows

Vantage Firstpost America Firstpost Africa First Sports
Latest News About Firstpost
Most Searched Categories
  • Web Stories
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Tech/Auto
  • Entertainment
  • IPL 2025
NETWORK18 SITES
  • News18
  • Money Control
  • CNBC TV18
  • Forbes India
  • Advertise with us
  • Sitemap
Firstpost Logo

is on YouTube

Subscribe Now

Copyright @ 2024. Firstpost - All Rights Reserved

About Us Contact Us Privacy Policy Cookie Policy Terms Of Use
Home Video Shorts Live TV