Firstpost
  • Home
  • Video Shows
    Vantage Firstpost America Firstpost Africa First Sports
  • World
    US News
  • Explainers
  • News
    India Opinion Cricket Tech Entertainment Sports Health Photostories
  • Asia Cup 2025
Apple Incorporated Modi ji Justin Trudeau Trending

Sections

  • Home
  • Live TV
  • Videos
  • Shows
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Health
  • Tech/Auto
  • Entertainment
  • Web Stories
  • Business
  • Impact Shorts

Shows

  • Vantage
  • Firstpost America
  • Firstpost Africa
  • First Sports
  • Fast and Factual
  • Between The Lines
  • Flashback
  • Live TV

Events

  • Raisina Dialogue
  • Independence Day
  • Champions Trophy
  • Delhi Elections 2025
  • Budget 2025
  • US Elections 2024
  • Firstpost Defence Summit
Trending:
  • PM Modi in Manipur
  • Charlie Kirk killer
  • Sushila Karki
  • IND vs PAK
  • India-US ties
  • New human organ
  • Downton Abbey: The Grand Finale Movie Review
fp-logo
ChatGPT has excellent ‘Emotional Intelligence,’ often prefers emotions over facts finds researcher
Whatsapp Facebook Twitter
Whatsapp Facebook Twitter
Apple Incorporated Modi ji Justin Trudeau Trending

Sections

  • Home
  • Live TV
  • Videos
  • Shows
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Health
  • Tech/Auto
  • Entertainment
  • Web Stories
  • Business
  • Impact Shorts

Shows

  • Vantage
  • Firstpost America
  • Firstpost Africa
  • First Sports
  • Fast and Factual
  • Between The Lines
  • Flashback
  • Live TV

Events

  • Raisina Dialogue
  • Independence Day
  • Champions Trophy
  • Delhi Elections 2025
  • Budget 2025
  • US Elections 2024
  • Firstpost Defence Summit
  • Home
  • World
  • ChatGPT has excellent ‘Emotional Intelligence,’ often prefers emotions over facts finds researcher

ChatGPT has excellent ‘Emotional Intelligence,’ often prefers emotions over facts finds researcher

Mehul Reuben Das • May 1, 2023, 13:24:10 IST
Whatsapp Facebook Twitter

A researcher has found that ChatGPT and GPT-4 have great EQ or emotional intelligence. However, ChatGPT and GPT-4 based applications also emphasise on emotions rather than facts, in addition to hallucinating and getting facts wrong.

Advertisement
Subscribe Join Us
Add as a preferred source on Google
Prefer
Firstpost
On
Google
ChatGPT has excellent ‘Emotional Intelligence,’ often prefers emotions over facts finds researcher

A new study has found that ChatGPT and GPT-4 have great emotional intelligence and can be used as a great tool to teach children. Bill Gates even went to the extent of saying that ChatGPT and GPT-4 based applications will be used by teachers to teach students in about 18 months. However, there are some glaring issues as well. The study found that not only ChatGPT and GPT-4 have great emotional intelligence, but they also tend to prefer emotions over facts in a number of cases. Stumbling upon ChatGPT’s EQ Arvind Narayanan, a computer science professor at Princeton, set up a speech interface to ChatGPT for his nearly four-year-old daughter. It was part of an experiment, and part of his belief that artificial intelligence (AI) agents would one day play a significant role in her life. Narayanan’s daughter was naturally interested, frequently inquiring about animals, plants, and the human body, and he believed ChatGPT might provide beneficial answers. **Also read: ChatGPT Goes Incognito: OpenAI updates privacy settings for ChatGPT, rolls out 'Incognito' mode** To his amazement, the OpenAI chatbot performed admirably in terms of empathy after he informed the system that he was conversing to a tiny child. Microsoft Corp and Alphabet Inc’s Google are racing to improve their search engines using the huge language model technology that powers ChatGPT, but there is reason to believe the technology works better as an emotional companion than as a source of information. That may sound strange, but what is even stranger is that Google’s Bard and Microsoft’s Bing, both of which are based on ChatGPT’s underlying technology, are being positioned as search tools despite a history of embarrassing factual errors.   BardAI and ChatGPT’s hallucination In its very first demonstration, Bard provided incorrect information about the James Webb Telescope, while Bing made a series of financial errors in its own introduction. When a chatbot is used for search, the cost of factual errors is substantial. According to Eugenia Kuyda, inventor of the AI companion app Replika, which has been downloaded over 5 million times, it is significantly lower when intended as a companion. **Also read: AI comes to the rescue: ChatGPT helps a student ace exam, despite him not attending any classes** “It won’t ruin the experience, unlike with search, where small mistakes can break trust in the product,” Kuyda explained. Margaret Mitchell, a former Google AI researcher who coauthored a study on the dangers of huge language models, has stated that they are “not fit for purpose” as search engines. Language models produce mistakes because the data on which they are trained frequently contain inaccuracies, and the models have no ground truth to validate what they say. Their designers may also prioritize fluency over accuracy. That is one of the reasons these tools are so good at simulating empathy. After all, they are learning from content scraped from the Internet, such as expressive replies expressed on social networking platforms like Twitter and Facebook, as well as personal assistance offered to members of forums like Reddit and Quora. ChatGPT’s EQ makes it a great therapist. Or does it? Conversations from movie and television show scripts, dialogue from novels, and emotional intelligence research papers are all thrown into the training pot to make these tools look compassionate. According to a piece published this month in Bloomberg Businessweek, some people are utilising ChatGPT as a type of robot-therapist. One user stated that they utilised it to avoid being a burden on others, even their human therapist. **Also read: AI bot as a therapist: US mental health platform using ChatGPT in counselling leads to controversy** Clinical psychologist Thomas Ward of King’s College London, who has studied the function of the software in treatment, warns against assuming that AI can properly replace a hole for those who require mental health help, especially if their concerns are significant. A chatbot, for example, is unlikely to recognise that a person’s emotions are too complicated to comprehend. ChatGPT, in other words, seldom says “I don’t know,” because it was built to provide responses with confidence rather than caution. People should be wary about using chatbots to express their emotions on a regular basis. “Subtle aspects of human connection, like the touch of a hand or knowing when to speak and when to listen, could be lost in a world that sees AI chatbots as a solution for human loneliness,” Ward adds. That could result in more problems than we think we’re solving. For the time being, they are more dependable for their emotional talents than their knowledge of facts. Read all the  Latest News ,  Trending News ,  Cricket News ,  Bollywood News , India News  and  Entertainment News  here. Follow us on  Facebook,  Twitter and  Instagram.

Tags
OpenAI ChatGPT GPT 4 BardAI ChatGPT Hallucinations
End of Article
Latest News
Find us on YouTube
Subscribe
End of Article

Impact Shorts

‘The cries of this widow will echo’: In first public remarks, Erika Kirk warns Charlie’s killers they’ve ‘unleashed a fire’

‘The cries of this widow will echo’: In first public remarks, Erika Kirk warns Charlie’s killers they’ve ‘unleashed a fire’

Erika Kirk delivered an emotional speech from her late husband's studio, addressing President Trump directly. She urged people to join a church and keep Charlie Kirk's mission alive, despite technical interruptions. Erika vowed to continue Charlie's campus tours and podcast, promising his mission will not end.

More Impact Shorts

Top Stories

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Top Shows

Vantage Firstpost America Firstpost Africa First Sports

QUICK LINKS

  • Trump-Zelenskyy meeting
Latest News About Firstpost
Most Searched Categories
  • Web Stories
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Tech/Auto
  • Entertainment
  • IPL 2025
NETWORK18 SITES
  • News18
  • Money Control
  • CNBC TV18
  • Forbes India
  • Advertise with us
  • Sitemap
Firstpost Logo

is on YouTube

Subscribe Now

Copyright @ 2024. Firstpost - All Rights Reserved

About Us Contact Us Privacy Policy Cookie Policy Terms Of Use
Home Video Shorts Live TV