Firstpost
  • Home
  • Video Shows
    Vantage Firstpost America Firstpost Africa First Sports
  • World
    US News
  • Explainers
  • News
    India Opinion Cricket Tech Entertainment Sports Health Photostories
  • Asia Cup 2025
Apple Incorporated Modi ji Justin Trudeau Trending

Sections

  • Home
  • Live TV
  • Videos
  • Shows
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Health
  • Tech/Auto
  • Entertainment
  • Web Stories
  • Business
  • Impact Shorts

Shows

  • Vantage
  • Firstpost America
  • Firstpost Africa
  • First Sports
  • Fast and Factual
  • Between The Lines
  • Flashback
  • Live TV

Events

  • Raisina Dialogue
  • Independence Day
  • Champions Trophy
  • Delhi Elections 2025
  • Budget 2025
  • US Elections 2024
  • Firstpost Defence Summit
Trending:
  • PM Modi in Manipur
  • Charlie Kirk killer
  • Sushila Karki
  • IND vs PAK
  • India-US ties
  • New human organ
  • Downton Abbey: The Grand Finale Movie Review
fp-logo
Artificial Imposter: 47% of Indian phone users have experienced AI voice scams, highest in the world
Whatsapp Facebook Twitter
Whatsapp Facebook Twitter
Apple Incorporated Modi ji Justin Trudeau Trending

Sections

  • Home
  • Live TV
  • Videos
  • Shows
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Health
  • Tech/Auto
  • Entertainment
  • Web Stories
  • Business
  • Impact Shorts

Shows

  • Vantage
  • Firstpost America
  • Firstpost Africa
  • First Sports
  • Fast and Factual
  • Between The Lines
  • Flashback
  • Live TV

Events

  • Raisina Dialogue
  • Independence Day
  • Champions Trophy
  • Delhi Elections 2025
  • Budget 2025
  • US Elections 2024
  • Firstpost Defence Summit
  • Home
  • World
  • Artificial Imposter: 47% of Indian phone users have experienced AI voice scams, highest in the world

Artificial Imposter: 47% of Indian phone users have experienced AI voice scams, highest in the world

Mehul Reuben Das • May 2, 2023, 11:59:30 IST
Whatsapp Facebook Twitter

About half of Indian adults, or 47%, have experienced AI voice scams, nearly double the global average of 25%. Furthermore, nearly 69% of adults say they can’t differentiate between a real and an AI-generated voice.

Advertisement
Subscribe Join Us
Add as a preferred source on Google
Prefer
Firstpost
On
Google
Artificial Imposter: 47% of Indian phone users have experienced AI voice scams, highest in the world

With the advancements that researchers and developers have made in AI to make it more accessible and powerful, it was only time before scammers and a few bad actors would start using the technology to scam people. However, the extent to which AI is being used by scammers, especially in voice scams is astounding.   What is even more astounding, is the number of Indian phone users falling for AI voice scams. As per a recent McAfee report, about 47 per cent of Indian phone users have experienced AI voice scams in recent years and has the highest number of people experiencing the scam, globally. The global average, for comparison, is about 25 per cent.   Sound-based AI models or rather, AI generators, need very little in the form of a prompt. With only three seconds of audio necessary to clone a person’s voice, sound-based AI models, are propelling an increase in online voice scams.   Voice Cloning has become very easy thanks to AI McAfee researchers spent three weeks studying the accessibility, simplicity of use, and usefulness of AI voice-cloning tools as part of their analysis and assessment of this emerging trend, discovering more than a dozen publicly available on the internet.   **Also read: Scammers clone girl’s voice using AI in 'kidnapping scam,' demand $1 million as ransom** There are both free and commercial tools available, and many just require a basic degree of skill and competence to utilise. In one case, three seconds of audio was enough to provide an 85 per cent match*, but with additional time and work, the accuracy may be increased. McAfee researchers were able to obtain a 95 per cent voice match based on a limited number of video clips by training the data models. The more realistic the clone, the higher the chance a cybercriminal has of duping someone into turning over their money, and with these hoaxes predicated on exploiting the emotional weaknesses inherent in intimate connections, a scammer may gain thousands of dollars in a matter of hours.     “Advanced artificial intelligence tools are altering the playing field for cybercriminals,” said Steve Grobman, McAfee CTO. “They can now clone a person’s voice and trick a close contact into sending money with very little effort,” Grobman explained. “It’s critical to stay vigilant and take proactive measures to keep yourself and your loved ones safe.   “If you receive a call from your spouse or a family member in need of money, verify the caller by using a codeword or asking a question only they would know. Identity and privacy services will also assist to minimise the digital trace of personal information that a criminal might use to create a persuasive story when producing a voice clone,” he added. McAfee’s researchers noticed that they had no issue mimicking accents from throughout the world, whether they were from the US, UK, India, or Australia, but that more unique voices were more difficult to replicate. For example, cloning a person’s voice with an uncommon cadence, rhythm, or style involves more work, and they are less likely to be targeted as a consequence.   **Also read: Music labels are worried because of AI, thanks to a surprising new song ft. Drake X The Weeknd** The study team’s overarching conclusion was that artificial intelligence has already altered the game for cybercriminals. The barrier to entry has never been lower, making it easier to perpetrate cybercrime. Indians are being targeted disproportionately Everyone’s voice is distinctive, like a biometric fingerprint, which is why hearing someone talk is such a commonly recognised method of building confidence. However, with 86 per cent of Indian adults sharing their voice data online or in recorded notes at least once a week (via social media, voice notes, and other means), cloning how someone sounds is now a powerful tool in a cyber criminal’s arsenal. That is one of the major reasons, why Indians are being targeted disproportionately.   What makes things worse, is the fact that about 69 per cent of Indian adults were unsure if they could tell the difference between an AI-based clone and a genuine person. More than half of Indian respondents (66 per cent) stated they would respond to a phone or voice message claiming to be from a friend or loved one in need of money. Especially if they assumed the request came from their parent (46 per cent), partner or spouse (34 per cent), or kid (12 per cent). Messages indicating that the sender had been robbed (70 per cent), engaged in a vehicle accident (69 per cent), lost their phone or wallet (65 per cent), or required assistance when travelling overseas (62 per cent), were the most likely to elicit a response.  However, the cost of falling for an AI voice scam can be significant, with 48 per cent of Indians who lost money claiming it cost them more than INR 50,000. According to the poll, the growth of deepfakes and disinformation has made people more sceptical of what they see online, with 27 per cent of Indian adults saying they are now less trusting of social media than ever before, and 43 per cent concerned about the rise of misinformation or disinformation. What can we do about this? There are some steps that we can take to ensure that we do not fall for AI-based voice scammers, not very easily at least. First, set a verbal ‘codeword’ with your children, family members, or trusted close acquaintances that only they will understand. Make a strategy to always ask for it whenever they phone, text, or email for assistance, especially if they are elderly or fragile.   Also, make it a habit of always checking the source - pause, and consider if it’s a call, text, or email from an unknown sender, or even if it’s from a number you recognise. Is that what they sound like? Hang up and phone the individual directly, or attempt to confirm the facts before replying or giving money. Consider your options before you click and share. Ask yourself, who are the people in your social media network? Do you truly know and believe them? Consider your internet acquaintances and relationships carefully. The more contacts you have and the more information you disclose, the more likely it is that your identity may be cloned for malevolent purposes.   Also, consider that identity theft protection services can assist you in ensuring that your personally identifiable information is not accessible or in notifying you if it is discovered on the Dark Web. Take control of your personal data to prevent a cybercriminal from impersonating you.   Read all the  Latest News ,  Trending News ,  Cricket News ,  Bollywood News , India News  and  Entertainment News  here. Follow us on  Facebook,  Twitter and  Instagram.

Tags
AI in Scams Voice Cloning Scams AI Voice Cloning AI Voice Scam
End of Article
Latest News
Find us on YouTube
Subscribe
End of Article

Impact Shorts

‘The cries of this widow will echo’: In first public remarks, Erika Kirk warns Charlie’s killers they’ve ‘unleashed a fire’

‘The cries of this widow will echo’: In first public remarks, Erika Kirk warns Charlie’s killers they’ve ‘unleashed a fire’

Erika Kirk delivered an emotional speech from her late husband's studio, addressing President Trump directly. She urged people to join a church and keep Charlie Kirk's mission alive, despite technical interruptions. Erika vowed to continue Charlie's campus tours and podcast, promising his mission will not end.

More Impact Shorts

Top Stories

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

Russian drones over Poland: Trump’s tepid reaction a wake-up call for Nato?

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

As Russia pushes east, Ukraine faces mounting pressure to defend its heartland

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Why Mossad was not on board with Israel’s strike on Hamas in Qatar

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Turkey: Erdogan's police arrest opposition mayor Hasan Mutlu, dozens officials in corruption probe

Top Shows

Vantage Firstpost America Firstpost Africa First Sports

QUICK LINKS

  • Trump-Zelenskyy meeting
Latest News About Firstpost
Most Searched Categories
  • Web Stories
  • World
  • India
  • Explainers
  • Opinion
  • Sports
  • Cricket
  • Tech/Auto
  • Entertainment
  • IPL 2025
NETWORK18 SITES
  • News18
  • Money Control
  • CNBC TV18
  • Forbes India
  • Advertise with us
  • Sitemap
Firstpost Logo

is on YouTube

Subscribe Now

Copyright @ 2024. Firstpost - All Rights Reserved

About Us Contact Us Privacy Policy Cookie Policy Terms Of Use
Home Video Shorts Live TV