Deepfakes have become an emerging concern throughout the world. While the use of artificial intelligence (AI) to create fabricated videos and images to spread political misinformation is much talked about, deepfakes have another much bigger problem.
Deepfake pornography, which primarily targets women, is becoming disturbingly common. As per a recent study, 98 per cent of deepfake videos circulated online were pornographic, and 99 per cent of them targeted women or girls.
United States Representative Alexandria Ocasio-Cortez has recently opened up about the horror she experienced after discovering a deepfake porn image of her that was posted on X.
Here’s what is going on.
The growing deepfake menace
Deepfake uses a type of AI called deep learning to generate manipulated media – video, audio, or photos. It creates a completely computer-generated video or image or superimposes a likeness on existing content.
The term ‘ deepfake ’ emerged in 2017 when a Reddit user started posting videos of celebrities swapping their faces onto the bodies in existing pornographic clips.
Although deepfakes are identifiable, with the advances in AI technology, it is turning out to be much harder to distinguish between real and fake.
With AI tools becoming more accessible now, it has also become easier and cheaper to produce non-consensual sexually explicit content. There has also been a proliferation of “nudifying” apps that transform existing clothed pictures and videos of women and girls into nudes.
A 60-second deepfake video can be produced for free using a single clear image within half an hour. According to The Guardian, over 100,000 sexually explicit deepfake images and videos are disseminated on the internet every day, mostly by using images from private social media accounts.
Impact Shorts
More ShortsA 2023 State of Deepfakes report found that adult content targeting mostly women accounts for 98 per cent of all deepfake videos online.
Today, deepfake porn has become a problem for both celebrities and ordinary citizens. It is being used to target female students and many times the perpetrators are their own male classmates.
A website named MrDeepFakes, which is one of the most popular websites dedicated to sexualised deepfakes, gets 17 million (1.7 crores) visitors a month, NBC News reported citing web analytics firm SimilarWeb.
People who create deepfake pornographic videos do it for as little as $5 (about Rs 416) to download thousands of footage of celebrities. They accept payments via Visa, Mastercard and cryptocurrency, as per NBC News.
How politicians have become a target
Recently, sexualised deepfakes of celebrities like Taylor Swift and Alia Bhatt were circulated online.
While female stars have long been targeted by using their image and superimposing it on an existing sexually explicit picture or video, female politicians are now also falling prey to such non-consensual pornography.
Ocasio-Cortez told Rolling Stone in an interview that seeing a digitally altered image of herself performing a sex act was “not as imaginary as people want to make it seem”.
The Queens Democrat said she was in a discussion with her aides during a car journey in February when she came across the sexually explicit deepfake image of herself. “There’s a shock to seeing images of yourself that someone could think are real”.
“As a survivor of physical sexual assault, it adds a level of dysregulation. It resurfaces trauma, while I’m trying to — in the middle of a f***ing meeting,” she told the magazine.
The Congresswoman said that the distressing image stayed with her the entire day, citing scientific research about how it is difficult for the brain to separate visceral images on a phone from reality, even when it is known that it is fake. “There are certain images that don’t leave a person, they can’t leave a person”.
She told Rolling Stone that “digitising violent humiliation” is like physical rape, and forewarned that “people are going to kill themselves over this”.
Deepfake porn has also hit other women politicians.
Italy’s prime minister, Giorgia Meloni, will testify in July in a civil lawsuit against two men accused of creating pornographic videos featuring her by placing her face on someone else’s body and then posting them online.
Meloni is seeking compensation of €100,000 (roughly Rs. 90.6 lakh) in damages from a 40-year-old and his 73-year-old father for the deepfake that was posted on a US porn website in 2020 and viewed millions of times, BBC reported citing the indictment.
According to an AD report in March, deepfake porn videos of dozens of Dutch celebrities, parliamentarians, and members of the Royal Family – all women – surfaced online, garnering tens of thousands of views.
How deepfakes harm victims
Non-consensual deepfake videos can cause deep damage to their victims such as triggering stress and even physiological symptoms, such as heart palpitations and panic attacks, as per Healthnews.
Victims can experience trauma and in extreme cases, even lose touch with reality.
“It really makes you feel powerless, like you’re being put in your place. Punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you’”, Helen Mort, a survivor of deepfake pornography, told MIT Technology Review in 2021.
Need for strict laws
There have to be strict regulations across the world to not only outlaw the distribution but also the creation of deepfake porn.
Clare McGlynn, Professor of Law, Durham University, argued in her The Conversation piece that criminalising the production of deepfake pornography “can impose greater obligations on internet platforms. If creation of pornographic deepfakes was unlawful, it would be difficult for payment providers to continue to prop up the deepfake ecosystem, difficult for Google to continue returning deepfake porn sites at the top of searches and difficult for social media companies such as X (formerly Twitter) or the app stores to continue to advertise nudify apps.”
Generative AI is being increasingly used to abuse women. It can blow up further if not tackled now. “If governments, regulators and businesses do not act, the scale of the harm inflicted on women across the world will be immense,” as per The Guardian article.
It is high time that women are made to feel safe in this ever-evolving world of technology.
With inputs from agencies