The misuse of artificial intelligence (AI) for malicious purposes, such as revenge porn or spreading fake news, has been raising fears worldwide. Now, researchers are saying that apps and websites that use AI to “undress” women in photos are becoming increasingly popular. According to social network analysis company Graphika, as many as 24 million (2.4 crore) people visited such sites in September alone, reported Bloomberg. How the use of “undressing” sites and apps has increased? Let’s find out. Apps that ‘undress’ women These undressing, or “nudify”, sites and apps alter existing clothed pictures and videos of real individuals. It then uses AI to generate a fake nude image of them, noted Business Insider. Most of these “nudifying” apps work only on women. Graphika said in its December report, “A Revealing Picture”, that it analysed 34 websites that offer these services, which it described as non-consensual intimate imagery (NCII). The researchers found that the number of links advertising undressing apps rose over 2,400 per cent since the beginning of this year on social media, including on X and Reddit, reported Bloomberg. The report mentioned that 53 Telegram groups used to access these services have at least 1 million users. ALSO READ:
How big is the deepfake problem in India and beyond Dangers of deepfake pornography The use of AI for generating non-consensual pornography has been a worrying trend for years now. Websites that create deepfake nudes – digitally manipulated images to make someone appear naked – have surfaced in the past few years. The menace is all the more concerning as it is also being used to target minors. Last month, the deepfake images of female students of a New Jersey high school were circulated online. In September, more than 20 girls fell prey to deepfake photos generated by the AI-powered app ‘Clothoff’ which lets users to ‘undress girls for free’, reported Daily Mail. These fake nude images made using the fully clothed pictures posted on these girls’ Instagram accounts were then shared in WhatsApp groups. [caption id=“attachment_13484272” align=“alignnone” width=“640”] The menace of deepfake pornography mostly targets women. Pixabay (Representational Image)[/caption] Earlier this year, a teenage boy allegedly used AI apps to create
deepfake images of female students at a high school in Seattle, Washington, as per the Daily Mail report. According to a Vice report in 2019, a software called DeepNude was used to produce a convincing nude image of women in 30 seconds. Speaking to Bloomberg, Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, said: “We are seeing more and more of this being done by ordinary people with ordinary targets”. “You see it among high school children and people who are in college.” Psychotherapist Lisa Sanfilippo, whose expertise includes sexual trauma, told Business Insider earlier that creating false nude images “is a major violation” of people’s privacy that can bring intense trauma to the victim. She said that for the victim “seeing images of yourself — or images that are falsified to look like you, in acts that you might find reprehensible, scary or that would be only for your personal life can be very destabilising — even traumatising. There is no ability to give consent there.” “It’s abuse when someone takes something from another person that has not been freely given to them,” Sanfilippo added. With AI tools becoming more accessible, it has also become easier and cheaper to produce non-consensual sexually explicit content. From celebrities to a layperson, deepfakes can be deployed to target anyone with just a few clicks. The availability of multiple open source diffusion models has also made it simpler to alter images, leading to the launch of these “undressing” websites and apps, Graphika’s report said. “Bolstered by these AI services, synthetic NCII providers now operate as a fully-fledged online industry, leveraging many of the same marketing tactics and monetisation tools as established e-commerce companies. This includes advertising on mainstream social media platforms, influencer marketing, deploying customer referral schemes, and the use of online payment technologies,” Forbes cited the report as saying. The researchers warned: “We assess the increasing prominence and accessibility of these services will very likely lead to further instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sextortion, and the generation of child sexual abuse material”. [caption id=“attachment_13484292” align=“alignnone” width=“640”]
AI tools have become more accessible in the recent years. Pixabay (Representational Image)[/caption] What have companies said? As per Bloomberg, one of the apps has paid for sponsored content on Google’s YouTube, making it the first to appear when the search word is “nudify.” A Google spokesperson said that the tech giant does not allow ads “that contain sexually explicit content.” “We’ve reviewed the ads in question and are removing those that violate our policies,” the company told Bloomberg. A Telegram spokesperson told Daily Mail, “Since its creation, Telegram has actively moderated harmful content on its platform, including nonconsensual pornography. Telegram’s moderators actively monitor public parts of the platform and accept user reports in order to remove content that breaches our terms of service.” TikTok and Meta have reportedly blocked the search word “undress” to tackle the problem. Google has taken down some ads for undressing sites and apps/ A Reddit spokesperson told Bloomberg said the site does not allow any non-consensual sharing of synthetic sexually explicit material and had banned many domains following the research. With inputs from agencies
A survey by the social network analysis company Graphika has found that websites and apps that use AI to produce fake naked images are attracting a large number of visitors. These ‘nudifying’ apps mostly target women
Advertisement
End of Article