tech2 News StaffJun 28, 2019 13:17:32 IST
Update: Citing server overload and potential harms, the creator of DeepNudes has since pulled the application down.
Unless you are living under a rock, you must have heard or read about deepfakes in recent time. And every second that you and I spend in making sense of what the technology is, and its potential harms and benefits, it somehow keeps getting more complex, and roots deeper and deeper into our lives. There's now a new technology called DeepNudes, which has essentially evolved from deepfakes but is faster and easier to use.
And using DeepNudes, a programmer has created an application, which uses neural networks that turn images of women wearing clothes into their (realistic-looking) nudes!
The software behind the application uses a regular photo of a person (where they are wearing clothes!) and creates a new, naked image of them. It swaps clothes for naked breasts and a vulva. And guess what, it only works on images of women. Apparently, Motherboard tried using an image of a man, but the app replaced his pants with a vulva.
Also, the software tends to work better in images where there's already some skin showing. The results vary depending on the resolution and lighting in the images. However, in high res images, with the woman looking into the camera, the results are hauntingly life-like.
It was last month that the DeepNude website was launched, along with a Windows and Linux application of the same. The app is easily available and accessible over the internet. It downloads and installs like any other app and you do not require any technical expertise to use it either.
The app comes in two versions. The free app's output images are partially covered with a large watermark. There is a paid version of the app as well, which costs $50, and the image output from this one does not have any watermark; it does have a stamp that says "FAKE", but its placement in the corner makes it easy to crop or edit it out.
For a few years now, deepfakes have been spotted in various hilarious and horrifying use cases. We know you can turn an image into a talking video of you, your friends, or people who aren't even around anymore, using deepfake, and that can be fun. But then there are also instances where fake videos have been used to defame a person, like in the case of US House Speaker Nancy Pelosi or even Facebook CEO Mark Zuckerberg.
While we focus on seeing deepfakes as a disinformation tool, we are still not paying enough attention to the more devastating use of it: how it is being used against women!
Be it to experiment with the technology using images without women's consent, or maliciously spreading nonconsensual porn on the internet. Take it in the example of the Motherboard article. In order to share how the app works, the images of Taylor Swift and Tyra Banks are all over the website.
Besides that too, there are tons of explicit videos online, where celebrities' faces have been grafted over someone else's. All of these videos, of course, have been put up by anonymous "creators". There is one such fake video of Scarlett Johansson, which was described as real “leaked” footage, that was watched on a major porn site over 1.5 million times. Such videos, by the way, have also been spotted on PornHub that claimed to ban AI-generated deepfake porn in February 2018.
Find our entire collection of stories, in-depth analysis, live updates, videos & more on Chandrayaan 2 Moon Mission on our dedicated #Chandrayaan2TheMoon domain.