Amazon Pay
SBI
Grofers

Inside the problematic world of 'deepfakes', AI-made videos which morph celebrities' faces on porn stars

By Taruni Kumar

“Computers driving our cars, beating humans at Go.
Nonsense! We all know what they are for
AI is for PORN!”

This is the only explanation available on the home page of deepmindy.com, a website to which you can upload the image of a person and find similar faces in pornographic photos. You can upload your own (if that’s what you’re into), a friend's, a celebrity's or just about anybody at all, and artificial intelligence (AI) will find you the one to f*p to. Fun fact: Deep Mindy is a play on Google’s Deep Mind AI company, which describes itself on its website as “the world leader in artificial intelligence research and its application for positive impact".

Well, positive impact is definitely not what AI is being used for at the moment. As creepy as it is that Deep Mindy allows you to find a porn star who looks similar to your desired sexual partner, these are still photos of women who consented to being photographed (unless their photos were leaked online, which is a whole other vile revenge porn genre). But what’s creepier than Deep Mindy is the new genre of AI-made fake porn video doing the rounds of the Internet, which has the faces of celebrities morphed onto the bodies of actual porn actors. It’s better than an average Photoshop job and takes place in actual video, not just static images. Sure, there’s the occasional glitch and a weird uncanny valley type effect does play out, but for the most part, it’s scarily accurate.

Representational image. Reuters

Representational image. Reuters

Let’s go back to the very beginning of this phenomenon. In December 2017, tech website Motherboard posted an article about the first AI-assisted porno made by a Reddit user who called himself ‘deepfakes’. The video was an incest-themed porno starring a woman with Wonder Woman star Gal Gadot’s face. Aside from some glitches here and there, it was a scarily convincing video. As is the way of the Internet, deepfakes’ Reddit handle took on a noun value of its own and as more people began to use AI to morph porn, the videos began to be called deepfakes.

And before anybody says, “Yeh western phenomenon hai, hamara Indian culture nahin,” there are deepfakes of Deepika Padukone, Priyanka Chopra and Shraddha Kapoor on PornHub already. And I assure you, they’re incredibly disturbing.

What’s even more disturbing is that the technology being used in the creation of these deepfakes isn’t coming from a sophisticated tech lab with insane hardware.

Deepfakes are being made by regular people with computers that are slightly more powerful than average ones using open source, that is, freely available tools from the Internet, including Google’s own TensorFlow machine learning framework. In fact, a Reddit user called deepfakeapp even built, as his handle suggests, a deepfake app that can be used by anybody without needing the technical know-how of TensorFlow or Python or any of the other tools required to create deepfake videos.

As a fantastic headline from Quartz sums it up, “Google gave the world powerful AI tools and the world made porn with them." On the other hand, this isn't exactly new, because the Internet’s most cutting-edge innovations have all come out of the porn industry, from payment gateways to streaming video.

Keeping in line with how the world usually works, most of these deepfake porn videos are of famous women. There’s at least one deepfake of a man—Nicholas Cage—but that’s not a porn video. It’s a compilation of movie scenes with his face morphed into them. No, really, deepfakes are just another way for men to fulfill their fantasies of taking control of women’s agency.

Let’s be clear. I am not saying, "Haaw, sex!" I am not anti-porn. But I just like pro-porn and pro-pleasure positions to go with a pro-consent position. And deepfakes have no consensual element whatsoever. The celebrity women whose faces are being used for these videos did not consent to them, and the actors in the videos who did consent are being erased out. This hasn’t gone unnoticed. In February, Reddit banned deepfakes and PornHub started taking down deepfake flagged videos. But once something is on the World Wide Web, it’s nearly impossible to get rid of it.

Here’s the thing. While deepfakes are awful and really need to go away, the technology behind the videos may have some interesting implications for the world of media. For example:

In this scene from Star Wars: Rogue One, the top frame has the original CGI version of a young Princess Leia while the bottom frame is almost as good, but a much cheaper and faster version of it using the deepfakes tech.

So, the question is, is all AI-assisted video bad or are just deepfakes porno bad? The answer isn’t so simple, because in these times of fake news, morphed videos that are more difficult to spot don’t bode well, especially when we’re still struggling with how to deal with modified static images and false text. There is already a video doing the rounds of an AI-generated visual of former US President Barack Obama lip syncing to an audio track of him speaking, which looks surprisingly real. And Face2Face, a real-time face tracker, allows an output that can look like anybody you desire – from George Bush to Vladimir Putin.

But perhaps deepfakes will begin a conversation and a counter-tech movement to build tools that will make it easier to spot modified videos: a development that we would all welcome and that may prevent incidents like the doctored JNU videos released by Zee News in early 2016, that led to sedition charges and the arrest of students Kanhaiya Kumar, Anirban Bhattacharya and Umar Khalid.

It does make one wonder though. When Elon Musk talked about AI’s potential to become a digital dictator, I don’t think he was talking about revolutionising the porn industry. But then again, this is the same man who “unwittingly” attended a sex party last summer where he claimed he saw no sex.

The Ladies Finger (TLF) is a leading online women's magazine


Updated Date: Apr 16, 2018 11:17 AM

Also See