Culture

FakeApp is an app where you can make fake porn with anyone's face

By Maya Khamala

It’s not technically illegal to take a person’s face (a woman’s face), place it on a porn star’s body and pass it off as real. The fact that the AI technology capable of creating convincing yet false footage implicating people in various scenes who never actually took part in any such scenes should make anyone with a pulse feel queasy at the very least.

It’s endlessly disturbing that people (men) have spent otherwise unencumbered periods of time and a perfectly usable life source tinkering away at the ever-so-helpful humanity-enhancing task of making this aforementioned AI technology easily accessible to any questionable specimen under the sun. And to be clear, any person wanting to use it is inherently questionable. 

FakeApp is a user-friendly application that allows anyone to recreate these videos with their very own datasets. It’s bad enough that the availability of this app violates (by its very existence) the underenforced and underprotected laws of consent that every person should hold near and dear, but where the hell does this leave us with regard to recognizing fake news? Or evidence in a murder trial? Is blackmail obsolete yet? Or just a lot easier? Where on earth are we again, and what year is it? Do rules and reality both mean a lot less than they ever have before?

FakeApp

Motherboard discovered a redditor named “deepfakes” in December, just a random dude who happened to enjoy playing the swap game with celebrity faces—with the help of publicly available porn videos, a machine learning algorithm, and his trusty home computer, of course. He made several convincing porn videos of celebrities—including Gal Gadot, Maisie Williams, Taylor Swift, Scarlett Johansson, Maisie Williams, Daisy Ridley, and Aubrey Plaza. Anyone who looks closely won’t be fooled, so they say, but who scrutinizes every video for evidence of fakery? Plus, the technology will only become more advanced—of that we can be sure.


While FakeApp was indeed made possible by deepfakes' algorithm, it was a second Reddit user, “deepfakeapp,” who actually created the app, without the help of deepfakes himself. Still with me? Good. The app allows users without any computer science background to speak of (or any particularly redeeming skills or charm, I assume) to create AI-assisted fake porn from the comfort of their own homes. All the tools one needs to create said videos are free, readily available, and come with instructions to guide a kid through, step-by-dubious-step.

Samantha Cole from Motherboard extracted the following gem from deepfakeapp himself:

“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks. Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”

I don’t know about you, but I’ll sleep better now that I know deepfakeapp is working tirelessly to improve my ability to make fake sex tapes of people and spread them around.

An assault on the senses

This ain’t the first time poor Taylor has been a target; last year Kanye West created a naked lookalike wax figure of Taylor Swift in his Famous music video.

“Taylor is furious after seeing Kanye’s video. She feels like she’s been assaulted,” a source told HollywoodLife.com.

Celebrity women may be the main targets right now, but anyone with a bone to pick or a few screws loose; anyone mad at their ex; anyone competing for a corporate job—now officially has the tools to play it dirty. I thought the zero class era had already arrived, but this seals the deal. Legal though the app may be, I can’t imagine how one might put forward an ethical argument in its favour. Every time such a video is made, the consent of the person who’s had their face swapped in remains entirely unconsidered, and yet they are opened up to a deeply violating kind of exposure, commentary, and potential harm. Really not cool. Scary.

Deborah Johnson, Professor Emeritus of Applied Ethics at the University of Virginia’s school of engineering says it well:

 “…we’re getting to the point where we can’t distinguish what’s real—but then, we didn’t before. What is new is the fact that it’s now available to everybody, or will be...It’s destabilizing. The whole business of trust and reliability is undermined by this stuff.”

Only time will tell how FakeApp fares amid the #metoo cultural revolution that happens to be placing consent in the limelight on any stage that will listen the hell up already.

<3

Image Source: Tony Futura

Stay in the loop, bbOur top stories delivered to your inbox weekly