Recently, Reddit has been making news again with a subreddit in w hich people use a machine learning tool called “Deep Fake” to automatically replace one person’s face with another in a video. Obviously, since this is the internet, people are using it for two things: fake celebrity porn and inserting Nicolas Cage into random movies.
While swapping someone’s face in a photograph has always been relatively easy, swapping someone’s face in a video used to be time consuming and difficult. Up until now, it’s mainly just been done by VFX studios for big budget Hollywood movies, where an actor’s face is swapped onto their stunt double. But now, with Deep Fake, anyone with a computer can do it quickly and automatically.
Before going any further, you need to know what a Deep Fake looks like. Check out the SFW video below which is a compilation of different celebrity face swaps, mainly involving Nic Cage.
Play Video
The Deep Fake software works using machine learning. It’s first trained with a target face. Distorted images of the target are run through the algorithm and it learns how to correct them to resemble the unaltered target face. When the algorithm is then fed images of a different person, it assumes they’re distorted images of the target, and attempts to correct them. To get video, the Deep Fake software operates on every frame individually.
The reason that Deep Fakes have largely just involved actors is that there is a lot of footage of them available from different angles which makes training more effective (Nicolas Cage has 91 acting credits on IMDB). However, given the amount photos and video people post online and that you really only need about 500 images to train the algorithm, there’s no reason ordinary people can’t be targeted too, although probably with a little less success.
How to Spot a Deep Fake
Right now, Deep Fakes are pretty easy to spot but it will get harder as the technology gets better. Here are some of the giveaways.
Weird Looking Faces. In a lot of Deep Fakes, the faces just look weird. The features don’t line up perfectly and everything just appears a bit waxy like in the image below. If everything else looks normal, but the face appears weird, it’s probably a Deep Fake.
Flickering. A common feature of bad Deep Fake videos is the face appearing to flicker and the original features occasionally popping into view. It’s normally more obvious at the edges of the face or when something passes in front of it. If weird flickering happens, you’re looking at a Deep Fake.
Different Bodies. Deep Fakes are only face swaps. Most people try and get a good body match, but it’s not always possible. If the person seems to be noticeably heavier, lighter, taller, shorter, or has tattoos they don’t have in real life (or doesn’t have tattoos they do have in real life) there’s a good chance it’s fake. You can see a really obvious example below, where Patrick Stewart’s face has been swapped with J.K. Simmons in a scene from the movie Whiplash. Simmons is significantly smaller than Stewart, so it just looks odd.
Short Clips. Right now, even when the Deep Fake software works perfectly and creates an almost indistinguishable face swap, it can only really do it for a short amount of time. Before too long, one of the problems above will start happening. That’s why most Deep Fake clips that people share are only a couple of seconds long, the rest of the footage is unusable. If you’re shown a very short clip of a celebrity doing something, and there’s no good reason it’s so short, it’s a clue that it’s a Deep Fake.
No Sound or Bad Lip Syncing. The Deep Fake software only adjusts facial features; it doesn’t magically make one person sound like another. If there’s no sound with the clip, and there’s no reason for their not to be sound, it’s another clue you’re looking at a Deep Fake. Similarly, even if there is sound, if the spoken words don’t match up correctly with the moving lips (or the lips look strange while the person talks like in the clip below), you might have a Deep Fake.
Unbelievable Clips. This one kind of goes without saying but, if you’re shown a truly unbelievable clip, there’s a good chance you shouldn’t actually believe it. Nicolas Cage has never starred as Loki in a Marvel movie. That’d be cool, though.
Dubious Sources. Like with fake photos, where the video supposedly comes from is often a big clue as to its authenticity. If the New York Times is running a story on it, it’s far more likely to be true that something you discover in a random corner of Reddit.
For the time being, Deep Fakes are more of a horrifying curiosity than a major problem. The results are easy to spot, and while it’s impossible to condone what’s being done, no one is yet trying to pass off Deep Fakes as genuine videos.
As the technology gets better, however, they’re likely to be a much bigger issue. For example, convincing fake footage of Kim Jong Un declaring war on the USA could cause a major panic.