• Subscribe

The real price of deepfakes

Speed read
  • Deepfake technologies allow people to create bogus video, audio, or images
  • Fake news is a problem, but deepfakes prompt a wider distrust of reality
  • Social media platforms aren’t doing enough to keep deepfakes in check

Have you ever felt like you’re living in a post-truth society? Does the prospect of deepfake videos make you want to pack up everything and move to a hermitage in the woods?

Don’t fear, we’ve been here before—sort of.

Can you trust what you see? Probably not, if you’re looking at a deepfake. These altered media use machine learning to pair someone’s voice or appearance with another person’s likeness, sometimes literally putting words in their mouth. Courtesy The New York Times.

The Soviet Union consistently doctored photos in favor of Stalin’s interests. Two young cousins even gained national attention in 1917 by passing off faked photos of fairies as real.

That said, we are in a unique moment in the world of fake news. Deepfakes (altered media in which a person in a video, image, or audio is replaced with someone else’s likeness) are becoming increasingly convincing, and their power reaches far beyond fooling people into thinking a fabricated video is real.

It's not like up until deepfakes everything was fine, that there were no misinformation campaigns, there were no fake images, no fake videos – of course there were.

To better understand the current situation, we talked with Hany Farid, an expert in digital forensics and professor at University of California, Berkeley. While he isn’t too worried about deepfakes destroying society as we know it, he does have some words of wisdom for those who want to uphold the importance of truth.

DIY fakes

“It's not like up until deepfakes everything was fine, that there were no misinformation campaigns, there were no fake images, no fake videos – of course there were,” says Farid. “The difference now, besides the scary-sounding name, is the democratization of access to technology.”

Depending on your end goal, deepfake technology is disturbingly accessible. For example, the website thispersondoesnotexist.com can instantly synthesize a face that doesn’t exist, which could then be used for a phony Facebook profile photo.

It’s a little bit more involved for audio and video. “For the so-called face-swap deepfakes you can go to GitHub and download a code and run it. Can absolutely anybody do that? No, of course not. Can a lot of people do it? Sure,” says Farid. “And the other thing we know is that there are now services popping up that will create deepfakes for you.”

We are becoming less informed with the more information that's being sent to us. We've confused the information age with the knowledge age.

We may also be thinking too narrowly when it comes to deepfakes. Counterfeiting a politician’s speech is alarming, but Farid points out that non-consensual pornography is also an enormous concern with this technology. This is where a person’s face is grafted onto sexually explicit content without their approval.

“If you look at the vast majority of deepfakes they are in the non-consensual porn space,” says Farid. “If you examine where the harm is being done it is absolutely to women in the form of non-consensual pornography.”

<strong>The real danger of deepfakes</strong>, says digital forensics expert Hany Farid, is that we lose our ability to tell truth from fiction—or manipulation. If we want to actively fight back against deepfakes, we need to understand the full scope of their effects. There’s been a fair amount of discussion on how this tech could influence politics, but the media has been largely silent about the pornographic use. Women are the primary victims of this kind of deepfake, and the technology offers a new and widespread threat to the right to bodily autonomy.

What is truth?

That said, Farid seems to be more worried about our perception of deepfakes than the actual hoaxes themselves. He describes how politicians might use this technology to get away with outright lies.

“You'd have politicians who can simply say ‘oh, there's a damaging video, audio, or image of me saying or doing something – it’s a deepfake!’” says Farid. “We are becoming less informed with the more information that's being sent to us. We've confused the information age with the knowledge age.”

The average person, when they look at a video, is not particularly well-equipped to tell whether something is real or not

Farid also notes that people shouldn’t rely on technology to detect which videos are real. He acknowledges that such technology exists and is being improved, but for the most part it’s not something most people will have access to.

“The average person, when they look at a video, is not particularly well-equipped to tell whether something is real or not,” says Farid. “Something I can suggest is the old adage that extraordinary claims require extraordinary evidence. I think some common sense is always good, so forget about pixels.”

However, this isn’t to say that our only weapon is vigilance. Farid also discusses how social media platforms need to be held responsible for disseminating this sort of content. Sadly, the current environment doesn’t seem to favor this view.

“Social media’s core business model is basically engagement. It’s not about getting it right more often than getting it wrong,” says Farid. “The real issue is the targeted advertising. I can create downright lies objectively speaking, and I can target advertise them to very specific audiences.”

If you don’t like the sound of that, Farid says it’s up to us to initiate the change. Companies aren’t going to do it on their own. “Now you have to turn toward Capitol Hill and say ‘Guys, get your stuff together and start regulating this space.’”

Despite all the hurdles presented by deepfakes, Farid acknowledges that the technology itself isn’t to blame.

“I would say that the average researcher out there is just trying to do something cool. And it is inarguably cool!” says Farid. “But there are serious consequences. It’s imperative that we think about how we're developing it, how we deploy it and how it’s made available to the public.”

Read more:

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2020 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.

Republish

We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.