Are You Worried About 'Deepfakes'?
Are you worried about deepfakes in the 2020 election?
by Countable | 9.4.19
What's the story?
Can you believe your eyes and ears?
Countable is compiling a dossier on all the issues that may affect the 2020 election, answering your questions and concerns about what to expect on Tuesday, November 3rd, 2020.
We're (naturally) calling the series "Foresight 2020". And we begin with "deepfakes."
What are “deepfakes”?
- Deepfakes are videos, images, and audio that use artificial intelligence (AI) to create false evidence of people saying or doing things that they actually didn’t do. (Like President Donald Trump and Kim Jong-un fist-bumping.)
- The name is a combination of “deep learning” and “fake.” AI needs source material to “learn”—and politicians provide ample images and video for AI to access.
- Here’s North Korean leader Kim Jong-un shaking hands with Elvis Presley:
- Here’s Barack Obama delivering a completely fabricated public-service announcement about the dangers of deepfakes:
What’s the concern?
Manipulated videos can easily spread online before they are labelled as fake.
- This is what happened in May 2019, when a faked video of House Speaker Nancy Pelosi (D-CA) appearing drunk went viral across social media.
A fake video of a world leader or politician making an incendiary remark could set off a trade war—or even a conventional one.
- An ambiguous tweet by Trump regarding the trade war with China can send the markets reeling. Now imagine the effects on financial markets if Chinese President Xi Jinping were to appear in a video saying: “We will never reach a trade deal with the U.S. and are raising tariffs to 50%, effectively immediately.”
Deepfake manipulations could become so ubiquitous that people are unwilling to trust video or audio evidence.
- Politicians could claim that a compromising video or audio recording was not actually them but an AI-crafted deepfake.
Timing is everything
- It’s Tuesday, November 3rd, 2020. You wake up to a trending video: a politician engaged in sexually abusive behavior. You don’t know whether the video is legitimate or a deepfake. You have pundits on both sides of the aisle asserting its accuracy or mendacity. You only have 12 hours to vote—do you pull the lever for that politician?
Report highlights Instagram & deepfakes as key disinformation threats in 2020 elections
- Deepfake videos of candidates will pose a threat during the 2020 election, according a September 2019 report by New York University’s Stern Center for Business and Human Rights.
- The report - Disinformation and the 2020 Election: How the Social Media Should Prepare - predicts that deepfakes will be unleashed across the media landscape "to portray candidates saying and doing things they never said or did" and, as a result, "unwitting Americans could be manipulated into participating in real-world rallies and protests."
- The NYU report pinpointed Russia, China, and Iran as countries likely to disseminate fabrications in an attempt to sway public opinions regarding the next occupant of the White House.
- However, and perhaps more alarmingly, the report finds that "domestic disinformation will prove more prevalent than false content from foreign sources."
How is social media reacting?
- Most social media companies do not have a specific policy against deepfakes.
- Instagram recently launched a tool that allows users to report misinformation on the platform.
- Facebook CEO Mark Zuckerberg said the company is evaluating how it should handle deepfakes.
- Twitter said its rules “clearly prohibit coordinated account manipulation, malicious automation, and fake accounts.”
What can Congress do?
- The Deepfake Report Act of 2019 would require the Department of Homeland Security to conduct an annual study of deepfakes and related content. DHS would be required to assess: 1) the technology used to generate deepfakes, 2) deepfakes’ uses by foreign and domestic entities, 3) countermeasures to deepfakes and 4) potential statutory and regulatory authorities to address this threat.
- The Honest Ads Act of 2019, a bipartisan bill reintroduced earlier this year, aims to improve transparency around who purchases political ads.
- The PAID AD Act would make it illegal for foreign nationals to purchase broadcast, cable, satellite, or digital communications naming a candidate for office at any point in time. It would also prevent foreign governments and foreign lobbyists from buying issue ads.
- Virginia in July 2019 expanded its nonconsensual pornography ban to include deepfakes and “deepnudes.” The law makes it illegal to share nude photos or videos without the subject's permission, regardless of whether the images are real or fake.
What do you think?
Are you concerned about deepfakes in the 2020 election? Should Congress pass any of the above laws? Should social media outlets adopt policies against deepfakes? Take action above and tell your reps then share your thoughts below.
(Photo Credit: Screen Capture from Deepfake "Donald Trump Russell White deepfake")
The DC: Your reps want to hear from you about impeachment, and... 👍 Should Trump pardon Roger Stone?Welcome to Monday, November 18, renters and owners...When it comes to public hearings, one of the criticisms often lobbed at
by Countable | 11.18.19
Know a Nominee: Justice Robert Luck to the 11th Circuit Court of AppealsThe Senate is expected to consider the nomination of Justice Robert Luck to be a judge on the 11th Circuit Court of Appeals,
by Countable | 11.17.19
Know a Nominee: Justice Barbara Lagoa to the 11th Circuit Court of AppealsThe Senate is expected to consider the nomination of Justice Barbara Lagoa to be a judge on the Eleventh Circuit Court of
by Countable | 11.17.19