Your Turn: What Should We Do About Foreign Influence of Our Elections?
Vote to see how others feel about this issue
The story
Facebook said on Tuesday that it identified a political influence campaign that was potentially built to disrupt the midterm elections, with the company detecting and removing 32 pages and fake accounts that had engaged in activity around divisive social issues.
Today, the Senate Intelligence Committee held a hearing with social media experts on foreign influence operations aimed at disrupting U.S. elections.
Renee DiResta, Director of Research at New Knowledge, testified:
“This is one of the defining threats of our generation… The social media platforms cannot and should not bear the sole responsibility for defending our democracy and public discourse.”
Background
The experts provided an extensive history of foreign influence of public discourse.
In the 21st century, the use and impact of disinformation has become widespread. In 2014, the Russian government used disinformation to create a counter-narrative after Russian-backed Ukrainian rebels shot down Malaysia Airlines Flight 17, and again after it invaded Crimea.
Since then, the experts testified today, Russia has weaponized U.S. social media platforms. John Kelly, Founder and CEO of Graphika, said:
“The Russian government is now doing to us what they did at home and in Eastern Europe a decade ago.”
Laura Rosenberger, Director of the Alliance for Securing Democracy at The German Marshall Fund of the United States, testified:
“Our authoritarian adversaries are using these platforms because controlling the means of information is a powerful tool in advancing their agendas around the world.”
Fake news became a global concept that was widely introduced to billions mainly given its role in the 2016 U.S. presidential election. Over the course of the election cycle, fake news saw higher sharing on Facebook than legitimate news stories, which analysts explained was because fake news often panders to expectations or is otherwise more exciting than legitimate news.
Philip Howard, Director of the Oxford Internet Institute, studied web traffic in the U.S. in the lead-up to the presidential election. He found that about one half of all news on Twitter directed at the swing state of Michigan was fake. The other experts confirmed that disinformation was much more heavily targeted at swing states.
Are some people more susceptible to disinformation than others?
In his testimony today, Howard said:
“We’ve found junk news to be particularly appetizing to the far right, white supremacists, and Trump supporters, though notably, not conservatives.”
According to Buzzfeed, during the last three months of the presidential campaign, of the top 20 fake election-related articles on Facebook, 17 were anti-Clinton or pro-Trump.
However, Kim LaCapria of the fact-checking website Snopes has argued that in the United States, fake news is a bipartisan phenomenon, saying:
"There has always been a sincerely held yet erroneous belief misinformation is more red than blue in America, and that has never been true.”
Jeff Green of Trade Desk agrees the phenomenon affects both sides. Green’s company found that affluent and well-educated people in their 40s and 50s are the primary consumers of fake news. He told 60 Minutes that this audience tends to live in an “echo chamber,” and that these are the people who vote.
How does it work?
Known as “computational propaganda,” fake news publishers use “bots” that make their content appear more popular than it is. Bots are fake social media accounts that are programmed to automatically like and/or retweet a particular message. This triggers elements of social media algorithms that prioritize popular posts, making it more likely that the article will gain wide traction.
Kelly testified:
“The automated accounts of the far left and the far right of our political spectrum produce 20 to 30 times the messaging of legitimate accounts.”
The articles, memes, and other content are designed to inflame existing divisions. At today’s hearing, Howard testified that it’s not so much about creating a single counter-narrative, but rather about creating multiple, often ridiculous stories to sow discord, division, and confusion along existing political and social fault lines.
Kelly said that Russian disinformation propagators would often put out content designed to appeal to both sides of a tense social divide, such as Black Lives Matter and Blue Lives Matter, or Muslim and Christian groups. According to Kelly:
“The goal is to get angry citizens to confront each other in the streets.”
During the 2016 presidential election, Russian operatives working for the St. Petersburg-based Internet Research Agency used multiple social media accounts — including ones called “Woke Blacks” and “Blacktivists” — to urge black Americans to vote for third-party candidates or sit out the election entirely.
Kelly testified that Russian disinformation efforts have not stopped, and have even accelerated since the 2016 election in the lead-up to the midterms.
Diresta said:
“We’re in the midst of an arms race in which responsibility for the integrity of public discourse is largely in the hands of private social platforms.”
What to do?
Dr. Todd Helmus, Senior Behavioral Scientist at the RAND Corporation, said today that we need media literacy training in public schools.
Rosenberg argued that social media users need more information about what they’re seeing and why they’re seeing it. She applauded Facebook’s announcement yesterday, saying we need more such transparency.
Rosenburg also said that the Honest Ads Act and Secure Elections Act are important legal frameworks for addressing these challenges.
The experts generally agreed that public-private partnerships are necessary, and that social media companies cannot manage this threat alone. Diresta noted that this does not necessarily mean someone has to become the arbiter of truth:
“One problem with the platforms is that they’ve thought they need to address the core of the narrative, when they really need to target the process of dissemination.”
What do you think?
How should the U.S. government combat foreign influence of elections? Hit Take Action to tell your reps, then share your thoughts below.
—Sara E. Murphy
(Photo credit: iStock.com / BigFishDesign)
The Latest
-
IT: Here's how you can help fight for justice in the U.S., and... 📱 Are you concerned about your tech listening to you?Welcome to Thursday, April 18th, communities... Despite being deep into the 21st century, inequity and injustice burden the U.S. read more...
-
Restore Freedom and Fight for Justice With GravvyDespite being deep into the 21st century, inequity and injustice burden the U.S., manifesting itself in a multitude of ways. read more... Criminal Justice Reform
-
Myth or Reality: Is Our Tech Listening?What's the story? As technology has become more advanced, accessible, and personalized, many have noticed increasingly targeted read more... Artificial Intelligence
-
IT: 🧊 Scientists say Antarctic ice melt is inevitable, and... Do you think Trump is guilty?Welcome to Tuesday, April 16th, members... Scientists say Antarctic ice melt is inevitable, implying "dire" climate change read more...