Q&A: 10 Questions About Facebook and Russian Propaganda

Q&A: 10 Questions About Facebook and Russian Propaganda

Siva Vaidhyanathan is the Robertson Professor of Media Studies and the director of UVA’s Center for Media and Citizenship. (Photo by Dan Addison, University Communications)
November 02, 2017

This week, lawyers from Facebook, Google and Twitter testified before three congressional hearings on how Russian agents are using their platforms to influence American politics. They confirmed that Russia-backed social media ads spread to at least 126 million Facebook users and 20 million Instagram users. Russian actors also sent hundreds of thousands of Tweets and created at least 1,000 YouTube videos, lawmakers said.

Most of the content was aimed at fomenting division among the American electorate by promoting angry rhetoric during the 2016 presidential election and exploiting contentious issues like race, police brutality and religion.

As examples, lawmakers released more than 3,000 Facebook ads purchased by Russian operatives. In one example, published by a Russia-linked account called “Army of Jesus,” the ad blared “Satan: If I win, Clinton wins!” Another, published by a “Blacktivist” account, accused the government of supporting the Ku Klux Klan. Still others stoked fears on both sides of the political spectrum around topics like border protection, police brutality, the Black Lives Matter movement and the growth of Islam.

In a particularly striking example, the Senate Intelligence Committee highlighted two Facebook posts from the Russian propaganda group Internet Research Agency. One, shared by the Russia-linked Heart of Texas Facebook page, encouraged followers to attend an anti-Islamic rally in Houston. Another, from a page called United Muslims of America, promoted a pro-Muslim rally at the same time and place. The virtual ads resulted in a real shouting match at the Islamic Center in Houston, where both groups gathered on May 21, 2016.

To learn more about the debate surrounding these fake ads and what can be done about them, we spoke with UVA’s Robertson Professor of Media Studies Siva Vaidhyanathan, who has studied social media for years and directs UVA’s Center for Media and Citizenship. He is exploring its effects on democracy in an upcoming book, “Anti-Social Media,” and spoke on the topic at this week’s Obama Foundation Summit.

Q. Much of the congressional hearings focused on Facebook ads purchased by Russian agents. Where do these ads appear on our newsfeeds?

A. On Facebook, it is very hard to tell the difference between an advertisement and user-generated content, because anything that comes across your newsfeed is presented in the same way.

Most of the ads that we are talking about right now are memes – images with superimposed text – that show up in people’s newsfeeds. They have been shared from fake Facebook pages or groups set up by Russian actors to promote causes like Texan independence or Black Lives Matter. These pages are fronts, and they have produced hundreds of images meant to generate alarm among some segment of the population. It seems that their goal is to spread cynicism about democracy and push on pressure points in our society.

Q. How are these fake ads targeted?

A. Facebook has a rich dossier on every one of their 2.1 billion users worldwide. Not only do they have the information you put in your profile; they can track which publications you read and which links you click on. Facebook also purchases consumer information from database companies that monitor credit card usage, so they can correlate your profile information with what you purchase at Target, for example. Taken together, it is the most powerful profiling tool out there.

Any organization – including the Russian actors Congress is looking at – can pay to promote a Facebook post to specific users with a checklist of attributes that are based on that data. For example, UVA’s Center for Media and Citizenship has a Facebook page. I paid $75 to promote our podcast on psychology to people who live in North America and the United Kingdom who have a master’s degree and have expressed interest in psychology. It reached about 20,000 people for just $75.

Q. Political ads on radio or broadcast networks have long been governed by a specific set of regulations. Do similar regulations apply on social media?

A. Right now, campaign ads on social media have no effective regulation. A candidate can accuse another candidate of shoplifting 48 hours before an election and target the ad so precisely and quietly that the victim of the ad has no response. Facebook has pledged to change this before the 2018 midterms, but right now it could factor into the current Virginia governor’s race, for example.

Q. Congress has identified about 3,000 Facebook ads purchased by Russian agents so far. Is this just the tip of the iceberg?

A. Absolutely. Right now, they are only looking at one player in the Russian propaganda game, called the Internet Research Agency. I believe Sen. [Mark] Warner [the Virginia Democratic senator on the Senate Intelligence Committee] is convinced it is a much bigger problem.

There are also domestic forces doing the same thing. White nationalist groups, anti-Semitic groups, anti-government groups and just plain pranksters are running similar campaigns sowing cynicism and distrust.

Q. Representatives from Twitter and Google were also questioned. How has political propaganda played out on those platforms?

A. Twitter is a bit different because it does not have anything close to the influence of Facebook. It has about 300 million users worldwide – probably a third of which are fake “bot” accounts – and that number is not growing. Twitter also does not have nearly as much data on its users, meaning that its advertisements are less targeted. 

Google, on the other hand, is a tremendous player in this issue, largely because of YouTube, which Google owns. YouTube is the most influential source of video in the world, and all sorts of nefarious groups post video there. Google also has a rich dossier on its users, based on all of the search data generated every day.

Q. Have other countries faced similar issues regarding the use of social media sites by political groups?

A. Yes, there are several examples, including some I talked about at the Obama Summit. The ruling Bharatiya Janata Party in India has run several campaigns almost entirely on social media, and the prime minister, Narendra Modi, is a master of social media. Armies of people there have constructed and distributed propaganda on Facebook and WhatsApp, the messaging platform that Facebook owns. Much of it is anti-Muslim.

In the Philippines, Rodrigo Duterte ran his campaign with a similar playbook. In Myanmar, anti-Muslim sentiment has been promoted by a small group of Buddhists, but spread far and wide on Facebook.

[See Vaidhyanathan’s full talk with Dutch politician Marietje Schaake at the Obama Foundation Summit below.]

Q. Are there any steps Facebook can take to reduce this problem?

A. I believe most of the talk of reform is cosmetic. Facebook is working exactly as it supposed to. It is designed to be an algorithmic system that precisely targets both user-generated content and advertisements. It creates and enforces ideological bubbles and pushes our emotional buttons. It amplifies both our best and worst qualities.

I believe we will continue to see this kind of manipulation, and these threats to democracy, as long as Facebook has more than 2 billion users and continues making more than $10 billion per quarter. 

Q. Can you identify any regulatory solutions that could help?

A. Yes, but I am not confident they will be strong enough. I think we need to take another look at antitrust laws, which of course go way beyond Facebook. I would consider breaking Facebook up into its constituent parts – WhatsApp, Instagram, Facebook messenger and Oculus Rift, its virtual reality product.

I also think the United States should consider data protection laws like those in Europe, where companies must get explicit permission to access user-generated data. That has not decreased Facebook’s influence in Europe, but it has created more awareness about the power that companies like Facebook have.

Q. How can individual users identify fake ads or propaganda on their newsfeeds?

A. I don’t believe that burden should fall to users, because it is not an easy thing to do. Still, we should all recognize that Facebook has built-in biases that favor highly emotional content.

The angrier something makes you, the farther it will go on Facebook, and the less likely it is to be a true and full story. We need to understand that we are all potential victims of propaganda and see the risks of getting our news on Facebook.

Q. Do you have any other suggestions for users wanting to better control how social media platforms influence their lives and politics?

A. For better or for worse, we have increasingly lived through our phones for a large part of the last decade. The apps on our phones are designed to be addictive and to make things more convenient.

I think we need to start valuing inconvenience. Start appreciating quiet times, start reaching out to people in the real world. Focus on getting to know each other, getting to know our neighbors. “Re-humanize” ourselves, if you will.

Media Contact

Caroline Newman

Senior Writer and Assistant Editor of Illimitable Office of University Communications