Put Yourself In A
Russian Bot's Shoes

An Interactive Dive into Russian Interference Tactics in the 2016 U.S. Elections
Note: We use the terms "bot" and "troll" to refer generally to fake accounts used for Russian interference.
*All authors contributed equally to this article.
May 22, 2020

Network of Russian Twitter bots (red) mentioning real accounts (green) and one another from 2014-2017.

Prelude

It’s difficult to think about democracy in the midst of a global pandemic. In addition to dominating headlines and the mental space of most citizens, the coronavirus has led to the postponement of various state primaries and halted campaign events, largely placing the 2020 presidential election in the backseat of our minds and calendars.
However, one effect of the pandemic is not so removed from the November elections. Forced to stay home and observe social distancing, Americans are online now more than ever. According to the ACA (America’s Communication Association), download usage in the U.S. has increased by 27% during the pandemic, with upload usage up by 36%. For those of us confined at home, spending time on social media and streaming sites has replaced our day-to-day rhythms outside the house. Even more than before, online media has become a vital channel for accessing information about the world. With critical health outcomes depending on our united knowledge of and response against the pandemic, online misinformation and fake news have become all the more dangerous.
In this article, we seek to understand one particular source of such misinformation— foreign actors using Twitter to pose as Americans and spread polarizing opinions. To do this, we take a deep dive into the strategies used by Russian agents to deceive the American public in the 2016 presidential elections. Importantly, we show that election interference is a pervasive, ongoing force; one that is likely occurring right this moment. At a time when Americans are online more than ever, we must remain vigilant in preparing for such threats not only for our next election, but for the integrity of our online dialogue as a whole.

Introduction

Almost a year after taking office, President Donald Trump retweeted a Russian bot. This retweet was originally authored by a fake account called @10_gop, which called itself the “Unofficial Twitter of Tennessee Republicans.” Before its removal, this retweet was favorited by nearly 45,000 people and retweeted by thousands more. At the time this tweet was published in 2017, the American public had already been informed about Russian interference in the 2016 presidential elections; yet this Russian bot still slipped effortlessly into the president’s own feed. How, then, did this fake account succeed— and consequently, the integrity of American dialogue online fail— so spectacularly?
Since 2016, information about Russian interference in the presidential election has become public knowledge. What’s perhaps less obvious is the series of targeted strategies that Russian actors used to achieve remarkable success in hacking American trust. In this article, we explore the subtle strategies used by Russian trolls to push and pull our dialogue during the elections— how they established authenticity as “genuine Americans,” succeeded in mimicking plausible authority, and magnified one another’s deceptions through retweets and mentions. For our analyses, we pull from a dataset collected by NBC of 200,000 Russian troll tweets posted around the time of the 2016 elections. By understanding how these fake accounts worked and succeeded, we seek to understand the biggest vulnerabilities in American digital trust and give insight into possible defenses against similar interference in upcoming elections.

Mimicking the American Voice

The first key success of the 2016 Russian interference was in setting up Twitter accounts impersonating credible or prominent American voices. Bots prepared years in advance of the 2016 elections by establishing their authenticity and speaking power as run-of-the-mill American neighbors, or as U.S. news sources. We begin by exploring these two strategies for mimicking the American voice— establishing authenticity and establishing power.

Establishing Authenticity

Russian agents used a variety of rhetorical strategies to establish their fake Twitter accounts as authentic American voices and subsequently inject divisive dialogue into the American election debate. In the following wordclouds, we display the bots’ most frequently used hashtags in 2014, 2015, 2016, and 2017.

In 2014, just before election activity begins to stir, we see the Russian bots establishing themselves as authentic “American” accounts. Hashtags such as #Love and #USA dominate tweets in this year, as well as interesting references to music (#Itunes, #rap, #RAPCORE), famous Americans (#Kurt_Vonnegut, #SteveJobs), and emotions (#happy, #sadness, #lonely). This early setup stage— two years before the 2016 elections— indicates significant and intentional forethought on the part of Russian agents as they prepared their bots to later inject divisive dialogue during the U.S. presidential election.

In 2015, the Russian bots publish increasingly politically-charged tweets, masquerading as Americans as they react to hot topics and current events (#SanBernardino, #Prayers4California, #GOPDebate).

In 2016, the Russian bots peak in their divisive rhetoric, frequently supporting Donald Trump (#maga, #Trump2016) and antagonizing Hillary Clinton (#NeverHillary, #CrookedHillary).

Then, even after President Trump wins the 2016 elections, Russian bots continue to pose as Americans. In 2017, we see these fake accounts transition from politically-charged tweets to ones about everyday American topics— similar to their 2014 tweets that mimicked the everyday American voice. Alarmingly, this suggests that Russian actors are seeking to preserve the perceived authenticity of their fake accounts— likely to leave the door open for another interference attempt in the future.

Interactive Wordcloud

In the interactive wordcloud below, we present the hashtags used by Russian bots for a given range of time. Adjust the slider to change the time range.

Establishing Authority

In addition to mimicking an authentic American voice, Russian actors also sought to establish authority by garnering followers and posing as influential American news sources and institutions. To show this, we begin with a plot of popularity, placing the number of followers along the x-axis and the total number of favorites along the y-axis. We also represent the total number of tweets with the size of each point— the bigger the circle, the more tweets that account tweeted.

Interestingly, most of the less active trolls (smaller circles) fall into a dense column near the y-axis; these bots post tweets that garner many favorites, despite having fewer followers. Many of these accounts pose as regular people, with handles like @KarenParker93, @MsM0re, and @RealRobert1987.

However, we also see a band of more active accounts (larger circles) along the x-axis that do not get many favorites, but do have many followers. Interestingly, these low-favorited, well-followed bots are almost all posing as American news sources.

The bots’ account handles demonstrate how closely they targeted specific U.S. audiences. Many like @TodayNYCity focused on densely-populated American cities. Others copied names of real news outlets but replaced the word “news” with “new,” as in @ChicagoDailyNew or @DetroitDailyNew, to mislead readers into believing they were the actual news outlets. As indicated in the previous plot, these fake news accounts had high follower counts— testifying to their success at impersonation.

Notice that @ChicagoDailyNew and @DetroitDailyNew have the exact same account description, aside from the location mentioned. These cookie-cutter fake news accounts provide a glimpse into the strategic and systematic spread of misinformation by Russian actors in the 2016 U.S. elections.

Legitimacy for Russian troll accounts was more than appearing American— it also meant appearing to be an authoritative source.

In the upcoming interactive graph, we present a subset (indicated by the box) of troll accounts by number of followers, favorites, and posts. Explore bios of different troll accounts by clicking on a dot.

Interactive Popularity Plot

In the interactive plot below, we present fake accounts follower count vs. favorites on their tweets to show trends.
Hover over accounts and click them to see more information.

Highlight fake news accounts

User Bio

By mimicking the American voice and establishing authority as fake news outlets, Russian actors were able to use their Twitter bots to fool others into believing their tweets were from real Americans. In the following section, we detail some of the impacts of this interference in how real users unknowingly interacted with and augmented the voices of these Russian bots on Twitter.

The Mentions Network

In addition to mimicking the American voice, Russian bots actively worked to mention other accounts and grow their following. To explore this, we examined how bots interacted with one another and real Twitter accounts with a network graph analysis of troll mentions.



We initially visualized the full network of users that Russian bots mentioned using a directed graph. In the photo, we see trolls in red and real users in green. While the graph displays around 14,000 accounts, approximately 200 of these are troll accounts— indicating a large impact on real accounts per troll. Next, we took a closer look at the accounts that were mentioned most frequently.

Here, we see accounts that were mentioned by Russian trolls at least ten times. Notice that we now start to see a clearer pattern; there are clusters of topics and users that each troll or group of trolls targeted, indicating that these mentions were strategic and not random.

In the three main groups of mentions shown, we see differences in the content of mentions from trolls to real users as well as the identity of the real users. We labeled each of the three groups with the most common content category of the tweets.

Group 1 was the least dense of all groups with a total of about 400 tweets and only 9 real accounts. We found that the majority of the real users in this group tweeted left-leaning content, and only one user supported Trump. Interestingly, the tweets from these trolls were not pushing a certain agenda; instead, trolls were retweeting whatever the real accounts had posted. For instance, @toneporter, a fake account, mentioned the left-leaning real account @chiefplan1 by tweeting “RT @chiefplan1: Debate watch: Notice how, w/every lie Delusional Donnie tells, bags under his eyes get bigger--fuller of baggage...” On the other hand, other troll accounts retweeted a real pro-Trump account saying “RT @Shane_Rodenbeck: #IHaveARightToKnow if Obama is planning on moving back to his birthplace, Kenya.” Therefore, in contrast to possible expectations that Russian trolls would primarily retweet pro-Trump content, this group of trolls primarily retweeted anti-Trump content— indicating a more complex strategy to polarize American dialogue online.

When looking at the most common words within the tweets of this group, “Hillary” comes up first at 77 separate occurrences, and Trump second with 70 occurrences. The numbers drop sharply for all other words, though we see a focus on the presidential debates through words such as delusional (36 occurrences), vote (32 occurrences), healthcare (26 occurrences), and RejectedDebateTopics (14 occurrences).

Next, group 2 was considerably more dense than group 1, consisting of about 1300 tweets and 25 real accounts. Similar to group 1, most of the real accounts here were liberal, but they were much more well-known public figures— with a particularly high number of activists. For instance, we see @shaunking, @jamilsmith, @revjjackson and @deray, all of whom are African American activists with followers between 150k to +1m. The trolls in this group occasionally exhibited opinions different from the real account they mentioned, and, at times, used mentions to attack the real account’s statements. For instance, @ten_gop tweeted “WOW! @Deray and @POTUS[referring to Obama] plotting against America for nearly 4.5 hours! So how many plans for riots?!” This exemplifies one of group 2’s trolling strategies in using mentions to magnify their voice by responding to real public figures.

The general topics that were discussed in group 2 focused on news headlines speculating Trump’s policies. For example, @javonhidp, a fake account, tweeted “RT @blicqer: ▶ On Trump’s Education Policy https://t.co/AgECMmglkY @NatCounterPunch” Whether the tweets were supportive of the mentioned headline or not, we noticed significant visibility for Donald Trump in this group and fewer mentions of Hillary Clinton. The word “Trump” showed up 441 times and “Donald” 154 times, while “Clinton” showed up only 117 times and “Hillary” 69 times. These numbers show that these trolls gave Trump more visibility than Clinton.

Lastly, group 3 was the most dense with about 5000 tweets and 60 real accounts. This group of trolls mentioned the real accounts of prominent Trump supporters and public figures from the GOP. We see @tedcruz, @seanhannity, @erictrump and @mike_pence. Unlike groups 1 and 2, group 3 had fewer trolls than real accounts.

In this group, Trump and Clinton were mentioned a similar number of times at 946 and 845 occurrences respectively. When we look at the content of the tweets, they were mostly retweets of strong attacks on Hillary Clinton or vigorous endorsements of Trump. Some of the common attacks against Hillary Clinton referred to Clinton’s emails and Benghazi, and hashtags such as #NeverHillary was among the most used. One of the most frequently retweeted tweets was “RT @DonaldJTrumpJr: Bernie Delegate: DNC is Replacing Sanders Supporters With Paid Seat Fillers to Create Fake Unity https://t.co/dOQpqVruV…”. Tweets like these targeted Bernie supporters who were suspicious of the DNC’s disapproval of Bernie. Generally, these trolls retweeted posts that aimed to steer voters away from Hillary Clinton and towards other candidates: “RT @Tevorbowles: @PrisonPlanet @DrJillStein Please urge your supporters to go to Trump then Ms. Stein. Hillary is a dangerous sociopath.”

In order to more closely investigate the bots’ strategies, we removed all the real accounts from this graph and examined only the trolls’ interactions with one another. One thing becomes immediately clear in this graph— most trolls did not mention other trolls. More specifically, only 27% of the trolls interacted with other trolls.

One of the nodes that stands out particularly well in this graph is @ten_gop. In all of the graphs that included real users, @ten_gop did not seem to be of particular significance. However, in this graph, we can see that @ten_gop was able to reach a wide audience through other trolls— likely creating perceived social proof for @ten_gop’s ideas. As detailed by the Russia Tweets Projects, this fake account garnered over 145,000 followers, and its backup account @10_gop was retweeted by @realdonaldtrump himself.

In addition to @ten_gop, we notice another interesting fake account when we visualize the full mentioned network with all accounts and edges. Notice how the single bot @ameliebaldwin connects a swath of real users to the central cluster of bots. This account is the most connected troll in the full graph. Upon taking a closer look, we found that this bot took the approach of mentioning many real accounts a few times each. Other bots follow a similar pattern, but @ameliebaldwin is the clear front-runner here based on sheer volume of mentions. Notice this interesting contrast between @ten_gop and @ameliebaldwin. @ameliebaldwin is not very noticeable in its interactions with other trolls (see the trolls-only network), but is very noticeable in its interactions with real users. On the contrary, @ten_gop is more prominent in the trolls-only network, and it actively mentions and is mentioned by other trolls.However, @ten_gop is not as prominent in networks that contain both real and troll accounts.

Interactive Mentions Network

In the interactive network below, we visualize Twitter accounts as nodes and mentions as edges.
Hover on nodes and click to reveal more information (you can multi-select using shift+click).

Start Date:

End Date:

Only show nodes with at least

connections

*Note: Minimum value for the above is 10

Remove non-trolls


Limitations: While there are useful insights to gain from this analysis, note that this graph is limited without corresponding mentions from real accounts.
Future work could look at real accounts’ mentions of Russian trolls and analyze how such connections developed over time.

Digging into the Tweets

We’ve seen two drastically different approaches to election interference— the affected authenticity of @ten_gop aiming to create human content and the retweet-machine of @ameliebaldwin.

Even the frequency of their activity differs vastly; @ten_gop posts regularly and consistently, while @ameliebaldwin remains dormant for long periods of time and suddenly tweets en masse.

For instance, on the single day of September 17, 2016 alone, she tweeted 834 times.

In the upcoming interactive feed, explore the tweets of fake accounts @ameliebaldwin and @ten_gop throughout and after the 2016 election period to see what each account tweeted and compare their trolling approaches.

Interactive Fake Tweets

In this interactive feed, we show the tweets from @ameliebaldwin and @ten_gop accounts.
Amelie Baldwin
@ameliebaldwin
Wife, Mother, Patriot, Friend
Tennessee
@ten_gop
Unofficial Twitter of Tennessee Republicans. Covering breaking news, national politics, foreign policy and more. #MAGA #2A

The Staggering Impact of these
Fake Accounts, in Numbers

The Attack

454
fake accounts
203,482
tweets
33,162
mentions
1,170
days

The Impact

2,061,661
favorites
2,302,521
retweets
=
times Russian agents manipulated U.S. elections

Final Thoughts

In this article, we’ve demonstrated that the Russian attacks on the 2016 U.S. presidential elections were strategic, multifaceted, and alarmingly successful. Troll accounts were systematically built to mimic individual Americans and reputable news outlets. After establishing these accounts, trolls used mentions and retweets, often of each other, to amplify their messages.

Perhaps most importantly, Russian interference continued after President Trump was sworn into office, and is likely happening to this very day. This calls for serious efforts by all social media platforms and the U.S. government to expect foreign interference once more in the upcoming 2020 presidential elections. Each of us also have our part to play. By understanding the way Russian trolls behave and acknowledging their existence in our social media feeds, we can all help defend our online dialogue by reading with caution and reporting any accounts that seem suspicious. Four years ago, we failed to notice and correct for foreign interference in our conversations online. Political preferences aside, Americans deserve to know when the opinions they read online are not from their fellow neighbors, but a Russian bot. By presenting this analysis, we hope to provide insight into the strategic and intentional deception tactics used by Russian agents to attack the integrity of the 2016 elections.

Hopefully, we will be better prepared next time.