Advanced Persistent Threat
A Politico story today is reporting ‘Sustained and ongoing’ disinformation assault targets Dem presidential candidates.
A wide-ranging disinformation campaign aimed at Democratic 2020 candidates is already underway on social media, with signs that foreign state actors are driving at least some of the activity.
The main targets appear to be Sens. Kamala Harris (D-Calif.), Elizabeth Warren (D-Mass.) and Bernie Sanders (I-Vt.), and former Rep. Beto O’Rourke (D-Texas), four of the most prominent announced or prospective candidates for president.
But they caution that the organized and organic are not always easy to tease apart in the face of ongoing misinformation efforts.
Not all of the activity is organized. Much of it appears to be organic, a reflection of the politically polarizing nature of some of the candidates. But there are clear signs of a coordinated effort of undetermined size that shares similar characteristics with the computational propaganda attacks launched by online trolls at Russia’s Internet Research Agency in the 2016 presidential campaign, which special counsel Robert Mueller accused of aiming to undermine the political process and elevate Donald Trump.
These findings are consistent with our own data and we will report on them in more detail over the next few days. We will start tomorrow by discussing a series of racially-oriented attacks on Kamala Harris.
Tracking the Problem
No one specific has responsibility for tracking these misinformation campaigns. Even worse, researchers often exhibit wide disagreement on basic questions of definition or process when confronted with the task of amelioration. If we want a political environment free of these influences, it is entirely up to us to make that happen. Politico quotes Brett Horvath, a Guardians.ai founder:
“Moderates and centrists and Democratic candidates still don’t understand what happened in 2016, and they didn’t realize, like Hillary Clinton, that she wasn’t just running a presidential campaign, she was involved in a global information war,” Horvath said. “Democratic candidates and presidential candidates in the center and on the right who don’t understand that aren’t just going to have a difficult campaign, they’re going to allow their campaign to be an unwitting amplifier of someone else’s attempts to further divide Americans.”
This is correct. We’ll be adding their list of suspect accounts to our bias visualization.
At Marvelous AI, we are most interested in understanding the messages and their target audiences. In other words, we tend to focus on defense, regardless of the identity of the attacker. So I will spend a little more time than usual on background in this post. That way we can talk in much more detail about our own research later with the proper context in mind.
Viral Deception in Online Media
When we started working on Marvelous AI, one of the core motivators was the 2016 US Presidential Election media environment. We were especially concerned with misinformation and viral deception promoting misleading narratives. The role of Russian state propaganda campaigns garners substantial attention. But the deceptive persuasion deployed against Hillary Clinton more broadly leveraged a wide variety of vulnerabilities in political discourse and media. Like the maelstrom of activity in the Trump era more broadly, the techniques and actors at play are too numerous to fully catalog. In many cases the borders between domestic activists and foreign operatives is nearly impossible to distinguish. And the challenges presented by these deceptive techniques continue unabated into the 2020 political environment.
Regardless of the origins, understanding the contours and impact of various narrative campaigns is fundamental to their amelioration.
Cyberwar: Jamieson Sets the Terms
The term viral deception owes its provenance to Kathleen Hall Jamieson’s comprehensive analysis of Russian interference in 2016, entitled Cyberwar. Like Jamieson, we like this term because it has a broader coverage than “misinformation” and because its initialism “V.D.” is both descriptive and potentially prophylactic.
The Cyberwar is Cognitive
Jamieson argues that influence operations did impact the outcome in 2016. She points to a number of factors, but most notably to the role of stolen content and online amplification in agenda setting for the televised debates. The range of techniques, tactics and procedures at play are wide-ranging and numerous:
Because much of the troll messaging was consistent with content available elsewhere in the social media stream, its existence matters not so much for the injection of new ideas into the campaign dialogue (although there are instances in which that was done) but rather to the extent that it increased the visibility of anti-Clinton and pro-Trump content (an amplifying effect); drove memes into traditional news outlets (an agenda-setting effect); signaled social media users that its sentiments were widely shared (a normative effect); helped the trolls identify users susceptible to subsequent mobilizing or demobilizing appeals (target identification); increased likelihood that, rather than sitting out, a person would decide to cast a vote for Trump (a mobilizing effect); was shared by those not already exposed to the message (a two-step flow effect); changed the relative amount of anti-Clinton content or negative emotion in the feeds of susceptible individuals (with weighting, contagion and spiral of silence effects); and increased perception of the accuracy of the messages (a familiarity effect). Past research and available data suggest that these are all plausible outcomes. (Cyberwar p75, emphasis mine)
Most people active on social media during the lead up to the 2016 election will recognize all of these effects.1 Even if their provenance was often less clear at the time.
All Media is Social Media Now
Harvard researchers who looked at the role of social media in the spread of misinformation prior to the 2016 election2 found that messages spread differently, depending on properties specific to each social network.
The online political environment in the United States is polarized, but the “filter bubbles” are best characterized as 1) the Fox News bubble and 2) everyone else. And the corrective sanctions at play in the two environments are highly asymmetric. The latter bubble penalizes for straying from the truth while the former penalizes for straying from the accepted narrative. You can see these effects pretty clearly in our own bias distributions.
Even worse, Benkler argues that a propaganda feedback loop reinforces right wing narratives in the Fox bubble and promotes those narratives to the mainstream via a loosely organized propaganda pipeline. They describe the propaganda pipeline (attention backbone) as a battle for attention.
The process by which peripheral nodes in a network compete for attention with subnetworks of like-minded users. This is the process by which the details of the narrative are selectively promoted to the central nodes.
Social Media Agenda-Setting Drives Corporate Media Narratives
Jamieson also shows how the Wikileaks release of Clinton’s Wall Street speeches mere days before the debates reinforced existing media narratives about her.
The ongoing Clinton-Sanders to-and-fro over those addresses primed the salience of the issue with political reporters. As a result, when WikiLeaks released partial texts for some of them two days before the second debate, it didn’t take much for Trump, Trump-aligned media, and Russian operatives to assert that Clinton had had good reasons to conceal them.
In the second presidential debate, on October 9, the moderator, Martha Raddatz, mentioned WikiLeaks but neither the Russian hacking that put the heretofore hidden speech segments in that site’s possession nor the fact that the content was ill-gotten. At the same time, she reminded viewers that Clinton has refused to release the material.
Ignoring the nominations that had elicited mass support, Raddatz opted instead for one that garnered only thirteen votes but fit the frame set by the Sunday shows earlier that day: “Is it okay for politicians to be two-faced?” To that question the veteran journalist added, “Is it acceptable for a politician to have a private stance on issues?” Unmentioned was the discussion of Steven Spielberg’s film Lincoln, which had contextualized Clinton’s original remarks about public and private positions. (pp179-80)
She describes a similar dynamic around the third debate as well, but centered on comments about trade with Latin America.
Voter Suppression in 2016
Numerous sources, including several Trump campaign officials, cite two related factors when talking about the surprising outcome.
- the targeting of black voters with viral deception; and
- the somewhat depressed African American turnout in rust belt battlegrounds.
Certainly many other factors contribute. But we note these two in particular because they suggest striking parallels to 2020-related activity already underway today.3
Wedging the Voters
In 2016, Russian social media actors sought to wedge various constituencies with inflammatory messages. Jamieson (p86) notes:
They also magnified conflicts between those in the black community and the police while priming forms of black identification with the black panther movement and aligning the police with the Ku Klux Klan (KKK).
Coupling this activity with similar campaigns targeted at rural white communities betrays a generic strategy of stoking divides in the American electorate, both within and across parties.
Dampening Voter Enthusiasm
Most analysts would agree that the larger impact of Russian influence campaigns during the 2016 election was to weaken Clinton, rather than to strengthen Trump. The amplifying effect of their efforts did strengthen Trump. But the damage that ongoing viral deception efforts4 did to her campaign, especially to its credibility, far outstrip those effects. So how did they target her directly with this framing?
Super predator: Targeting the Candidate
Certainly the generic “backlash” effects that were being whipped up by accounts such as Blacktivist and TEN_GOP had an impact on voter behavior. But the real anti-Clinton attack was extremely precise.
The Kremlin’s vote-suppression maneuvers were many and various. Prominent among them was not only priming a specific facet of Bill Clinton’s record but also doing the same for remarks by Hillary Clinton that vexed liberals in general and black voters in particular. Days before the South Carolina primary, a BLM activist rasied both when she demanded at a Clinton fundraiser that the candidate apologize for her husband’s support of mass incarceration and her own 1996 characterization of gang members as “super predators.” In late August (August 21, 2016), the Republican nominee retweeted the CSPAN-2 video of Hillary Clinton’s 1996 statement in support of sentencing reform that referred to “super predators” who need to be “brought to heel.”
Russian tweets translated these concerns into statements such as “Black families are divided and destroyed by mass incarceration and death of black men. Facts don’t lie” and “U.S. prisons now hold more black men than slavery ever did.” Troll posts also claimed that “Black people continue to make up more than 30 percent of the people dying from police misconduct, though we make up only 13 percent of [sic] nation’s population.” In early 2916 an IRA Tumblr account posted the video of Clinton using the term “super predators.”
Part Two: Defining Kamala Harris
In tomorrow’s post, we will take a closer look at how online actors are attempting to preemptively frame Kamala Harris. And we will show that early awareness of these campaigns offers much more than an ounce of prevention.
- Clinton supporters likely feel a familiar pang at mention of the spiral of silence effect.
- Network Propaganda, Benkler, et al 2018
- We will discuss the specific attacks on Kamala Harris in Part 2 tomorrow.
- Not just from the Russians but also other anti-Clinton actors.