Facebook accounts run by Russian trolls repeatedly called for violence against different social and political groups in the U.S., including police officers, Black Lives Matter activists and undocumented immigrants.
Posts from three now-removed Facebook groups created by the Russian Internet Research Agency suggest Russia sought not only to meddle in U.S. politics but to encourage ideologically opposed groups to act out violently against one another. The posts are part of a database compiled by Jonathan Albright, the research director at Columbia University’s Tow Center for Digital Journalism, who tracks and analyzes Russian propaganda.
For example, “Being Patriotic,” a group that regularly posted content praising Donald Trump’s candidacy, stated in an April 2016 post that Black Lives Matter activists who disrespected the American flag should be “be immediately shot.” The account accrued about 200,000 followers before it was shut down.
Another Russia-linked group, “Blacktivist,” described police brutality in a November 2016 post weeks after the election, and stated, “Black people have to do something. An eye for an eye. The law enforcement officers keep harassing and killing us without consequences.”
The group “Secured Borders” had the most violent rhetoric, some of it well after the presidential election. A post in March 2017 described the threat of “dangerous illegal aliens” and said, “The only way to deal with them is to kill them all.” Another post about immigrants called for a draconian new law, saying, “if you get deported that’s your only warning. You come back you get shot and rolled into a ditch… BANG, problem solved.” And a post about refugees said, “the state department needs to be burned to the ground and the rubble reduced to ashes.”
Related: Facebook estimates 126 million people were served content from Russia-linked pages
More than two dozen messages encouraging violence are among thousands of controversial posts from Russia-linked Facebook accounts that analysts say sought to increase hostility — both ideological and physical — in the U.S. in an effort to further divide American society along political, religious or racial lines.
Mark R. Jacobson, a Georgetown University professor and expert on Russian influence operations, said Russia strategically seeks to undermine U.S. political cohesion by promoting extremist views within opposing political or social groups, and hoping chaos—and violence — ensues.
“The Russians don’t want groups like Black Lives Matter [and] the Alt-Right to sit there and have discussions and debates about the future of America. They want violent clashes,” Jacobson said.
Jacobson noted that, during the Cold War, Russia sought to enhance extremist ideas within the civil rights movement in hopes of sparking race-based warfare in the U.S.
“If we start to see violent rallies… we should start to look for the hidden hand of Russian influence behind it,” he said.
Columbia University’s Albright said even if only a fraction of the accounts’ posts called for physical violence, the overall messaging sought to push audiences toward more radical viewpoints that they would act on.
“These posts contained psychological calls to action toward both online and physical behavior,” he said.
Some of the violent posts received tens of thousands of likes, comments, shares, or reactions, according to a database of messages Albright compiled from six now-deleted Russia-linked accounts, which included the accounts that posted the violent messages reviewed by CNN.
One post by Secured Borders shared in October 2016, which was interacted with more than 100,000 times, stated, “if Killary wins there will be riots nationwide, not seen since the times of Revolutionary war!!”
Albright said this post was likely amplified through paid advertising because the overwhelming majority of Secured Borders’ messages received only a few thousand interactions.
Facebook has said it identified 3,000 ads tied to the Russian troll farm that ran between June 2015 and May 2017, though it’s unclear if those ads included any of the messages calling for violence. Facebook shared those ads with Congress, but they have not yet been publicly released.
Related: Even Pokémon Go used by extensive Russian-linked meddling effort
Susan Benesch, director of the Dangerous Speech Project and a faculty associate at Harvard’s Berkman Klein Center for Internet & Society, said violent messages like this could increase the possibility of audiences condoning or participating in violence against members of targeted groups.
“People can be heavily influenced by content online even when they don’t know where it comes from,” Benesch said. “In these cases, we can’t know if anyone was actually influenced toward violence, but this type of speech could increase that risk.”
Facebook’s terms of service prohibit content that is “hate speech, threatening, or… incites violence.”
Asked for comment, a Facebook spokesperson told CNN, “We don’t allow the promotion of violence on Facebook but know we need to do better. We are hiring thousands of new people to our review teams, building better tools to keep our community safe, and investing in new technologies to help locate more banned content and bad actors.”
Facebook’s Vice President of Policy and Communications, Elliot Schrage, has said the company is working to develop greater safeguards against election interference and other forms of abuse. In a blog post earlier this month, Schrage said Facebook is “still looking for abuse and bad actors on our platform — our internal investigation continues.”
The Internet Research Agency, a secretive company based in St. Petersburg, which the US intelligence community has linked to the Kremlin, appears to be the source of 470 inauthentic Facebook accounts that shared a wide range of controversial messages. Documents obtained by CNN show the IRA included a “Department of Provocations” that sought to spread fake news and social divisions in the West.
— CNN’s Drew Griffin contributed reporting.
CNNMoney (New York) First published October 31, 2017: 10:37 AM ET