Facebook Blocks 115 Accounts After Alert From U.S. Authorities

Facebook and Instagram ads linked to a previous Russian effort to disrupt the U.S. elections.

Facebook says it has blocked 115 user accounts after U.S. authorities alerted the social network to suspicious activity that may be linked to a foreign country.

The company's move, announced in a blog post late on November 5, came hours after U.S. law enforcement agencies warned that, as U.S. voters go to the polls on November 6, they should be wary of attempts by Russia, Iran, and other countries to spread fake news on social media.

"At this time we have no indication of compromise of our nation's election infrastructure that would prevent voting, change vote counts, or disrupt the ability to tally votes," said a joint statement by Department of Homeland Security Secretary Kirstjen Nielsen; Attorney General Jeff Sessions, Director of National Intelligence Dan Coats, and FBI Director Christopher Wray.

"But Americans should be aware that foreign actors - and Russia in particular - continue to try to influence public sentiment and voter perceptions through actions intended to sow discord," it said. "They can do this by spreading false information about political processes and candidates, lying about their own interference activities, disseminating propaganda on social media, and through other tactics."

"The American public can mitigate these efforts by remaining informed, reporting suspicious activity, and being vigilant consumers of information," the statement said.

Russian Foreign Minister Sergei Lavrov dismissed the warning, saying on November 6 that "accusations of [Russian] attempts to influence the outcomes of elections in the United States" are "no longer interesting."

Speaking at a news conference after talks in Madrid with his Spanish counterpart, Josep Borrell, Lavrov asserted that "not a single piece of evidence proving our meddling in any elections or referendums, either in the United States or in Catalonia, Macedonia or Montenegro, has ever been given" -- a statement that seemed to contradict evidence cited in indictments issued by U.S. special prosecutor Robert Mueller's office.

A study published last week found that misinformation on social media in the days leading up to the November 6 midterm elections was spreading more rapidly than it did during the run-up to the 2016 presidential election.

Facebook said it needed to do further analysis to determine whether the accounts it deleted are linked to Russia or the Internet Research Agency, both of which were accused of being involved in past election-meddling efforts.

U.S. prosecutors have accused the so-called troll farm in St. Petersburg and other Russia operatives of seeking to influence the 2016 presidential election, including by spreading misinformation and sowing discord on social networks.

Facebook said 85 of the removed accounts were posting items in English on Facebook's Instagram service. Some of the Instagram accounts "were focused on celebrities" and others on "political debate," it said.

Facebook said 30 more accounts that it removed were spreading posts on Facebook and were associated with Facebook pages that were written in French and Russian.

The tip Facebook received came from U.S. law enforcement on November 4, Nathaniel Gleicher, Facebook's head of cybersecurity policy, wrote in the post.

The company decided to announce its actions right away, "given that we are only one day away from important elections in the U.S.," he said.

The midterm elections will determine which party controls the U.S. Congress, with polls showing both chambers remain up for grabs.

Social-media companies like Facebook are striving to act more quickly against foreign disinformation efforts than they did in the last U.S. election.

Twitter said on November 3 that it deleted a "series of accounts" that attempted to share disinformation, without providing details.

With reporting by AFP, Reuters, TASS, and Interfax