Cyberwarfare / Nation-State Attacks , Fraud Management & Cybercrime , Social Engineering

Facebook, Twitter Remove More Russia-Linked Accounts

Review Found Links to Internet Research Agency Troll Farm
Facebook, Twitter Remove More Russia-Linked Accounts

Facebook and Twitter have removed dozens suspicious accounts after investigations found that many of them operating out of Ghana and Nigeria had ties to Russian groups attempting to spread disinformation to U.S. voters in the months before the November presidential election.

See Also: OnDemand | Combatting Rogue URL Tricks: How You Can Quickly Identify and Investigate the Latest Phishing Attacks

On Thursday, Facebook announced that it had removed 49 accounts and 69 pages from its platform, as well as 85 Instagram accounts, which were engaging in spreading disinformation. In some cases, Facebook's internal security team found a connection between these pages and the notorious Internet Research Agency, a Russia-based troll farm that employs hackers to create bogus social media accounts. The agency was named as one of the major disruptors of the 2016 presidential election (see: Report: US Struggled to Counter 2016 Election Interference).

Last October, the U.S. imposed sanctions against six members of the Internet Research Agency along with Yevgeniy Prigozhin, who allegedly provided financing for the organization (see: Russian Troll Farm Targeted With Fresh US Sanctions).

Facebook had previously banned the Internet Research Agency from its platforms.

Meanwhile, Twitter said this week it removed 71 Russia-linked accounts that were also engaged in disinformation, including pushing hot-button social issues such as race relations and civil rights.

With the U.S. presidential election coming up in November, social media firms are stepping up their efforts to shut down foreign interference and disinformation campaigns related to U.S. politics (see: FBI's Elvis Chan on Election Cybersecurity).

Fake Pages and Disinformation

Facebook says that the networks it shut down this week were in the early stages of building an audience, with about 13,500 accounts and 265,000 individuals following one or more of the pages and Instagram accounts. About 65 percent of the followers were from the U.S., the company notes.

"This network was in early stages of audience building and was operated by local nationals - witting and unwitting - in Ghana and Nigeria on behalf of individuals in Russia," says Nathaniel Gleicher, Facebook’s head of security policy.

The threat actors used fake accounts to post on the groups and manage pages that pretended to be not-for-profit organizations or personal blogs, according to Facebook. The company's internal systems rejected attempts by these groups to run political ads, the company says.

Facebook says the networks operated out of Ghana and Nigeria "frequently posted about U.S. news and attempted to grow their audience by focusing on topics like black history, black excellence and fashion, celebrity gossip, news and events related to famous Americans, like historical figures and celebrities, and LGBTQ issues."

While activity did not appear to directly focus on elections, these networks posted content about oppression, injustice and police brutality, Facebook notes. In one example from September 2019, a group named "Black Facts Untold" posted an image of three Stanford marching band members taking a knee during the national anthem.

Twitter says that the threat actors were largely tweeting in English and presenting themselves as based in the U.S.

Joint Investigation

An investigation by CNN and two professors at Clemson University helped track down some of these groups using Facebook and Twitter. For example, they discovered a group called “Eliminating Barriers to the Liberation of Africa,” which described itself as an nonprofit organization but was actually part of the disinformation campaign.

Graphika, a social network analysis company that also worked with Facebook to uncover these networks, noted that this group used a mix of fake accounts and real people to help spread disinformation. In its report, CNN found that much of the activity on these accounts had similarities to the Russian troll campaign of 2016.


About the Author

Ishita Chigilli Palli

Ishita Chigilli Palli

Senior Correspondent, Global News Desk

As senior correspondent for Information Security Media Group's global news desk, Ishita covers news worldwide. She previously worked at Thomson Reuters, where she specialized in reporting breaking news stories on a variety of topics.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing healthcareinfosecurity.com, you agree to our use of cookies.