Joe Osborne, a Facebook spokesperson, said in a statement that the company “had already been investigating these topics” at the time of Allen’s report. “Since that time, we have stood up teams, developed new policies and collaborated with industry peers to address these networks. We’ve taken aggressive enforcement actions against these kinds of foreign and domestic inauthentic groups and have shared the results publicly on a quarterly basis.”
In the process of fact checking this story shortly before publication, MIT Technology Review found that five of the troll-farm pages mentioned in the report remained active.
The report found that troll farms were reaching the same demographic groups singled out by the Kremlin-backed Internet Research Agency (IRA) during the 2016 election, which had targeted Christians, Black Americans, and Native Americans. A 2018 BuzzFeed News investigation found that at least one member of the Russian IRA, indicted for alleged interference in the 2016 US election, had also visited Macedonia around the emergence of its first troll farms, though it didn’t find concrete evidence of a connection. (Facebook said its investigations hadn’t turned up a connection between the IRA and Macedonian troll farms, either.)
“This is not normal. This is not healthy,” Allen wrote. “We have empowered inauthentic actors to accumulate huge followings for largely unknown purposes … The fact that actors with possible ties to the IRA have access to huge audience numbers in the same demographic groups targeted by the IRA poses an enormous risk to the US 2020 election.”
As long as troll farms found success in using these tactics, any other bad actor could too, he continued: “If the Troll Farms are reaching 30M US users with content targeted to African Americans, we should not at all be surprised if we discover the IRA also currently has large audiences there.”
Allen wrote the report as the fourth and final installment of a year-and-a-half-long effort to understand troll farms. He left the company that same month, in part because of frustration that leadership had “effectively ignored” his research, according to the former Facebook employee who supplied the report. Allen declined to comment.
The report reveals the alarming state of affairs in which Facebook leadership left the platform for years, despite repeated public promises to aggressively tackle foreign-based election interference. MIT Technology Review is making the full report available, with employee names redacted, because it is in the public interest.
Its revelations include:
As of October 2019, around 15,000 Facebook pages with a majority US audience were being run out of Kosovo and Macedonia, known bad actors during the 2016 election.
Collectively, those troll-farm pages—which the report treats as a single page for comparison purposes—reached 140 million US users monthly and 360 million global users weekly. Walmart’s page reached the second-largest US audience at 100 million.
The troll farm pages also combined to form:
the largest Christian American page on Facebook, 20 times larger than the next largest—reaching 75 million US users monthly, 95% of whom had never followed any of the pages.
the largest African-American page on Facebook, three times larger than the next largest—reaching 30 million US users monthly, 85% of whom had never followed any of the pages.
the second-largest Native American page on Facebook, reaching 400,000 users monthly, 90% of whom had never followed any of the pages.
the fifth-largest women’s page on Facebook, reaching 60 million US users monthly, 90% of whom had never followed any of the pages.
Troll farms primarily affect the US but also target the UK, Australia, India, and Central and South American countries.
Facebook has conducted multiple studies confirming that content more likely to receive user engagement (likes, comments, and shares) is more likely of a type known to be bad. Still, the company has continued to rank content in user’s newsfeeds according to what will receive the highest engagement.
Facebook forbids pages from posting content merely copied and pasted from other parts of the platform but does not enforce the policy against known bad actors. This makes it easy for foreign actors who do not speak the local language to post entirely copied content and still reach a massive audience. At one point, as many as 40% of page views on US pages went to those featuring primarily unoriginal content or material of limited originality.
Troll farms previously made their way into Facebook’s Instant Articles and Ad Breaks partnership programs, which are designed to help news organizations and other publishers monetize their articles and videos. At one point, thanks to a lack of basic quality checks, as many as 60% of Instant Article reads were going to content that had been plagiarized from elsewhere. This made it easy for troll farms to mix in unnoticed, and even receive payments from Facebook.
How Facebook enables troll farms and grows their audiences
The report looks specifically at troll farms based in Kosovo and Macedonia, which are run by people who don’t necessarily understand American politics. Yet because of the way Facebook’s newsfeed reward systems are designed, they can still have a significant impact on political discourse.