- Troll farms were building massive audiences and peddling propaganda ahead of the 2020 presidential election, according to a report.
- Facebook’s design and algorithms helped disseminate content from the troll farm, the MIT Technology Review said.
- The social media platform has struggled to crush disinformation campaigns since the 2016 presidential election.
Facebook propaganda and disinformation troll farms have reached an estimated 140 million Americans per month as the 2020 presidential election looms, according to a company internal report obtained by MIT Technology Review.
The report came out in 2019 as part of a nearly two-year effort to understand the activity of troll farms on the platform and was recently handed over to MIT Technology by a former employee not involved in its research.
The social media company failed to crack down on disinformation and lagging farm content after the 2016 election, according to the report.
The company “pursued a rock-solid strategy that involved monitoring and canceling the activity of bad actors when they engaged in political discourse, and adding safeguards that prevented” the worst of the day. worse, “MIT Technology Review said.
By 2020, troll farms, or groups that work together to create and disseminate fake news online, “were still building up a massive audience by running networks of Facebook pages,” according to the MIT report.
15,000 pageviews by a predominantly American audience came from Kosovo and Macedonia, which embarked on disinformation campaigns in the 2016 election. Collectively, the Troll Farm pages reached 140 million American users per months and 360 million global users per week at the end of 2019, as Americans prepare to vote in one of the most turbulent US presidential elections in history.
“It’s not normal. It’s not healthy,” wrote Jeff Allen, a former senior Facebook data scientist who wrote the report. “We have allowed inauthentic actors to accumulate huge following for largely unknown purposes.”
The Troll Farm Pages produced content for the site’s largest American Christian page reaching 75 million U.S. users per month, largest African-American page with 30 million users per month, second largest Native American page with 400,000 monthly users, and the site’s fifth largest female page with 60 million monthly users.
A majority of users who viewed this content had never followed these pages.
Facebook did not immediately respond to Insider’s request for comment on the report. Joe Osborne, a spokesperson for Facebook, told MIT Technology Review that the company has “formed teams, developed new policies and worked with industry peers to combat these networks. We have taken aggressive enforcement action against these kinds of groups and have shared the results publicly on a quarterly basis. “
Since 2016, bad actors have repeatedly succeeded in spreading US election conspiracies and disinformation about COVID-19. Politicians and regulators have criticized the inability of Facebook and other platforms to limit foreign interference. Researchers and tech rights advocates have provided their own resources to tackle disinformation online, but most companies have chosen to use their in-house disinformation algorithms.
Facebook itself has tried to offer more transparency in its content moderation practices, but its current approach – using algorithms to flag possible bad content and asking human reviewers to review that content on a case-by-case basis – has been criticized as being more of a band-aid. solution than a permanent solution.