Facebook Unleashed 2 Tools That Helped Suppress Conservative Content - WJ Among Its Victims
The Wall Street Journal has run a series of articles as part of a project called “The Facebook Files.” One installment published Sunday is entitled “Facebook’s Internal Chat Boards Show Politics Often at Center of Decision Making.”
The Journal reviewed company documents and scoured internal employee communications to gain an understanding of how Facebook decides which news stories will be promoted and which will be suppressed.
The importance of these decisions cannot be overstated. Citing Pew Research Center, the Journal reported that over a third of Americans use Facebook as a regular source of news. The Big Tech company’s ability to determine the content its users read gives it the power to actually influence users’ views.
The Journal’s Sunday article focused on the simmering tension between Facebook employees and management over two tools the platform uses to detect “misinformation.” Surprisingly enough, employees are far more eager to censor conservative sites than managers who know the company would face backlash if its methods became too obvious.
The Journal reported that these tools — called “Sparing Sharing” and “Informed Engagement” — weren’t developed for the purpose of stifling conservative content. But that’s what they did.
In 2019, the social media giant conducted an analysis of the tools and found that if they were eliminated, traffic on “very conservative” sites would increase significantly — Breitbart’s would rise 20 percent, The Washington Times’ 18 percent, The Western Journal’s 16 percent and The Epoch Times’ 11 percent.
Facebook had essentially labeled conservative content as misinformation and prevented stories from going viral.
Heated debate followed. One company researcher wrote, “We could face significant backlash for having ‘experimented’ with distribution at the expense of conservative publishers.”
Much of the company’s attention was directed at Breitbart, a site many employees believed published prolific amounts of misinformation.
For example, when riots spread across the U.S. in June 2020 following the death of George Floyd, an employee wrote a message on a Facebook chat board saying, “Get Breitbart out of News Tab.” The News Tab feature “aggregates and promotes articles from various publishers, chosen by Facebook,” according to the Journal.
The employee cited several Breitbart headlines: “Minneapolis Mayhem: Riots in Masks,” “Massive Looting, Buildings in Flames, Bonfires!” and “BLM Protesters Pummel Police Cars on 101.”
These headlines, the employee said, were “emblematic of a concerted effort at Breitbart and similarly hyperpartisan sources (none of which belong in News Tab) to paint Black Americans and Black-led movements in a very negative way.” The Journal reported that many employees agreed.
A Facebook researcher responded by saying that removing Breitbart from News Tab could face internal opposition due to concerns about political backlash. “At best, it would be a very difficult policy discussion.”
Another employee wrote, “My argument is that allowing Breitbart to monetize through us is, in fact, a political statement. … It’s an acceptance of extreme, hateful, and often false news used to propagate fear, racism and bigotry. On a daily basis, it publishes articles that I believe insult our values as a company.”
In the end, Facebook kept Breitbart on News Tab. It eliminated the Informed Engagement tool but kept Sparing Sharing.
“We make changes to reduce problematic or low-quality content to improve people’s experiences on the platform, not because of a page’s political point of view,” Facebook spokesman Andy Stone said. “When it comes to changes that will impact public pages like publishers, of course we analyze the effect of the proposed change before we make it.”
Yes, Facebook obviously analyzes the effect of its algorithms. It wants to know how far it can go without getting caught. What can we do to undermine conservative sites before they figure it out? Management seems far more cognizant of the potential consequences than the company’s more aggressive employees.
Over the past few years, Big Tech has come under tremendous public scrutiny — and with good reason. The CEOs of Facebook, Google and Twitter currently wield more power over political discourse than the federal government.
Unfortunately for us, they work in tandem with the federal government. When you throw the establishment media into the mix, we are in the fight of our lives.
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.