If you want to get the truth, you go to the people you trust. That’s been a basic tenet for determining who succeeds and who doesn’t in journalism for decades.
Mark Zuckerberg believes the trust factor should also determine whose news you see or don’t see on Facebook.
The problem is, someone other than you is going to be determining who you should trust.
Facebook’s CEO said Tuesday that his company is ranking news organizations based on trustworthiness, and will promote or suppress content from those organizations based on the scores they receive.
It’s the latest tool the company is using to combat what it perceives as an increase in the number of “fake news” stories on its site in the past few years.
So who or what determines which news is trustworthy? Zuckerberg said the company will compile data on how users perceive news brands by asking them to identify whether they have heard of various publications and if they trust them.
“We put [that data] into the system, and it is acting as a boost or a suppression, and we’re going to dial up the intensity of that over time,” he said, as reported by BuzzFeed. “We feel like we have a responsibility to further [break] down polarization and find common ground.”
Zuckerberg said the company will invest “billions” of dollars in a combination of artificial intelligence and tens of thousands of human moderators to keep fake news and deliberate propaganda away from users, and will step up that effort during elections in an effort to avoid the influx of Russian propaganda that was posted on Facebook prior to the 2016 presidential elections.
“There’s no guarantee that we get this right,” Zuckerberg said. “We will make mistakes, and there will be consequences, and we will need to fix them. But what I can guarantee is that if we don’t work on this, the world isn’t moving in this direction by itself.”
Putting the decision of what rates as trustworthy news or fake news into the hands of individuals has plenty of critics within journalism circles.
As Adrienne Lafrance of The Atlantic wrote, “Deciding what to believe based on other people’s opinions is not only not journalistic, it’s arguably hostile to the press as a democratic institution. The truth may be nuanced, but reportable facts are often quite straightforward. As any journalist can tell you, the best answer to the question ‘What happened?’ is not ‘Why don’t you ask a bunch of your friends what they think, organize their views along a spectrum, and then decide where to plant yourself.'”
Shepard Amballas of Intellihub says the rating system will have an even more chilling effect.
“It looks like this could be the beginning of the end for any independent journalist or news outlet who dares to challenge the official narrative,” Amballas said.
Putting the ratings in the hands of users is also a way for Facebook to try and distance itself from allegations of bias in how news content is distributed.
As the Western Journal reported in March, Facebook’s January implementation of a new algorithm that determines what stories appear in users’ news feeds has had negative impact on user traffic for conservative news sites compared to those lean more to the left. Portions of that report were used by Republican Rep. Steve Scalise of Louisiana during his questioning of Zuckerberg about news bias last month on Capitol Hill.
Prior to his speech Tuesday, Zuckerberg told Wired he was somewhat surprised by how many questions he received during his congressional testimony in regards to bias and censorship of conservative views on Facebook.
“There were more questions about bias than I had expected,” Zuckerberg said. “I think that that reflects a real concern that a lot of people have about the tech companies, which are predominantly based in Silicon Valley and Seattle and these extremely liberal places. That depth of concern that there might be some political bias really struck me. I really want Facebook to be a platform for all ideas.”
To that end, the company announced this week it has formed an advisory board to look into allegations of bias against conservatives.
According to Axios, the conservative bias advising partnership will be led by former Arizona Republican Sen. Jon Kyl, along with his team at Covington and Burling, a Washington law firm.
“Kyl will examine concerns about alleged liberal bias on Facebook, internally and on its services. They will get feedback directly from conservative groups and advise Facebook on the best way to work with these groups moving forward,” the report said. “The Heritage Foundation, a conservative public policy think tank, will convene meetings on these issues with Facebook executives.”
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.