Although Facebook’s fact-checking program is worthwhile, a third-party fact-checker partner says there is still a lot of work to be done to curb misinformation on the social media site and its partners.
Full Fact, a nonprofit company that has been part of Facebook’s fact-checking program for the past six months, released a report detailing its involvement in the program with recommendations on how it can be improved.
For one, Full Fact says that Facebook is not transparent about the impact of the fact-checking, so fact-checkers don’t know how many users the fact-check reaches compared with the number of people who received or spread the misinformation.
As part of the fact-checking process, Facebook sends the third-party fact-checkers a “queue” of content it has identified as possibly false.
The London-based Full Fact says that although the fact-checkers do not know exactly how things are determined to go into the queue, “we do know that it is a combination of Facebook users flagging the content as suspicious, and Facebook’s algorithms proactively identifying other signals that might suggest it is false.”
Each queue contains content the fact-checker specializes in. For example, Full Fact’s queue “is supposed to prioritise UK-centric content.”
From there, the fact-checkers can write a fact check on their own website and attach it with their rating: false, mixture, false headline, true, not eligible, satire, opinion, prank generator or not rated.
Depending on the rating given to the post, Facebook can take further action such as reducing the distribution of a false-rated post.
“We believe that Facebook’s current rating system for the Third Party Fact Checking programme needs to change, and we have made other specific recommendations about how the programme can be strengthed,” Full Fact wrote in its report.
Some of those recommendations include “developing tools that can better identify potentially harmful false content,” “add a ‘mixture’ rating which does not reduce the reach of content,” “develop clearer guidance on how to differentiate between several claims within a single post” and “be explicit about plans for machine learning.”
Other fact-checkers have voiced similar concerns in the past year.
“They’ve essentially used us for crisis PR,” Brooker Binkowski, the former managing editor of Snopes, told The Guardian. “They’re not taking anything seriously. They are more interested in making themselves look good and passing the buck. … They clearly don’t care.”
Facebook responded to Full Fact’s report and said, “We are encouraged that many of the recommendations in the report are being actively pursued by our teams as part of continued dialogue with our partners, and we know there’s always room to improve.”
“Our latest findings demonstrate that Facebook has systematically failed to take action while its platform continues to be plagued with fake review groups generating thousands of posts a day,” Natalie Hitchins, head of products and services for Which, said, according to the New York Post.
Which found that in just one day, there were more than 3,500 new incentivized review posts from 10 different groups. Across 30 days, that number rose to 55,000 new incentivized posts.
“It is unacceptable that Facebook groups promoting fake reviews seem to be reappearing,” George Lusty, senior director of the U.K.’s Competition and Markets Authority, said. “Facebook must take effective steps to deal with this problem by quickly removing the material and strop it from resurfacing.
“Lots of us rely on reviews when shopping online to decide what to buy. It’s important that people are able to trust they are genuine, rather than something someone has been paid to write.”
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.