Share

Facebook Introduces New Tool To 'Combat Hate Speech,' Removes Function Immediately

Share

Facebook tested an unusual feature for several minutes on Tuesday, before it disappeared without explanation.

The tool asked users “Does this post contain hate speech?” and gave the option to click “Yes” or “No.”

The feature appeared on every post but is now gone, less than an hour later.

There was no announcement from Facebook about the new feature, and when users clicked “yes” on the post, the pop-up indicated the tool wasn’t completed and may have been rolled out by a Facebook developer accidentally.

Trending:
Barr Calls Bragg's Case Against Trump an 'Abomination,' Says He Will Vote for Former President

https://twitter.com/jkurr/status/991336610510856192

Several users on Twitter expressed concern with the new feature and the likelihood it will be abused.



https://twitter.com/cardamomaddict/status/991337151781580801
Do you think this is a dangerous tool?


Although the tool was only up for less than an hour, Facebook founder and CEO Mark Zuckerberg has already voiced plans to develop artificial intelligence tools to flag and ban hate speech.

During a testimony before Congress in April, Zuckerberg said Facebook has developed intelligent software tools to root out terrorist propaganda and will continue to develop them for hate speech.

“Hate speech, I am optimistic that over a five to ten year period we’ll have AI tools that can get into some of the nuances, the linguistic nuances of different types of content to be more accurate in flagging things for our systems, but today is just not there on that,” he said.

Senator Ben Sasse, R-Neb., expressed his concern that Facebook feels the need to “police a whole bunch of speech that I think America might be better off not having policed by one company that has a really big and powerful platform.”

Related:
Wendy's Giving Out Free Fries to Customers Every Friday in Stunning New Offer

“Can you define hate speech?” he asked Zuckerberg.

“Senator, I think that this is a really hard question,” Zuckerberg replied, “and I think it’s one of the reasons why we struggle with it.”

Sasse continued to press Zuckerberg about the issue, particularly about the censorship of certain viewpoints that could occur.

“I am worried about the psychological categories,” Sasse said. “You used language of safety and protection earlier. We have seen this happen on college campuses. It’s dangerous. Forty percent of Americans under age 35 tell pollsters they think the First Amendment is dangerous because you might use your freedom to say something that hurts somebody else’s feelings.”

“There are some passionately held views about the abortion issue on this panel,” Sasse continued. “Can you imagine a world where you might decide that pro-lifers are prohibited from speaking about their abortion views on your platform?”

“I certainly would not want that to be the case,” Zuckerberg replied.

Watch the entire exchange below:



As of the writing of this article, Facebook has not addressed the briefly unveiled tool.

Truth and Accuracy

Submit a Correction →



We are committed to truth and accuracy in all of our journalism. Read our editorial standards.

Tags:
,
Share
Rebekah Baker is the former deputy managing editor of The Western Journal.
Rebekah Baker is the former deputy managing editor of The Western Journal. She graduated from Grove City College with a Bachelor of Arts in Political Science. She has written hundreds of articles on topics like the sanctity of life, free speech and freedom of religion.
Education
Bachelor of Arts in Political Science
Location
Phoenix, Arizona
Languages Spoken
English
Topics of Expertise
Politics, Faith




Conversation