Twitter CEO Jack Dorsey reportedly personally weighs in on content decisions on his prolific social media platform to the frustration of those who work for him.
“In policing content on the site and punishing bad actors, Twitter relies primarily on its users to report abuses and has a consistent set of policies so that decisions aren’t made by just one person, its executives say,” The Wall Street Journal reported.
However, in some cases, “Dorsey has weighed in on content decisions at the last minute or after they were made, sometimes resulting in changes and frustrating other executives and employees, according to people familiar with the matter,” The Journal added.
Twitter has denied that Dorsey overrules his staff, such as a recent decision to allow Alex Jones to remain on the platform, when Facebook and YouTube banned him.
“Any suggestion that Jack made or overruled any of these decisions is completely and totally false,” Twitter’s chief legal officer, Vijaya Gadde, said in a statement. “Our service can only operate fairly if it’s run through consistent application of our rules, rather than the personal views of any executive, including our CEO.”
Sources told The Journal that Dorsey gets involved in high profile account issues, including presumably whether to ban users, but isn’t the final word.
The CEO is slated to appear with Facebook Chief Operating Officer Sheryl Sandberg before the House Commerce Committee on Wednesday to address how his company treats conservative voices.
The hearing “is about pulling back the curtain on Twitter’s algorithms, how the company makes decisions about content, and how those decisions impact Americans,” said Commerce Chairman Rep. Greg Walden, R., Ore.
Dorsey waded into the “shadowbanning” controversy in July involving Republican Party chairwoman Ronna McDaniel, and GOP Reps. Jim Jordan and Mark Meadows, USA Today reported.
Their names did not appear in the auto-complete function while trying to conduct a search for their accounts on Twitter.
Kayvon Beykpour, product lead at Twitter, said the problem was not based on the users’ political views, but a glitch which was being fixed.
“Some accounts weren’t being auto-suggested even when people were searching for their specific name,” wrote Beykpour in a tweet in July.
“Our usage of the behavior signals within search was causing this to happen & making search results seem inaccurate. We’re making a change today that will improve this.”
A short thread addressing some issues folks are encountering as a result of our conversational health work, specifically the perception of “shadowbanning” based on content or ideology. It suffices to say we have a lot more work to do to earn people’s trust on how we work. https://t.co/MN97l7w7RF
— jack 🌍🌏🌎 (@jack) July 25, 2018
Dorsey tweeted in a thread on the subject, “It suffices to say, we have a lot more work to do to earn people’s trust on how we work.”
In April, Dorsey apologized for Twitter labeling Turning Point USA media director Candace Owens “far right.”
Hi Candace. I want to apologize for our labeling you “far right.” Team completed a full review of how this was published and why we corrected far too late (12 hrs after). There was a clear break in our curation process and understanding, and we’re fixing. Thanks for calling out.
— jack 🌍🌏🌎 (@jack) April 27, 2018
“Far right?” Owens questioned. “Allow me to clarify: I believe the black community can do it without hand-outs. I believe the Democrats have strapped us to our past to prevent us from our futures. And I won’t stop fighting until all black Americans see that. I’m not far right—I’m free.”
CORRECTION: An earlier version of this article incorrectly identified the chief operating officer of Facebook as Sheryl Mills. Facebook’s COO is Sheryl Sandberg. We apologize for the error.
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.