YouTube might be inadvertently brainwashing young people, turning them into far-right ideologues and then back into left-leaning activists, The New York Times reported Saturday.
The company’s business model rewards extremist videos with advertising dollars while its algorithms are aimed at keeping viewers keyed into such content which is changing users’ ideological positions, the report notes.
Conservatives on Twitter, meanwhile, blasted The Times report for depicting some moderate pundits as alt-right.
“Yes, I know I’m on the front page of today’s New York Times, in a story about YouTube and far-right radicals. The story doesn’t mention me, so I can’t ask for a correction, but much like Milton Freidman. I’m obviously not ‘far-right,'” Candice Malcolm, a writer at the Toronto Sun, wrote on Twitter Sunday.
Yes, I know I’m on the front page of today’s New York Times, in a story about YouTube and far-right radicals. The story doesn’t mention me, so I can’t ask for a correction, but much like Milton Freidman, @jordanbpeterson & @benshapiro, I’m obviously not “far-right”
— Candice Malcolm (@CandiceMalcolm) June 10, 2019
Malcolm was referring to The Times’ front page cover of the story, which appeared to paint moderate conservative voices as extremist ideologues.
Others were similarly upset. “The ‘newspaper of record’ publishing a front page story about one guy who watched YouTube videos. There is no data, its an anecdote, the framing inverts the conclusion of the story, the core premise is easily debunked,” freelance journalist Tim Pool told his Twitter followers Sunday.
The absolute state of journalism
The ‘newspaper of record’ publishing a front page story about one guy who watched youtube videos
There is no data, its an anecdote, the framing inverts the conclusion of the story, the core premise is easily debunked https://t.co/vk5sh5ZFId
— Tim Pool (@Timcast) June 9, 2019
Pool frequently reports on what he and many conservatives believe is big tech’s penchant for censorship.
The Times report comes as other conservatives argue Google, which owns YouTube, has too much influence on human behavior.
YouTube tweaked its artificial intelligence in 2015 and created a type of “addiction machine,” according to the report.
The new technology was the brainchild of Google’s AI-division, called Google Brain, which helped transform YouTube’s system into a type of AI that mimics the human brain.
The report focuses on a 26-year-old man named Caleb Cain, who told The Times that he “fell down the alt-right rabbit hole” on YouTube five years ago.
He now distances himself from the movement, but he described being supposedly radicalized as the algorithm slowly but surely pulled him further and further into so-called right-wing content.
“I just kept falling deeper and deeper into this, and it appealed to me because it made me feel a sense of belonging,” Cain said. “I was brainwashed.”
Things began changing again for Cain sometime in 2018 after a group of liberal trolls began hijacking the algorithm delivering content, the report notes.
Cain found videos by Natalie Wynn, a former philosopher who goes by the name ContraPoints. She is part of a group of YouTubers who call themselves BreadTube, a reference to an 1892 book called “The Conquest of Bread” by a communist named Peter Kropotkin.
The group also includes people like British philosopher Oliver Thorn, who hosts shows about topics like transphobia, racism and Marxist economics.
BreadTube’s strategy is a kind of algorithmic hijacking. By talking about many of the same topics that far-right creators do — and, in some cases, by responding directly to their videos — left-wing YouTubers are able to get their videos recommended to the same audience.
Cain was a Donald Trump supporter in 2016, but the shift to Wynn and other content transformed him again, according to the report.
Cain started his own YouTube channel recently where he talks about politics from a left-wing perspective.
“You have to reach people on their level, and part of that is edgy humor, edgy memes,” he told a reporter. “You have to empathize with them, and then you have to give them the space to get all these ideas out of their head.”
Cain said he has no intention of pulling back his YouTube viewing habits.
Some academics worry Google has too much influence. Former Google Design Ethicist Tristan Harris, for one, warns that big tech companies’ algorithms are effectively controlling people’s minds.
“A handful of people working at a handful of technology companies, through their choices, will steer what a billion people are thinking today,” Harris said at a TED talk in 2017. Others worry such influence could affect elections.
The Search Engine Manipulation Effect is what psychologist Robert Epstein calls the kind of manipulation which The Times suggests influenced Cain.
“These are new forms of manipulation people can’t see,” he told the Los Angeles Times in March.
Algorithms “can have an enormous impact on voters who are undecided. … People have no awareness the influence is being exerted.”
Epstein tracked 47,300 searches by undecided voters in the California districts of Democratic Reps. Katie Porter, Harley Rouda and Mike Levin, all of whom won their elections in 2018.
Mainstream news outlets like the Los Angeles Times and The New York Times dominated the Google search results, his research found. Searches on Yahoo and Bing, on the other hand, linked people to outlets like Breitbart.
Epstein’s model showed that roughly 35,455 voters were persuaded to vote for a Democrat entirely because of the sources Google fed them.
Google CEO Sundar Pichai dismissed Epstein’s premise in a congressional hearing in 2018, telling lawmakers that the professor’s methods were flawed.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact firstname.lastname@example.org.
A version of this article appeared on The Daily Caller News Foundation website.
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.