YouTube Inc. Chief Executive Susan Wojcicki presents the new television subscription service of the company
Examining millions of recommendations from YouTube over the course of a year, two researchers have found that the platform is actually fighting against political radicalization.
The researchers said the recommendation algorithm for YouTube appears to be designed to benefit mainstream media and cable news content over independent creators of YouTube as of late 2019.
Also the report, released on Tuesday, says the algorithm of YouTube favors left-leaning and politically neutral channels.
The study was conducted by independent data scientist Mark Ledwich and UC Berkeley postdoctoral researcher Anna Zaitsev, who concluded that while the platform includes radical content, the recommendation algorithm currently does not direct users to such videos.
In the report, they said:
“There is clearly plenty of content on YouTube that one might view as radicalizing or inflammatory. However, the responsibility of that content is with the content creator and the consumers themselves. Shifting the responsibility for radicalization from users and content creators to YouTube is not supported by our data.”
The research follows a series of articles published earlier this year on Google-owned YouTube about radicalization in the New York Times. Caleb Cain, 26, recounts in one of the stories how he fell into what he described years ago as an “alt-right rabbit hole.”
YouTube has changed the way it recommends content since Cain’s experience.
Like all social media platforms, YouTube has been grappling in recent years with the issue of content moderation.
Earlier this month the company wrote a blog post: “There will always be content on YouTube that brushes up against our policies, but doesn’t quite cross the line. So over the past couple of years, we’ve been working to raise authoritative voices on YouTube and reduce the spread of borderline content and harmful misinformation.”
Ledwich released an article on Medium on Friday to explain his research findings and criticize news coverage of the recommendation algorithm for YouTube.
In a post he said, “Contrary to the narrative promoted by the New York Times, the data suggests that YouTube’s recommendation algorithm actively discourages viewers from visiting content that one could categorize as radicalizing or otherwise questionable.”