YouTube Inc. Chief Executive Susan Wojcicki presents the new television subscription service of the company
Examining millions of recommendations from YouTube over the course of a year, two researchers have found that the platform is actually fighting against political radicalization.

The researchers said the recommendation algorithm for YouTube appears to be designed to benefit mainstream media and cable news content over independent creators of YouTube as of late 2019.

Also the report, released on Tuesday, says the algorithm of YouTube favors left-leaning and politically neutral channels.

The study was conducted by independent data scientist Mark Ledwich and UC Berkeley postdoctoral researcher Anna Zaitsev, who concluded that while the platform includes radical content, the recommendation algorithm currently does not direct users to such videos.

In the report, they said:
“There is clearly plenty of content on YouTube that one might view as radicalizing or inflammatory. However, the responsibility of that content is with the content creator and the consumers themselves. Shifting the responsibility for radicalization from users and content creators to YouTube is not supported by our data.”

The research follows a series of articles published earlier this year on Google-owned YouTube about radicalization in the New York Times. Caleb Cain, 26, recounts in one of the stories how he fell into what he described years ago as an “alt-right rabbit hole.”

YouTube has changed the way it recommends content since Cain’s experience.

Kevin Rooose
Like all social media platforms, YouTube has been grappling in recent years with the issue of content moderation.
Earlier this month the company wrote a blog post: “There will always be content on YouTube that brushes up against our policies, but doesn’t quite cross the line. So over the past couple of years, we’ve been working to raise authoritative voices on YouTube and reduce the spread of borderline content and harmful misinformation.”

Ledwich released an article on Medium on Friday to explain his research findings and criticize news coverage of the recommendation algorithm for YouTube.

In a post he said, “Contrary to the narrative promoted by the New York Times, the data suggests that YouTube’s recommendation algorithm actively discourages viewers from visiting content that one could categorize as radicalizing or otherwise questionable.”


0 Comments

Your email address will not be published. Required fields are marked *

Choose A Format
Personality quiz
Series of questions that intends to reveal something about the personality
Trivia quiz
Series of questions with right and wrong answers that intends to check knowledge
Poll
Voting to make decisions or determine opinions
Story
Formatted Text with Embeds and Visuals
List
The Classic Internet Listicles
Countdown
The Classic Internet Countdowns
Open List
Submit your own item and vote up for the best submission
Ranked List
Upvote or downvote to decide the best list item
Meme
Upload your own images to make custom memes
Video
Youtube, Vimeo or Vine Embeds
Audio
Soundcloud or Mixcloud Embeds
Image
Photo or GIF
Gif
GIF format