YouTube ‘Going off the Rails’ as Algorithm Reportedly ‘Pushes Hateful Content, Misinformation’

The largest-ever crowdsourced investigation into YouTube’s algorithm, launched by the Mozilla Foundation, spanning a period from July 2020 to May 2021, flagged 3,362 regrettable videos, coming from 91 countries, and revealed what it sees as a major source of the problem.

Google-owned YouTube is not only miserably failing to live up to the company’s stated intention to limit the spread of hateful diatribes and misinformation. The world’s second-most visited website has been found complicit in pushing the afore-mentioned “disturbing” video content via its recommendation algorithm, states a report by the Mozilla Foundation, published on 7 July.

From conspiracy theories about the 9/11 terror attacks on the US and the ongoing coronavirus pandemic, to promotion of so-called “white supremacy” and inappropriate “children’s” cartoons, YouTube’s algorithm has been implicated in driving some 71 percent of its violent content, according to the research.

​The nonprofit found that a majority of problematic videos were recommended by the video-sharing platform’s algorithm. Social media platforms like YouTube have vehemently rejected a clamour to share information about their algorithms, citing user privacy.

However, to address the growing body of evidence suggesting that social media’s recommendation algorithms amplify the spread of misinformation and violent content, in 2020 Mozilla empowered users to take part in a crowdsourced study.

The nonprofit launched…

Continue Reading This Article At Sputnik News


Please enter your comment!
Please enter your name here