YouTube's algorithm is pushing violent videos and misinformation, despite the US-based company's rules which are meant to limit their spread, according to a new report.
The Mozilla Foundation report published recently found 71 percent of all videos that volunteers reported as disturbing were recommended by the video-sharing platform's algorithm, the Politico reported.
The videos recommended on YouTube, which is owned by Google, included conspiracies about the coronavirus pandemic, as well as the promotion of white supremacy.
Researchers also found that YouTube's efforts to police its platforms properly have been uneven as users in non-English speaking countries were more likely to encounter videos they considered disturbing.
“YouTube’s algorithm is working in amplifying some really harmful content and is putting people down disturbing pathways,” said Brandi Geurkink, the foundation's senior manager of advocacy. “It really shows that its recommendation algorithm is not even functioning to support its own platform policies, it's actually going off the rails.”
A YouTube spokesperson however claimed that the platform "constantly" works to improve users' experience and has launched 30 different changes to reduce recommendations of harmful content in the last year.
"The goal of our recommendation system is to connect viewers with content they love and on any given day, more than 200 million videos are recommended on the homepage alone,” the spokesperson said.
YouTube and other American platforms like it have long denied sharing information about their algorithms, claiming it violates business secrets and user privacy, despite the fact that evidence implicates social media's recommendation algorithms in the spread of misinformation and violent content.
Mozilla's investigation is based on a browser extension, where more than 37,000 users from 91 countries have reported “regrettable recommendations.”