Mozilla: ‘Youtube algorithm recommends videos that go against policy’

Youtube’s algorithm itself recommends videos that go against its own community guidelines. That’s what researchers from Mozilla say in a study. They analyzed the operation of the algorithm using a browser extension that collects data.

The survey found that of all videos labeled “deplorable,” 71% were recommended by YouTube’s proprietary algorithm. The researchers also stated that recommended videos were 40 percent more likely to be labeled “deplorable” compared to user-searched videos.

The study also found that in countries where English is not the primary language, the proportion of videos labeled ‘deplorable’ is 60% higher compared to countries where English is the primary language. Of all reported “deplorable” videos, 9 percent were effectively removed from the platform after they accumulated 160 million views, according to the researchers.

“Our research confirms that YouTube not only hosts videos that go against its own Community Guidelines, but also endorses them,” said Mozilla’s Brandi Geurkink. “The platform has to admit that the algorithm is designed to misinform and harm people too much,” it said.

The researchers received data via the RegretsReporter browser extension. This open source extension allows people to report YouTube videos with inappropriate content to Mozilla’s researchers. According to the company, more than 37,000 volunteers have installed this extension and 1,662 people have reported at least one such video with inappropriate content. About 3300 reports came from 91 countries.

Update, 21:18: Quote Mozilla adjusted. Thanks to user109731.