Joan E. Solsman | CNET.Com
Troy Warren for CNT
When YouTube viewers volunteered to report videos they regretted watching, 71% of them were recommended by YouTube’s own algorithms, according to a Mozilla study.
YouTube’s almighty recommendations surfaced most of the videos that a crowdsourced army of volunteers said they regretted watching, according to a study released Wednesday by Mozilla based on “regret” reports from YouTubeusers. Of the videos people said they regretted, 71% were recommended by YouTube’s artificial intelligence. YouTube also recommended people watch videos that it later removed for breaking its own rules, the report said, and videos that people regretted were more popular on YouTube than those without any volunteers’ objections.
In response, YouTube said its own surveys find users are satisfied by its recommendations, saying they generally direct people to authoritative or popular videos. YouTube can’t properly review Mozilla’s definition of “regrettable” nor the validity of its data, the company added, and it noted that it works constantly to improve recommendations, including 30 changes to reduce recommendations of harmful videos in the past year.
Google’s massive video site is the world’s biggest source of online video, reaching more than 2 billion viewers every month who watch more than 1 billion hours there every day. For years, YouTube has vaunted its algorithmic recommendations for driving more than 70% of the time people spend watching YouTube. But Mozilla’s report provides a peek into some of those recommendations’ possible shortcomings.
About 9% of recommended “regrettable” videos — a total of 189 videos in this study — were later taken down from YouTube. YouTube videos can be removed for a variety of reasons, including breaking rules against offensive or dangerous content or infringing copyrights. Sometimes, the person who posted the video takes it down. But the study confirmed YouTube removed some videos for violating its policies after it had previously recommended them.
“That is just bizarre,” Brandi Geurkink, Mozilla’s senior manager of advocacy and coauthor of the study, said in an interview Tuesday. “The recommendation algorithm was actually working against their own abilities to … police the platform.”
The study also found videos that people reported as a regrettable seemed to thrive more on YouTube. Videos that volunteers reported generated 70% more views per day than other videos watched by the volunteers.
YouTube — like Facebook, Twitter, Reddit and many other internet companies that give users a platform to post their own content — has wrestled with how to balance freedom of expression with effective policing of harmful material posted there. Over the years, YouTube has grappled with misinformation, conspiracy theories, discrimination, hate and harassment, videos of mass murder and child abuse and exploitation, all at an unprecedented global scale.
Mozilla’s study, for example, found that YouTube videos with misinformation were the kind most frequently reported as regrettable. And the rate of regretted YouTube videos is 60% higher in countries that don’t speak English as a primary language, particularly in Brazil, Germany and France.
The study is based on voluntary reports sent through a special RegretsReporter extension that Mozilla developed for Chrome and Firefox web browsers. Tens of thousands of people downloaded the extension, and 1,662 submitted at least one report on a YouTube video they regretted watching, for a total of 3,362 reports coming from 91 countries between July 2020 and May.
The study has several limitations. The people reporting these regrettable videos aren’t a random sample — they’re volunteers whose willingness to participate may mean they’re not representative of YouTube’s audience as a whole. The report acknowledges that limitation, as well as the fact that many factors may affect whether a volunteer reports a particular video and that the concept of a regrettable video may differ among volunteers.
The study is also based solely on regret reports filed from desktop web-browser extensions, which excludes any viewing on mobile devices or connected TVs. Mobile devices, for example, account for more than 70% of time spent watching YouTube.
What’s next
Mozilla’s report makes several recommendations for YouTube, for lawmakers and for people like you.
For individual YouTube viewers, Mozilla recommended you check your data settings for YouTube and Google and consider reviewing your “watch” and “search” history to edit out anything that you don’t want influencing your recommendations.
YouTube and other platforms should set up independent audits of its recommendation systems, Mozilla said, as it also called for more transparency about borderline content and greater user control over how your personal data contributes to your recommendations, including the ability to opt out of personalization.
YouTube said it welcomes more research and is exploring options to bring in external researchers to study its systems.
Mozilla also recommended policymakers require YouTube and others to release information and create tools for researchers to scrutinize their recommendation algorithms. And regulations should ensure platforms take into account the risks they’re taking when designing and running automated systems that amplify content at scale.
Mozilla is a software company best known for its unit that operates the Firefox web browser. Google, YouTube’s parent company, is also one of Mozilla’s biggest sources of revenue, through royalties that Google pays Mozilla for integrating the search engine into Firefox. Google is the default search engine in Firefox in many regions of the world.
In Other NEWS