AndronETalksNews
AndronETalksNews
Epoch Times
A new study by software nonprofit Mozilla Foundation found that 71 percent of videos study participants deemed objectionable were suggested to them by YouTube’s own recommendation algorithm.
“Research volunteers encountered a range of regrettable videos, reporting everything from COVID fear-mongering to political misinformation to wildly inappropriate “children’s” cartoons,” Mozilla Foundation wrote in a post.
The largest-ever crowdsourced probe into YouTube’s controversial recommendation algorithm found that the automated software continues to recommend videos viewers considered “disturbing and hateful,” Mozilla said, including ones that violate YouTube’s own content policies.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checkbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checkbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |