Latest revision as of 16:48, 28 December 2024 editCremastra (talk | contribs)Autopatrolled, Extended confirmed users, Page movers, New page reviewers, Pending changes reviewers, Rollbackers22,905 edits Updating RFD template: discussion was relisted (XFDcloser) |
Latest revision as of 16:26, 2 January 2025 edit RMCD bot (talk | contribs)Bots, Template editors999,819 edits Removing notice of move discussion |
Line 1: |
Line 1: |
|
|
{{Short description|none}} |
|
{{NOINDEX}}{{<includeonly>safesubst:</includeonly>#invoke:RfD||2=Offensive content in YouTube|month = December |
|
|
|
{{See also|Criticism of Google#YouTube|Censorship by Google#YouTube|Content moderation}} |
|
|day = 18 |
|
|
|
YouTube, a video sharing platform, has faced various criticisms over the years, particularly regarding content moderation, offensive content, and monetization. YouTube has faced criticism over aspects of its operations,<ref name="demonetization">{{cite web |last=Alexander |first=Julia |date=May 10, 2018 |title=The Yellow $: a comprehensive history of demonetization and YouTube's war with creators |url=https://www.polygon.com/2018/5/10/17268102/youtube-demonetization-pewdiepie-logan-paul-casey-neistat-philip-defranco |access-date=November 3, 2019 |website=Polygon |language=en}}</ref> its ] perpetuating ] and falsehoods,<ref>{{cite news |last1=Wong |first1=Julia Carrie |author-link=Julia Carrie Wong |last2=Levin |first2=Sam |date=January 25, 2019 |title=YouTube vows to recommend fewer conspiracy theory videos |language=en-GB |work=The Guardian |url=https://www.theguardian.com/technology/2019/jan/25/youtube-conspiracy-theory-videos-recommendations |access-date=November 3, 2019 |issn=0261-3077}}</ref> hosting videos ostensibly targeting children but containing ],<ref>{{cite news |last=Orphanides |first=K. G. |date=March 23, 2018 |title=Children's YouTube is still churning out blood, suicide and cannibalism |magazine=Wired UK |url=https://www.wired.co.uk/article/youtube-for-kids-videos-problems-algorithm-recommend |access-date=November 3, 2019 |issn=1357-0978}}</ref> videos of minors attracting ] activities in their comment sections,<ref>{{cite news |last=Orphanides |first=K. G. |date=February 20, 2019 |title=On YouTube, a network of paedophiles is hiding in plain sight |magazine=Wired UK |url=https://www.wired.co.uk/article/youtube-pedophile-videos-advertising |access-date=November 3, 2019 |issn=1357-0978}}</ref> and fluctuating policies on the types of content that is eligible to be monetized with advertising.<ref name="demonetization" /> |
|
|year = 2024 |
|
|
|time = 06:08 |
|
|
|timestamp = 20241218060821 |
|
|
<!-- The above content is generated by {{subst:Rfd}}. --> |
|
|
<!-- End of RFD message. Don't edit anything above here. Feel free to edit below here, but do NOT change the redirect's target. -->|content= |
|
|
#REDIRECT ] |
|
|
|
|
|
|
|
YouTube has also been blocked by several countries. As of 2018, public access to YouTube was blocked by countries including ], ], ], ],<ref>{{cite web |title=Turkmenistan |url=https://rsf.org/en/news/turkmenistan-1 |website=] |language=en |date=March 11, 2011}}</ref> ],<ref>{{cite news |last1=Syundyukova |first1=Nazerke |title=Uzbekistan has blocked YouTube social network |url=https://qazaqtimes.com/en/article/48743 |access-date=January 23, 2019 |work=The Qazaq Times |date=October 9, 2018}}</ref><ref>{{cite news |title=Маҳаллий ОАВ: Ўзбекистонда Facebook ва YouTube яна ўчириб қўйилди |trans-title=Local Media: YouTube and Facebook once again blocked in Uzbekistan |url=https://www.ozodlik.org/a/29713088.html |access-date=January 23, 2019 |work=Radio Free Europe/Radio Liberty's Uzbek Service |date=January 16, 2019 |language=uz}}</ref> ], ], ] and ]. |
|
{{Redirect category shell| |
|
|
|
|
|
{{R to related topic}} |
|
|
|
==History== |
|
}} |
|
|
|
Controversial content has included material relating to ] and the ], in which 96 football fans from Liverpool were crushed to death in 1989.<ref>{{cite news |title=YouTube criticized in Germany over anti-Semitic Nazi videos |agency=Reuters |url=https://www.haaretz.com/hasen/spages/898004.html |access-date=May 28, 2008 |archive-date=May 17, 2008 |archive-url=https://web.archive.org/web/20080517001126/http://www.haaretz.com/hasen/spages/898004.html |url-status=dead}}</ref><ref>{{cite web |title=Fury as YouTube carries sick Hillsboro video insult |url=https://icliverpool.icnetwork.co.uk/0100news/0100regionalnews/tm_headline=fury-as-youtube-carries-sick-hillsboro-video-insult%26method=full%26objectid=18729523%26page=1%26siteid=50061-name_page.html |url-status=dead |archive-url=https://web.archive.org/web/20120320021147/https://icliverpool.icnetwork.co.uk/0100news/0100regionalnews/tm_headline%3Dfury-as-youtube-carries-sick-hillsboro-video-insult%26method%3Dfull%26objectid%3D18729523%26page%3D1%26siteid%3D50061-name_page.html |archive-date=March 20, 2012 |access-date=November 29, 2015 |publisher=icLiverpool}}</ref> In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content". YouTube responded by stating: |
|
<!-- Don't add anything after this line unless you're drafting a disambiguation page or article to replace the redirect. --> |
|
|
|
|
|
}} |
|
|
|
{{blockquote|We have strict rules on what's allowed, and a system that enables anyone who sees inappropriate content to report it to our 24/7 review team and have it dealt with promptly. We educate our community on the rules and include a direct link from every YouTube page to make this process as easy as possible for our users. Given the volume of content uploaded on our site, we think this is by far the most effective way to make sure that the tiny minority of videos that break the rules come down quickly.<ref>{{cite news |first1=James |last1=Kirkup |first2=Nicole |last2=Martin |title=YouTube attacked by MPs over sex and violence footage |url=https://www.telegraph.co.uk/technology/3358061/YouTube-attacked-by-MPs-over-sex-and-violence-footage.html |archive-url=https://ghostarchive.org/archive/20220110/https://www.telegraph.co.uk/technology/3358061/YouTube-attacked-by-MPs-over-sex-and-violence-footage.html |archive-date=2022-01-10 |url-access=subscription |url-status=live |website=] |date=July 31, 2008 |access-date=March 26, 2017}}{{cbignore}}</ref> (July 2008)}} |
|
|
|
|
|
In October 2010, U.S. Congressman ] urged YouTube to remove from its website videos of imam ].<ref>{{cite news |date=October 25, 2010 |title=Al-Awlaki's YouTube Videos Targeted by Rep. Weiner |publisher=Fox News |url=https://www.foxnews.com/politics/al-awlakis-youtube-videos-targeted-by-rep-weiner/ |access-date=November 13, 2010}}</ref> YouTube pulled some of the videos in November 2010, stating they violated the site's guidelines.<ref>{{cite news |last1=F. Burns |first1=John |last2=Helft |first2=Miguel |date=November 4, 2010 |title=YouTube Withdraws Cleric's Videos |newspaper=The New York Times |url=https://www.nytimes.com/2010/11/05/world/05britain.html |access-date=March 26, 2017 |id={{ProQuest|1458411069}}}}</ref> In December 2010, YouTube added the ability to flag videos for containing terrorism content.<ref>{{cite news |last=Bennett |first=Brian |date=December 12, 2010 |title=YouTube is letting users decide on terrorism-related videos |work=Los Angeles Times |url=https://www.latimes.com/news/nationworld/nation/la-na-youtube-terror-20101213,0,3375845.story |access-date=November 29, 2015}}</ref> |
|
|
|
|
|
In 2018, YouTube introduced a system that would automatically add information boxes to videos that its algorithms determined may present conspiracy theories and other ], filling the infobox with content from ] and ] as a means to inform users to minimize misinformation propagation without impacting freedom of speech.<ref>{{cite web |last=Newton |first=Casey |date=March 13, 2018 |title=YouTube will add information from Misplaced Pages to videos about conspiracies |url=https://www.theverge.com/2018/3/13/17117344/youtube-information-cues-conspiracy-theories-susan-wojcicki-sxsw |access-date=April 15, 2019 |work=]}}</ref><ref>{{Cite news |last=Brown |first=David |date=March 14, 2018 |title=YouTube uses Misplaced Pages to fight fake news |language=en |work=] |url=https://www.thetimes.co.uk/article/youtube-fights-fake-news-with-wikipedia-frkpc8nm2 |url-status=live |access-date=July 13, 2023 |archive-url=https://archive.today/20210927105159/https://www.thetimes.co.uk/article/youtube-fights-fake-news-with-wikipedia-frkpc8nm2 |archive-date=September 27, 2021 |issn=0140-0460}}</ref> The ] said in a statement that "neither Misplaced Pages nor the Wikimedia Foundation are part of a formal partnership with YouTube. We were not given advance notice of this announcement."<ref>{{Cite magazine |last=Matsakis |first=Louise |date=March 16, 2018 |title=Don't Ask Misplaced Pages to Cure the Internet |url=https://www.wired.com/story/youtube-wikipedia-content-moderation-internet/ |access-date=July 21, 2024 |magazine=] |language=en-US |issn=1059-1028}}</ref> |
|
|
|
|
|
In the wake of the ] on April 15, 2019, several user-uploaded videos of the landmark fire were flagged by YouTube' system automatically with an Encyclopædia Britannica article on the false conspiracy theories around the ]. Several users complained to YouTube about this inappropriate connection. YouTube officials apologized for this, stating that their algorithms had misidentified the fire videos and added the information block automatically, and were taking steps to remedy this.<ref>{{cite news |last=Bergen |first=Mark |date=April 15, 2019 |title=YouTube Flags Notre-Dame Fire as 9/11 Conspiracy, Says System Made 'Wrong Call' |url=https://www.bloomberg.com/news/articles/2019-04-15/youtube-flags-notre-dame-fire-as-9-11-conspiracy-in-wrong-call?srnd=technology-vp |access-date=April 15, 2019 |publisher=]}}</ref> |
|
|
|
|
|
To limit the spread of misinformation and fake news via YouTube, it has rolled out a comprehensive policy regarding how it plans to deal with technically manipulated videos.<ref>{{cite news |last=Alba |first=Davey |author-link=Davey Alba |date=February 3, 2020 |title=YouTube Says It Will Ban Misleading Election-Related Content |work=] |url=https://www.nytimes.com/2020/02/03/technology/youtube-misinformation-election.html |access-date=February 10, 2020}}</ref> |
|
|
|
|
|
On April 18, 2023, YouTube revealed its changes in handling content associated with ]s. This social media platform's Community Guidelines now prohibit content that could encourage emulation from at-risk users. This content includes behavior that shows severe calorie tracking and ] after eating. However, videos featuring positive behavior such as in the context of recovery will be permitted on the platform under two conditions—the user must have a registered (logged-in) account and must be older than 18. |
|
|
This policy was created in collaboration with nonprofit organizations as well as the National Eating Disorder Association. Garth Graham, YouTube's Global Head of Healthcare revealed in an interview with CNN that this policy change was geared at ensuring that this video-sharing platform provides an avenue for "community recovery and resources" while ensuring continued viewer protection.<ref>{{cite news |url=https://edition.cnn.com/2023/04/18/tech/youtube-eating-disorder-policies/index.html |title=YouTube rolls out new policies for eating disorder content |publisher=CNN}}</ref> |
|
|
|
|
|
==Moderators== |
|
|
YouTube contracts companies to hire content moderators, who view content flagged as potentially violating YouTube's content policies and determines if they should be removed. In September 2020, a class-action suit was filed by a former content moderator who reported developing ] (PTSD) after an 18-month period on the job. The former content moderator said that she was regularly made to exceed YouTube's stated limit of four hours per day of viewing graphic content. The lawsuit alleges that YouTube's contractors gave little to no training or support for its moderators' mental health, made prospective employees sign NDAs before showing them any examples of content they would see while reviewing, and censored all mention of trauma from its internal forums. It also purports that requests for extremely graphic content to be blurred, reduced in size or made monochrome, per recommendations from the ], were rejected by YouTube as not a high priority for the company.<ref>{{cite web |last=Kimball |first=Whitney |date=September 22, 2020 |title=Content Moderator Exposed to Child Assault and Animal Torture Sues YouTube |url=https://gizmodo.com/youtube-moderator-sues-over-ptsd-symptoms-lack-of-work-1845143110 |access-date=October 11, 2020 |work=Gizmodo}}</ref><ref>{{cite news |last=Vincent |first=James |date=September 22, 2020 |title=Former YouTube content moderator sues the company after developing symptoms of PTSD |url=https://www.theverge.com/2020/9/22/21450477/youtube-content-moderator-sues-lawsuit-ptsd-graphic-content-exposure |access-date=October 11, 2020 |work=The Verge}}</ref><ref>{{cite web |last=Elias |first=Jennifer |date=September 22, 2020 |title=Former YouTube content moderator describes horrors of the job in new lawsuit |url=https://www.cnbc.com/2020/09/22/former-youtube-content-moderator-describes-horrors-of-the-job-in-lawsuit.html |access-date=October 11, 2020 |publisher=CNBC}}</ref> |
|
|
|
|
|
== Homophobia and transphobia == |
|
|
Five leading content creators whose channels were based on ] materials filed a federal lawsuit against YouTube in August 2019, alleging that YouTube's algorithms divert discovery away from their channels, impacting their revenue. The plaintiffs claimed that the algorithms discourage content with words like "lesbian" or "gay", which would be predominant in their channels' content, and because of YouTube's near-monopolization of online video services, they are abusing that position.<ref>{{cite news |last1=Bensinger |first1=Greg |last2=Albergotti |first2=Reed |date=August 14, 2019 |title=YouTube discriminates against LGBT content by unfairly culling it, suit alleges |url=https://www.washingtonpost.com/technology/2019/08/14/youtube-discriminates-against-lgbt-content-by-unfairly-culling-it-suit-alleges/ |access-date=August 14, 2019 |newspaper=]}}</ref> In early 2021 the lawsuit was dismissed based on the plaintiffs inability to prove YouTube acted on behalf of the government and because of ].<ref>{{Cite web |last=Lang |first=Nico |date=2021-01-08 |title=This Lawsuit Alleging YouTube Discriminates Against LGBTQ+ Users Was Just Tossed Out |url=https://www.them.us/story/lawsuit-alleging-youtube-discriminates-against-lgbtq-users-tossed-out |access-date=2024-11-01 |website=Them |language=en-US}}</ref> |
|
|
|
|
|
In June 2022, ], a media watchdog group, reported that ] and ] content calling LGBT people ] was becoming more common on YouTube.<ref name="lawton_20220623">{{cite web |url=https://www.mediamatters.org/google/right-wing-clickbait-pushing-anti-lgbtq-groomer-smears-are-increasingly-popular-youtube |title=Right-wing clickbait pushing anti-LGBTQ 'groomer' smears are increasingly popular on YouTube |website=Media Matters |last1=Lawton |first1=Sophie |date=June 23, 2022 |access-date=October 23, 2022}}</ref> The report also referred to common accusations in YouTube videos that LGBT people are ].<ref name="lawton_20220623" /> The report stated the content appeared to be in violation of YouTube's hate speech policy.<ref name="lawton_20220623" /> |
|
|
|
|
|
== Animal torture == |
|
|
In late 2020, animal welfare charity ''Lady Freethinker'' identified 2,053 videos on YouTube in which they stated animals were "deliberately harmed for entertainment or were shown to be under severe psychological distress, physical pain or dead."<ref>{{cite news|title= |
|
|
YouTube must remove videos of animal cruelty, says charity|url=https://www.theguardian.com/world/2020/dec/19/youtube-must-remove-videos-of-animal-cruelty-says-charity |work=The GuardianTimes |access-date=December 5, 2023|date=December 19, 2020}}</ref> |
|
|
|
|
|
In 2021, ''Lady Freethinker'' filed a lawsuit accusing YouTube of a breach of contract in allowing a large number of videos on its site showing animal abuse and failing to remove them when notified. YouTube responded by stating that they had "expanded its policy on animal abuse videos" in 2021, and since the introduction of the new policy "removed hundreds of thousands of videos and terminated thousands of channels for violations."<ref>{{cite news|title=Monkeys killed in blenders by sadistic torture ring that films abuse to sell online |url=https://www.nytimes.com/2021/10/19/technology/youtube-sued-animal-abuse.html |access-date=December 4, 2023 |work=The New York Times|date=October 19, 2021}}</ref> |
|
|
|
|
|
In 2022, Google defeated the ''Lady Freethinker'' lawsuit, with a judge ruling that YouTube was protected by Section 230 of the Communications Decency Act, that shields internet platforms from lawsuits based on content posted by their users.<ref>{{cite news|title=Google Defeats Lawsuit Decrying Animal Abuse Videos on YouTube|url=https://www.bloomberg.com/news/articles/2022-08-04/google-defeats-lawsuit-decrying-animal-abuse-videos-on-youtube |access-date=December 5, 2023 |work=The New York Times|date=August 4, 2022}}</ref> |
|
|
|
|
|
In 2023, YouTube stated that animal abuse "has no place on their platforms, and they are working to remove content (of that nature)".<ref name="telegraph/monkey-torture-women-arrested">{{cite news |last1=Newey |first1=Sarah |title=Monkeys killed in blenders by sadistic torture ring that films abuse to sell online |url=https://www.telegraph.co.uk/world-news/2023/06/20/monkey-torture-ring-three-women-arrested-britain-bbc/ |access-date=June 23, 2023 |work=The Telegraph |date=June 20, 2023 |archive-url=https://web.archive.org/web/20230620211907/https://www.telegraph.co.uk/world-news/2023/06/20/monkey-torture-ring-three-women-arrested-britain-bbc/ |archive-date=June 20, 2023}}</ref><ref name="bbc/Iot1dIWVS5">{{cite web |title=Hunting the monkey torturers |url=https://www.bbc.co.uk/news/extra/Iot1dIWVS5/hunting-the-monkey-torturers |website=BBC News |access-date=June 23, 2023}}</ref><ref name="express.co.uk/1503840">{{cite news |last1=Pritchard-Jones |first1=Oliver |title=YouTube hosts HUNDREDS of 'disgusting' animal cruelty videos |url=https://www.express.co.uk/news/uk/1503840/youtube-news-baby-monkey-torture-animal-cruelty|work=]|date=October 10, 2021|access-date=June 23, 2023}}</ref><ref name="bbc.co.uk/m001n32l">{{cite web |title=The Monkey Haters |url=https://www.bbc.co.uk/programmes/m001n32l |website=BBC Three |publisher=BBC |access-date=June 23, 2023}}</ref><ref name="bbc.co.uk/w3ct5j1p">{{cite web |title=The monkey haters |url=https://www.bbc.co.uk/programmes/w3ct5j1p |website=The Documentary |publisher=BBC World Service |access-date=June 23, 2023}}</ref><ref name="vice/custom-baby-monkey-torture-videos">{{cite web |last1=Geiger |first1=Gabriel |title=People Buy Custom Baby Monkey Torture Videos on World's Worst Forum |url=https://www.vice.com/en/article/7kvqgx/people-buy-custom-baby-monkey-torture-videos-on-worlds-worst-forum |website=Vice |access-date=June 23, 2023 |language=en |date=August 13, 2021}}</ref> |
|
|
|
|
|
== Conspiracy theories and far-right content {{anchor|Promotion_of_conspiracy_theories_and_fringe_discourse|Conspiracy_theories_and_fringe_discourse}}== |
|
|
YouTube has been criticized for using an algorithm that gives great prominence to videos that promote conspiracy theories, falsehoods and incendiary fringe discourse.<ref name="Darkest">{{cite news |last=Nicas |first=Jack |date=February 7, 2018 |title=How YouTube Drives People to the Internet's Darkest Corners |language=en-US |work=] |url=https://www.wsj.com/articles/how-youtube-drives-viewers-to-the-internets-darkest-corners-1518020478 |access-date=June 16, 2018 |issn=0099-9660}}</ref><ref>{{cite news |title=As Germans Seek News, YouTube Delivers Far-Right Tirades |newspaper=The New York Times |date=September 7, 2018 |language=en |url=https://www.nytimes.com/2018/09/07/world/europe/youtube-far-right-extremism.html |access-date=September 8, 2018 |last1=Fisher |first1=Max |last2=Bennhold |first2=Katrin}}</ref><ref name="secret life">{{cite news |last1=Ingram |first1=Matthew |title=YouTube's secret life as an engine for right-wing radicalization |language=en |work=Columbia Journalism Review |issue=September 19, 2018 |url=https://www.cjr.org/the_media_today/youtube-conspiracy-radicalization.php |access-date=March 26, 2019}}</ref> According to an investigation by ''The Wall Street Journal'', "YouTube's recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven't shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints."<ref name="Darkest" /><ref>{{cite web |last1=Lewis |first1=Rebecca |date=September 2018 |title=Alternative Influence: Broadcasting the Reactionary Right on YouTube |url=https://datasociety.net/wp-content/uploads/2018/09/DS_Alternative_Influence.pdf |access-date=March 26, 2019 |website=datasociety.net |publisher=Data and Society}}</ref> When users search for political or scientific terms, YouTube's search algorithms often give prominence to hoaxes and conspiracy theories.<ref name="secret life" /><ref>{{cite news |title=YouTube wants the news audience, but not the responsibility |language=en |work=Columbia Journalism Review |url=https://www.cjr.org/innovations/youtube-wants-the-news-audience-but-not-the-responsibility.php |access-date=September 23, 2018}}</ref> After YouTube drew controversy for giving top billing to videos promoting falsehoods and conspiracy when people made breaking-news queries during the ], YouTube changed its algorithm to give greater prominence to mainstream media sources.<ref name="Darkest" /><ref>{{cite news |last=Nicas |first=Jack |date=October 6, 2017 |title=YouTube Tweaks Search Results as Las Vegas Conspiracy Theories Rise to Top |language=en-US |work=] |url=https://www.wsj.com/articles/youtube-tweaks-its-search-results-after-rise-of-las-vegas-conspiracy-theories-1507219180 |access-date=June 16, 2018 |issn=0099-9660}}</ref><ref>{{cite news |title=Here's How YouTube Is Spreading Conspiracy Theories About The Vegas Shooting |language=en |work=BuzzFeed |url=https://www.buzzfeed.com/charliewarzel/heres-how-youtube-is-spreading-conspiracy-theories-about |access-date=June 16, 2018}}</ref><ref>{{cite news |title=The Big Tech Platforms Still Suck During Breaking News |language=en |work=BuzzFeed |url=https://www.buzzfeed.com/charliewarzel/the-big-tech-platforms-are-still-botching-breaking-news |access-date=June 16, 2018}}</ref> In 2018, it was reported that YouTube was again promoting fringe content about breaking news, giving great prominence to conspiracy videos about ]'s death.<ref>{{cite news |last=Alba |first=Davey |date=June 16, 2018 |title=YouTube Is Spreading Conspiracy Theories about Anthony Bourdain's Death |language=en |work=] |url=https://www.buzzfeednews.com/article/daveyalba/conspiracy-theories-about-anthony-bourdains-death-are |access-date=June 16, 2018}}</ref> |
|
|
|
|
|
In 2017, it was revealed that advertisements were being placed on extremist videos, including videos by rape apologists, anti-Semites, and hate preachers who received ad payouts.<ref name="apologises">{{cite news |date=March 20, 2017 |title=Google apologises as M&S pulls ads |language=en-GB |work=BBC News |url=https://www.bbc.com/news/business-39325916 |access-date=June 16, 2018}}</ref> After firms started to stop advertising on YouTube in the wake of this reporting, YouTube apologized and said that it would give firms greater control over where ads got placed.<ref name="apologises" /> |
|
|
|
|
|
], known for far-right conspiracy theories, had built a massive audience on YouTube.<ref>{{cite news |last=Lewis |first=Paul |date=February 2, 2018 |title='Fiction is outperforming reality': how YouTube's algorithm distorts truth |language=en-GB |work=The Guardian |url=https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth |access-date=June 16, 2018 |issn=0261-3077}}</ref> YouTube drew criticism in 2018 when it removed a video from ] compiling offensive statements made by Jones, stating that it violated its policies on "harassment and bullying".<ref>{{cite news |last=Levin |first=Sam |date=April 23, 2018 |title=YouTube under fire for censoring video exposing conspiracy theorist Alex Jones |language=en |newspaper=The Guardian |url=https://www.theguardian.com/technology/2018/apr/23/youtube-alex-jones-sandy-hook-media-matters-video |access-date=June 16, 2018}}</ref> On August 6, 2018, however, YouTube removed Alex Jones' YouTube page following a content violation.<ref>Salinas, Sara (August 6, 2018). ]. Retrieved October 15, 2018.</ref> |
|
|
|
|
|
University of North Carolina professor ] has referred to YouTube as "The Great Radicalizer", saying "YouTube may be one of the most powerful radicalizing instruments of the 21st century."<ref>{{cite news |title=Opinion {{!}} YouTube, the Great Radicalizer |newspaper=The New York Times |date=March 10, 2018 |language=en |url=https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html |access-date=June 16, 2018 |last1=Tufekci |first1=Zeynep |id={{ProQuest|2610860590}}}}</ref> Jonathan Albright of the Tow Center for Digital Journalism at Columbia University described YouTube as a "conspiracy ecosystem".<ref name="secret life" /><ref>{{cite news |title=Parkland shooting 'crisis actor' videos lead users to a 'conspiracy ecosystem' on YouTube, new research shows |url=https://www.washingtonpost.com/news/the-switch/wp/2018/02/25/parkland-shooting-crisis-actor-videos-lead-users-to-a-conspiracy-ecosystem-on-youtube-new-research-shows/ |access-date=September 23, 2018 |newspaper=The Washington Post |language=en}}</ref> |
|
|
|
|
|
In January 2019, YouTube said that it had introduced a new policy starting in the United States intended to stop recommending videos containing "content that could misinform users in harmful ways." YouTube gave ], miracle cures, and ] as examples.<ref>{{cite news |last=Weill |first=Kelly |date=January 25, 2019 |title=YouTube Tweaks Algorithm to Fight 9/11 Truthers, Flat Earthers, Miracle Cures |language=en |url=https://www.thedailybeast.com/youtube-tweaks-algorithm-to-fight-911-truthers-flat-earthers-miracle-cures |access-date=January 29, 2019}}</ref> Efforts within YouTube engineering to stop recommending borderline extremist videos falling just short of forbidden hate speech, and track their popularity were originally rejected because they could interfere with viewer engagement.<ref>{{cite news |last1=Bergen |first1=Mark |date=April 2, 2019 |title=YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant |work=Bloomberg News |url=https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant |access-date=April 2, 2019}}</ref> |
|
|
|
|
|
In January 2019, the site announced it would be implementing measures directed towards "raising authoritative content and reducing borderline content and harmful misinformation."<ref name="neo-Nazi">{{Cite web |last=Brodkin |first=Jon |date=June 5, 2019 |title=YouTube bans neo-Nazi and Holocaust-denial videos in push against hate speech |url=https://arstechnica.com/tech-policy/2019/06/youtube-bans-neo-nazi-and-holocaust-denial-videos-in-push-against-hate-speech/ |access-date=February 2, 2024 |website=] |language=en-us}}</ref> That June, YouTube announced it would be banning ] and ] content.<ref name="neo-Nazi" /> YouTube has blocked the neo-Nazi propaganda film '']'' from being uploaded.<ref>{{Cite web |date=October 13, 2021 |title=Antisemitism in the Digital Age: Online Antisemitic Hate, Holocaust Denial, Conspiracy Ideologies and Terrorism in Europe |url=https://hopenothate.org.uk/wp-content/uploads/2021/10/google-report-2021-10-v3.pdf |url-status=live |archive-url=https://web.archive.org/web/20231116055750/https://hopenothate.org.uk/wp-content/uploads/2021/10/google-report-2021-10-v3.pdf |archive-date=November 16, 2023 |access-date=September 23, 2023 |website=Hope not Hate |page=34}}</ref> |
|
|
|
|
|
Multiple research studies have investigated cases of misinformation in YouTube. In a July 2019 study based on ten YouTube searches using the ] related to climate and climate change, the majority of videos were videos that communicated views contrary to the ].<ref>{{cite journal |last=Allgaier |first=Joachim |date=July 25, 2019 |title=Science and Environmental Communication on YouTube: Strategically Distorted Communications in Online Videos on Climate Change and Climate Engineering |journal=Frontiers in Communication |volume=4 |doi=10.3389/fcomm.2019.00036 |issn=2297-900X |doi-access=free}}</ref> A May 2023 study found that YouTube was monetizing and profiting from videos that included misinformation about climate change.<ref>{{Cite web |date=May 4, 2023 |title=Google profiting from climate misinformation on YouTube, report finds |url=https://www.independent.co.uk/climate-change/news/google-youtube-climate-disinformation-ads-b2331573.html |access-date=August 27, 2023 |website=The Independent |language=en}}</ref> A 2019 BBC investigation of YouTube searches in ten different languages found that YouTube's algorithm promoted health misinformation, including fake cancer cures.<ref>{{cite news |last1=Carmichael |first1=Flora |last2=Gragani |first2=Juliana |others=Beyond Fake News & B.B.C. Monitoring |title=How YouTube makes money from fake cancer cure videos |work=BBC News |date=September 12, 2019 |url=https://www.bbc.com/news/blogs-trending-49483681 |access-date=September 27, 2019 |language=en}}</ref> In Brazil, YouTube has been linked to pushing pseudoscientific misinformation on health matters, as well as elevated far-right fringe discourse and conspiracy theories.<ref>{{cite news |last1=Fisher |first1=Max |last2=Taub |first2=Amanda |date=August 11, 2019 |title=How YouTube Radicalized Brazil |language=en-US |work=The New York Times |url=https://www.nytimes.com/2019/08/11/world/americas/youtube-brazil.html |access-date=August 12, 2019 |issn=0362-4331}}</ref> In the Philippines, numerous channels disseminated misinformation related to the ].<ref>{{cite news |last=Tuquero |first=Loreben |title=Red flag for 2022: Political lies go unchecked on YouTube showbiz channels |url=https://www.rappler.com/nation/elections/political-lies-unchecked-youtube-showbiz-channels-red-flag-candidates-2022 |access-date=September 23, 2021 |work=] |publisher=Rappler Inc. |date=September 22, 2021 |location=], Philippines}}</ref> Additionally, research on the dissemination of ] beliefs in social media, has shown that networks of YouTube channels form an echo chamber that polarizes audiences by appearing to confirm preexisting beliefs.<ref>{{cite journal |last1=Diaz Ruiz |first1=Carlos |last2=Nilsson |first2=Tomas |date=August 8, 2022 |title=Disinformation and Echo Chambers: How Disinformation Circulates on Social Media Through Identity-Driven Controversies |journal=Journal of Public Policy & Marketing |volume=42 |language=en |pages=18–35 |doi=10.1177/07439156221103852 |s2cid=248934562 |issn=0743-9156 |doi-access=}}</ref> |
|
|
|
|
|
=== Use among white supremacists === |
|
|
Before 2019, YouTube took steps to remove specific videos or channels related to ] content that had violated its acceptable use policies but otherwise did not have site-wide policies against ].<ref name="youtubeblog june2019">{{cite web |date=June 5, 2019 |title=Our ongoing work to tackle hate |url=https://youtube.googleblog.com/2019/06/our-ongoing-work-to-tackle-hate.html |access-date=April 9, 2020 |via=YouTube}}</ref> |
|
|
|
|
|
In the wake of the March 2019 ], YouTube and other sites like Facebook and Twitter that allowed user-submitted content drew criticism for doing little to moderate and control the spread of hate speech, which was considered to be a factor in the rationale for the attacks.<ref>{{cite web |last=Robertson |first=Adi |date=March 15, 2019 |title=Questions about policing online hate are much bigger than Facebook and YouTube |url=https://www.theverge.com/2019/3/15/18267638/new-zealand-christchurch-mass-shooting-online-hate-facebook-youtube |access-date=April 9, 2020 |work=]}}</ref><ref>{{cite news |last1=Timberg |first1=Craig |last2=Harwell |first2=Drew |last3=Shaban |first3=Hamza |last4=Ba Tran |first4=Andrew |last5=Fung |first5=Brian |date=March 15, 2020 |title=The New Zealand shooting shows how YouTube and Facebook spread hate and violent images – yet again |url=https://www.washingtonpost.com/technology/2019/03/15/facebook-youtube-twitter-amplified-video-christchurch-mosque-shooting/ |access-date=April 9, 2020 |newspaper=]}}</ref> These platforms were pressured to remove such content, but in an interview with '']'', YouTube's chief product officer Neal Mohan said that unlike content such as ] videos which take a particular format and thus easy to detect through computer-aided algorithms, general hate speech was more difficult to recognize and handle, and thus could not readily take action to remove without human interaction.<ref>{{cite web |last=Roose |first=Kevin |date=March 29, 2019 |title=YouTube's Product Chief on Online Radicalization and Algorithmic Rabbit Holes |url=https://www.nytimes.com/2019/03/29/technology/youtube-online-extremism.html |access-date=April 9, 2020 |work=]}}</ref> |
|
|
|
|
|
In May 2019, YouTube joined an initiative led by France and New Zealand with other countries and tech companies to develop tools to be used to block ] and to develop regulations, to be implemented at the national level, to be levied against technology firms that failed to take steps to remove such speech, though the United States declined to participate.<ref>{{cite web |last=Browne |first=Ryan |date=May 15, 2019 |title=New Zealand and France unveil plans to tackle online extremism without the US on board |url=https://www.cnbc.com/2019/05/15/new-zealand-france-unveil-plans-to-tackle-online-extremism-without-us.html |access-date=April 9, 2020 |publisher=]}}</ref><ref>{{cite web |last=Willsher |first=Kim |date=May 15, 2019 |title=Leaders and tech firms pledge to tackle extremist violence online |url=https://www.theguardian.com/world/2019/may/15/jacinda-ardern-emmanuel-macron-christchurch-call-summit-extremist-violence-online |access-date=April 9, 2020 |work=]}}</ref> Subsequently, on June 5, 2019, YouTube announced a major change to its terms of service, "specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status." YouTube identified specific examples of such videos as those that "promote or glorify Nazi ideology, which is inherently discriminatory". YouTube further stated it would "remove content denying that well-documented violent events, like the Holocaust or ], took place."<ref name="youtubeblog june2019" /><ref>{{cite web |last=Newton |first=Casey |date=June 5, 2019 |title=YouTube just banned supremacist content, and thousands of channels are about to be removed |url=https://www.theverge.com/2019/6/5/18652576/youtube-supremacist-content-ban-borderline-extremist-terms-of-service |access-date=April 9, 2020 |work=]}}</ref> |
|
|
|
|
|
In August 2019, the channel of the white nationalist website ] was banned. The ban was later reversed.<ref>{{Cite web |last=Holt |first=Jared |date=August 30, 2019 |title=YouTube Reverses Course, Apologizes to Far-Right Channels & Unbans Them |url=https://www.rightwingwatch.org/post/youtube-reverses-course-apologizes-to-far-right-channels-unbans-them/ |url-status=live |archive-url=https://web.archive.org/web/20230322154216/https://www.rightwingwatch.org/post/youtube-reverses-course-apologizes-to-far-right-channels-unbans-them/ |archive-date=March 22, 2023 |access-date=June 25, 2024 |website=] |language=en-US}}</ref> The channel was permanently banned in August 2020 for violating YouTube's policies against ].<ref>{{Cite web |last=Holt |first=Jared |date=August 10, 2020 |title=White Nationalist VDARE Suspended From YouTube. This Time It's Permanent |url=https://www.rightwingwatch.org/post/vdare-suspended-from-youtube-and-this-time-its-permanent/ |url-status=live |archive-url=https://web.archive.org/web/20240428233756/https://www.rightwingwatch.org/post/vdare-suspended-from-youtube-and-this-time-its-permanent/ |archive-date=April 28, 2024 |access-date=June 25, 2024 |website=Right Wing Watch |language=en-US}}</ref> |
|
|
|
|
|
In September 2018, YouTube limited some videos by ], a white supremacist multimedia company, after it posted a video claiming that white women were being "pushed" into interracial relationships.<ref>{{Cite news |last=Sommer |first=Will |author-link=Will Sommer |date=2018-09-06 |title=YouTube Won't Ban 'They Want You Dead, White Man!' Channel |url=https://www.thedailybeast.com/youtube-wont-ban-they-want-you-dead-white-man-channel |access-date=2024-09-11 |work=] |language=en}}</ref> In October 2019, YouTube banned Red Ice's main channel for hate speech violations. The channel had about 330,000 subscribers. ] and Red Ice promoted a backup channel in an attempt to circumvent the ban.<ref>{{cite news |last1=Ramirez |first1=Nikki McCann |date=October 18, 2019 |title=White nationalist Red Ice TV is promoting a backup channel to skirt its YouTube ban |language=en |work=] |url=https://www.mediamatters.org/white-nationalism/how-white-nationalist-red-ice-tv-working-around-its-youtube-ban |url-status=live |access-date=October 20, 2019 |archive-url=https://web.archive.org/web/20191020215745/https://www.mediamatters.org/white-nationalism/how-white-nationalist-red-ice-tv-working-around-its-youtube-ban |archive-date=October 20, 2019}}</ref><ref>{{cite news |last1=Gais |first1=Hannah |date=October 21, 2019 |title=YouTube Takes Down Red Ice's Main Channel |language=en |work=HateWatch |publisher=] |url=https://www.splcenter.org/hatewatch/2019/10/21/youtube-takes-down-red-ices-main-channel |url-status=live |access-date=October 22, 2019 |archive-url=https://web.archive.org/web/20191022081513/https://www.splcenter.org/hatewatch/2019/10/21/youtube-takes-down-red-ices-main-channel |archive-date=October 22, 2019}}</ref> A week later, the backup channel was also removed by YouTube.<ref>{{cite news |last1=Gais |first1=Hannah |date=October 23, 2019 |title=YouTube Yanks Second Red Ice Channel |language=en |work=HateWatch |publisher=Southern Poverty Law Center |url=https://www.splcenter.org/hatewatch/2019/10/23/youtube-yanks-second-red-ice-channel |url-status=live |access-date=October 27, 2019 |archive-url=https://web.archive.org/web/20191025010112/https://www.splcenter.org/hatewatch/2019/10/23/youtube-yanks-second-red-ice-channel |archive-date=October 25, 2019}}</ref><ref name="DailyDot">{{cite news |last1=Katzowitz |first1=Josh |title=Red Ice, a popular white supremacist YouTube channel, has been shut down|work=]|url=https://www.dailydot.com/layer8/red-ice-youtube-ban/ |url-status=live|archive-url=https://web.archive.org/web/20191028205415/https://www.dailydot.com/layer8/red-ice-youtube-ban/ |archive-date=October 28, 2019|date=October 24, 2019|access-date=November 25, 2019}}</ref> |
|
|
|
|
|
In June 2020, YouTube was criticized for allowing white supremacist content on its platform for years after it announced it would be pledging $1 million to fight racial injustice.<ref>{{Cite web |last=Hamilton |first=Isobel Asher |date=June 1, 2020 |title=YouTube has pledged $1 million in solidarity with Black Lives Matter protesters, but critics note the site has allowed white supremacist videos for years |url=https://www.businessinsider.com/youtube-pledges-1-million-to-fight-racial-injustice-draws-criticism-2020-6 |access-date=May 11, 2024 |website=Business Insider |language=en-US}}</ref> Later that month, it banned several channels associated with white supremacy, including those of ], ], and ], asserting these channels violated their policies on hate speech. The ban occurred the same day that ] announced the ban on several hate speech sub-forums including ].<ref>{{cite web |last=Alexander |first=Julia |date=June 29, 2020 |title=YouTube bans Stefan Molyneux, David Duke, Richard Spencer, and more for hate speech |url=https://www.theverge.com/2020/6/29/21307303/youtube-bans-molyneux-duke-richard-spencer-conduct-hate-speech |access-date=June 29, 2020 |work=]}}</ref> |
|
|
|
|
|
== Handling of COVID-19 pandemic and other misinformation == |
|
|
Following the dissemination via YouTube of ] that ] communications technology was responsible for the spread of ] which led to multiple 5G towers in the United Kingdom being attacked by arsonists, YouTube removed all such videos linking 5G and the coronavirus in this manner.<ref name="guardian-youtube-to-suppress-content-spreading-coronavirus-5g-conspiracy-theory">{{cite news |last=Hern |first=Alex |date=April 5, 2020 |title=YouTube moves to limit spread of false coronavirus 5G theory |newspaper=] |url=https://www.theguardian.com/world/2020/apr/05/youtube-to-suppress-content-spreading-coronavirus-5g-conspiracy-theory |access-date=April 5, 2020}}</ref> |
|
|
|
|
|
In September 2021, YouTube extended this policy to cover videos disseminating misinformation related to any vaccine, including those long approved against measles or Hepatitis B, that had received approval from local health authorities or the ].<ref name="WaPo20210929">{{cite news |last=Pannett |first=Rachel |date=January 29, 2021 |title=Russia threatens to block YouTube after German channels are deleted over coronavirus misinformation |newspaper=The Washington Post |url=https://www.washingtonpost.com/world/2021/09/29/russia-ban-youtube-german-coronavirus/ |access-date=September 30, 2021}}</ref><ref name="NYT20210929">{{cite news |last=Alba |first=Davey |author-link=Davey Alba |date=September 29, 2021 |title=YouTube bans all anti-vaccine misinformation |work=The New York Times |url=https://www.nytimes.com/2021/09/29/technology/youtube-anti-vaxx-ban.html |archive-url=https://ghostarchive.org/archive/20211228/https://www.nytimes.com/2021/09/29/technology/youtube-anti-vaxx-ban.html |archive-date=December 28, 2021 |url-access=limited |access-date=September 30, 2021}}{{cbignore}}</ref> The platform proceeded to remove the accounts of anti-vaccine campaigners such as ] and ].<ref name="NYT20210929" /> |
|
|
|
|
|
Google and YouTube implemented policies in October 2021 to deny monetization or revenue to advertisers or content creators that promoted ], which "includes content referring to climate change as a hoax or a scam, claims denying that long-term trends show the global climate is warming, and claims denying that greenhouse gas emissions or human activity contribute to climate change."<ref>{{cite web |last=Peters |first=Jay |date=October 7, 2021 |title=Google and YouTube will cut off ad money for climate change deniers |url=https://www.theverge.com/2021/10/7/22715102/google-youtube-climate-change-deniers-ads-monetization |work=] |access-date=October 7, 2021}}</ref> In January 2024, the ] reported that climate change deniers were instead pushing other forms of climate change denial that have not yet been banned by YouTube, including false claims that global warming is "beneficial or harmless", and which undermined climate solutions and ].<ref>{{Cite web |last=Belanger |first=Ashley |title=Climate denialists find new ways to monetize disinformation on YouTube |url=https://arstechnica.com/tech-policy/2024/01/youtube-profits-from-videos-claiming-global-warming-is-beneficial/ |website=Ars Technica|date=January 16, 2024|access-date=January 31, 2024}}</ref><ref>{{Cite news |date=January 17, 2024 |title=YouTube making money off new breed of climate denial, monitoring group says |url=https://www.reuters.com/sustainability/climate-energy/youtube-making-money-off-new-breed-climate-denial-monitoring-group-says-2024-01-16/ |access-date=January 31, 2024 |work=]}}</ref> |
|
|
|
|
|
In July 2022, YouTube announced policies to combat misinformation surrounding ], such as videos with instructions to perform abortion methods that are considered unsafe and videos that contain misinformation about the ].<ref>{{cite web |last=Elias |first=Jennifer |date=July 21, 2022 |title=YouTube says it will crack down on abortion misinformation and remove videos with false claims |url=https://www.cnbc.com/2022/07/21/youtube-says-it-will-crack-down-on-abortion-misinformation.html |access-date=July 21, 2022 |publisher=CNBC |language=en}}</ref> |
|
|
|
|
|
=== Election misinformation === |
|
|
|
|
|
YouTube has extended the moderation of misinformation to non-medical areas. In the weeks following the ], the site added policies to remove or label videos promoting election fraud claims;<ref>{{cite news |url=https://apnews.com/article/youtube-election-misinformation-removal-74ca3738e2774c9a4cf8fbd1e977710f |title=Weeks after election, YouTube cracks down on misinformation |first=Barbara |last=Ortutay |date=December 9, 2020 |access-date=June 2, 2023 |work=]}}</ref><ref>{{Cite web |last=Lee |first=Timothy B. |date=December 9, 2020 |title=YouTube bans videos claiming Trump won |url=https://arstechnica.com/tech-policy/2020/12/youtube-bans-videos-claiming-trump-won/ |access-date=January 31, 2024 |website=] |language=en-us}}</ref> however, it reversed this policy in June 2023, citing that the reversal was necessary to "openly debate political ideas, even those that are controversial or based on disproven assumptions".<ref>{{cite news |date=June 1, 2023 |title=YouTube changes policy to allow false claims about past US presidential elections |url=https://apnews.com/article/youtube-election-misinformation-policy-42a6c1b7623c485dbc04eb76ad443247 |access-date=June 2, 2023 |work=]}}</ref><ref>{{Cite web |last=Brodkin |first=Jon |date=June 2, 2023 |title=YouTube now allows videos that falsely claim Trump won 2020 election |url=https://arstechnica.com/tech-policy/2023/06/youtube-now-allows-videos-that-falsely-claim-trump-won-2020-election/ |access-date=January 31, 2024 |website=Ars Technica |language=en-us}}</ref> |
|
|
|
|
|
In the wake of the ], YouTube reported that it had been working to remove content that promoted ], misled voters, or encouraged ]. The platform also vowed to remove election misinformation generated by ].<ref>{{cite news |last=Capoot |first=Ashley |date=November 5, 2024 |title=How your favorite social media apps are preparing for election conspiracies |url=https://www.cnbc.com/2024/11/05/on-election-day-social-media-companies-meta-x-youtube-in-spotlight-.html |access-date=November 11, 2024 |work=]}}</ref> |
|
|
|
|
|
=== Child safety and wellbeing === |
|
|
{{See also|FamilyOFive|Fantastic Adventures scandal|Elsagate}} |
|
|
Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%.<ref>{{cite magazine |last=Luscombe |first=Belinda |date=May 18, 2017 |title=The YouTube Parents Who are Turning Family Moments into Big Bucks |url=https://time.com/4783215/growing-up-in-public/ |access-date=June 21, 2019 |magazine=]}}</ref><ref>{{cite web |last=Alexander |first=Julia |date=June 21, 2019 |title=YouTube can't remove kid videos without tearing a hole in the entire creator ecosystem |url=https://www.theverge.com/2019/6/21/18651223/youtube-kids-harmful-content-predator-comments-family-vlogging |access-date=June 21, 2019 |work=]}}</ref> However, with the increase in videos featuring children, the site began to face several controversies related to ]. During Q2 2017, the owners of popular channel ], which featured themselves playing "pranks" on their children, were accused of ]. Their videos were eventually deleted, and two of their children were removed from their custody.<ref name="Ohlheiser2017">{{cite news |last=Ohlheiser |first=Abby |date=April 26, 2017 |title=The saga of a YouTube family who pulled disturbing pranks on their own kids |url=https://www.washingtonpost.com/news/the-intersect/wp/2017/04/25/the-saga-of-a-youtube-family-who-pulled-disturbing-pranks-on-their-own-kids/ |newspaper=]}}</ref><ref name="Cresci2017">{{cite news |last=Cresci |first=Elena |date=May 7, 2017 |title=Mean stream: how YouTube prank channel DaddyOFive enraged the internet |language=en-GB |work=] |url=https://www.theguardian.com/technology/shortcuts/2017/may/07/when-youtube-pranks-go-horribly-wrong |access-date=June 7, 2017 |issn=0261-3077}}</ref><ref name="Dunphy2017">{{cite web |last=Dunphy |first=Rachel |date=April 28, 2017 |title=The Abusive 'Pranks' of YouTube Family Vloggers |url=https://nymag.com/selectall/2017/04/daddyofive-youtube-abuse-controversy-explained.html|work=]|access-date=July 9, 2017}}</ref><ref name="Gajanan2017">{{cite magazine |last=Gajanan |first=Mahita |date=May 3, 2017 |title=YouTube Star DaddyOFive Loses Custody of 2 Children Shown in 'Prank' Videos |url=https://time.com/4763981/daddyofive-mike-martin-heather-martin-youtube-prank-custody/ |access-date=July 9, 2017 |magazine=]}}</ref> A similar case happened in 2019 when the owner of the channel ] was accused of abusing her adopted children. Her videos would later be deleted.<ref>{{cite web |first1=Eric |last1=Levenson |first2=Mel |last2=Alonso |title=A mom on a popular YouTube show is accused of pepper-spraying her kids when they flubbed their lines |url=https://www.cnn.com/2019/03/20/us/youtube-fantastic-adventures-mom-arrest-trnd/index.html |publisher=CNN |date=March 20, 2019}}</ref> |
|
|
|
|
|
Later that year, YouTube came under criticism for showing inappropriate videos targeted at children and often featuring popular characters in violent, sexual or otherwise disturbing situations, many of which appeared on ] and attracted millions of views. The term "]" was coined on the Internet and then used by various news outlets to refer to this controversy.<ref>Ben Popper, , ''The Verge'', February 4, 2017</ref><ref>{{cite web |author=<!--Staff writer(s); no by-line.--> |date=March 31, 2017 |title=Report: Thousands of videos mimicking popular cartoons on YouTube Kids contain inappropriate content |url=https://news10.com/2017/03/31/report-thousands-of-videos-mimicking-popular-cartoons-on-youtube-kids-contain-inappropriate-content/ |access-date=April 30, 2017 |website=NEWS10 ABC |archive-date=August 19, 2017 |archive-url=https://web.archive.org/web/20170819234642/http://news10.com/2017/03/31/report-thousands-of-videos-mimicking-popular-cartoons-on-youtube-kids-contain-inappropriate-content/ |url-status=dead }}</ref><ref name="NYT">{{cite web |last=Maheshwari |first=Sapna |date=November 4, 2017 |title=Child Friendly? Startling Videos Slip Past Filters |url=https://www.nytimes.com/2017/11/04/business/media/youtube-kids-paw-patrol.html |url-access=limited |website=The New York Times |id={{ProQuest|2463387110}}}}</ref><ref name="forbes">Dani Di Placido, , '']'', November 28, 2017</ref> On November 11, 2017, YouTube announced it was strengthening site security to protect children from unsuitable content. Later that month, the company started to mass delete videos and channels that made improper use of family-friendly characters. As part of a broader concern regarding child safety on YouTube, the wave of deletions also targeted channels that showed children taking part in inappropriate or dangerous activities under the guidance of adults. Most notably, the company removed '']'', a channel with over 8.5 million subscribers, that featured a father and his two daughters in odd and upsetting situations.<ref name="auto">Todd Spangler, , '']'', November 17, 2017</ref><ref name="verge">{{cite news |last=Popper |first=Ben |date=November 9, 2017 |title=YouTube says it will crack down on bizarre videos targeting children |work=] |url=https://www.theverge.com/2017/11/9/16629788/youtube-kids-distrubing-inappropriate-flag-age-restrict |url-status=live |archive-url=https://web.archive.org/web/20171116090955/https://www.theverge.com/2017/11/9/16629788/youtube-kids-distrubing-inappropriate-flag-age-restrict |archive-date=November 16, 2017 |quote=In August of this year, YouTube announced that it would no longer allow creators to monetize videos which "made inappropriate use of family-friendly characters." Today it's taking another step to try to police this genre.}}</ref><ref>Sarah Templeton, , '']'', November 22, 2017</ref><ref>, '']'', November 22, 2017</ref><ref>Charlie Warzel, ], November 22, 2017</ref> According to analytics specialist SocialBlade, it earned up to $11.2 million annually prior to its deletion in November 2017.<ref>{{cite news |last1=Bridge |first1=Mark |last2=Mostrous |first2=Alexi |date=November 18, 2017 |title=Child abuse on YouTube |newspaper=The Times |url=https://www.thetimes.co.uk/article/child-abuse-on-youtube-q3x9zfkch |url-access=subscription |access-date=November 28, 2017}}</ref> |
|
|
|
|
|
Even for content that appears to be aimed at children and appears to contain only child-friendly content, YouTube's system allows for anonymity of who uploads these videos. These questions have been raised in the past, as YouTube has had to remove channels with children's content which, after becoming popular, then suddenly include inappropriate content masked as children's content.<ref name="WSJ kids love">{{cite news |last1=Koh |first1=Yoree |last2=Morris |first2=Betsy |date=April 11, 2019 |title=Kids Love These YouTube Channels. Who Creates Them Is a Mystery. |newspaper=The Wall Street Journal |url=https://www.wsj.com/articles/kids-love-these-youtube-channels-who-creates-them-is-a-mystery-11554975000 |url-access=registration |url-status=live |archive-url=https://web.archive.org/web/20190814180500/https://www.wsj.com/articles/kids-love-these-youtube-channels-who-creates-them-is-a-mystery-11554975000 |archive-date=August 14, 2019 |access-date=August 14, 2019}}</ref> Alternative, some of the most-watched children's programming on YouTube comes from channels that have no identifiable owners, raising concerns of intent and purpose. One channel that had been of concern was "]" which provided numerous mass-produced animated videos aimed at children. Up through 2019, it had drawn up to {{USD|10 million}} a month in ad revenue and was one of the largest kid-friendly channels on YouTube before 2020. Ownership of Cocomelon was unclear outside of its ties to "Treasure Studio", itself an unknown entity, raising questions as to the channel's purpose,<ref name="WSJ kids love" /><ref>{{cite magazine |last=Martineau |first=Paris |title=YouTube Has Kid Troubles Because Kids Are a Core Audience |url=https://www.wired.com/story/youtube-kid-troubles-kids-core-audience/ |url-status=live |archive-url=https://web.archive.org/web/20190811205146/https://www.wired.com/story/youtube-kid-troubles-kids-core-audience/ |archive-date=August 11, 2019 |access-date=August 14, 2019 |magazine=]}}</ref><ref>{{cite web |last=Graham |first=Jefferson |date=June 22, 2019 |title=Why YouTube's kid issues are so serious |url=https://www.usatoday.com/story/tech/talkingtech/2019/06/22/nursery-rhymes-i-toy-story-porn-youtube-thats-kid-problem/1529724001/ |url-status=live |archive-url=https://web.archive.org/web/20190814181002/https://www.usatoday.com/story/tech/talkingtech/2019/06/22/nursery-rhymes-i-toy-story-porn-youtube-thats-kid-problem/1529724001/ |archive-date=August 14, 2019 |access-date=August 14, 2019 |website=USA Today}}</ref> but '']'' had been able to confirm and interview the small team of American owners in February 2020 regarding "Cocomelon", who stated their goal for the channel was to simply entertain children, wanting to keep to themselves to avoid attention from outside investors.<ref>{{cite news |last1=Bergan |first1=Mark |last2=Shaw |first2=Lucas |date=February 10, 2020 |title=YouTube's Secretive Top Kids Channel Expands Into Merchandise |url=https://www.bloomberg.com/news/articles/2020-02-10/popular-youtube-kids-channel-cocomelon-gets-into-merch-and-toys |access-date=June 15, 2020 |work=]}}</ref> The anonymity of such channel raise concerns because of the lack of knowledge of what purpose they are trying to serve.<ref name="vice kids content">{{cite web |last=Haskins |first=Caroline |date=March 19, 2019 |title=YouTubers Are Fighting Algorithms to Make Good Content for Kids |url=https://www.vice.com/en_us/article/mbznpy/youtubers-are-fighting-algorithms-to-make-good-content-for-kids |url-status=live |archive-url=https://web.archive.org/web/20190814182839/https://www.vice.com/en_us/article/mbznpy/youtubers-are-fighting-algorithms-to-make-good-content-for-kids |archive-date=August 14, 2019 |access-date=August 14, 2019 |website=]}}</ref> The difficulty to identify who operates these channels "adds to the lack of accountability", according to Josh Golin of the ], and educational consultant Renée Chernow-O'Leary found the videos were designed to entertain with no intent to educate, all leading to critics and parents to be concerned for their children becoming too enraptured by the content from these channels.<ref name="WSJ kids love" /> Content creators that earnestly make child-friendly videos have found it difficult to compete with larger channels, unable to produce content at the same rate as them, and lacking the same means of being promoted through YouTube's recommendation algorithms that the larger animated channel networks have shared.<ref name="vice kids content" /> |
|
|
|
|
|
In January 2019, YouTube officially banned videos containing "challenges that encourage acts that have an inherent risk of severe physical harm" (such as the ]) and videos featuring pranks that "make victims believe they're in physical danger" or cause emotional distress in children.<ref>{{cite web |last=Palladino |first=Valentina |date=January 16, 2019 |title=YouTube updates policies to explicitly ban dangerous pranks, challenges |url=https://arstechnica.com/gadgets/2019/01/youtube-updates-policies-to-explicitly-ban-dangerous-pranks-challenges/ |access-date=January 16, 2019 |website=Ars Technica |language=en-us}}</ref> |
|
|
|
|
|
== Sexualization of children and pedophilia == |
|
|
{{See also|Elsagate}} |
|
|
|
|
|
Also in November 2017, it was revealed in the media that many videos featuring children—often uploaded by the minors themselves, and showing innocent content such as the children playing with toys or performing gymnastics—were attracting comments from ]<ref>, '']'', November 15, 2017</ref><ref name="habits">{{cite news |last1=Mostrous |first1=Alexi |last2=Bridge |first2=Mark |last3=Gibbons |first3=Katie |date=November 24, 2017 |title=YouTube adverts fund paedophile habits |newspaper=The Times |url=https://www.thetimes.co.uk/article/youtube-adverts-fund-paedophile-habits-fdzfmqlr5 |url-access=subscription |access-date=November 28, 2017}}</ref> with predators finding the videos through private YouTube playlists or typing in certain keywords in Russian.<ref name="habits" /> Other child-centric videos originally uploaded to YouTube began propagating on the ], and uploaded or embedded onto forums known to be used by pedophiles.<ref>{{cite news |last=Tait |first=Amelia |date=April 24, 2016 |title=Why YouTube mums are taking their kids offline |url=https://www.newstatesman.com/culture/observations/2016/04/why-youtube-mums-are-taking-their-kids-offline |access-date=June 21, 2019 |work=]}}</ref> |
|
|
|
|
|
As a result of the controversy, which added to the concern about "Elsagate", several major advertisers whose ads had been running against such videos froze spending on YouTube.<ref name="forbes" /><ref>Todd Spangler, , '']'', November 25, 2017</ref> In December 2018, '']'' found more than 100 grooming cases in which children were manipulated into sexually implicit behavior (such as taking off clothes, adopting overtly sexual poses and touching other children inappropriately) by strangers.<ref name="Paedophiles">{{cite news |author1=Harry Shukman |author2=Mark Bridge |date=December 10, 2018 |title=Paedophiles grooming children live on YouTube |language=en |work=The Times |url=https://www.thetimes.co.uk/article/paedophiles-grooming-children-live-on-youtube-3fv8gt730 |issn=0140-0460 |url-status=dead |archive-url=https://web.archive.org/web/20181210055232/https://www.thetimes.co.uk/article/paedophiles-grooming-children-live-on-youtube-3fv8gt730 |archive-date=December 10, 2018 |access-date=February 3, 2024}}</ref> After a reporter flagged the videos in question, half of them were removed, and the rest were removed after ''The Times'' contacted YouTube's PR department.<ref name="Paedophiles" /> |
|
|
|
|
|
In February 2019, YouTube vlogger Matt Watson identified a "wormhole" that would cause the YouTube recommendation algorithm to draw users into this type of video content, and make all of that user's recommended content feature only these types of videos.<ref>{{cite web |last1=Lieber |first1=Chavie |title=YouTube has a pedophilia problem, and its advertisers are jumping ship |url=https://www.vox.com/the-goods/2019/2/27/18241961/youtube-pedophile-ring-child-safety-advertisers-pulling-ads |website=vox.com |date=March 1, 2019}}</ref> Most of these videos had comments from sexual predators commenting with timestamps of when the children were shown in compromising positions or otherwise making indecent remarks. In some cases, other users had re-uploaded the video in unlisted form but with incoming links from other videos, and then monetized these, propagating this network.<ref name="bloomberg mwatson">{{cite news |last1=Bergen |first1=Mark |last2=de Vynck |first2=Gerrit |last3=Palmeri |first3=Christopher |date=February 20, 2019 |title=Nestle, Disney Pull YouTube Ads, Joining Furor Over Child Videos |url=https://www.bloomberg.com/news/articles/2019-02-20/disney-pulls-youtube-ads-amid-concerns-over-child-video-voyeurs |access-date=February 20, 2019 |work=]}}</ref> In the wake of the controversy, the service reported that they had deleted over 400 channels and tens of millions of comments, and reported the offending users to law enforcement and the ]. A spokesperson explained that "any content—including comments—that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. There's more to be done, and we continue to work to improve and catch abuse more quickly."<ref>{{cite web |last=Alexander |first=Julia |date=February 21, 2019 |title=YouTube terminates more than 400 channels following child exploitation controversy |url=https://www.theverge.com/2019/2/21/18234494/youtube-child-exploitation-channel-termination-comments-philip-defranco-creators |access-date=February 21, 2019 |work=]}}</ref><ref>{{cite web |last=Brodkin |first=Jon |date=February 21, 2019 |title=YouTube loses advertisers over 'wormhole into pedophilia ring' |url=https://arstechnica.com/tech-policy/2019/02/youtube-loses-advertisers-over-wormhole-into-pedophilia-ring/ |access-date=February 22, 2019 |website=Ars Technica |language=en-us}}</ref> Despite these measures, ], ], ], ], and ] all pulled their advertising from YouTube.<ref name="bloomberg mwatson" /><ref>{{cite web |last1=Haselton |first1=Todd |last2=Salinas |first2=Sara |date=February 21, 2019 |title=As fallout over pedophilia content on YouTube continues, AT&T pulls all advertisements |url=https://www.cnbc.com/2019/02/21/att-pulls-all-ads-from-youtube-pedophilia-controversy.html |access-date=February 21, 2019 |publisher=]}}</ref> |
|
|
|
|
|
Subsequently, YouTube began to demonetize and block advertising on the types of videos that have drawn these predatory comments. The service explained that this was a temporary measure while they explore other methods to eliminate the problem.<ref>{{cite web |last=Ingraham |first=Nathan |date=February 22, 2019 |title=YouTube is proactively blocking ads on videos prone to predatory comments |url=https://www.engadget.com/2019/02/22/youtube-blocking-ads-on-videos-predatory-comments/ |access-date=February 22, 2019 |work=]}}</ref> YouTube also began to flag channels that predominantly feature children, and preemptively disable their comments sections. "Trusted partners" can request that comments be re-enabled, but the channel will then become responsible for moderating comments. These actions mainly target videos of toddlers, but videos of older children and teenagers may be protected as well if they contain actions that can be interpreted as sexual, such as gymnastics. YouTube stated it was also working on a better system to remove comments on other channels that matched the style of child predators.<ref>{{cite news |last=Fox |first=Chris |date=February 28, 2019 |title=YouTube bans comments on all videos of kids |language=en-GB |url=https://www.bbc.com/news/technology-47408969 |access-date=March 2, 2019}}</ref><ref>{{cite web |last=Alexander |first=Julia |date=February 28, 2019 |title=YouTube is disabling comments on almost all videos featuring children |url=https://www.theverge.com/2019/2/28/18244954/youtube-comments-minor-children-exploitation-monetization-creators |access-date=February 28, 2019 |work=]}}</ref> |
|
|
|
|
|
A related attempt to algorithmically flag videos containing references to the string "CP" (an abbreviation of ]) resulted in some prominent false positives involving unrelated topics using the same abbreviation, including videos related to the mobile video game '']'' (which uses "CP" as an abbreviation of the statistic "Combat Power"), and '']''. YouTube apologized for the errors and reinstated the affected videos.<ref>{{cite web |last=Gerken |first=Tom |date=February 19, 2019 |title=YouTube backtracks after Pokemon 'child abuse' ban |url=https://www.bbc.com/news/technology-47278362|work=]|access-date=February 20, 2019}}</ref> Separately, online trolls have attempted to have videos flagged for takedown or removal by commenting with statements similar to what the child predators had said; this activity became an issue during the ] rivalry in early 2019. YouTube stated they do not take action on any video with these comments but those that they have flagged that are likely to draw child predator activity.<ref>{{cite web |last=Alexander |first=Julia |date=February 28, 2019 |title=Trolls are lying about child porn to try to get YouTube channels taken down |url=https://www.theverge.com/2019/2/28/18241925/youtube-creator-comments-weaponized-trolling-child-exploitation-lies-controversy-lies |access-date=February 28, 2019 |work=]}}</ref> |
|
|
|
|
|
In June 2019, ''The New York Times'' cited researchers who found that users who watched erotic videos could be recommended seemingly innocuous videos of children.<ref>{{cite web |last1=Fisher |first1=Max |last2=Taub |first2=Amanda |date=June 3, 2019 |title=On YouTube's Digital Playground, an Open Gate for Pedophiles |url=https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html |access-date=June 6, 2019 |work=]}}</ref> As a result, Senator ] stated plans to introduce federal legislation that would ban YouTube and other video sharing sites from including videos that predominantly feature minors as "recommended" videos, excluding those that were "professionally produced", such as videos of televised talent shows.<ref>{{cite web |last=Ingraham |first=Nathan |date=June 6, 2019 |title=A Senator wants to stop YouTube from recommending videos featuring minors |url=https://www.engadget.com/2019/06/06/senator-youtube-bill-stop-featuring-minors-in-recommendations/ |access-date=June 6, 2019 |work=]}}</ref> YouTube has suggested potential plans to remove all videos featuring children from the main YouTube site and transferring them to the ] site where they would have stronger controls over the recommendation system, as well as other major changes on the main YouTube site to the recommended feature and auto-play system.<ref>{{cite web |last=Copeland |first=Rob |date=June 19, 2019 |title=YouTube, Under Fire, Considers Major Changes to Kids' Content |url=https://www.wsj.com/articles/youtube-under-fire-considers-major-changes-to-kids-content-11560953721 |access-date=June 19, 2019 |work=]}}</ref> |
|
|
|
|
|
== Misogyny == |
|
|
An August 2022 report by the ], a British think tank, found that harassment against women was flourishing on YouTube. It noted that channels espousing a similar ideology to that of ] ] were using YouTube to grow their audience, despite Tate being banned from the platform.<ref name="misogyny">{{cite news |last=Lorenz |first=Taylor |author-link=Taylor Lorenz |date=September 18, 2022 |title=YouTube remains rife with misogyny and harassment, creators say |language=en-US |newspaper=] |url=https://www.washingtonpost.com/technology/2022/09/18/you-tube-mysogyny-women-hate/ |access-date=December 26, 2022 |issn=0190-8286}}</ref> In his 2022 book ''Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination'', ] reporter Mark Bergen said that many female content creators were dealing with harassment, bullying, and stalking.<ref name="misogyny" /> |
|
|
|
|
|
==References== |
|
|
{{reflist}} |
|
|
|
|
|
{{YouTube navbox}} |
|
|
|
|
|
] |
|
|
] |
|
|
] |