Misplaced Pages

Offensive content in YouTube

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

This is an old revision of this page, as edited by Stranger43286 (talk | contribs) at 03:38, 18 December 2024 (Start new article). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Revision as of 03:38, 18 December 2024 by Stranger43286 (talk | contribs) (Start new article)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff) See also: Criticism of Google § YouTube, Censorship by Google § YouTube, and Content moderation

YouTube has a set of community guidelines aimed to reduce abuse of the site's features. The uploading of videos containing defamation, pornography, and material encouraging criminal conduct is forbidden by YouTube's "Community Guidelines". Generally prohibited material includes sexually explicit content, videos of animal abuse, shock videos, content uploaded without the copyright holder's consent, hate speech, spam, and predatory behavior. YouTube relies on its users to flag the content of videos as inappropriate, and a YouTube employee will view a flagged video to determine whether it violates the site's guidelines. Despite the guidelines, YouTube has faced criticism over aspects of its operations, its recommendation algorithms perpetuating videos that promote conspiracy theories and falsehoods, hosting videos ostensibly targeting children but containing violent or sexually suggestive content involving popular characters, videos of minors attracting pedophilic activities in their comment sections, and fluctuating policies on the types of content that is eligible to be monetized with advertising.

YouTube contracts companies to hire content moderators, who view content flagged as potentially violating YouTube's content policies and determines if they should be removed. In September 2020, a class-action suit was filed by a former content moderator who reported developing post-traumatic stress disorder (PTSD) after an 18-month period on the job.

Controversial moderation decisions have included material relating to Holocaust denial, the Hillsborough disaster, Anthony Bourdain's death, and the Notre-Dame fire. In July 2008, the Culture and Media Committee of the House of Commons of the United Kingdom stated that it was "unimpressed" with YouTube's system for policing its videos, and argued that "proactive review of content should be standard practice for sites hosting user-generated content".

In June 2022, Media Matters, a media watchdog group, reported that homophobic and transphobic content calling LGBT people "predators" and "groomers" was becoming more common on YouTube. The report also referred to common accusations in YouTube videos that LGBT people are mentally ill. The report stated the content appeared to be in violation of YouTube's hate speech policy.

An August 2022 report by the Center for Countering Digital Hate, a British think tank, found that harassment against women was flourishing on YouTube. In his 2022 book Like, Comment, Subscribe: Inside YouTube's Chaotic Rise to World Domination, Bloomberg reporter Mark Bergen said that many female content creators were dealing with harassment, bullying, and stalking.

Conspiracy theories and far-right content

YouTube has been criticized for using an algorithm that gives great prominence to videos that promote conspiracy theories, falsehoods and incendiary fringe discourse. According to an investigation by The Wall Street Journal, "YouTube's recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven't shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints." After YouTube drew controversy for giving top billing to videos promoting falsehoods and conspiracy when people made breaking-news queries during the 2017 Las Vegas shooting, YouTube changed its algorithm to give greater prominence to mainstream media sources.

In 2017, it was revealed that advertisements were being placed on extremist videos, including videos by rape apologists, anti-Semites, and hate preachers who received ad payouts. After firms started to stop advertising on YouTube in the wake of this reporting, YouTube apologized and said that it would give firms greater control over where ads got placed.

University of North Carolina professor Zeynep Tufekci has referred to YouTube as "The Great Radicalizer", saying "YouTube may be one of the most powerful radicalizing instruments of the 21st century." Jonathan Albright of the Tow Center for Digital Journalism at Columbia University described YouTube as a "conspiracy ecosystem".

Use among white supremacists

Before 2019, YouTube took steps to remove specific videos or channels related to supremacist content that had violated its acceptable use policies but otherwise did not have site-wide policies against hate speech.

In the wake of the March 2019 Christchurch mosque attacks, YouTube and other sites like Facebook and Twitter that allowed user-submitted content drew criticism for doing little to moderate and control the spread of hate speech, which was considered to be a factor in the rationale for the attacks. These platforms were pressured to remove such content, but in an interview with The New York Times, YouTube's then chief product officer Neal Mohan said that unlike content such as ISIS videos which take a particular format and thus easy to detect through computer-aided algorithms, general hate speech was more difficult to recognize and handle, and thus could not readily take action to remove without human interaction.

In May 2019, YouTube joined an initiative led by France and New Zealand with other countries and tech companies to develop tools to be used to block online hate speech and to develop regulations, to be implemented at the national level, to be levied against technology firms that failed to take steps to remove such speech, though the United States declined to participate. Subsequently, on June 5, 2019, YouTube announced a major change to its terms of service and further stated it would "remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place."

In June 2020, YouTube was criticized for allowing white supremacist content on its platform for years after it announced it would be pledging $1 million to fight racial injustice. Later that month, it banned several channels associated with white supremacy, including those of Stefan Molyneux, David Duke, and Richard B. Spencer, asserting these channels violated their policies on hate speech.

Misinformation and handling of the COVID-19 pandemic

Multiple research studies have investigated cases of misinformation in YouTube. In a July 2019 study based on ten YouTube searches using the Tor Browser related to climate and climate change, the majority of videos were videos that communicated views contrary to the scientific consensus on climate change. A May 2023 study found that YouTube was monetizing and profiting from videos that included misinformation about climate change. A 2019 BBC investigation of YouTube searches in ten different languages found that YouTube's algorithm promoted health misinformation, including fake cancer cures. In Brazil, YouTube has been linked to pushing pseudoscientific misinformation on health matters, as well as elevated far-right fringe discourse and conspiracy theories. In the Philippines, numerous channels disseminated misinformation related to the 2022 Philippine elections. Additionally, research on the dissemination of Flat Earth beliefs in social media, has shown that networks of YouTube channels form an echo chamber that polarizes audiences by appearing to confirm preexisting beliefs.

In 2018, YouTube introduced a system that would automatically add information boxes to videos that its algorithms determined may present conspiracy theories and other fake news, filling the infobox with content from Encyclopædia Britannica and Misplaced Pages as a means to inform users to minimize misinformation propagation without impacting freedom of speech. In 2023, YouTube revealed its changes in handling content associated with eating disorders. This social media platform's Community Guidelines now prohibit content that could encourage emulation from at-risk users.

In January 2019, YouTube said that it had introduced a new policy starting in the United States intended to stop recommending videos containing "content that could misinform users in harmful ways." YouTube gave flat earth theories, miracle cures, and 9/11 truther-isms as examples. Efforts within YouTube engineering to stop recommending borderline extremist videos falling just short of forbidden hate speech, and track their popularity were originally rejected because they could interfere with viewer engagement. In July 2022, YouTube announced policies to combat misinformation surrounding abortion, such as videos with instructions to perform abortion methods that are considered unsafe and videos that contain misinformation about the safety of abortion. Google and YouTube implemented policies in October 2021 to deny monetization or revenue to advertisers or content creators that promoted climate change denial. In January 2024, the Center for Countering Digital Hate reported that climate change deniers were instead pushing other forms of climate change denial that have not yet been banned by YouTube.

Following the dissemination via YouTube of misinformation related to the COVID-19 pandemic that 5G communications technology was responsible for the spread of coronavirus disease 2019 which led to multiple 5G towers in the United Kingdom being attacked by arsonists, YouTube removed all such videos linking 5G and the coronavirus in this manner.

In September 2021, YouTube extended this policy to cover videos disseminating misinformation related to any vaccine, including those long approved against measles or Hepatitis B, that had received approval from local health authorities or the World Health Organization. The platform proceeded to remove the accounts of anti-vaccine campaigners such as Robert F. Kennedy Jr. and Joseph Mercola. YouTube had extended this moderation to non-medical areas. In the weeks following the 2020 United States presidential election, the site added policies to remove or label videos promoting election fraud claims; however, it reversed this policy in June 2023, citing that the removal was necessary to "openly debate political ideas, even those that are controversial or based on disproven assumptions".

Child safety and wellbeing

See also: FamilyOFive, Fantastic Adventures scandal, and Elsagate

Leading into 2017, there was a significant increase in the number of videos related to children, coupled between the popularity of parents vlogging their family's activities, and previous content creators moving away from content that often was criticized or demonetized into family-friendly material. In 2017, YouTube reported that time watching family vloggers had increased by 90%. However, with the increase in videos featuring children, the site began to face several controversies related to child safety, including with popular channels FamilyOFive and Fantastic Adventures.

Later that year, YouTube came under criticism for showing inappropriate videos targeted at children and often featuring popular characters in violent, sexual or otherwise disturbing situations, many of which appeared on YouTube Kids and attracted millions of views. The term "Elsagate" was coined on the Internet and then used by various news outlets to refer to this controversy. Following the criticism, YouTube announced it was strengthening site security to protect children from unsuitable content and the company started to mass delete videos and channels that made improper use of family-friendly characters. As part of a broader concern regarding child safety on YouTube, the wave of deletions also targeted channels that showed children taking part in inappropriate or dangerous activities under the guidance of adults.

Even for content that appears to be aimed at children and appears to contain only child-friendly content, YouTube's system allows for anonymity of who uploads these videos. These questions have been raised in the past, as YouTube has had to remove channels with children's content which, after becoming popular, then suddenly include inappropriate content masked as children's content. The anonymity of such channel raise concerns because of the lack of knowledge of what purpose they are trying to serve. The difficulty to identify who operates these channels "adds to the lack of accountability", according to Josh Golin of the Campaign for a Commercial-Free Childhood, and educational consultant Renée Chernow-O'Leary found the videos were designed to entertain with no intent to educate, all leading to critics and parents to be concerned for their children becoming too enraptured by the content from these channels. Content creators that earnestly make child-friendly videos have found it difficult to compete with larger channels, unable to produce content at the same rate as them, and lacking the same means of being promoted through YouTube's recommendation algorithms that the larger animated channel networks have shared.

In January 2019, YouTube officially banned videos containing "challenges that encourage acts that have an inherent risk of severe physical harm" (such as the Tide Pod Challenge) and videos featuring pranks that "make victims believe they're in physical danger" or cause emotional distress in children.

Sexualization of children and pedophilia

See also: Elsagate

In November 2017, it was revealed in the media that many videos featuring children—often uploaded by the minors themselves, and showing innocent content such as the children playing with toys or performing gymnastics—were attracting comments from pedophiles with predators finding the videos through private YouTube playlists or typing in certain keywords in Russian. Other child-centric videos originally uploaded to YouTube began propagating on the dark web, and uploaded or embedded onto forums known to be used by pedophiles.

As a result of the controversy, which added to the concern about "Elsagate", several major advertisers whose ads had been running against such videos froze spending on YouTube. In December 2018, The Times found more than 100 grooming cases in which children were manipulated into sexually implicit behavior (such as taking off clothes, adopting overtly sexual poses and touching other children inappropriately) by strangers.

In February 2019, YouTube vlogger Matt Watson identified a "wormhole" that would cause the YouTube recommendation algorithm to draw users into this type of video content, and make all of that user's recommended content feature only these types of videos. Most of these videos had comments from sexual predators commenting with timestamps of when the children were shown in compromising positions or otherwise making indecent remarks. In the wake of the controversy, the service reported that they had deleted over 400 channels and tens of millions of comments, and reported the offending users to law enforcement and the National Center for Missing and Exploited Children. Despite these measures several large advertisers pulled their advertising from YouTube.

Subsequently, YouTube began to demonetize and block advertising on the types of videos that have drawn these predatory comments. YouTube also began to flag channels that predominantly feature children, and preemptively disable their comments sections.

A related attempt to algorithmically flag videos containing references to the string "CP" (an abbreviation of child pornography) resulted in some prominent false positives involving unrelated topics using the same abbreviation. YouTube apologized for the errors and reinstated the affected videos.

In June 2019, The New York Times cited researchers who found that users who watched erotic videos could be recommended seemingly innocuous videos of children.

References

  1. ^ "YouTube Community Guidelines". Archived from the original on March 4, 2017. Retrieved November 30, 2008 – via YouTube.
  2. ^ Alexander, Julia (May 10, 2018). "The Yellow $: a comprehensive history of demonetization and YouTube's war with creators". Polygon. Retrieved November 3, 2019.
  3. Wong, Julia Carrie; Levin, Sam (January 25, 2019). "YouTube vows to recommend fewer conspiracy theory videos". The Guardian. ISSN 0261-3077. Retrieved November 3, 2019.
  4. Orphanides, K. G. (March 23, 2018). "Children's YouTube is still churning out blood, suicide and cannibalism". Wired UK. ISSN 1357-0978. Retrieved November 3, 2019.
  5. Orphanides, K. G. (February 20, 2019). "On YouTube, a network of paedophiles is hiding in plain sight". Wired UK. ISSN 1357-0978. Retrieved November 3, 2019.
  6. Kimball, Whitney (September 22, 2020). "Content Moderator Exposed to Child Assault and Animal Torture Sues YouTube". Gizmodo. Retrieved October 11, 2020.
  7. Vincent, James (September 22, 2020). "Former YouTube content moderator sues the company after developing symptoms of PTSD". The Verge. Retrieved October 11, 2020.
  8. Elias, Jennifer (September 22, 2020). "Former YouTube content moderator describes horrors of the job in new lawsuit". CNBC. Retrieved October 11, 2020.
  9. "YouTube criticized in Germany over anti-Semitic Nazi videos". Reuters. Archived from the original on May 17, 2008. Retrieved May 28, 2008.
  10. "Fury as YouTube carries sick Hillsboro video insult". icLiverpool. Archived from the original on March 20, 2012. Retrieved November 29, 2015.
  11. Alba, Davey (June 16, 2018). "YouTube Is Spreading Conspiracy Theories about Anthony Bourdain's Death". BuzzFeed News. Retrieved June 16, 2018.
  12. Bergen, Mark (April 15, 2019). "YouTube Flags Notre-Dame Fire as 9/11 Conspiracy, Says System Made 'Wrong Call'". Bloomberg L.P. Retrieved April 15, 2019.
  13. Kirkup, James; Martin, Nicole (July 31, 2008). "YouTube attacked by MPs over sex and violence footage". The Daily Telegraph. Archived from the original on 2022-01-10. Retrieved March 26, 2017.
  14. ^ Lawton, Sophie (June 23, 2022). "Right-wing clickbait pushing anti-LGBTQ 'groomer' smears are increasingly popular on YouTube". Media Matters. Retrieved October 23, 2022.
  15. ^ Lorenz, Taylor (September 18, 2022). "YouTube remains rife with misogyny and harassment, creators say". The Washington Post. ISSN 0190-8286. Retrieved December 26, 2022.
  16. ^ Nicas, Jack (February 7, 2018). "How YouTube Drives People to the Internet's Darkest Corners". The Wall Street Journal. ISSN 0099-9660. Retrieved June 16, 2018.
  17. Fisher, Max; Bennhold, Katrin (September 7, 2018). "As Germans Seek News, YouTube Delivers Far-Right Tirades". The New York Times. Retrieved September 8, 2018.
  18. ^ Ingram, Matthew. "YouTube's secret life as an engine for right-wing radicalization". Columbia Journalism Review. No. September 19, 2018. Retrieved March 26, 2019.
  19. "YouTube wants the news audience, but not the responsibility". Columbia Journalism Review. Retrieved September 23, 2018.
  20. Lewis, Rebecca (September 2018). "Alternative Influence: Broadcasting the Reactionary Right on YouTube" (PDF). datasociety.net. Data and Society. Retrieved March 26, 2019.
  21. Nicas, Jack (October 6, 2017). "YouTube Tweaks Search Results as Las Vegas Conspiracy Theories Rise to Top". The Wall Street Journal. ISSN 0099-9660. Retrieved June 16, 2018.
  22. "Here's How YouTube Is Spreading Conspiracy Theories About The Vegas Shooting". BuzzFeed. Retrieved June 16, 2018.
  23. "The Big Tech Platforms Still Suck During Breaking News". BuzzFeed. Retrieved June 16, 2018.
  24. ^ "Google apologises as M&S pulls ads". BBC News. March 20, 2017. Retrieved June 16, 2018.
  25. Tufekci, Zeynep (March 10, 2018). "Opinion | YouTube, the Great Radicalizer". The New York Times. ProQuest 2610860590. Retrieved June 16, 2018.
  26. "Parkland shooting 'crisis actor' videos lead users to a 'conspiracy ecosystem' on YouTube, new research shows". The Washington Post. Retrieved September 23, 2018.
  27. ^ "Our ongoing work to tackle hate". June 5, 2019. Retrieved April 9, 2020 – via YouTube.
  28. Robertson, Adi (March 15, 2019). "Questions about policing online hate are much bigger than Facebook and YouTube". The Verge. Retrieved April 9, 2020.
  29. Timberg, Craig; Harwell, Drew; Shaban, Hamza; Ba Tran, Andrew; Fung, Brian (March 15, 2020). "The New Zealand shooting shows how YouTube and Facebook spread hate and violent images – yet again". The Washington Post. Retrieved April 9, 2020.
  30. Roose, Kevin (March 29, 2019). "YouTube's Product Chief on Online Radicalization and Algorithmic Rabbit Holes". The New York Times. Retrieved April 9, 2020.
  31. Browne, Ryan (May 15, 2019). "New Zealand and France unveil plans to tackle online extremism without the US on board". CNBC. Retrieved April 9, 2020.
  32. Willsher, Kim (May 15, 2019). "Leaders and tech firms pledge to tackle extremist violence online". The Guardian. Retrieved April 9, 2020.
  33. Newton, Casey (June 5, 2019). "YouTube just banned supremacist content, and thousands of channels are about to be removed". The Verge. Retrieved April 9, 2020.
  34. Hamilton, Isobel Asher (June 1, 2020). "YouTube has pledged $1 million in solidarity with Black Lives Matter protesters, but critics note the site has allowed white supremacist videos for years". Business Insider. Retrieved May 11, 2024.
  35. Alexander, Julia (June 29, 2020). "YouTube bans Stefan Molyneux, David Duke, Richard Spencer, and more for hate speech". The Verge. Retrieved June 29, 2020.
  36. Allgaier, Joachim (July 25, 2019). "Science and Environmental Communication on YouTube: Strategically Distorted Communications in Online Videos on Climate Change and Climate Engineering". Frontiers in Communication. 4. doi:10.3389/fcomm.2019.00036. ISSN 2297-900X.
  37. "Google profiting from climate misinformation on YouTube, report finds". The Independent. May 4, 2023. Retrieved August 27, 2023.
  38. Carmichael, Flora; Gragani, Juliana (September 12, 2019). "How YouTube makes money from fake cancer cure videos". BBC News. Beyond Fake News & B.B.C. Monitoring. Retrieved September 27, 2019.
  39. Fisher, Max; Taub, Amanda (August 11, 2019). "How YouTube Radicalized Brazil". The New York Times. ISSN 0362-4331. Retrieved August 12, 2019.
  40. Tuquero, Loreben (September 22, 2021). "Red flag for 2022: Political lies go unchecked on YouTube showbiz channels". Rappler. Manila, Philippines: Rappler Inc. Retrieved September 23, 2021.
  41. Diaz Ruiz, Carlos; Nilsson, Tomas (August 8, 2022). "Disinformation and Echo Chambers: How Disinformation Circulates on Social Media Through Identity-Driven Controversies". Journal of Public Policy & Marketing. 42: 18–35. doi:10.1177/07439156221103852. ISSN 0743-9156. S2CID 248934562.
  42. Newton, Casey (March 13, 2018). "YouTube will add information from Misplaced Pages to videos about conspiracies". The Verge. Retrieved April 15, 2019.
  43. Brown, David (March 14, 2018). "YouTube uses Misplaced Pages to fight fake news". The Times. ISSN 0140-0460. Archived from the original on September 27, 2021. Retrieved July 13, 2023.
  44. "YouTube rolls out new policies for eating disorder content". CNN.
  45. Weill, Kelly (January 25, 2019). "YouTube Tweaks Algorithm to Fight 9/11 Truthers, Flat Earthers, Miracle Cures". Retrieved January 29, 2019.
  46. Bergen, Mark (April 2, 2019). "YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant". Bloomberg News. Retrieved April 2, 2019.
  47. Elias, Jennifer (July 21, 2022). "YouTube says it will crack down on abortion misinformation and remove videos with false claims". CNBC. Retrieved July 21, 2022.
  48. Peters, Jay (October 7, 2021). "Google and YouTube will cut off ad money for climate change deniers". The Verge. Retrieved October 7, 2021.
  49. Belanger, Ashley (January 16, 2024). "Climate denialists find new ways to monetize disinformation on YouTube". Ars Technica. Retrieved January 31, 2024.
  50. "YouTube making money off new breed of climate denial, monitoring group says". Reuters. January 17, 2024. Retrieved January 31, 2024.
  51. Hern, Alex (April 5, 2020). "YouTube moves to limit spread of false coronavirus 5G theory". The Guardian. Retrieved April 5, 2020.
  52. Pannett, Rachel (January 29, 2021). "Russia threatens to block YouTube after German channels are deleted over coronavirus misinformation". The Washington Post. Retrieved September 30, 2021.
  53. ^ Alba, Davey (September 29, 2021). "YouTube bans all anti-vaccine misinformation". The New York Times. Archived from the original on December 28, 2021. Retrieved September 30, 2021.
  54. Ortutay, Barbara (December 9, 2020). "Weeks after election, YouTube cracks down on misinformation". Associated Press News. Retrieved June 2, 2023.
  55. Lee, Timothy B. (December 9, 2020). "YouTube bans videos claiming Trump won". Ars Technica. Retrieved January 31, 2024.
  56. "YouTube changes policy to allow false claims about past US presidential elections". Associated Press. June 1, 2023. Retrieved June 2, 2023.
  57. Brodkin, Jon (June 2, 2023). "YouTube now allows videos that falsely claim Trump won 2020 election". Ars Technica. Retrieved January 31, 2024.
  58. Luscombe, Belinda (May 18, 2017). "The YouTube Parents Who are Turning Family Moments into Big Bucks". Time. Retrieved June 21, 2019.
  59. Alexander, Julia (June 21, 2019). "YouTube can't remove kid videos without tearing a hole in the entire creator ecosystem". The Verge. Retrieved June 21, 2019.
  60. Ohlheiser, Abby (April 26, 2017). "The saga of a YouTube family who pulled disturbing pranks on their own kids". The Washington Post.
  61. Cresci, Elena (May 7, 2017). "Mean stream: how YouTube prank channel DaddyOFive enraged the internet". The Guardian. ISSN 0261-3077. Retrieved June 7, 2017.
  62. Dunphy, Rachel (April 28, 2017). "The Abusive 'Pranks' of YouTube Family Vloggers". New York Magazine. Retrieved July 9, 2017.
  63. Gajanan, Mahita (May 3, 2017). "YouTube Star DaddyOFive Loses Custody of 2 Children Shown in 'Prank' Videos". Time. Retrieved July 9, 2017.
  64. Levenson, Eric; Alonso, Mel (March 20, 2019). "A mom on a popular YouTube show is accused of pepper-spraying her kids when they flubbed their lines". CNN.
  65. Ben Popper, Adults dressed as superheroes is YouTube's new, strange, and massively popular genre, The Verge, February 4, 2017
  66. "Report: Thousands of videos mimicking popular cartoons on YouTube Kids contain inappropriate content". NEWS10 ABC. March 31, 2017. Archived from the original on August 19, 2017. Retrieved April 30, 2017.
  67. Maheshwari, Sapna (November 4, 2017). "Child Friendly? Startling Videos Slip Past Filters". The New York Times. ProQuest 2463387110.
  68. ^ Dani Di Placido, YouTube's "Elsagate" Illuminates The Unintended Horrors Of The Digital Age, Forbes, November 28, 2017
  69. Todd Spangler, YouTube Terminates Toy Freaks Channel Amid Broader Crackdown on Disturbing Kids' Content, Variety, November 17, 2017
  70. Popper, Ben (November 9, 2017). "YouTube says it will crack down on bizarre videos targeting children". The Verge. Archived from the original on November 16, 2017. In August of this year, YouTube announced that it would no longer allow creators to monetize videos which "made inappropriate use of family-friendly characters." Today it's taking another step to try to police this genre.
  71. Sarah Templeton, Disturbing 'ElsaGate', 'Toy Freaks' videos removed from YouTube after abuse allegations, Newshub, November 22, 2017
  72. YouTube to crack down on videos showing child endangerment, ABC News, November 22, 2017
  73. Charlie Warzel, YouTube Is Addressing Its Massive Child Exploitation Problem BuzzFeed, November 22, 2017
  74. Bridge, Mark; Mostrous, Alexi (November 18, 2017). "Child abuse on YouTube". The Times. Retrieved November 28, 2017.
  75. ^ Koh, Yoree; Morris, Betsy (April 11, 2019). "Kids Love These YouTube Channels. Who Creates Them Is a Mystery". The Wall Street Journal. Archived from the original on August 14, 2019. Retrieved August 14, 2019.
  76. ^ Haskins, Caroline (March 19, 2019). "YouTubers Are Fighting Algorithms to Make Good Content for Kids". Vice. Archived from the original on August 14, 2019. Retrieved August 14, 2019.
  77. Palladino, Valentina (January 16, 2019). "YouTube updates policies to explicitly ban dangerous pranks, challenges". Ars Technica. Retrieved January 16, 2019.
  78. YouTube videos of children are plagued by sexual comments, The Verge, November 15, 2017
  79. ^ Mostrous, Alexi; Bridge, Mark; Gibbons, Katie (November 24, 2017). "YouTube adverts fund paedophile habits". The Times. Retrieved November 28, 2017.
  80. Tait, Amelia (April 24, 2016). "Why YouTube mums are taking their kids offline". New Statesman. Retrieved June 21, 2019.
  81. Todd Spangler, YouTube Faces Advertiser Boycott Over Videos With Kids That Attracted Sexual Predators, Variety, November 25, 2017
  82. Harry Shukman; Mark Bridge (December 10, 2018). "Paedophiles grooming children live on YouTube". The Times. ISSN 0140-0460. Archived from the original on December 10, 2018. Retrieved February 3, 2024.
  83. Lieber, Chavie (March 1, 2019). "YouTube has a pedophilia problem, and its advertisers are jumping ship". vox.com.
  84. ^ Bergen, Mark; de Vynck, Gerrit; Palmeri, Christopher (February 20, 2019). "Nestle, Disney Pull YouTube Ads, Joining Furor Over Child Videos". Bloomberg News. Retrieved February 20, 2019.
  85. Alexander, Julia (February 21, 2019). "YouTube terminates more than 400 channels following child exploitation controversy". The Verge. Retrieved February 21, 2019.
  86. Brodkin, Jon (February 21, 2019). "YouTube loses advertisers over 'wormhole into pedophilia ring'". Ars Technica. Retrieved February 22, 2019.
  87. Haselton, Todd; Salinas, Sara (February 21, 2019). "As fallout over pedophilia content on YouTube continues, AT&T pulls all advertisements". CNBC. Retrieved February 21, 2019.
  88. Ingraham, Nathan (February 22, 2019). "YouTube is proactively blocking ads on videos prone to predatory comments". Engadget. Retrieved February 22, 2019.
  89. Fox, Chris (February 28, 2019). "YouTube bans comments on all videos of kids". Retrieved March 2, 2019.
  90. Alexander, Julia (February 28, 2019). "YouTube is disabling comments on almost all videos featuring children". The Verge. Retrieved February 28, 2019.
  91. Gerken, Tom (February 19, 2019). "YouTube backtracks after Pokemon 'child abuse' ban". BBC News. Retrieved February 20, 2019.
  92. Fisher, Max; Taub, Amanda (June 3, 2019). "On YouTube's Digital Playground, an Open Gate for Pedophiles". The New York Times. Retrieved June 6, 2019.