Jump to content

Elsagate

From Wikipedia, the free encyclopedia
Elsagate thumbnails featured familiar children's characters doing inappropriate or disturbing things, shown directly or suggested. Examples included injections, mutilation, childbirth, urination, fellatio, and chemical burning.

Elsagate (derived from Elsa and the -gate scandal suffix) is a controversy surrounding videos on YouTube and YouTube Kids that were categorized as "child-friendly", but contained themes inappropriate for children. These videos often featured fictional characters from family-oriented media, sometimes via crossovers, used without legal permission. The controversy also included channels that focused on real-life children, such as Toy Freaks, that raised concern about possible child abuse.

Most videos in this category were produced either with live action or Flash animation, but some used claymation or computer-generated imagery.[1] The videos were sometimes tagged in such a way as to circumvent YouTube's child safety algorithms, and some appeared on YouTube Kids. These videos were difficult to moderate due to the large scale of YouTube.[2] In order to capture search results and attract attention from users, their buzzword titles and descriptions featured the names of the fictional characters, as well as keywords such as "education", "learn colors", and "nursery rhymes".[3][4][5] They also included automatically placed commercials, making them lucrative to their owners and YouTube.[3]

Public awareness of the phenomenon grew in late 2017. That year—after reports on child safety on YouTube by several media outlets—YouTube adopted stricter guidelines regarding children's content. In late November, the platform deleted channels and videos falling into the Elsagate category, as well as large amounts of other inappropriate videos or user comments relating to children.[6] While these efforts, intended to wipe the platform of Elsagate-related media, curbed much of the older content, as of the 2020s similar videos and channels (this time utilizing video games popular with children) have been found to remain pervasively accessible to children on YouTube.[7][8]

History

[edit]

Early history (2016–2017)

[edit]

In June 2016, The Guardian published an article about the channel Webs and Tiaras, which had been created in March of the same year. The channel showed people dressed as characters like Spider-Man, Elsa, and the Joker engaging in bizarre or nonsensical actions. The videos themselves had background music but no dialogue. The lack of dialogue meant that there was no language barrier on the videos, which would normally hinder worldwide distribution. The article also reported that several nearly identical channels, named Toy Monster, The Superheroes Life, and The Kids Club, had appeared on YouTube.[9]

In January 2017, one channel under the control of a YouTube partner in Vietnam, Spiderman Frozen Marvel Superhero Real Life, blocked their Vietnamese subscribers after complaints from parents regarding the content of their videos.[10] The channel's owner was later fined by Vietnamese authorities.[11]

The increasing the number of views had led some to voice concerns that such channels are gaming the system by using bots or click farms to inflate viewing figures to higher proportions; however, there is no evidence for this.[9] In early February 2017, Tubefilter interviewed one of Webs And Tiaras' creators, Eric, who insisted that the team has "never used any bots or any other method to increase our views." Given the channel's substantial numbers, he said, "If we were not respecting the rules, YouTube would certainly have taken notice." He also said that his team is based in Canada, and there is no larger company behind his channels, and that it represents a grassroots project among friends.[12]

In February 2017, The Verge commented that "adults dressing up in costume and acting out weird, wordless skits has become a booming industry on the world's biggest video platform" and that while many videos were "puerile but benign", others featured more questionable content, such as scatological humor and violent or sexual situations. The article noted that most videos were made with a very limited budget and "a few Halloween costumes", which made them easy to produce and contributed to their multiplication. It also attributed their success to the frequent use of "Freudian concerns", which young children may find fascinating, amusing, or frightening, such as "peeing, pooping, kissing, pregnancy, and the terrifying notion of going to the doctor and getting a shot".[1]

Also in February, The Awl published an article on Webs and Tiaras and similar channels, describing their content as "nonsensically nightmarish" and "pretty twisted for children's content: some videos involve Elsa giving birth, and in some others, Spider-Man injects Elsa with a brightly colored liquid. You half expect the scenarios to be porn setups." In most videos, the like and dislike options were disabled, which makes it difficult to understand how many users were actually engaging with them. Many videos feature hundreds of positive comments written by similar channels in an apparent attempt to attract more clicks.[13]

In March, the BBC ran a piece titled "The disturbing YouTube videos that are tricking children". The article focuses on a Peppa Pig imitation, where the titular character's teeth are painfully pulled out by a dentist, and a video featuring said character burning down an occupied house. The article also mentioned the existence of "hundreds" of similar videos, ranging from unauthorized but otherwise harmless copies of authentic animations to frightening and gory content.[14]

CTV News also reported in March about YouTube's "fake toons problem", with adult-themed imitations of popular children's shows frequently appearing on YouTube Kids: "In some cases, the video will feature a kid-friendly thumbnail, while the video itself might be entirely different" and be very unsuitable for small children. The network commented that such videos were "often nightmares to behold, with lots of frightening scenes involving monsters and blood. Many of these videos venture into dark territory, with the characters often being chased, attacked, or injured in a bloody manner."[15]

The term "Elsagate" was coined on the Internet in 2017. During the summer of that year, it became a popular hashtag on Twitter as users called attention to the presence of such material on YouTube and YouTube Kids.[16] On Reddit, an Elsagate subreddit (r/ElsaGate) was created on June 23 to discuss the phenomenon, soon attracting tens of thousands of users.[17]

Discovery of Elsagate videos (2017)

[edit]

In November 2017, several newspapers published articles about the YouTube channel Toy Freaks, which had been created two years earlier by a single father named Greg Chism. Toy Freaks had a total of 8.54 million subscribers and were among the top 100 most viewed channels before it shut down that month. The channel often featured Chism's daughters, and in most cases showed them scared or crying.[18][19]

Several individuals, including the rapper B.o.B and commentary channel h3h3Productions, discussed Elsagate on social media during this time.[20]

On November 4, The New York Times published an article about the "startling" videos slipping past YouTube's filters and disturbing children, "either by mistake or because bad actors have found ways to fool the YouTube Kids' algorithms".[3] On November 6, author James Bridle wrote on Medium about his worry about videos aimed at scaring, hurting, and exploiting children. He said these videos were common on YouTube, and noticed that many of these videos were confusing as obvious parodies and imitations interacted with algorithm-driven content creators, which led to content that mixed up popular tropes, characters, and keywords. He said this made videos with real people resemble automated content.[4] On November 17, Internet commentator Philip DeFranco posted a video addressing the issue.[21]

The New York Times found that one of the channels featuring counterfeit cartoons, Super Zeus TV, was linked to a website called SuperKidsShop.com, registered in Ho Chi Minh City, Vietnam. A man working for SuperKidsShop.com confirmed that his partners were responsible for the videos, on which "a team of about 100 people" were producing. Subsequent requests for an interview went unanswered.[3]

On November 9, members of the satirical sound collage group Negativland presented an episode of their weekly radio show Over the Edge dedicated to Elsagate. "'Modern Animal Kids'[22] threads Elsagate through a remix of three '90s episodes of Over the Edge which focused on media for children, all broadcast in the final years before Teletubbies pioneered marketing to the 6- to 18-month-old demographic".[23]

On November 22, BuzzFeed News published an article about unsettling videos that depict children in disturbing and abusive situations. The information on the article came with the assistance of journalist and human rights activist Matan Uziel, whose investigation and report to the FBI on that matter were sent on September 22, informing its leadership about "tens of thousands of videos available on YouTube that we know are crafted to serve as eye candy for perverted, creepy adults and online predators to indulge in their child fantasies".[24]

On November 23, French-Canadian outlet Tabloïd released a video investigation about Toy Monster, a channel linked to Webs and Tiaras. They confronted the videos' creators – based out of the south shore of Quebec City – who refused to be interviewed. One of the actors featured in the videos anonymously stated that he was contractually obligated to refrain from commenting. The investigation revealed that identical content was being posted on numerous channels apparently operated by the same people.[25]

2020s

[edit]

Although YouTube initiated a crackdown on Elsagate content in 2017, videos hosting similar content have continued to be found on the website in the following years. Much of the content is based on video game IPs popular with children, such as Minecraft, Among Us or Poppy Playtime, and is both marketed towards, and freely accessible to, children. And while YouTube Kids disallows inappropriate content and is intended to steer children away from the main app, the efficacy of that method has been called into question.[7][8]

An investigation, published by Wired on 30 March 2021, found dozens of "disturbing" or "grotesque" animated videos, targeting Minecraft and Among Us fans, that were featured under YouTube's "Topics" or "hashtags" pages for the games. The magazine argued that these newer discoverability features lacked moderation, and allowed "opportunistic" channels to display questionable or inappropriate content. One livestream showcased an animated video of "a female Minecraft avatar with bare breasts opening a present full of the poop emoji", as well as a thumbnail featuring "two inflated breasts [holding] up a poop Minecraft brick", with other inappropriate or disturbing Minecraft or Among US-style thumbnails, easily seen via the Topics or hashtags sections. Still, the magazine acknowledged the 2017 purge and stated that their findings didn't represent "a direct Elsagate repeat", noting that these videos were not on YouTube Kids, that much of their shocking content was limited to the thumbnails only, and that many of the more obvious ways of targeting children had been challenged by the purge.[7]

In 2022, Newsweek also reported on channels and videos containing similarly inappropriate content or thumbnails, featuring characters from horror games such as Five Nights at Freddy's and Poppy Playtime that were popular with children. The magazine described suggestive thumbnails, as well as content that featured drugs (including date rape drugs) and violence, such as murder, school shootings, or physical abuse towards children and female characters. YouTube responded, stating that "the content shared has not been found in the YouTube Kids app which is our recommended experience for children under 13." However, Sonia Livingstone, professor at the London School of Economics and expert on children's digital safety, argued that this was "insufficient" given that the main app still had the content in question accessible to children, with poor "moderation, labeling," and "age-gating." Newsweek further contended that, while YouTube encourages creators to age-restrict content intended for older users and disallows "misleading" content that targets minors while displaying inappropriate themes, none of the videos they found had been age-restricted.[8]

Effect on children

[edit]

The New York Times quoted pediatrics professor Michael Rich, who considered it upsetting that "characters [children] thought they knew and trusted" were shown behaving in an improper or violent manner.[3]

Response from YouTube

[edit]

In response to the controversy, on 1 June 2017, YouTube changed its guidelines, specifically to ban children's characters in inappropriate situations. Unfortunately, that ban had little effect, and in the months after, the subreddit r/elsagate became a reliable place for amateur investigators to raise the alarm on videos that slipped through YouTube's moderation system.[26]

In August 2017, YouTube announced its new guidelines on content and monetization. In an ongoing series of efforts to demonetize controversial and offensive videos, it was announced that creators would no longer be able to monetize videos that "made inappropriate use of family-friendly characters". In November of the same year, it announced that it would implement "a new policy that age restricts this content in the YouTube main app when flagged".[27]

The controversy extended to channels that featured not necessarily children's characters but actual children, who sometimes performed inappropriate or dangerous activities under the guidance of adults. As part of a broader action, YouTube terminated the channel Toy Freaks, which featured a father (Greg Chism) and his two daughters in potentially abusive situations.[28][29][30] Chism was subsequently investigated by child-protection officials in Illinois and Missouri for alleged child abuse.[18][31][32] In December 2017, authorities announced that Chism would not face criminal charges.[33] Before its removal, the channel had over 8.5 million subscribers.[28][29][30]

It was also revealed in the media that many videos featuring minors – frequently uploaded by the children themselves and showing innocent content – had attracted comments from pedophiles and other groups. Some of these videos were monetized. As a result of the controversy, several major advertisers froze spending on YouTube, forcing YouTube to ban children from their site, citing legal obligations.[34][35][36]

On November 22, 2017, YouTube announced that it had deleted over 50 channels and thousands of videos that did not fit the new guidelines.[37] On November 27, the company said in a statement to BuzzFeed News that it had "terminated more than 270 accounts and removed over 150,000 videos", "turned off comments on more than 625,000 videos targeted by child predators" and "removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content".[38] Forbes contributor Dani Di Placido wrote that many problematic videos could still be seen on the platform, and that "the sheer volume of videos hastily deleted from the site prove that YouTube's algorithms were utterly ineffective at protecting young children".[34]

In December 2017, as advertisers pulled ads, YouTube CEO Susan Wojcicki announced new moderation guidelines, removing inappropriate children's content and bringing the total number of moderators up to 10,000.[26]

See also

[edit]

References

[edit]
  1. ^ a b Popper, Ben (2017-02-20). "Adults dressed as superheroes is YouTube's new, strange, and massively popular genre". The Verge. Archived from the original on 2017-11-13. Retrieved 2024-03-27.
  2. ^ "Does 'Elsagate' prove YouTube is too big to control?". The Week. 25 November 2017. Archived from the original on 1 December 2017. Retrieved 21 November 2017.
  3. ^ a b c d e Maheshwari, Sapna (2017-11-04). "On YouTube Kids, Startling Videos Slip Past Filters". The New York Times. ISSN 0362-4331. Archived from the original on 2019-06-26. Retrieved 2024-03-28.
  4. ^ a b Bridle, James (2018-06-21). "Something is wrong on the internet". Medium. Archived from the original on 2017-11-23. Retrieved 2024-03-27.
  5. ^ "Fall of linear TV turns kids towards dark corners of YouTube". Berkeley High Jacket. 2023-01-28. Archived from the original on 2024-03-28. Retrieved 2024-03-28.
  6. ^ "5 ways we're toughening our approach to protect families on YouTube and YouTube Kids". blog.youtube. Archived from the original on 2024-03-27. Retrieved 2024-03-27.
  7. ^ a b c D'Anastasio, Cecilia. "Blood, Poop, and Violence: YouTube Has a Creepy Minecraft Problem". Wired. Retrieved 5 July 2024.
  8. ^ a b c Hajjaji, Danya. "Violent YouTube Cartoons Exploit Children's Favorite Horror Characters". Newsweek. Retrieved 5 July 2024.
  9. ^ a b Dredge, Stuart (2016-06-29). "YouTube's latest hit: neon superheroes, giant ducks and plenty of lycra". The Guardian. ISSN 0261-3077. Archived from the original on 2017-11-23. Retrieved 2024-03-27.
  10. ^ Nam, Dinh (18 January 2017). "Vietnamese parents shocked over adult content in YouTube videos for children". VnExpress International. Archived from the original on 1 December 2017. Retrieved November 24, 2017.
  11. ^ "Vietnam man fined for suggestive YouTube parody channel". Vietnam Life. 2017-01-19. Archived from the original on 2024-03-27. Retrieved 2024-03-27.
  12. ^ "YouTube's Latest Bizarre Trend Has Adults Dressing Up In Spider-Man And Elsa Costumes". Tubefilter. 2017-02-02. Archived from the original on 2023-06-02. Retrieved 2024-03-28.
  13. ^ "The Ballad Of Elsa And Spiderman - The Awl". 2017-12-01. Archived from the original on 2017-12-01. Retrieved 2024-03-27.
  14. ^ "The disturbing YouTube videos that are tricking children". BBC News. 2017-03-27. Archived from the original on 2019-06-26. Retrieved 2018-04-27.
  15. ^ "Fake toons: Kids falling prey to adult parodies of popular children's shows". CTVNews. 2017-03-28. Archived from the original on 2024-03-27. Retrieved 2024-03-27.
  16. ^ "'ElsaGate' hashtag calling attention to disturbing YouTube content". WGN-TV. 2017-07-26. Archived from the original on 2024-03-27. Retrieved 2024-03-27.
  17. ^ "Elsagate: The sick YouTube videos targeting your kids". Newshub. Archived from the original on 2024-03-27. Retrieved 2024-03-27.
  18. ^ a b "YouTuber with ties to St. Louis allegedly under investigation for questionable videos". KSDK. 1 December 2017. Archived from the original on 12 March 2023. Retrieved 3 December 2017.
  19. ^ Koerber, Brian (11 November 2017). "YouTube bans the creepy and weird kid channel 'Toy Freaks'". Mashable. Archived from the original on 19 January 2018. Retrieved 25 January 2018.
  20. ^ Schroeder, Audra (6 July 2017). "B.o.B. discovers the weird world of pregnant Elsa videos". The Daily Dot. Archived from the original on 11 October 2017. Retrieved 25 November 2017.
  21. ^ Why We Need To Talk About The Insane YouTube Kids Problem… #Elsagate. Archived from the original on 2017-11-23. Retrieved 2017-11-28.
  22. ^ Episode downloadable at negativland.com's OTE files Archived 2018-01-07 at the Wayback Machine.
  23. ^ "Negativland - SIX HOURS OF CONTENT TOGETHER THROUGH LIFE OVER THE EDGE "Living Monitress" Broadcast November 30, 2017 Jon Leidecker Gamify your self-employment. Forced to take a look at all the good things your telephone can do. OVER THE EDGE "Modern Animal Kids" Broadcast November 9, 2017 Jon Leidecker Many of the strangest Elsagate videos have now been scrubbed, but if it weren't clear before, there's no one in charge. Pregnant Elsa Injection Finger Spiderman Scream Face Daddy Indoor Fun Family Song Minnie Mouse Spider Cartoon. Tonight's episode threads Elsagate through a remix of three 90's episodes of Over The Edge which focused on media for children, all broadcast in the final years before Teletubbies pioneered marketing to the 6 to 18 month old demographic". Facebook. Archived from the original on March 27, 2024. Retrieved 2024-03-27.
  24. ^ Warzel, Charlie (22 November 2017). "YouTube Is Addressing Its Massive Child Exploitation Problem". BuzzFeed. Archived from the original on 22 November 2017. Retrieved 22 November 2017.
  25. ^ Derrière le masque de Spiderman Archived 2021-03-08 at the Wayback Machine(in French), Tabloid, 23 November 2017.
  26. ^ a b Brandom, Russell (2017-12-08). "Inside Elsagate, the conspiracy-fueled war on creepy YouTube kids videos". The Verge. Archived from the original on 2018-04-13. Retrieved 2024-03-28.
  27. ^ Popper, Ben (9 November 2017). "YouTube says it will crack down on bizarre videos targeting children". The Verge. Archived from the original on 2017-11-16. In August of this year, YouTube announced that it would no longer allow creators to monetize videos which "made inappropriate use of family-friendly characters". Today it's taking another step to try and police this genre.
  28. ^ a b Spangler, Todd (2017-11-17). "YouTube Terminates Toy Freaks Channel Amid Broader Crackdown on Disturbing Kids' Content". Variety. Archived from the original on 2018-01-29. Retrieved 2024-03-27.
  29. ^ a b "Disturbing, 'abusive' 'Toy Freaks' video channel removed from YouTube". Newshub. Archived from the original on 2024-04-01. Retrieved 2024-03-27.
  30. ^ a b Warzel, Charlie (2017-11-22). "YouTube Is Addressing Its Massive Child Exploitation Problem". BuzzFeed News. Archived from the original on 2024-03-30. Retrieved 2024-03-27.
  31. ^ Gibbons, Katie (30 November 2017). "Toy Freaks YouTube video father Greg Chism faces abuse inquiry". The Times. Archived from the original on 3 December 2017. Retrieved 3 December 2017.
  32. ^ Smidt, Remy (30 November 2017). "Authorities Say YouTube's Toy Freaks Dad Is Under Investigation — But They Won't Say Who's In Charge". BuzzFeed News. Archived from the original on 3 December 2017. Retrieved 3 December 2017.
  33. ^ Gutelle, Sam (2017-12-07). "After Investigation, Father Behind Terminated YouTube Channel Toy Freaks Won't Face Criminal Charges". Tubefilter. Archived from the original on 2018-06-30. Retrieved 2018-06-30.
  34. ^ a b Placido, Dani Di. "YouTube's "Elsagate" Illuminates The Unintended Horrors Of The Digital Age". Forbes. Archived from the original on 2017-12-01. Retrieved 2024-03-27.
  35. ^ Spangler, Todd (2017-11-25). "YouTube Faces Advertiser Boycott Over Videos With Kids That Attracted Sexual Predators". Variety. Archived from the original on 2018-01-29. Retrieved 2024-03-27.
  36. ^ Mostrous, Alexi; Bridge, Mark; Gibbons, Katie (24 November 2017). "YouTube adverts fund paedophile habits". The Times. Archived from the original on 26 November 2017. Retrieved 28 November 2017.
  37. ^ "YouTube to crack down on videos showing child endangerment". ABC News. Archived from the original on 2017-11-24. Retrieved 2024-03-27.
  38. ^ Montgomery, Blake (2017-11-28). "YouTube Has Deleted Hundreds Of Thousands Of Disturbing Kids' Videos". BuzzFeed News. Archived from the original on 2024-02-24. Retrieved 2024-03-27.
[edit]