YouTube should be a safer platform
January 17, 2018
YouTube is the most popular video platform in the world due to its millions of videos uploaded each day, and in its 11 years of existence has left a permanent impact on digital culture. Many YouTube users, also known as YouTubers, make a living by monetizing their videos. But where YouTube falls short is its lack of discipline, which can be dangerous for a site where much of its users are young children.
On Jan 1, famous YouTuber Logan Paul uploaded a video of him venturing into the Aokigahara Forest in Japan, a location that is well known for the amount of suicides that take place in it. During his vlog, Paul came across the hanging corpse of a man who had recently committed suicide. Rather than deleting the footage, Paul filmed the corpse, put it in the video and even made the corpse the thumbnail of the video with a clickbait title, which caused outrage across the internet. Paul removed the video and gave, in my opinion, a half-hearted apology with little regard for the impact of what he did.
Paul initially faced zero consequences from YouTube for his video showing the corpse, despite it going against the site’s guidelines and was only punished once there was an outrage on the internet. Meanwhile, videos condemning Paul’s actions, such as one uploaded by YouTuber Nathan Zed, were age restricted, demonetized or deleted by YouTube.
This is not the first time this has happened. Last year popular YouTuber Pewdiepie faced backlash several times for repeatedly expressing antisemitic beliefs and use of racial slurs. Although in this case, Pewdiepie’s upcoming YouTube Red series was recalled, that’s about all that happened. Pewdiepie still makes money off of his videos and remains one of the most subscribed channels on the site. If you look the other way around, YouTube has stopped monetizing videos with LGBT themes, even if it is not inappropriate.
What’s even more concerning is that a majority of the audience of both Paul and Pewdiepie are children. Surely, you know or have met a child that obsessively watches YouTube videos on a tablet or phone. What happens when children are exposed to this disturbing content? As it turns out, this case goes deeper. A look into the Elsagate thread on Reddit reveals a whole other ballpark of inappropriate content in a rather bizarre way.
The term “Elsagate” has come to refer to the genre of YouTube videos that depict popular cartoon characters, such as Elsa from Frozen or Spider-Man, in inappropriate situations, all hidden behind an innocent-looking thumbnail. Such ‘inappropriate situations’ involve public urination, giving birth, needles, gore, sexual activities and more. Viewing this material can be damaging and even traumatizing to a child – if not now, then when they are older. Many of these videos are from some of the most popular children’s channels on YouTube and also visible on the YouTube Kids app. Parents remain unaware of what their kids are watching because the videos usually include happy, upbeat music.
Perhaps these videos can not easily be found with a simple search, but when kids are sat at a tablet all day clicking video after video, eventually they will start to show up. Some suspect that those who create these disturbing videos are just people with either a desire for views or a sick sense of humor. Others believe it to be a form of brainwashing or behavior modification. Kids are prone to imitate what others do, after all, in a psychological process known as modeling.
“They’re using what kids find attractive and what YouTube will promote more to children,” Elsagate Reddit moderator Lfodder said in an article on The Verge. “You have kids, maybe four years old, bombarded all day long in some cases, with these videos of chopping people’s fingers off and burning people and defecating on people.”
Does this mean that kids exposed to these videos will start grabbing knives and cutting off fingers? Not necessarily, but it does normalize the behavior much like how we are desensitized to violence on television.
In 2017, YouTube implemented stricter guidelines and moderation for what is uploaded for children, but still there are videos that slip past the filters. All one can do is report the videos and hope they’re taken down. Parents should also be aware of these videos and monitor what their children are watching. Some parents have started watching YouTube videos with their child, or prohibited the use of YouTube altogether.
More importantly, users uploading inappropriate content should be held accountable for their actions, no matter how many subscribers they have and whether it affects children or not. Logan Paul showed no respect for the suicide victim, Pewdiepie has spewed blatant hate speech and Elsagate videos have the potential to permanently traumatize children with their content.
Even if this was the first time something like this happened (which it isn’t), there should be some sort of strike system in place. After so many strikes, users should have their videos removed, demonetized or the channel should be terminated depending on the severity. YouTube is becoming more and more unsafe for young audiences, and it cannot be ignored any longer.