YouTube has existed since 2005 and now, in 2019, it has finally decided to ban videos that promote Nazi ideology, or that claim 9/11 didn’t happen. The video platform says it has always generally ban content promoting hate despite the fact that one of its biggest stars, gaming streamer PewDiePie – who is consistently pulled up for spouting anti-semetic abuse – still flourishes on the site with over 96 million followers.
Swathes of accounts are now set to be deleted however, after the platform has reviewed its terms of service. In a statement, YouTube said: “Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.”
YouTube did not provide specific examples of accounts that will be caught out by the new terms, and admitted that it could take some time for the ban to take effect: “We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we’ll be gradually expanding coverage over the next several months."
The fact that a platform of YouTube’s size and influence has actively allowed such content to be uploaded and circulated for so many years is extremely worrying, especially considering instances of when its influence bleeds into real-life – the Christchurch mosque shooter who killed 51 worshippers in New Zealand referred to PewDiePie in the live stream of the attack.
Yes, the shooter’s manifesto referred to many other alt-right memes, but YouTube content forms part of this extremely online and hate-filled language. The site was also criticised at the time for being too slow in removing content showing the shooting, along with Facebook and Twitter.