Updated on September 3, 2019 to reflect new data about the number of videos and channels removed for exhibiting hateful content.
YouTube announced that it’s banning extremist videos that promote white supremacy, neo-Nazi ideology, and conspiracy theories.
In a blog post, YouTube said its new policy would ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion.”
The changes to YouTube’s hate speech policy comes after it was criticized for refusing to ban videos of a right-wing content creator, Steve Crowder, who’d been harassing a Vox journalist Carlos Maza, by repeatedly using racist and homophobic language in his videos. At the time, YouTube justified its actions, telling The New York Times, “Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site.”
However, just two days later, YouTube, which is owned by Google, blocked ads on the far-right commentator’s channel, and issued new hate speech guidelines.
YouTube is the latest social media company to block controversial or offensive speech. In May, Facebook banned Sandy Hook-denier Alex Jones and five others for spreading hate and inciting violence. And in the last couple of years, Twitter has temporarily or permanently banned individuals who spew offensive or hateful speech.
In a blogpost published on September 3rd, the company announced that they had removed over 100,000 videos and 17,000 channels between April 2019 and June 2019 for exhibiting hateful or abusive content. The numbers represent a five-fold increase from previous quarters: between January 2019 and March 2019, Youtube removed only 19,927 videos and 3,379 channels for this reason.
“The spikes in removal numbers are in part due to the removal of older comments, videos and channels that were previously permitted,” the video-sharing giant explained in its blogpost. Nevertheless, these numbers demonstrate the profound impact the hate speech policy has had on the social media platform.