Search icon

Tech

05th Jun 2019

YouTube has found a stance on hate speech and supremacist content

Alan Loughnane

Youtube hate speech

Changes are coming.

Just one day after initially balking at the prospect of taking action against YouTuber Steven Crowder, over what video host and writer Carlos Maza said were repeated homophobic and racial slurs against him, YouTube has done a 180 of its decision.

YouTube had initially said Crowder did not violate any of its policies and that Crowder’s YouTube channel will stay up.

But today, in a blog post, the company announced that it is changing its community guidelines to ban videos promoting the superiority of any group as a justification for discrimination against others based on their age, gender, race, caste, religion, sexual orientation, or veteran status.

The company, which is owned by Google, confirmed it would no longer host videos that glorified fascist views or material that denied the existence of the Holocaust, following years of criticism over its role in spreading far-right hate and conspiracy theories.

Users are no longer allowed to post videos, such as the Sandy Hook Massacre and 9/11, saying those events did not happen, YouTube said.

YouTube has traditionally had a standoffish approach to dealing with controversial policy matters, but in the relentless 2019 culture of public pressure, they were forced to act last year by banning a handful of extremists from their platform, including Alex Jones from InfoWars.

While the sheer volume of content which is uploaded on to the platform makes moderation difficult, YouTube’s AI has proven itself adept at hunting out content which breaches the platforms community guidelines. But the real issues have emerged when when real-time policy issues have arisen, and YouTube’s sluggishness to take action on these issues.

LISTEN: You Must Be Jokin’ with Conor Sketches | Tiger Woods loves Ger Loughnane and cosplaying as Charles LeClerc