YouTube announced on Wednesday that it will change its community guidelines to ban extremist content and videos denying “well-documented events,” such as those claiming that the Sandy Hook mass shooting was a false flag.
The video-hosting platform also announced that it would demonetize channels that repeatedly skirt its rules against hate speech. The company did not name any specific channels that would be banned, but alluded to videos that glorified Nazi ideology, as an example.
In a statement, YouTube said it was the company’s “responsibility to protect, and prevent our platform from being used to incite hatred, harassment, discrimination and violence.”
At first glance, the news seems to be refreshing evidence of Silicon Valley finally cracking down on hate. But the timing of the announcement, coupled with the lack of information about how these rules will be enforced, makes it feel more like a PR exercise.
Days ago, Vox journalist Carlos Maza drew attention to the persistent racist and homophobic harassment he has faced as a result of right-wing YouTube creator Steven Crowder. Maza posted a cut of Crowder’s show where he repeatedly attacks Maza for his ethnicity and sexual orientation. That harassment has in turn led to Crowder’s fans doxxing and harassing the journalist.
As Maza noted, this activity is in direct violation of YouTube’s harassment policy, specifically its rule against “content that incites others to harass or threaten individuals off YouTube.”
YouTube, however, has refused to punish Crowder.
“Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our sit,” the company said Tuesday, responding to Maza’s tweets. “Even if a video remains on our site, it doesn’t mean we endorse/support that viewpoint.”
In follow-up remarks to Gizmodo, Google (which owns YouTube) clarified that since the videos in question were made to “respond to opinion,” they were fine — even the ones in which Crowder refers to Maza as a “lispy queer,” “token Vox gay atheist sprite,” and a “gay Mexican.”
“YouTube’s new anti-supremacy policy is a joke,” Maza tweeted Wednesday. “[It’s] a shiny prop meant to distract reporters and advertisers from the reality, which is that @YouTube doesn’t actually enforce any of these documents.”
The ambiguity surrounding what YouTube says it will enforce versus what it actually enforces is similar to an incident which occurred earlier in the spring on Facebook. In March the company declared that it would be banning white nationalist content from its platform. Soon after, however, HuffPost drew Facebook’s attention to Faith Goldy, a prominent far-right figure who, in one particular Facebook video, said that whites were being “replaced” and that Jews and persons of color should pay reparations for “invading” European countries.
According to a Facebook spokesperson, that video did not violate the company’s new policies.
YouTube’s content headache goes far beyond white nationalism and far-right harassment. Earlier this week, The New York Times reported that YouTube’s algorithm was recommending innocuous videos of children in bathing suits to users who were searching for sexualized content. And in February, Wired reported that YouTube’s comment section was being used by pedophiles to guide other predators towards child-centric videos.