Facebook Will Ban White Nationalist along with White Supremacist Content

Facebook will begin banning white nationalist, white separatist, along with white supremacist content, along with direct users who attempt to post such content to the website of the nonprofit Life After Hate, which works to de-radicalize people drawn into hate groups.

The change, first reported Wednesday by Vice’s Motherboard, comes less than two weeks after Facebook was heavily criticized for its role inside the Christchurch mosque attack. The gunman went live on the platform for several minutes before the attack began, showing off his guns along with at one point ironically said, “Remember lads, subscribe to PewDiePie,” referring to the Swedish YouTuber connected to quite a few racist along with anti-Semitic controversies.

BuzzFeed News has reached out to Facebook for more information on how the ban will work. In a blog post titled “Standing Against Hate,” Facebook published today, the ban takes effect next week. As of midday Wednesday, the feature did not yet appear to be live, based on searches by BuzzFeed News for terms like “white nationalist,” “white nationalist groups,” along with “blood along with soil.”

“the idea’s clear that will these concepts are deeply linked to organized hate groups along with have no place on our services,” the blog post reads. “Over the past three months our conversations with members of civil society along with academics who are experts in race relations around the planet have confirmed that will white nationalism along with separatism cannot be meaningfully separated via white supremacy along with organized hate groups.”

Earlier This particular week, a French Muslim advocacy group filed a lawsuit against Facebook, along with YouTube, for not removing footage of the attack quickly enough.

Facebook did not respond to an inquiry via BuzzFeed News last week on whether white nationalism along with neo-Nazism were being moderated using the same image-matching along with language-understanding the idea uses to police ISIS-related content. According to internal training documents that will were leaked last year, Facebook has typically not considered white nationalism intrinsically linked to racism.

Based on information in Motherboard’s report, the platform will use content-matching to delete images previously flagged as hate speech. There was no further elaboration on how that will might work, including whether or not URLs to websites like the Daily Stormer might be affected by the ban.

Progressive nonprofit civil rights advocacy coloration of Change called Facebook’s brand-new moderation policy a critical step forward.

“coloration Of Change alerted Facebook years ago to the growing dangers of white nationalists on its platform, along with today, we are glad to see the company’s leadership take This particular critical step forward in updating its policy on white nationalism,” the statement reads. “We look forward to continuing our work with Facebook to ensure that will the platform’s content moderation guidelines along with trainings properly support the updated policy along with are informed by civil rights along with racial justice organizations.”

In another change to Facebook’s moderation policy following public outcry, last month, the platform announced that will anti-vax misinformation might appear less frequently across people’s News Feeds, public pages along with groups, private pages along with groups, search predictions, along with in recommendation widgets around the site. The announcement came after weeks of pressure via lawmakers along with public health advocates to crack down on anti-vax content.