Facebook Says Its Employees Will View Your Nudes If You Use Its Anti-Revenge Porn Program

In an attempt to combat the rise of revenge porn on its platform, Facebook is usually asking users to upload any nude photos they think may be distributed without consent — a process which involves a Facebook employee reviewing the uploaded images.

Piloting the program in Australia, Facebook has teamed with the Australian government’s eSafety division, with an aim to prevent intimate images being shared without consent on all of its platforms (This kind of includes Messenger, Instagram along with Facebook Groups).

The entire process is usually as follows:

  • A person worried that will intimate photos of themselves are being shared online fills out a form on the eSafety Commissioner’s website;
  • The user then sends the photo(s) to themselves on Facebook Messenger;
  • While This kind of is usually happening, the eSafety Commissioner’s office notifies Facebook of the person’s submission;
  • Facebook’s community operations team uses “image matching technology” to prevent the image being uploaded or shared online. At least one “specially-trained representative” will review your image(s) before hashing them.
  • Hashing an image converts This kind of into a digital fingerprint — a series of numbers — that will are used to block attempts to upload the image to Facebook’s platforms.
  • The user is usually then prompted by Facebook to delete the image they have sent to themselves.

In a blog post on Thursday, Facebook confirmed that will at least one company employee will view the nude photos users upload.

In a post on its Newsroom portal, Facebook’s global head of safety, Atigone Davis, wrote that will a “specially-trained representative” through the social network’s Community Operations team will review the image before “hashing” This kind of.

Facebook then stores the hash, which This kind of says “creates a human-unreadable, numerical fingerprint of This kind of,” yet not the photo itself. This kind of helps to prevent future uploading — if you’re comfortable with an employee seeing your nudes.

This kind of brand new system through Facebook builds on an announcement in April, at which the company first said This kind of would certainly be introducing brand new tools to help people who had images shared on Facebook without consent.

Previously, users were encouraged to use Facebook’s “report” feature to block images through being shared that will were already uploaded.

The brand new hashing program will give users the ability to notify Facebook themselves thereby stop the image through being uploaded inside first place.

“The safety along with well-being of the Facebook community is usually our top priority,” said Davis in a statement.

“These tools, developed in partnership with global safety experts, are one example of how we’re using brand new technology to keep people safe along with prevent harm.”

Leave a Reply

Your email address will not be published. Required fields are marked *

*

4 × 3 =