At Facebook, hand-wringing over a fix for fake content


SAN FRANCISCO — In June, Mark Zuckerberg spoke about “community” using a gathering of influential Facebook users in Chicago.

the item was an important moment for the 33-year-old chief executive of the social media company. He was promoting Facebook Groups, a product of which millions of people on Facebook used to talk about shared interests, debate, discuss as well as maybe debate some more.

This particular type of activity, he believed, was one of the keys to his sprawling company’s future. The goal of Facebook, he told his audience, which included many Groups leaders, was to “give people the power to build community as well as bring the globe closer together.”

Inside Mr. Zuckerberg’s company, however, there was already growing concern among employees of which some of of which content was having the opposite effect. Foremost among the offending material: Posts as well as memes touching on hot-button issues like race, gender as well as sexuality of which were secretly created by Russian organizations with ties to the Kremlin to influence the 2016 presidential election.

right now there is usually an ongoing debate among Facebook employees over how to handle This particular so-called organic content, or posts via users of which are not advertisements as well as can be freely shared across Facebook, according to a dozen current as well as former Facebook employees. These people spoke on condition of anonymity because they were prohibited by nondisclosure agreements via talking about the company.

On one side are employees who idealize Facebook’s lofty goals of unfettered speech as well as do not think the company should be from the business of censoring what its users have to say. On the some other side are workers who worry of which the company’s hands-off approach has already caused problems — ones of which will grow worse if nothing is usually done.

“The algorithms Facebook sets up of which prioritize as well as control what’s shown in its powerful newsfeed are a lot more consequential than the ads,” said Zeynep Tufekci, an associate professor at the University of North Carolina at Chapel Hill who closely follows social media as well as technology. “The company seems stuck between exercising This particular massive power as the item wishes, although also terrified about the conversation about its power as well as how to handle the item.”

Next week, Facebook’s general counsel will be among several tech industry executives anticipated to testify at a series of Congressional hearings about the role the technology industry played in Russian interference of last year’s election.

Facebook has acknowledged of which an organization with ties to the Kremlin purchased $100,000 worth of ads related to the election as well as has promised to crack down on such advertising.

Since Facebook disclosed the existence of those ads as well as posts with Russian ties last month, the company has attempted to tamp down fears the item abetted interference from the election. the item has also added rules meant to improve disclosures of political advertising in an attempt to show users exactly who is usually behind the ads they saw run through their newsfeeds.

as well as on Friday, the company began a test of fresh features designed to give users a better understanding of the people as well as organizations buying advertising on Facebook. of which included providing users using a searchable database of ads being served to them.

although misleading ads were often a modest component of the misinformation campaign.

Investigators believe the Internet Research Agency, a so-called troll farm of which has been linked to the Kremlin, amassed enormous followings for various Facebook Pages of which masqueraded as destinations for discussion about all sorts of issues, via the Black Lives Matter movement to gun ownership.

Aided by Facebook’s finely tuned ad-targeting tools, the Russian firm would certainly pay to place posts from the News Feeds of users. The ad product, called a “promoted post,” was designed to look little different than the rest of the content flowing through the News Feed.

Users who responded in a positive manner to the advertisements were prompted to subscribe to related Facebook Pages or Groups run by the Russians. If they did, the item meant of which nonpaid, “organic” posts would certainly begin to appear from the users’ News Feeds. via there they spread, being shared as well as reshared among the user’s network of friends.

The tactic was effective. Some of the pages, like “Blacktivists,” which focused on racial issues, had more than 360,000 users who “liked” the page — even more than the main “Black Lives Matter” page.

Facebook is usually not the only big internet company wrestling with the issue. although at Mr. Zuckerberg’s company, the issue has been particularly troublesome, given how easy the item is usually to spread messages to tens of millions of Facebook users.

Whether something is usually removed via Facebook is usually often dictated by its terms of service, which define standards of behavior on the network. Those standards prohibit posting nudity as well as threats of violence. although misleading users — even outright lying — aren’t necessarily against the rules. as well as of which’s hard to police.

So far, Facebook has focused on the issue of authenticity as well as identity on the platform. Facebook removed hundreds of ads last month, not because of the content they contained, although because the Russians running the pages did not disclose their real identities.

“We want people to be able to come to Facebook to talk about issues of public interest, as well as we know of which means people will sometimes disagree as well as the issues they discuss will sometimes be controversial,” Monika Bickert, head of product policy as well as counterterrorism at Facebook, said in a statement. “of which’s O.K. although the item’s important to us of which these conversations happen in an authentic way, meaning we have to be speaking as ourselves, with our real identities.”

of which line of reasoning may not hold up for long, as Facebook is usually being forced to deal with policy discussions outside the United States. In Myanmar, Facebook is usually caught between the government as well as a persecuted minority group of Muslims, the Rohingya, who face a misinformation campaign on Facebook in posts often via top government leaders.

Facebook has said little publicly about the situation although there is usually intensifying pressure to respond.

One of the solutions discussed internally at Facebook has been “whitelisting,” in which algorithms would certainly decide which content makers would certainly be allowed to publish or advertise on Facebook, according to two people familiar with the company’s internal deliberations. They have also discussed “blacklisting,” in which the algorithms would certainly decide which content makers could not post.

although in closed-door meetings at Facebook’s Menlo Park, Calif., headquarters as well as in Washington, Facebook employees have expressed concern of which such a move could have effects on some other publications as well as content makers of which are not spreading false or divisive news.

Others worry of which acting too hastily could establish precedents of which would certainly lead to a situation, for example, where human rights activists using Facebook to coordinate protests in Syria would certainly be forced to identify themselves. They also worry of which any effort to quash certain content from the United States could only aid censors in some other countries.

As for a technical solution, some desire artificial intelligence can help Facebook sift fact via fiction. although today’s A.I. technology is usually not advanced enough to do the work on its own.

Perhaps unsurprisingly, Mr. Zuckerberg’s solution is usually to double down on his community concept. He has said publicly of which strengthening social bonds on Facebook will lead to a positive outcome, despite whatever reservations his employees as well as the general public may have.

“Every day, I say to myself, ‘I don’t have much time here on Earth, how can I make the greatest positive impact?’ Some nights I go to bed as well as I’m not sure I made the right choices of which day,” Mr. Zuckerberg said at the June conference. “I can tell you, those doubts don’t go away, no matter who you are. although every day you just get up as well as try to make the globe a little better.”

Follow Mike Isaac on Twitter @MikeIsaac. Nicole Perlroth contributed reporting.

Leave a Reply

Your email address will not be published. Required fields are marked *


1 + six =