Facebook will stop flagging content in which’s been declared false by external fact-checkers, as well as will instead surface fact-checks as related articles inside News Feed, the social media giant announced Wednesday.
The move represents the biggest outward facing change to Facebook’s year-old partnership with fact-checkers. The company said This kind of brand new approach will be more effective in stopping the spread of misinformation, while also doing This kind of easier to scale its effort to various other markets as well as content types.
Tessa Lyons, a News Feed product manager, told BuzzFeed News in which surfacing fact-checks as related articles proved more effective in tests than applying a disputed flag to stories inside News Feed.
“Related articles outperformed disputed flags in giving people more information so they could understand what was true or false,” she said. “Hoaxes in which had related article fact checks had fewer shares than those with the disputed flag.”
The related articles module appears underneath a piece of content inside News Feed, as well as was first rolled out in April.
Lyons wrote in a blog post announcing the change in which “putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs — the opposite effect to what we intended. Related Articles, by contrast, are simply designed to give more context, which our research has shown will be a more effective way to help people get to the facts.”
Brendan Nyhan, a political scientist at Dartmouth College who researches misinformation, said he was “optimistic” about the change.
“I have always been concerned in which [disputed flags] were not being widely deployed as well as were not politically feasible to scale up,” he told BuzzFeed News. “Surfacing fact-checks in related articles seems less intrusive as well as therefore will be likely to be more palatable to users.”
Aside through the difference users will see inside News Feed, the change could make This kind of easier for Facebook to surface different kinds of fact-checks. The disputed flag was only applied to links deemed completely false by at least two fact-checkers. Related articles will show fact-checks with more nuanced ratings, such as “mostly true,” as well as will instantly surface 1 fact-check. Internal data through Facebook found in which on average This kind of took three days for the multi-party disputed label to be applied to a piece of content.
“The disputed flag was difficult to scale to various other markets, to scale where there are not as many fact checkers,” Lyons said.
This kind of also only worked with links shared on Facebook. The company recently indicated in which inside coming year, This kind of plans to tackle misinformation inside form of photos as well as videos on its platform Surfacing a fact-check as a piece of related content to a meme will be more feasible than applying a disputed flag, according to the company.
As part of the announcement, Facebook published a Medium post by three employees who over the last year “traveled around the entire world, through Germany to Indonesia, talking with people about their experiences with misinformation.” They cited four reasons why the disputed label was ineffective, including the fact in which This kind of took several clicks for users to get to the fact-check. The post also noted in which Facebook’s fact-checking partners will be highlighted having a special “badge” inside related articles module.
Since its launch was announced almost exactly a year ago, the disputed flag has been a source of scrutiny as well as criticism. Some reports suggested This kind of could actually make people more inclined to share an article. A study by two Yale researchers also questioned whether the label had any effect of users’ perceptions of the accuracy of the flagged content.
With the brand new change, Facebook has confirmed This kind of initial attempt wasn’t the best solution. Lyons told BuzzFeed News the related articles approach will be an improvement, however one in which will need to be tested as well as studied over time.
“The fight against misinformation will be an ongoing one in which’s going to take us doing our own research,” she said. “This kind of’s an iterative process.”