Fact-Checking Facebook Was Like Playing A Doomed Game Of Whack-A-Mole

Trying to stem the tsunami of fake news was like battling the Hydra — every time we cut off a virtual head, two more could grow in its place.

Posted on February 8, 2019, at 3:59 p.m. ET

Facebook has always struggled to comprehend the scale of its fake news along with propaganda problem. today, the item’s struggling to retain the fact-checkers the item paid to try along with deal with the crisis. Last week both Snopes along with the Associated Press ended their partnerships with the social network, after a tense couple of years trying, without success, to tackle the epidemic.

nevertheless those partnerships should never have existed inside first place, along with I say This specific as the former managing editor of Snopes, who Facebook first made contact with in 2016. When they first emailed me about a potential partnership, I knew the item could bring much more attention to the work of our little newsroom — along with much more scrutiny.

nevertheless what I didn’t realize was that will we were entering a full-blown crisis, not just of “fake news,” nevertheless of journalism, democracy, along with the nature of reality itself — one we’re all still trying to sort out in 2019, along with which had more twists along with turns than I’d ever thought possible. Looking back, my overwhelming impression of the years since 2016 is actually how surreal everything became.

the item turned out that will trying to fact-check a social media service that will is actually used by a huge chunk of the globe’s population is actually no easy task. We tried to make the item easier by showing where disinformation could originate, nevertheless there were just too many stories. Trying to stem the tsunami of hoaxes, scams, along with outright fake stories was like playing the globe’s most doomed game of whack-a-mole, or like battling the Hydra of Greek myth. Every time we cut off a virtual head, two more could grow in its place. My excellent nevertheless exhausted along with overworked team did as much as we could, nevertheless soon felt like we were floating around in a beat-up old skiff, trying to bail out the ocean having a leaky bucket.

Things soon got worse. Because of my own history reporting on refugee rights, I had contacts with groups all over the globe working on migration along with humanitarian crises. Since early 2015, I’d been hearing bits along with pieces about Myanmar along with the Rohingya Muslims, along with how activists on the ground — exhausted, dispirited activists who were begging any reporter they could find to help spread the word — were saying the crisis had been fueled along with spread by social media. The people of Myanmar had only experienced unfettered access to the internet since around 2012, along with today Facebook, through its Internet.org program that will provided free mobile internet access to its site, had quickly become the only source for news for a large portion of the population. Newsfeeds in Myanmar were pushing a narrative that will helped justify ethnic cleansing along with some other human rights violations on a massive scale. I took the item to my editorial team along with we put out some stories, along with then I took the item to Facebook.

Nothing happened, along with I came to see Myanmar as something of a style for the damage algorithms along with disinformation could do to our world. that will’s when the migraines started off. I became obsessed with This specific connection — I dreamed about the item at night, woke up thinking about the item, along with felt responsible for stopping a problem that will few others even knew existed.

What were the algorithmic criteria that will generated the lists of articles for us to check? We never knew, along with no one ever told us.

In case you’re curious, here’s what the item was like to be an official Facebook fact-checker. We were given access to a tool that will hooked into our personal Facebook accounts along with was accessed that will way (strike one, as far as I was concerned) along with the item spat out a long list of stories that will had been flagged for checks. We were free to ignore the list, or mark stories as “true,” “false,” or “mixture.” (Facebook later added a “satire” category after what I like to call “the Babylon Bee incident”, where a satirical piece was incorrectly labeled false.)

the item was clear by the start that will that will This specific list was generated via algorithm. the item contained headlines along with URLs, along having a graph showing their popularity along with how much time they had been on the site. There were puzzling aspects to the item, though. We could often get the same story over along with over again by different sites, which is actually to be supposed to a certain degree because many of the most lingering stories have been recycled again along with again. This specific is actually what Facebook likes to call “engagement.”

nevertheless no matter how many times we marked them “false,” stories could keep resurfacing with nothing more than a word or two changed. This specific happened often enough to make the item clear that will our efforts weren’t genuinely helping, along with that will we were being directed toward a certain type of story — along with, we presumed, away by others.

What were the algorithmic criteria that will generated the lists of articles for us to check? We never knew, along with no one ever told us.

There was a pattern to these repeat stories though: they were almost all “junk” news, not the highly corrosive stuff that will should have taken priority. We’d be asked to check if a story about a woman who was arrested for leaving her children inside car for hours while she ate at a buffet was true; meanwhile a flood of anti-semitic false George Soros stories never showed up on the list. I could never figure the item out why, nevertheless perhaps the item was a feature, not a bug.

Disinformation isn’t necessarily meant for you. the item’s meant for the people who lean authoritarian, the fearful conformists along with the perennially anxious

along with here we are today, with Snopes along with the Associated Press pulling out of their partnerships within days of each some other. the item doesn’t surprise me to see This specific falling apart, because the item was never a sufficient solution to a crisis that will still poses a real threat to our world. If Facebook is actually serious about undoing some of the damage they have done, here is actually what they should be doing (Twitter, which is actually by no means innocent in This specific, should follow suit):

First, Facebook must jettison This specific idea of influencing individual emotions or crowd behavior. Mass communication comes having a huge moral responsibility; so far they have shown themselves completely incapable of living up to the item.

Second, the item should make the algorithms that will select what shows up in our news feeds absolutely transparent, along with require users to opt-in, not opt-out. Let us all see the forces that will underpin our perception of the globe. We have been experimented on for far too long at This specific point, along with the item needs to change, along with change today. the item may sound like dystopian science fiction to say This specific, or perhaps the ravings of an overworked woman who has been swimming inside waters of conspiracy theories for far too long, nevertheless to the skeptics I will say This specific: Disinformation isn’t necessarily meant for you. the item’s meant for the people who lean authoritarian, the fearful conformists along with the perennially anxious. the item’s for weapons hoarders along with true believers along with the scary uncle that will no one inside family talks to any more.

the item’s the reason why Americans are still relitigating 2016 along with Britons are still arguing over Brexit. the item’s why Kenya had to have an election do-over. the item’s why Myanmar’s Rohingya Muslims are were ethnically cleansed. the item’s how people can look at the misery along with suffering of children ripped by their parents along with placed into detention camps on American soil, where they’re sexually assaulted along with drugged, along with simply shrug. the item’s redirecting every single important national along with international conversation we’ve been having, for years today. the item needs to end.

Finally, along with most importantly: Social media companies should establish a foundation for journalism to give back some of what they have taken by us. This specific foundation must be open, transparent, along with governed by reputable independent directors. A portion of the profits earned at the expense of the news industry should be dispersed across local newsrooms around world. In today’s media landscape, Silicon Valley has vacuumed up the news industry’s revenue while simultaneously using its newfound power to push around what’s left of the newsrooms the item’s destroying — just look at how Facebook’s wildly false metrics caused organizations to “pivot to video,” with predictable results.

There’s another way the windfall revenues of social media should be invested: hire moderators, armies of them. Facebook should contain the capability to beat back the disinformation the item spreads, along with if the item claims This specific is actually impossible at the scale the item operates at, then the item should not be allowed to operate at that will scale.

Moderators should contain the resources needed to get the job done — not hundreds of low-paid contractors given a few seconds per post to make assessments that will can literally mean life or death. Thousands of journalists are currently looking for work; hiring them to enthusiastically root out the lies along with propaganda that will are ruining so much of public life — along with identify who is actually deliberately spreading the item — might be a not bad start.

Brooke Binkowski is actually the managing editor of TruthOrFiction.com, along with formerly served as managing editor of Snopes. She is actually a consulting expert witness for the Sandy Hook families in their lawsuit against Infowars.