The Logan Paul Suicide Video Shows YouTube can be Facing A Crucial Turning Point


Nicholas Hunt / Getty Images

Logan Paul attends “The Thinning” Meet & Greet during the 2016 brand new York Comic Con.

YouTube incorporates a content crisis — again. On the heels of the company’s child exploitation problem, the idea finds itself facing a brand new wave of criticism after high-profile YouTuber Logan Paul posted a video of a dead body while filming in Aokigahara, Japan’s so-called “Suicide Forest.” The Logan Paul controversy can be just the latest for a company in which has increasingly had to contend with criticism over what kind of content can be appropriate on its platform — along with also how the idea unevenly applies its own community guidelines.

logan paul exploiting a suicide victim in Japan to the tune of 6M+ views while youtube demonetizes students protest… https://t.co/ylcI8kY58k

Suicide isn’t a joke. Suicide isn’t clickbait. Suicide can be a serious matter. Logan Paul has taken the idea too far. He jus… https://t.co/l5i9pJ67Lo

YouTube, after a decade of being the pioneer of internet video, can be at an inflection point as the idea struggles to control the vast stream of content flowing across its platform, balancing the need for moderation with an aversion toward censorship. from the past 12 months alone, the idea has been embroiled in controversies including anti-Semitic rhetoric found in videos of its biggest star PewDiePie, an advertiser exodus over videos with hate speech or extremist content, along with also the disturbing along with also potentially child exploitative content promoted by its algorithm. With every brand new misstep, the idea has alternately angered the creators the idea depends on for content, turned off advertisers, along with also confused users about how, exactly, the idea makes decisions about which videos can remain on its platform, what should be taken down, along with also what can be monetized. The Paul video can be just the latest manifestation of in which struggle.

In This specific case, the sensational video of a dead body, an apparent death by suicide, was live for more than 24 hours before being taken down by Paul himself after mounting public backlash. (Paul’s PR representative did not return a request for comment.) In in which time span the idea was viewed more than 6.3 million times, according to brand new York Magazine. The video fits within a larger pattern of controversial content along with also highlights how YouTube has created a system of incentives for creators on its platform to push boundaries.

“To what extent can be YouTube overtly along with also tacitly encouraging individuals to push on the outrageousness factor?”

“Let’s be honest, This specific flare-up on Logan Paul can be going to die out eventually,” Sarah Roberts, an assistant professor at UCLA who has been studying content moderation for seven years, told BuzzFeed News. “although there’s a bigger conversation to be had: to what extent can be YouTube overtly along with also tacitly encouraging individuals to push on the outrageousness factor [in producing content]? Do they need in which to keep the engagement going?”

YouTube on Tuesday acknowledged in which the video did violate its policies for being a graphic video posted in a “shocking, sensational or disrespectful manner.” “If a video can be graphic, the idea can only remain on the site when supported by appropriate educational or documentary information along with also in some cases the idea will be age gated,” a company spokesperson wrote in an emailed statement to BuzzFeed News.

although the Logan Paul incident highlights the consistently inconsistent application of YouTube’s content moderation rules. YouTube did not respond when asked if the idea had initially reviewed along with also approved the video to remain on the platform. According to a member of YouTube’s Trusted Flagger Program, however, when the company manually reviewed Paul’s video, the idea decided in which the video could remain online along with also didn’t need an age restriction.

Logan Paul’s video was reported along with also YouTube manually reviewed the idea; they decided to leave the idea up without even an age r… https://t.co/ciYl1AzPAx

YouTube also said when the idea removes a video for violating community guidelines, the idea applies a “strike” to a channel. the idea was unclear whether the idea did so with Logan Paul’s channel because Paul deleted his own video. If a channel accrues three strikes within a three-month period, YouTube shuts the channel down, per the company’s community guidelines. (YouTube did not respond to follow-up questions via BuzzFeed News asking whether the idea had indeed applied a strike to Paul’s account.) Notably, Paul had demonetized the video when he first posted the idea — meaning neither he nor YouTube earned any advertising revenue via the video.

Paul’s video isn’t something artificially intelligent moderation could catch on its own, two experts that has a focus on content moderation told BuzzFeed News. “What can be obscene can be having shown along with also been disrespectful about the body of a suicide victim,” said Tarleton Gillespie, who studies the impact of social media on public discourse at Microsoft Research. “This specific can be the kind of contextual along with also ethical subtlety in which automated tools are likely never to be able to approximate.”

What’s more, the decision in which Logan Paul crossed the line can be one in which fundamentally involves an exercise of moral judgment, according to James Grimmelmann, a professor of law who studies social networks at Cornell. “You have to look at what’s considered decent behavior from the user community YouTube has along with also wants to have,” Grimmelmann said. “You can’t just turn a crank along with also contain the algorithm figure out your morality for you.” In in which sense, YouTube did ultimately make a value judgment on the Logan Paul video, based on the reaction of its own community, by publicly saying the idea violated its policies.

“You can’t just turn a crank along with also contain the algorithm figure out your morality for you.”

Of course, in which’s not how the company wants the public to view its role. YouTube has remained largely silent on the fiasco, while Logan Paul has issued two apologies. “Firms have done such a Great job of positioning themselves to ensure when something like This specific happens, they can wash their hands of the idea along with also say, ‘We’re just the dissemination channel,’” says Roberts. “although I would likely push on in which along with also ask — what’s YouTube’s relationship with Logan Paul?”

Paul can be a marquee YouTube star. He can be a main character in The Thinning along with also Foursome, two ongoing YouTube Red Original series — high-quality exclusive shows in which YouTube distributes on its paid subscription service, YouTube Red. Paul has had a YouTube channel since 2015, along with also in in which time he’s accumulated 15 million subscribers along with also nearly 3 billion views. YouTube knows Paul’s irreverent style of video, along with also Paul knows what does well on the platform. “In This specific case, This specific guy can be a top producer for YouTube,” said Roberts. “the idea becomes harder to argue the video wasn’t seen in-house.”

Compounding the problem can be in which YouTube itself likely has no way of knowing exactly what content can be on its platform at all times — especially with users uploading nearly 0,000 hours of brand new video to YouTube daily. “The problem with current digital distribution platforms can be the micro-targeting of content to users,” said Bart Selman, a Cornell University professor of artificial intelligence. “In fact, a well-tuned ranking algorithm will make sure in which extreme content can be only shown to people who will not feel offended — or may even welcome the idea — along with also won’t be shown to others.” The bubble of micro-targeting can be pierced when disturbing videos go viral along with also attract a lot of public attention along with also media scrutiny. although in which’s the exception, not the norm.

along with also in which leaves the public to exert pressure on YouTube. Still, exactly how YouTube’s complex system of human moderators, automated algorithms, policy enforcement, along with also revenue generation work together to police along with also promote videos remains a black box — along with also in which’s an issue. “Those key ingredients are under lock along with also key,” UCLA’s Roberts said. “One positive income of these incidents can be in which the public asks brand new questions of YouTube.”

“We are all beta testers along with also a focus group, including how content moderation can be applied,” Roberts continued. today, YouTube will likely throw even more resources at its content moderation problem along with also communicate its strategy even more loudly to the public — something the idea has already begun to do — in an effort to outpace any regulation in which might come down on the platform.

Davey Alba can be a senior technology reporter for BuzzFeed News along with also can be based in brand new York.

Contact Davey Alba at davey.alba@buzzfeed.com.

Got a confidential tip? Submit the idea here.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

5 × two =