YouTube is usually Still Struggling To Rein In Its Recommendation Algorithm


Erik Blad for BuzzFeed News

How many clicks through YouTube’s “Up Next” recommendations does This particular take to go coming from an anodyne PBS clip about the 116th United States Congress to an anti-immigrant video coming from a designated hate organization? Thanks to the site’s recommendation algorithm, just nine.

The video in question is usually “A Day inside Life of an Arizona Rancher.” This particular features a man named Richard Humphries recalling an incident in which a crying woman begged him not to report her to Border Patrol, though, unbeknownst to her, he had already done so. This particular’s been viewed over 47,000 times. Its top comment: “Illegals are our enemies , FLUSH them out or we are doomed.”

The Center for Immigration Studies, a think tank the Southern Poverty Law Center classified as an anti-immigrant hate group in 2016, posted the video to YouTube in 2011. although in which designation didn’t stop YouTube’s Up Next coming from recommending This particular earlier This particular month after a search for “us house of representatives” conducted in a fresh search session with no viewing history, personal data, or browser cookies. YouTube’s top result just for This particular query was a PBS NewsHour clip, although after clicking through eight of the platform’s top Up Next recommendations, This particular offered the Arizona rancher video alongside content coming from the Atlantic, the Wall Street Journal, along with PragerU, a right-wing online “university.”

YouTube recommended a video posted by the Center for Immigration Studies after a video coming from the Wall Street Journal:


BuzzFeed News / Via youtube.com

YouTube does not have an information panel linking to contextual information about the Center for Immigration Studies, although the Southern Poverty Law Center (SPLC) designated This particular a hate group in 2016, along with YouTube has had a partnership with the SPLC since 2018.

(The Center for Immigration Studies has denied in which This particular promotes hate against immigrant groups, along with This particular recently filed a lawsuit against the SPLC in federal court alleging in which the nonprofit has been scheming to destroy CIS For just two years. According to YouTube, the CIS video doesn’t violate its community guidelines; YouTube didn’t respond to questions about whether This particular considers CIS a hate group.)

in which YouTube can be a petri dish of divisive, conspiratorial, along with sometimes hateful content is usually well-documented. Yet the recommendation systems in which surface along with promote videos to the platform’s users, the majority of whom report clicking on recommended videos, are frustratingly opaque. YouTube’s Up Next recommendations are algorithmically personalized, the result of calculations in which weigh keywords, watch history, engagement, along having a proprietary slurry of different undisclosed data points. YouTube declined to provide BuzzFeed News with information about what inputs the Up Next algorithm considers, although said This particular’s been working to improve the experience for users seeking news along with information.

“Over the last year we’ve worked to better surface news sources across our site for people searching for news-related topics,” a company spokesperson wrote via email. “We’ve changed our search along with discovery algorithms to surface along with recommend authoritative content along with introduced information panels to help give users more sources where they can fact check information for themselves.”

To better understand how Up Next discovery works, BuzzFeed News ran a series of searches on YouTube for news along with politics terms well-known during the first week of January 2019 (per Google Trends). We played the first result along with then clicked the top video recommended by the platform’s Up Next algorithm. We made each query in a fresh search session with no personal account or watch history data informing the algorithm, except for geographical location along with time of day, effectively demonstrating how YouTube’s recommendation operates inside absence of personalization.

inside face of ongoing scrutiny coming from the public along with legislators, YouTube has repeatedly promised to do a better job of policing hateful along with conspiratorial content. Yet BuzzFeed News’ queries show the company’s recommendation system continues to promote conspiracy videos, videos produced by hate groups, along with pirated videos published by accounts in which YouTube itself sometimes bans. These findings — the product of 147 total “down the rabbit hole” searches for 50 unique terms, resulting in a total of 2,221 videos played — reveal little inside way of overt ideological bias.

although they do suggest in which the YouTube users who turn to the platform for news along with information — more than half of all users, according to the Pew Research Center — aren’t well served by its haphazard recommendation algorithm, which seems to be driven by an id in which demands engagement above all else.


One of the defining US political news stories of the first weeks of 2019 has been the partial government shutdown, at This particular point the longest inside country’s history. In searching YouTube for information about the shutdown between January 7 along with January 11, BuzzFeed News found in which the path of YouTube’s Up Next recommendations had a common pattern for the first few recommendations, although then tended to pivot coming from mainstream cable news outlets to well-known, provocative content on a wide variety of topics.

After first recommending a few videos coming from mainstream cable news channels, the algorithm could often make a decisive although unpredictable swerve in a certain content direction. In some cases, in which meant recommending a series of Penn & Teller magic tricks. In different cases, This particular meant a series of anti–social justice warrior or misogynist clips featuring conservative media figures like Ben Shapiro or the contrarian professor along with author Jordan Peterson. In still different cases, This particular meant watching back-to-back videos of professional poker players, or late-night TV shows, or episodes of Lauren Lake’s Paternity Court. Here’s one example:

The first query for “government shutdown 2019 explained” returned a straightforward news clip titled “Partial US government shutdown required to continue into 2019” coming from a channel with nearly 700,000 subscribers called Euronews.

Next, the recommendation algorithm suggested a live Trump press conference video coming from a news aggregation channel. coming from there, YouTube pushed videos about the Mexican border along with Trump’s proposed border wall coming from outlets like USA Today along with Al Jazeera English. Then, Up Next recommended a video coming from an immigration news aggregator featuring a compilation of news clips, titled “Border Patrol Arrests, Deportations, Border Wall along with Mexico Sewage.”

coming from there, the algorithm took an abrupt turn, recommending a video about Miami International Airport, along with then a series of episodes of a National Geographic TV show called Ultimate Airport Dubai — “Snakes” (382,007 views), “Firefighters” (179,471 views), “Customs Officers” (179,471 views), “Crystal Meth” (453,011 views), along with “Faulty Planes” (241,290 views) — posted by a YouTube channel called Ceylon Aviator.

YouTube’s recommendation system can also lead viewers searching for news into a partisan morass of shock jocks along with videos touting misinformation. Between January 7 along with January 9, 2019, BuzzFeed News ran queries on the day’s most well-known Google trending terms as well as major news stories, including Donald Trump’s primetime address on the border wall along with the swearing in of the completely new members of the 116th Congress.

One query BuzzFeed News ran on the term “impeach the mother” — a reference to a remark by a newly sworn-in member of Congress regarding President Trump — highlighted Up Next’s ability to quickly jump to partisan content. The initial result for “impeach the mother” was a CNN clip of a White House press conference, after which YouTube Up Next recommended two more CNN videos in a row. coming from there, YouTube’s recommendations led to three more generic Trump press conference videos. At the sixth recommended video, the query veered unexpectedly into partisan territory having a clip coming from the conservative site Newsmax titled “Bill O’Reilly Explains Why Nancy Pelosi Will Fail as House Leader.” coming from there, the subsequent clips YouTube recommended escalated coming from Newsmax to increasingly partisan channels like YAFTV, pro-Trump media pundit Dinesh D’Souza’s channel, along with finally channels like True Liberty along with “TRUE AMERICAN CONSERVATIVES.”

Here’s an example of what YouTube recommended after a search for “impeach the mother”:


BuzzFeed News / Via youtube.com

YouTube recommended watching a video posted by pro-Trump media pundit Dinesh D’Souza after playing a video coming from conservative think tank Young America’s Foundation, then recommends a video posted by True Liberty. True Liberty posts videos with titles like “Ben Shapiro makes SNOWFLAKES run for safespace with stunning FACTS” along with “Can Justin Trudeau be any dumber than This particular?”

Here are the videos, as recommended by Up Next:

  1. “Bill O’Reilly’s Talking Points on Newsmax” (Newsmax TV)

  2. “DIGITAL EXCLUSIVE: Bill O’Reilly Warns Against A RISING EVIL In America | Huckabee” (Huckabee)

  3. “Dinesh D’Souza LIVE at Texas A&M” (YAFTV)

  4. “D’Souza embarrasses leftists at Brandeis U” (D’Souza’s channel)

  5. “Dinesh D’Souza EXPLOSIVE Q&A at UNC Greensboro” (True Liberty)

  6. “Are you calling all liberals ignorant? Dinesh D’Souza faces Tough questions @Yale University” (True Liberty)

  7. “Snowflake Student tries to bait Dinesh D’Souza on immigration, Gets SCHOOLED instead” (True Liberty)

  8. “(MUST WATCH) Dinesh D’souza HUMILIATES a student in a HEATED word Exchange” (True American Conservatives)

in which YouTube’s recommendation algorithm occasionally pushed toward hyperpartisan videos during BuzzFeed News’ testing is usually concerning because people do use the platform to learn about current events. Compounding This particular issue is usually the high percentage of users who say they’ve accepted suggestions coming from the Up Next algorithm — 81%, according to Pew. One in 5 YouTube users between ages 18 along with 29 say they watch recommended videos regularly, which makes the platform’s demonstrated tendency to jump coming from reliable news to outright conspiracy all the more worrisome.

For example, here’s the list of consecutively recommended videos on one January 7 query for “nancy pelosi speech” in which goes coming from a BBC News clip to a series of QAnon conspiracy videos after 10 jumps:

  1. “Pelosi quotes Reagan in Speaker remarks — BBC News” (22,475 views at the time of the experiment)

  2. “Watch the full, on-camera shouting match between Trump, Pelosi along with Schumer” — Washington Post (5,314 Views)

  3. “Anderson Cooper exposes Trump’s false claims in cabinet meeting” — CNN (919,758 views)

  4. “President Trump Holds Press Conference 1/4/19” — Live On-Air News (210,047 views) [No longer available]

  5. “The Wall: A 2,000-mile border journey” — USA Today (414,019 views)

  6. “President Donald Trump’s border wall with Mexico takes shape” — CBC News (5,482 views)

  7. “9 Things in which COULD Happen if Trump Builds ‘The Wall!’” — Pablito’s Way (8,071 views)

  8. “First Look at Trump’s Border Wall With Patrol Road on Top” — Hedgehog (1,545 views)

  9. “Why did Jeb Bush get Scared after He Saw The Note coming from Secret Service?” — Most News (1,594 views)

  10. “WHAT DID LAURA SEE? REEXAMINED” — Hengist Mountebank Presents (725,142 views)

  11. “Part III [Q]-H.R.C. & Pence Exchange “Note”;Hussein-Obama receive ‘🎁’ as Well(!)” — Reality NotFiction (895307 views; the video is usually tagged with the Qanon catchphrase “WWG1WGA”)

  12. “WHAT DID LAURA SEE? REEXAMINED” — Hengist Mountebank Presents (725,295 views) [Video was recommended a second time.]

  13. “Part III [Q]-H.R.C. & Pence Exchange ‘Note’;Hussein-Obama receive ‘🎁’ as Well(!)” — Reality NotFiction (895,307 views)

  14. “Will President GW Bush’s Funeral Be Next?” — Theresa (37,332 views)

  15. “Some participants of the Bush Funeral receive a shocking letter!” — The Unknown (146,189 views)

Here’s where YouTube recommended a video marked [Q] for QAnon:


BuzzFeed News / Via YouTube.com

The screenshot shows in which an ad for DonaldTrump.com played before the QAnon video.

In an emailed statement, YouTube said “none of the videos mentioned by Buzzfeed violated [its] policies” although in which This particular’s continuing to work to improve its recommendations on searches for news along with science information. YouTube said This particular’s made alterations intended to surface more reliable news content in search results along with on its homepage. This particular also includes contextual information panels on some videos, including those posted by state-run media or about common conspiracies. Those panels did not appear on any of the videos BuzzFeed News shared with Google.

YouTube has admitted in which its Up Next algorithm is usually imperfect along with has promised to improve the quality of its recommendations, although these problems persist. In early 2018, after the US Senate began questioning Google executives about the platform’s possible role in Russian interference inside 2016 election, a spokesperson for YouTube told the Guardian, which was about to publish an investigation into YouTube’s recommendation system, “We know there is usually more to do here, along with we’re looking forward to creating more announcements inside months ahead.”

In March of in which year, YouTube CEO Susan Wojcicki told Wired in which the year along having a half after Trump’s election had taught her “how important This particular is usually for us … to be able to deliver the right information to people at the right time. at This particular point in which’s always been Google’s mission, in which’s what Google was founded on, along with This particular year has shown This particular can be hard. although This particular’s so important to do in which. along with if we truly focus, we can get This particular done.”

along with, just last month, while testifying before Congress, Google CEO Sundar Pichai was asked about the promotion of conspiracy theories on YouTube. In response, Pichai said, “We are constantly undertaking efforts to deal with misinformation. We have clearly stated policies along with we have made lots of progress in many of the areas where over the past year. … We are looking to do more.”

although by the beginning of 2019, even regulating pirated content still seems to flummox YouTube’s recommendation system. For example, on January 7, after searches for terms including “why is usually the government shut down,” “government shutdown explained,” along with “government shutdown,” in fresh sessions not tied to any personal data, one of the videos YouTube most commonly recommended after a few clicks was titled “Joe BRILLIANT HUMILIATES Trump After He Facts Checking Trump’s False Border Wall Claims.”

in which video, which had 96,763 views by 7:10 p.m. PT on January 7, consisted only of pirated footage coming from a January episode of MSNBC’s Morning Joe. This particular was posted to a channel called Ildelynn Basubas in which had once exclusively posted videos of nail art, until This particular abruptly pivoted to posting ripped videos of cable TV shows with incendiary headlines. Ildelynn Basubas’s videos were among those most frequently recommended by YouTube following searches for government shutdown–related terms on January 7. However, less than two days later, YouTube had removed the channel entirely “for violating Google’s Terms of Service.”

Similarly, footage of well-known CNN shows — with thousands of views — posted by the channel Shadowindustry were recommended repeatedly by YouTube during BuzzFeed News’ queries. For example, when we searched “116th congress swearing in” along with followed YouTube’s recommendations 14 jumps, CNN’s video “CNN reporter presses Trump: You promised Mexico could pay for wall” appeared inside Up Next recommended video column a total of 10 times, while a ripped video of CNN’s Anderson Cooper 360° posted by Shadowindustry was recommended a total of 14 times. Despite YouTube’s initial algorithmic push for the channel, This particular had terminated Shadowindustry within just a couple days.

Given the plethora of conspiracies along with hate group content on its platform, pirated cable news content is usually perhaps the least of YouTube’s problems. The platform does, at least, seem capable of finding along with removing This particular — though not, This particular could seem, until after its own recommendation system has helped these videos accrue tens of thousands of views. Clearly, the people behind these channels have figured out how to game YouTube’s recommendation algorithm faster than YouTube can chase them down, leading the company to recommend their pirated videos before imminently deleting them.

Researchers have described YouTube as “the great radicalizer.” After the 2017 Las Vegas shooting, the site consistently recommended conspiracy theories to users searching mundane terms like “Las Vegas shooting.” Similarly, a 2017 report coming from the Guardian illustrated in which YouTube reliably pushed users toward conspiratorial political videos. along with according to a recent report, just weeks before the 2018 midterm elections, far-right reactionaries were able to “hijack” search terms in order to manipulate YouTube’s algorithms in order in which queries for well-known terms dredge up links to reactionary content.

In some cases, queries run by BuzzFeed News support the claims of past reports. along with yet different queries — run on the same day with the same search terms under the same conditions — offer different, more mundane results. Attempts to fully understand YouTube’s recommendation algorithms are complicated by the fact in which each viewer’s experience is usually not only unique, although tailored to their specific online experience. No two users watch the same videos, nor do they watch them the same way, along with there may be no way to reliably chart a “typical” viewer experience. YouTube’s rationale when deciding what content to show its viewers is usually frustratingly inscrutable.

although as demonstrated by BuzzFeed News’ more than 140 journeys through YouTube’s recommendation system, the outcome of in which decision-creating process can be difficult to reverse engineer. inside end, what’s clear is usually in which YouTube’s recommendation algorithm isn’t a partisan monster — This particular’s an engagement monster. This particular’s why its recommendations veer unpredictably coming from cable news to pirated reality shows to QAnon conspiracy theories. Its only governing ideology is usually to wrangle content — no matter how tenuously linked to your watch history — in which This particular thinks might keep you glued to your screen for another few seconds.

This particular story is usually a collaboration with the BuzzFeed News Tech Working Group.