Facebook Suggests Questionable Comments On Live Videos About Shootings as well as Sexual Assaults

Facebook appears to be testing a brand new tool of which prompts users to comment on live video streams — including those involving sensitive situations like shootings as well as sexual assault — using suggested text as well as emojis.

On Monday, a handful of Facebook users noticed of which the social media platform was offering them preset responses for live videos about a series of news stories. On one stream for MSNBC about an ongoing, officer-involved shooting at a Chicago hospital, NBCUniversal contractor Stephanie Haberman noticed Facebook was prompting her to comment with phrases like “This specific will be so sad” as well as “so sorry,” along with emojis including the prayer hands.

“So I’m just noticing of which Facebook incorporates a thoughts as well as prayers autoresponder on our Chicago Hospital shooting livestream as well as I have thoughts,” Haberman tweeted along with photos of the suggested responses by Facebook. She declined to comment just for This specific story.

While autoreply prompts are not an entirely brand new concept for Silicon Valley products — Google’s Gmail recently unveiled a pre-populated response tool called “Smart Reply” as well as Instagram sometimes suggests emoji responses — This specific appears to be the 1st time Facebook has tested the tool on live video, where content can be sensitive, unpredictable, as well as sometimes depicts violence. as well as while Facebook seems to be aiming to improve engagement on live video, critics have called the prompts insensitive as well as further evidence of which the company has not thought out the human impact or consequences of its products.

A Facebook spokesperson did not immediately respond to a request for comment.

BuzzFeed News examined some other Facebook livestreams on Monday as well as found of which the social media platform was testing prompted responses on a variety of videos, including ones by local news outlets, the shopping network QVC, as well as gamers. On one video by Phoenix’s Fox 10 station about a sexual assault as well as possible shooting in a Catholic supply store, Facebook’s algorithm suggested of which the user comment with “respect” or “take care.” On a different stream about the Chicago hospital shooting by NBC News, the suggested responses included a crying tears of joy emoji as well as another generating a kissing face.

In testing the product, a BuzzFeed News reporter only had to click a suggested response once for of which to appear inside comment feed of a given live video. Once one response was selected, the prompted comment menu disappeared as an option. of which’s unclear when Facebook rolled out the prompted response tool on live videos, or how widely available of which will be.

“Facebook has bigger things to worry about right today than rolling out response prompts on live video,” Caroline Sinders, principal designer at Convocation Design+Research, told BuzzFeed News. “as well as given of which of which’s suggesting inappropriate responses, I would certainly say of which’s probably best to turn of which off today or allow users to turn of which off.”

Sinders, a former fellow at the BuzzFeed Open Lab, explained of which offering autoreplies to live video can be especially hard given of which current machine learning technology incorporates a hard time “sussing out context in video as well as audio, just as of which does with text.” For example, since of which debuted live video in December 2015, Facebook has struggled with using algorithms to filter our violent content by users’ feeds.

Facebook’s prompted reply tool also appeared on live video for the QVC shopping channel; of which suggested of which users comment “pretty” as well as “cute” as two hosts showed off a dress. On a livestream of a gamer playing Battlefield V, the feature suggested of which viewers greet others with “yo” as well as “hey again!”

The most frequent blunders, however, seemed to happen on live news segments. During an ABC7 stream of a police pursuit in Los Angeles, Facebook’s algorithm suggested some questionable responses to users, including “Go” as well as “Agree.”

A source close to NBCUniversal said the company had never seen the prompts before as well as of which its news outlets had not opted in to them.

“This specific wins for most dystopian thing I’ve seen all day (as well as I live inside smoke-drenched Bay Area where everyone will be wearing masks, in order of which’s saying a lot),” one person tweeted in response to seeing screenshots of prompted responses on Twitter.

Leave a Reply

Your email address will not be published. Required fields are marked *


seven − 6 =