Breaking News Bar
updated: 3/16/2018 12:42 PM

They were searching for videos. Facebook thought they wanted videos of child abuse.

hello
Success - Article sent! close
  • Facebook users said Thursday night that they were confronted with disturbing and sexually graphic search recommendations on the social network, leading the company to apologize without offering a full explanation.

    Facebook users said Thursday night that they were confronted with disturbing and sexually graphic search recommendations on the social network, leading the company to apologize without offering a full explanation.

 
 

Facebook users said Thursday night that they were confronted with disturbing and sexually graphic search recommendations on the social network, leading the company to apologize without offering a full explanation.

After typing in "video of" into the Facebook search bar, some users said the tool suggested obscene terms that included sex acts and child abuse. Facebook's search predictions are designed to reflect popular searches, but it's unclear why the offensive terms appeared.

"We're very sorry this happened." Facebook said in statement Friday. "As soon as we became aware of these offensive predictions we removed them. Facebook search predictions are representative of what people may be searching for on Facebook and are not necessarily reflective of actual content on the platform."

The company does not permit sexually explicit imagery on its platform. Facebook indicated that it doesn't know why those search terms were suggested to users. The company said it's looking into the matter and working to improve the search feature.

Facebook just last week faced an outcry after the Guardian reported that it ran a survey asking users if they thought men should be allowed to ask children for sexual pictures through the site. Facebook said the survey was a mistake.

The video search episode is the latest example of autocomplete features drawing criticism. Tech companies, including Facebook and Google, designed them to provide a shortcut for people wanting to look something up. They are based in part on previous searches by users. While tech companies use search predictions to help their users, the features can also elevate repugnant behavior, highlighting the sometimes toxic consequences of automation and reflexive decision-making.

In 2016 Google, the world's dominant search engine, had to change its autocomplete search tool when users received offensive results after looking up information about women and Jewish people. And last year, Google surfaced unsubstantiated claims in suggested search terms about the shooter behind the mass killing in Sutherland Springs, Texas.

Facebook did not immediately respond to questions about its search prediction feature and what information the suggestions are based on.

Article Comments ()
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the X in the upper right corner of the comment box. To find our more, read our FAQ.