advertisement

Survey: Americans lack trust in social networks' judgment to remove offensive posts

Most Americans believe social media companies have a responsibility to remove offensive content from their platforms. But how tech firms decide what posts should be taken down is where public confidence fades.

In a survey published by the Pew Research Center on Wednesday, 66% of Americans say social networks should take down offensive posts and videos. But how should companies decide which objectionable content should stay online or get the ax?

Answering that question has presented a tremendous challenge in recent years for companies such as Facebook, YouTube and Twitter - exposing them to criticism that they've been too slow and reactive to the abuse and harassment found on their platforms. The way social media companies determine which posts they reject has also drawn accusations of selective enforcement colored by political bias, forcing the social networks to reiterate that they remain politically neutral.

When it comes to judgment calls to curb offensive content, the American public does not have much confidence in social networks, according to Pew.

Forty-two percent of Americans say that don't have too much confidence in social media companies to decide what content to take down. And nearly 1 in 4 Americans say they have no confidence in the companies to make these calls at all.

Pew, which broke down the survey results by political affiliation, found that this particular lack of faith in social media companies was shared by both sides of the aisle.

Among Democrats, 37% have confidence in the companies to determine what content should be taken down, compared to 63% who said they did not have too much confidence or none at all, the study found.

Republicans were even more skeptical: 23% of GOP supporters said they trusted the decisions of the companies to remove offensive posts, compared to 76% who largely lacked confidence.

Earlier this month, YouTube, the Google-owned video site, announced that it will take a more aggressive stance against hate speech, including the removal of videos that falsely deny the Holocaust and other major historical events took place. In response to criticism that social media companies are doing too little to combat hateful ideologies that rely on their platforms to gain an audience, YouTube and its peers have begun to take a broader view of what constitutes hate speech. Critics have zeroed in on the prevalence of posts that promote discrimination, peddle conspiracy theories about world events and that harm children.

On Wednesday, The Washington Post reported that the Federal Trade Commission is in the late stages of an investigation into YouTube, for allegedly violating children's privacy. YouTube declined to comment on the FTC probe. Consumer groups and youth advocates have claimed that the video site fails to filter content by appropriate age levels, exposing children to a near-endless stream of troubling content.

Article Comments
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the "flag" link in the lower-right corner of the comment box. To find our more, read our FAQ.