Breaking News Bar
updated: 8/21/2014 7:54 AM

Twitter caught in censorship dilemma over graphic images on site

hello
Success - Article sent! close
 
Bloomberg News

Twitter Inc. decided last year to make images more prominent on its site. Now, the social network is finding itself caught between being an open forum and patrolling for inappropriate content.

The pattern goes like this: a major public death spreads graphic images across Twitter. Users express outrage, forcing the company to decide what to remove.

Order Reprint Print Article
 
Interested in reusing this article?
Custom reprints are a powerful and strategic way to share your article with customers, employees and prospects.
The YGS Group provides digital and printed reprint services for Daily Herald. Complete the form to the right and a reprint consultant will contact you to discuss how you can reuse this article.
Need more information about reprints? Visit our Reprints Section for more details.

Contact information ( * required )

Success - request sent close

Two recent incidents illustrate the difficulty of the choice. While Twitter is taking pains to remove images of the death of James Foley, the journalist who was beheaded by Islamic militants, some photos of the body of Michael Brown, the teenager who was killed by police in Ferguson, Missouri, remains on users' streams. To many on Twitter, images of violence against Foley can be seen as spreading a terrorist's message, while publicizing Brown's death shines a light on a perceived injustice.

"They're letting the masses decide what should be up and what should not be up," said Ken Light, a professor of photojournalism at the University of California, Berkeley. "When it's discovered it needs to be dealt with promptly. The beheading video should never go viral."

The dilemma faced by Twitter, a proponent of free speech and distributor of real-time information, isn't much different from that of a newspaper or broadcaster, according to Bruce Shapiro, executive director of the Dart Center for Journalism & Trauma at Columbia Journalism School.

"Twitter's situation is exactly like that of a news organization," Shapiro said. "Freedom of the press and freedom of expression doesn't mean that you should publish every video no matter how brutal and violent."

Crossing Lines

The incidents also happened just after Robin Williams' daughter Zelda said she was quitting Twitter after receiving abusive messages following his death.

"In order to respect the wishes of loved ones, Twitter will remove imagery of deceased individuals in certain circumstances," the San Francisco-based company said in a policy that was enacted last week. "When reviewing such media removal requests, Twitter considers public interest factors such as the newsworthiness of the content and may not be able to honor every request."

Twitter's software isn't designed to automatically filter all inappropriate content. The company's Trust and Safety team works in all time zones to stamp out issues once they're discovered, according to Nu Wexler, a spokesman for the company. Twitter uses image-analysis technology to track and report child exploitation images, Wexler said.

Twitter doesn't specifically prohibit violent or graphic content on its site -- only "direct, specific threats of violence" and "obscene or pornographic images," according to its terms of service. It may need to go further, if Facebook Inc.'s experience is any guide.

Facebook Policy

In October, around the time Twitter started displaying images automatically in people's timelines, Facebook was dealing with an uproar over a separate beheading video that was spreading around its site. The company resisted taking it down until user complaints intensified, including from U.K. Prime Minister David Cameron. Then, Facebook changed its policies.

"When we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video," the Menlo Park, California-based company said at the time. Facebook said it "will remove content that celebrates violence."

Now that Twitter is encouraging images and video, it will also need to take another look at its rules, according to Columbia's Shapiro.

"I don't think a blanket rule is the point," Shapiro said. "You do need a company policy that recognizes that violent images can have an impact on viewers, can have an impact on those connected to the images, and can have an impact on the staff that have to screen this stuff. You can't ignore Twitter's role in spreading these images."

Share this page
Comments ()
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the X in the upper right corner of the comment box. To find our more, read our FAQ.