Breaking News Bar
posted: 3/3/2018 1:00 AM

Analysis: YouTube's mistaken 'purge' highlights new peril for video giant

hello
Success - Article sent! close
  • YouTube said Wednesday that its moderators had mistakenly removed videos in recent days during what some critics had called an ideological "purge," highlighting the ongoing challenge for a video giant now hiring thousands of new employees in a push to rein in shocking and dangerous content.

    YouTube said Wednesday that its moderators had mistakenly removed videos in recent days during what some critics had called an ideological "purge," highlighting the ongoing challenge for a video giant now hiring thousands of new employees in a push to rein in shocking and dangerous content.
    Bloomberg photo by Patrick T. Fallon.

 
 

YouTube said Wednesday that its moderators had mistakenly removed videos in recent days during what some critics had called an ideological "purge," highlighting the ongoing challenge for a video giant now hiring thousands of new employees in a push to rein in shocking and dangerous content.

Viewers and producers had recently complained that the site was targeting right-wing voices -- including some gun-related channels that had posted content in the days since the Parkland, Florida, school shooting -- with suspensions, video removals and "warning strikes."

The video giant has faced intensifying scrutiny in recent weeks over the content it hosts and promotes, including one video, listed in the site's top "Trending" ranking, that suggested students who had survived the shooting were "crisis actors."

Tim Harmsen, the head of the Military Arms Channel on YouTube, said in a video Saturday that he had temporarily disabled all of his videos after moderators gave him a "strike" for three firearms-related videos, including one about an exploding rifle target, they said had violated the site's guidelines. YouTube bans accounts after three strikes.

He posted a notice from moderators that said, "We don't allow content that encourages illegal activities or incites users to violate YouTube's guidelines." "Right now we're under attack" by YouTube employees, whom he called "far-leftist lunatics," he said in a video that has been viewed 260,000 times.

YouTube said in a statement that some of the videos were "removed in error" and would be reinstated. "As we work to hire rapidly and ramp up our policy enforcement teams throughout 2018, newer members may misapply some of our policies resulting in mistaken removals," the statement said.

After other examples of the misuse of its platform last year, YouTube in December said that its parent company, Google, would increase the number of people working on content moderation to 10,000 this year.

A source close to YouTube's operations said the mistaken removals included right-wing but also left-wing and mainstream videos. The errors, the source said, were due to the company ratcheting up its oversight of some of the more than 400 hours of video uploaded to the site every minute.

"We're flagging content at a much higher volume, so we're having more false positives because more content is being reviewed," the source said. "As we dramatically step up hiring, we will see mistakes. This isn't the last time this will happen."

Sarah Roberts, a University of California Los Angeles assistant professor who studies content moderation, said the mistaken removals were part of YouTube's "come-to-Jesus moment around understanding their own values and economic model." The site, she said, had long sought to avoid publicly making content-moderation decisions because it didn't want to be held responsible for deciding which speech should be protected or which videos go too far.

With "content moderation having been an absolute afterthought for so long, but with it now suddenly gaining importance and prominence in the public eye and the eye of regulators, they are really reckoning with their need to communicate to the public late in the game," Roberts said.

The moderators, she added, can be "easy targets" for YouTube to blame in moments of pushback. "When they do do something that the public can perceive and respond negatively, they are able to gesture at human moderators and hang the blame largely on them," she said. "But there's a lack of clarity around what the values are ... where those polices emanate, and on whose behalf."

Alex Jones, the conservative conspiracy theorist who publicly raised the possibility that the Parkland shooting was a "false flag" attack just hours after it happened, said his videos were among those targeted by YouTube's efforts to crack down on content that violated its policy against harassment. He was told he had been given "two strikes" by YouTube and was at risk of being banned from the platform.

But Jones loudly complained, posting videos to his own InfoWars website alleging that YouTube was wrongly blocking content that did not in fact violate policies. Jones said he did not dispute that the Parkland shooting happened or that the surviving students were real; he merely alleged that they were being coached in their public appearance as part of a political drive for gun control and questioned other aspects of the mainstream reporting about the attack.

Jones said in an interview Tuesday, "The good news is that they're now having to back off ... because I didn't say what they said I said."

Article Comments ()
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the X in the upper right corner of the comment box. To find our more, read our FAQ.