Facebook or Fakebook?

How can Facebook become a more reliable source of information while not shutting down the vitality of the web or shutting out voices that deserve to be heard?

As the Age of Trump dawns, that's an important question. A Pew study found that 67 percent of Americans use Facebook, and 44 percent of those users get news from the site. That makes Facebook "the most influential and powerful publisher in the world," according to Emily Bell in the Columbia Journalism Review.

Moreover, fake news stories spread through Facebook were used effectively by Trump supporters to undermine Hillary Clinton's candidacy. Indeed, Facebook is a key factor in creating a much larger and more insidious conviction - that facts themselves don't really matter anymore. Or even exist.

As President Obama said during a campaign speech in Michigan: "And people, if they just repeat attacks enough, and outright lies over and over again, as long as it's on Facebook and people see it, and as long as it's on social media, people start believing it. ... And it creates this cloud of nonsense."

Facebook founder Mark Zuckerberg has been eager to profit from the power of his creation, but reluctant to take responsibility for the consequences of that power. He's long insisted that he runs a "technology company," not a "media company," sort of like a phone company that simply conveys content without shaping it.

He's totally wrong about that. Facebook is not Verizon or T-Mobile. But he's right about something else. As Zuckerberg puts it, Facebook should not be the final "arbiter of truth." It should be an editor, not a censor; a guide, not a dictator. The last thing we need is a Ministry of Veracity.

What we do need is a sense of balance. Facebook has to embrace its influence, but not misuse it. It has to be part of the solution, not part of the problem. That said, it can't be the only solution. And there are encouraging signs that Facebook is facing up to that truth.

"We really value giving people a voice," Facebook Vice President Adam Mosseri told The New York Times, "but we also believe we need to take responsibility for the spread of fake news on our platform."

Let's be clear about what he - and we - mean by "fake news." The term has been hijacked by conservatives who are using it as one more weapon to attack the mainstream media. And it's certainly true that even the best reporters make mistakes, or have blind spots. But that's not fake news.

Fake news is deliberately fabricated to generate clicks, make money and, in some cases, alter the political debate. Pew reports that 23 percent of American adults have shared fake news stories with others, and 64 percent said made-up news has caused "a great deal of confusion" among voters.

So it's a serious issue, and as a first step, Facebook is crowdsourcing the problem, "testing several ways to make it easier to report a hoax if you see one on Facebook," says Mosseri.

Those reports will be forwarded to third party fact-checking organizations like Snopes and PolitiFact. If those services "identify a story as fake, it will get flagged as disputed," explains Mosseri. You'll still have the choice to share a flagged story, but it will carry a clear warning.

In addition, Facebook is "doing several things to reduce the financial incentives" for hoaxers by cutting off their ability to sell ads through the site.

These are good steps, but small ones, and they do nothing to solve another huge problem: Facebook algorithms that create "echo chambers" by sending readers only news articles that mirror the choices and preferences they've expressed in the past.

"Because Facebook tailors your News Feed based on your own behavior, you inadvertently become victim of your own biases," Nelson Granados, a professor of information systems at Pepperdine, writes in Forbes.

The power of fake news is reinforced by these echo chambers. People who are insulated from dissent or contradiction develop "tunnel vision," says Granados, and are more likely to believe fake stories that comport with their worldview - no matter how outlandish.

One answer: Facebook could mimic a well-edited op-ed page. Alter the algorithms to make sure a certain number of "crosscutting" stories are provided in every News Feed. People cannot be forced to read them, but at least they'll see another light in their tunnel.

Facebook is a great innovation, but it has to make sure it doesn't become Fakebook.

Steve and Cokie Roberts can be contacted by email at

© 2016, Universal

Article Comments
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the "flag" link in the lower-right corner of the comment box. To find our more, read our FAQ.