advertisement

What you don't know about Internet algorithms is hurting you

Until about three weeks ago, Pinterest was - hands down, no contest - my absolute favorite social network.

Yeah, I know it lacks the cachet of Twitter or Snapchat. I understand its associations with the bored and middle-aged. But I also really love to cook, and Pinterest is a never-ending font of new and unusual recipes. The site, frequently described as a "social pinboard," lets you follow other users much like Facebook or Twitter do; but instead of news articles or pithy observations on the weather, the people you follow post craft ideas, wedding plans or - best of all (!) - pictures of food.

And yet, recently, the food has gotten ... boring. Every time I open the app, I get the same handful of recommendations: Brussels sprouts, lentil soup, ricotta with honey. All my favorite foods, but such repetitive recipes.

The shift was not, incidentally, just in my head. In a blog post published Friday, Pinterest explained that it's been tweaking the algorithm behind its home feed - the bit of code that decides which recipes to show me and which to hide. Pinterest, like virtually every other website and social network, is trying to make its site more personalized. And so, because I like Brussels sprouts and lentil soup and ricotta with honey, the site is showing me those foods, to the exclusion of all other foods, all the time.

Neither I nor democracy will suffer too much from this particular change, of course. But one could hardly ask for a more self-evident (or delicious!) example of the dangers of algorithmic filtering and personalization, dangers that critics have only grown more alarmed about over the past year.

See, Pinterest is far from the only site on which computer code decides what content you see and what you do not. And on sites like Google or Facebook, now the primary news source for people under 35, algorithms aren't just keeping you from the next great cake recipe - they could be isolating you from opposing views, exacerbating your own biases, or, as more disturbing and recent examples have shown, even perpetuating racial and gender biases of their own.

In fact, algorithms are now so widespread, and so subtle, that some sociologists worry that they function as a form of "social control." (That is, at least, the title of a keynote at an upcoming academic conference called Theorizing the Web, where technologists and sociologists will discuss "algorithms as a type of social engineering.")

"Quite literally, this one algorithm has changed millions of lives," write OkCupid's founders of their site's matching algorithm, which - as if to prove exactly how important algorithms are to modern culture - just went up for sale as part of an "algorithm auction" in New York.

Admittedly, this is not the most accessible stuff. Even the word algorithm - a throwback to high-school calculus that no one asked for - threatens to scare off the casual Web user. But essentially, an "algorithm" is just a piece of computer code that makes a decision or recommendation of some kind.

When you Google something, an algorithm decides which results appear on the first page and which plummet to the hundredth. When you log into Facebook, you don't see everything your friends post; that would be too much stuff, Facebook figures. So instead, Facebook - like Amazon and Netflix and Pandora and OkCupid - tries to personalize the content that it shows you.

Unfortunately, personalization isn't always everything it's cracked up to be. Because personalization algorithms try to predict content you will like, they tend to surface only things that agree with your established preferences; over time, and through lots of clicks, you gradually work your way into an online world where all news articles are fiercely liberal, or all recipes contain Brussels sprouts. (The Internet activist Eli Pariser has famously dubbed this the "filter bubble" - a phenomenon in which personalization algorithms effectively cut people off from the cultural and ideological mainstream.)

Even worse, there's growing indication that algorithms contain their own biases, built right into their supposedly impartial code. Sometimes those biases are explicit, as in the case of Google's algorithms surfacing ads about background checks and arrest records when you search black-sounding names.

Sometimes they're more mercenary in nature; while Facebook claims that its algorithm is personalized for your benefit, for instance, that's not entirely true. Facebook's algorithm is optimized for Facebook, and thus for advertisers - not you. The communications researcher Christian Sandvig has called this "corrupt personalization," the idea that personalization pretends to serve you, when it's actually serving some corporate motive at your expense.

In either case, it's almost impossible to see those biases until personal examples arise by accident, because algorithms are proprietary black boxes. Neither Facebook nor Google nor Pinterest explains the intricacies of its code. Even though algorithms arguably shape how we think and what we know, no one gets to open them up and see how they work.

And yet - in the past year, particularly - cracks have begun to show in the black box. Last February, the Internet exploded over a study from Facebook's data team that showed algorithmic changes to the news feed could actually manipulate users' feelings without their knowledge. Shortly after that, the sociologist Zeynep Tufekci pilloried Facebook for algorithmically obscuring breaking news from Ferguson, Missouri, apparently in preference of lighter material with more pop cultural appeal.

This year, 2015, is the year we finally "get creeped out by algorithms," Tufekci later wrote - a prediction that has, thus far, been pretty prescient. In Britain just last week, a female pediatrician made international news when she was barred from entering a women's locker room because the computer code behind her gym's security system automatically regarded all "doctors" as male.

"Algorithms are not always neutral," Motherboard's Victoria Turk warned. "They're built by humans, and used by humans, and our biases rub off on the technology. Code can discriminate."

All the fomenting drama and discussion has sparked a new field of academic research called "algorithm auditing": probing Internet algorithms from the outside, in order to figure out how they tick and what may be wrong with them. A manifesto on the algorithm audit, published by the Open Technology Institute last October, envisioned an Internet where platforms would come with warnings about their algorithms and whom those algorithms serve.

Alas, even with that type of information, it's unclear if or how anyone would push back against the machines. Facebook and Google, the most-referenced examples in this space, are virtual monopolies in their respective fields. And on both sites, casual users seem unaware that algorithms even exist: According to one study, less than 40 percent of Facebook users realize they're not seeing everything that their friends put on the site.

And so, we end up with feeble oppositions: apps like Random, a news reader that tries to turn up stories you wouldn't usually see, or Bobble, a browser extension that shows the difference between your Google results and everyone else's. Eli Pariser, the guy who coined the term "filter bubble," has attempted to beat the algorithm-informational complex from inside - he's now head of the viral media company Upworthy, which draws attention to important issues by couching them in Facebook-friendly silliness and hyperbole.

"Would I love to see Facebook trending topics promote civically important stuff?" he asked. "Yeah ... But Facebook is many orders of magnitude bigger than any social network. If you want to play, you have to play there."

Meanwhile, I'm awaiting the enterprising media entrepreneur who can sneak through the holes in Pinterest's algorithm to give me back my unfiltered recipe feed. I like the Brussels sprouts, don't get me wrong. But I'd much prefer to choose what I like - rather than have a bit of code dictate it to me.

Article Comments
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the "flag" link in the lower-right corner of the comment box. To find our more, read our FAQ.