eli-pariser.jpg

Beware of Internet Filter Bubbles

A healthy warning about information gathering on the internet is coming from an unlikely source—Eli Pariser of MoveOn.org. He helped develop phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Obama’s web-powered presidential campaign. Pariser is also a fellow at the Roosevelt Institute and author of “The Filter Bubble,” which exposes how personalized search might be skewing our worldview.

I recently listened to a TED talk Pariser did a couple of years ago about online filter bubbles, and it put the “low-information voter” in a new context for me—it’s not just that voters aren’t interested in finding out what’s going on in the country—they’re not getting access to the information in the first place, because it’s being edited out by algorithms based on personalized interests.

Pariser explains that the internet is supposed to be a tool to bring people together, that it can be great for democracy as new voices are heard and different ideas shared. But he says there’s been a shift in how information is flowing online, and that “it’s invisible.” He also warns that “if we don’t pay attention to it, it could be a real problem.”

So I first noticed this in a place I spend a lot of time —my Facebook page. I’m progressive, politically—big surprise—but I’ve always gone out of my way to meet conservatives. I like hearing what they’re thinking about; I like seeing what they link to; I like learning a thing or two. And so I was surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out was going on was that Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared.

Facebook isn’t the only place doing algorithmic editing of the web. Google’s doing it too. So is Yahoo. So is Netflix. So are a lot of companies and organizations. Results are based on personal interests and habits. If we all search for the same thing, all of us will get different results (and you won’t have the opportunity to see my results just as I won’t see yours).

Pariser explains that even when we’re logged out, there are “57 signals that Google looks at—everything from what kind of computer you’re on to what kind of browser you’re using to where you’re located.” All of it designed to personally tailor our search results. Pariser tells of how he asked some friends to google Egypt and to send him the results; he was shocked by how different they were. One friend didn’t get any results about protests in Egypt while another’s was full of them—and it was the big story of the day.

This kind of personalization of searches “moves us very quickly toward a world in which the internet is showing us what it thinks we want to see, but not necessarily what we need to see.” Pariser calls this the “filter bubble.”

And your filter bubble is your own personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out.

Pariser says this reveals that we we’ve gotten the story about the internet wrong. In the past, before the internet, we were subject to information gatekeepers in the form of Walter Cronkite and Peter Jennings, of the editors at the New York Times and the Washington Post. With the rise of the internet, the information floodgates opened and swept away the traditional gatekeepers.

But, according to Pariser, that flood of information isn’t flowing like we think it is. “What we’re seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did.”

While Pariser recognizes the wonders of the web, he calls for individuals to exercise more control over their search results. He also calls for programmers to encode algorithms with a sense of public life, a sense of civic responsibility. They need to be “transparent enough” to allow us to “see what the rules are that determine what gets through our filters.”

I think we really need the internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it’s not going to do that if it leaves us all isolated in a Web of one.

Do you search out different views and perspectives on the internet, or do you live in a filter bubble? If you’ve broken out of your bubble, what advice can you offer others to break out of theirs?