eli-pariser.jpg

Beware of Internet Filter Bubbles

A healthy warning about information gathering on the internet is coming from an unlikely source—Eli Pariser of MoveOn.org. He helped develop phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Obama’s web-powered presidential campaign. Pariser is also a fellow at the Roosevelt Institute and author of “The Filter Bubble,” which exposes how personalized search might be skewing our worldview.

I recently listened to a TED talk Pariser did a couple of years ago about online filter bubbles, and it put the “low-information voter” in a new context for me—it’s not just that voters aren’t interested in finding out what’s going on in the country—they’re not getting access to the information in the first place, because it’s being edited out by algorithms based on personalized interests.

Pariser explains that the internet is supposed to be a tool to bring people together, that it can be great for democracy as new voices are heard and different ideas shared. But he says there’s been a shift in how information is flowing online, and that “it’s invisible.” He also warns that “if we don’t pay attention to it, it could be a real problem.”

So I first noticed this in a place I spend a lot of time —my Facebook page. I’m progressive, politically—big surprise—but I’ve always gone out of my way to meet conservatives. I like hearing what they’re thinking about; I like seeing what they link to; I like learning a thing or two. And so I was surprised when I noticed one day that the conservatives had disappeared from my Facebook feed. And what it turned out was going on was that Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends’ links than on my conservative friends’ links. And without consulting me about it, it had edited them out. They disappeared.

Facebook isn’t the only place doing algorithmic editing of the web. Google’s doing it too. So is Yahoo. So is Netflix. So are a lot of companies and organizations. Results are based on personal interests and habits. If we all search for the same thing, all of us will get different results (and you won’t have the opportunity to see my results just as I won’t see yours).

Pariser explains that even when we’re logged out, there are “57 signals that Google looks at—everything from what kind of computer you’re on to what kind of browser you’re using to where you’re located.” All of it designed to personally tailor our search results. Pariser tells of how he asked some friends to google Egypt and to send him the results; he was shocked by how different they were. One friend didn’t get any results about protests in Egypt while another’s was full of them—and it was the big story of the day.

This kind of personalization of searches “moves us very quickly toward a world in which the internet is showing us what it thinks we want to see, but not necessarily what we need to see.” Pariser calls this the “filter bubble.”

And your filter bubble is your own personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don’t decide what gets in. And more importantly, you don’t actually see what gets edited out.

Pariser says this reveals that we we’ve gotten the story about the internet wrong. In the past, before the internet, we were subject to information gatekeepers in the form of Walter Cronkite and Peter Jennings, of the editors at the New York Times and the Washington Post. With the rise of the internet, the information floodgates opened and swept away the traditional gatekeepers.

But, according to Pariser, that flood of information isn’t flowing like we think it is. “What we’re seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did.”

While Pariser recognizes the wonders of the web, he calls for individuals to exercise more control over their search results. He also calls for programmers to encode algorithms with a sense of public life, a sense of civic responsibility. They need to be “transparent enough” to allow us to “see what the rules are that determine what gets through our filters.”

I think we really need the internet to be that thing that we all dreamed of it being. We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it’s not going to do that if it leaves us all isolated in a Web of one.

Do you search out different views and perspectives on the internet, or do you live in a filter bubble? If you’ve broken out of your bubble, what advice can you offer others to break out of theirs?

  1. dittoheadadt

    “…and the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did.”

    He had me until that.  What a joke.  “Editors” and “ethics” in the same sentence.

    I’d rather have algorithms show me what they think I want to see than have editors show me what THEY want me to see.

  2. D.C. McAllister
    C
    dittoheadadt:”…and the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did.”

    He had me until that.  What a joke.  “Editors” and “ethics” in the same sentence.

    I’d rather have algorithms show me what they think Iwant to see than have editors show me what THEY want me to see. · 3 minutes ago

    I know what you’re saying. I had that reaction too at first. But I think in the bigger context, you did have news editors who were at least dealing with newsworthy items (though, of course, oftentimes from their perspectives). With the kinds of filters he’s talking about, we’re often not even getting something newsworthy. Like Zuckerberg said, “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” At least the editors of old weren’t writing about squirrels or stupid online videos. (though they were biased about the news). I think this is what he’s getting at.

  3. Z in MT

    I think this may be more true for people on the left than on the right.  I am pretty comfortable in my filter bubble on the web because I get exposed to the left through people at work and catching the network news once in a while.

    One area that I think that has always been a blind spot for most the elites (both liberal and conservative) is the ruralurban divide.  The vast majority of the elites in this country are from a few enclaves on the east and west coasts and have no understanding of the politics, motivations, and culture of people who live in small cities and rural areas of the interior.

  4. D.C. McAllister
    C
    Z in MT: I think this may be more true for people on the left than on the right.  

    I don’t know. I know some people on the right who don’t have a clue about a variety of issues–important issues. They might vote the way I do, and stand for solid principles, but they’re missing something essential to engaging the culture. But that’s an issue for another time. I don’t mean to be critical, but I do think if some of the right got out of their bubbles more, they might be able to effect more change in our country, turn back the tide better in the culture war. Build more bridges.  (I just read what I wrote, does that make me sound squishy?) But you’re right, as for the masses in the middle, they definitely live in bubbles–made up of TMZ and Entertainment tonight.

  5. Midget Faded Rattlesnake
    Denise McAllister: .

    This kind of personalization of searches “moves us very quickly toward a world in which the internet is showing us what it thinks we want to see, but not necessarily what we need to see.” Pariser calls this the “filter bubble.”

    I don’t feel constrained by it.

    To the extent that search services guess right about the information I need, that just saves me time. But I’m a weird enough person that search services often don’t guess right, so it’s habitual for me to be unsatisfied with the easy suggestions and dig deeper when I really want to find out about something. If I succeed in seeing only what I want to see, then that is my fault, not the internet’s.

    Online searches can be done so fast that it only takes a matter of minutes to do multiple searches on a given topic, using different word combinations. I find that’s one way to get around my preconceptions. Additionally, I almost never use my Facebook account, which probably helps.

    I do, however, find myself mildly baffled by all the Google ads I get for Mormon undergarments.

  6. KC Mulville

    You don’t know you’re in a bubble. That’s what makes it a bubble.

    Conservatives have less of a problem with this, not because conservatives are any more virtuous, but because the liberal perspective is so prevalent throughout the media. Conservatives are constantly smacking face-first into other perspectives. The one perspective we don’t encounter is conservatism.

  7. D.C. McAllister
    C
    Midget Faded Rattlesnake

    Denise McAllister: .

    This kind of personalization of searches “moves us very quickly toward a world in which the internet is showing us what it thinks we want to see, but not necessarily what we need to see.” Pariser calls this the “filter bubble.”

    I don’t feel constrained by it.

    To the extent that search services guess right about the information I need, that just saves me time. But I’m a weird enough person that search services often don’t guess right, so it’s habitual for me to be unsatisfied with the easy suggestions and dig deeper when I really want to find out about something. If I succeed in seeing only what I want to see, then that is my fault, not the internet’s.

    I do, however, find myself mildly baffled by all the Google ads I get for Mormon undergarments.

    Sadly, the low-information votes aren’t as responsible. They ‘see only what they want to see.’ I think this explains a lot about the low-information voter.

    I didn’t know there was such as a thing as “Mormon undergarments.” Do I even dare search to see what they look like?

  8. Skyler

    I’ve always believed that the internet will eventually be used to control us through controlling what information we receive and by reducing our laws to the least free jurisdiction.  I’m amazed that even that was being optimistic.

  9. D.C. McAllister
    C
    KC Mulville: You don’t know you’re in a bubble. That’s what makes it a bubble.

    Conservatives have less of a problem with this, not because conservatives are any more virtuous, but because the liberal perspective is so prevalent throughout the media. Conservatives are constantly smacking face-first into other perspectives. The one perspective we don’t encounter is conservatism. · 0 minutes ago

    Good points. Sadly, it’s the people who aren’t political, not really thoughtfully liberal or conservative, who are the corralled masses. These are the ones truly controlled by this (and impacting our elections). Though I do think we are all affected. Like you said, you don’t know in you’re in a bubble when you’re in a bubble. Sometimes I force myself out to read sites I never go to. I see that I am, in a sense, in a bubble. Not that I’m going to change my views by reading these other perspectives, but it does give me understanding, awareness. I think that’s important. But as far as issues like pure news, too many low-infos aren’t getting that at all. Zuckerberg’s right about the squirrels.

  10. Douglas
    dittoheadadt:”…and the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did.”

    He had me until that.  What a joke.  “Editors” and “ethics” in the same sentence.

    I’d rather have algorithms show me what they think Iwant to see than have editors show me what THEY want me to see. · 44 minutes ago

    TED Talks are high up there on “Stuff White People Like” anyway.

  11. Cpad12

    You could try using a search engine that does not personalize search results.  One that comes to mind is http://www.duckduckgo.com.

  12. Larry3435

    Filters only affect you if you use search engines to get general information.  If you read a variety of right and left leaning sites, you find out what everyone is thinking.  It is easy enough to bookmark Huffington Post, the NY Times, etc., and find out what the other side is talking about.

    Aggregation sites, like Real Clear Politics, are also helpful.  Yes, they exercise editorial judgment, but you know where they are coming from so you can use what they offer without relying on them to carry through on some idiot promise, such as “All the news that’s fit to print.”

    I only use search engines when I already know what I am looking for.  The text of a speech, for example, or a particular video.  

  13. Midget Faded Rattlesnake
    Denise McAllister

    I didn’t know there was such as a thing as “Mormon undergarments.” Do I even dare search to see what they look like?

    Oh, they’re very modest — a tailored tee shirt and “long johns” that end at the knee.

    The worst you’re likely to see during a search is a Photoshop job where some wag has pasted Mitt Romney’s head to the body of someone wearing a corset.

  14. JavaMan

    I see his point, but the fact of the matter is we need some sort of curation be it algorithmic or human selection. I question whether someone as young as Eli can even remember what the internet was like before Google began to sort and sift it for relevance. It was basically crap. The drinking from the firehose metaphor applied back then, and without services like Google it would be much worse now. Of course algorithms are not just things that pop into existence without planning or purpose, they are the work of engineers who are intelligent and therefore have opinions and biases. It would be foolish to think the logic systems of the creation would not, too some extent, reflect the mind(s) of the creator(s).  These companies bring us services we want in exchange for our attention. If they started ignoring our tastes and began presenting us with raw information and/or what they’ve determined socially responsible people should know, we’d get annoyed, stop paying attention, and they’d go out of business.

    Bottom line we all need curation and interpretation of data because we have limited time and expertise. 

  15. D.C. McAllister
    C
    Douglas

    dittoheadadt:”…and the thing is that the algorithms don’t yet have the kind of embedded ethics that the editors did.”

    He had me until that.  What a joke.  “Editors” and “ethics” in the same sentence.

    I’d rather have algorithms show me what they think Iwant to see than have editors show me what THEY want me to see. · 44 minutes ago

    TED Talks are high up there on “Stuff White People Like” anyway. · 11 minutes ago

    Hmmm. I’m not exactly the elitist type, but there are some things I’ve learned from the TED talks. (They’re just one way out of my bubble). 

  16. D.C. McAllister
    C
    Larry3435: Filters only affect you if you use search engines to get general information.  If you read a variety of right and left leaning sites, you find out what everyone is thinking.  It is easy enough to bookmark Huffington Post, the NY Times, etc., and find out what the other side is talking about.

    Aggregation sites, like Real Clear Politics, are also helpful.  Yes, they exercise editorial judgment, but you know where they are coming from so you can use what they offer without relying on them to carry through on some idiot promise, such as “All the news that’s fit to print.”

    I only use search engines when I already know what I am looking for.  The text of a speech, for example, or a particular video.   · 6 minutes ago

    I think the “filter bubble” really affects people who spend a lot of time of Facebook.

  17. genferei

    First, I am highly suspicious of claims that was a time in the past when there wasn’t a bubble. In the past the sources of information for the general public were so few  the bubble was hermetically sealed.

    Second, I’m wondering what the point of just being aware of things is. I know you say that’s a topic for another time, but, really, is the person who stopped reading at Aristotle (and Paul) really missing out on something compared to the person who religiously reads the NYT and National Review?

  18. D.C. McAllister
    C
    JavaMan: I see his point, but the fact of the matter is we need some sort of curation be it algorithmic or human selection.

    Bottom line we all need curation and interpretation of data because we have limited time and expertise.  · 2 minutes ago

    I think, aside from programmers, it does start with personal responsibility. But that means informing people that they need to take responsibility on the web. The low-infos don’t. They don’t even know the information is being edited. They don’t know about filter bubbles. And sadly, it’s a vicious cycle. All these people we’re trying to get to vote Republican have their heads buried not in sand but in bubbles. But how do you get them out?

  19. Sisyphus

    I use a meta-search engine, startpage.com, that filters identifying information from the backend search engines. Startpage is privacy centric and does not log IP data and the like.

    If you want to go a little more hard core, do an Internet search for the Tor Project. They provide a privacy hardened Firefox and a network of relays and internet gateways that make your packets appear to be coming from some random location across the globe.

  20. John Walker

    Always flush your browser of cookies before searching for something (I’d say disable storing them entirely, but many sites no longer work unless you compromise your security by allowing them) and do your searches through Tor or (more simply) the Tor Browser Bundle.

    You may be astonished to discover how different the world looks when seen from Google Singapore or another site chosen randomly based on which Tor relay your query popped out of.

    Note that some control-freak sites try to detect accesses through the Tor network and may do bad things to you if you identify yourself.  For example, the Mt.Gox Bitcoin exchange says that accounts may be suspended pending re-certification through their anti-money-laundering process if you log in through an IP address identified as a Tor relay.