The Filter Bubble: What the Internet is Hiding From You by Eli Pariser


The Filter Bubble: What the Internet is Hiding From You
Title : The Filter Bubble: What the Internet is Hiding From You
Author :
Rating :
ISBN : 1594203008
ISBN-10 : 9781594203008
Language : English
Format Type : Hardcover
Number of Pages : 294
Publication : First published January 1, 2011
Awards : Goodreads Choice Award Nonfiction (2011)

An eye-opening account of how the hidden rise of personalization on the Internet is controlling - and limiting - the information we consume.

In December 2009, Google began customizing its search results for each user. Instead of giving you the most broadly popular result, Google now tries to predict what you are most likely to click on. According to MoveOn.org board president Eli Pariser, Google's change in policy is symptomatic of the most significant shift to take place on the Web in recent years - the rise of personalization. In this groundbreaking investigation of the new hidden Web, Pariser uncovers how this growing trend threatens to control how we consume and share information as a society-and reveals what we can do about it.

Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook - the primary news source for an increasing number of Americans - prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like "The Washington Post" devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos.

In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs - and because these filters are invisible, we won't know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.

While we all worry that the Internet is eroding privacy or shrinking our attention spans, Pariser uncovers a more pernicious and far-reaching trend on the Internet and shows how we can - and must - change course. With vivid detail and remarkable scope, The Filter Bubble reveals how personalization undermines the Internet's original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world.


The Filter Bubble: What the Internet is Hiding From You Reviews


  • Daniel M.

    I read this book because it’s very well-known, because he gave a famous talk about this at a recent TED conference, and because I work and do research on how people think about the information they get from the internet. In the end, Pariser and I both think about these things a great deal—he worries deeply and writes a book that has essentially one complaint in it. His complaint? Internet companies provide personalization services that distort/affect/limit what you can see and it’s hard to know what’s NOT being shown to you.

    He’s right in some ways, and even I worry about this. But the book feels to me like a collection of essay fragments that’s been amplified to book length.

    Here’s my outline of his book (chapter by chapter). You can see there are a number of repeated themes, but not a book-length argument that’s developed.

    1. The race for relevance
    - personalizing software agents and personalized results are bad
    - Why? Results might be manipulated
    - there’s a bigger problem with companies you don’t know
    collecting data about you (e.g. Acxiom)
    2. User is the content
    - user behavioral data (what you click on, what you read) is being collected
    - this info is used to drive personalized views of your internet experience
    - this causes the reading audience to split into many smaller camps
    - crowd decisions about what’s good is NOT very smart (dull and boring topics get filtered out)
    - how will the important stuff get covered?
    3. Adderall Society
    - confirmation bias exists (if you live in an info bubble, isn’t everything you see confirmation?)
    - filter bubble eliminates all variant views
    - this gives you a very biased view of the world
    - it gives you *focus* (which is good), but it’s like someone taking Adderall (implicitly bad)
    4. The You Loop
    - there is an identity problem—behavior tracking doesn’t always give a rich model of you
    - as a consequence, info is filtered for you and tends to lock-in on one particular model of you
    5. The public is irrelevant
    - surprise! The news is manipulated.
    - the cloud is run by a small number of companies
    - outreach (e.g., in political campaigns) is limited to those who can be influenced
    6. Hello, World!
    - programming is important; you need to understand how algorithms work
    - internet use is voluntary, except when you need to compete against people who use it (then
    you're sort of pushed into it for competitive reasons)
    7. What you want, whether you want it or not
    - advertisers are really good at figuring how to get past your defenses
    8. Escape from the city of the ghettos
    - some ideas about ways to get around the filter bubble

    It’s irksome that the book is fundamentally a fairly haphazard collection of mini-essays on a small number of topics that don’t make strong arguments. The book has section titles like “The robot with Gaydar,” and then never says anything about “Gaydar” in the section. What should the reader take away from that? What about a chapter like “The Adderall Society” where the argument is a guilt-by-association. He argues that increased focus on a task (such as might be provided by a filtering mechanism) is a bad thing because drugs like Adderall help some people do that. (Really? That’s his argument??) Or that Google’s image recognition technology is slammed because Google did NOT launch it. (He seems worried that such technology exists at all, but drags Google into it not because they use it, but that it might be possible.)

    I also have to object to his style of writing. Page 201: “Google Research has captured most of the scholarly articles in the world.” Did he really mean “captured” in the sense of “to take control over”? Google Scholar (not Google Research) provides links to much of the world’s scholarly research literature, but that literature isn’t even stored on Google servers—the service is to provide an easily searchable index that gives links to the documents. They’re not “captured” in any sense.

    But this is the way the entire book is written: the language is negatively nuanced to make you feel that you’re being given an inside scoop on the evils of information filtering.
    If you take a step back, you realize that Pariser is fundamentally interested in how political ideas get munched in the filtering and personalization software. He’s worried (and in this I agree with him) that important stuff—laws, policy, regulations...all that boring, but deeply important political content—will be left out in a consumer-interest-driven information world.

    Pariser is longing for the days when a really great editor would pick and choose what you really need to know and put it on the front page for your edification.

    He seems to have forgotten all of the yellow journalism that preceeded the golden age of “objective” journalism, and has an optimistic view that before automated information filtering and content tailoring we somehow could all easily detect sources of bias and we lived a life of pure objectivity and knowledge.

    That is, of course, nonsense. Everyone has always lived in a highly mediated world. Libraries (which we tend to think of as ultimately open and bias-free information sources) have ALWAYS been highly curated, selectively choosing what gets included in their stacks and offerings. Newspapers ALWAYS have had a political bias, sometimes evident, sometimes not. Compare France’s Le Monde with the New York Times, or with the Dallas Morning News, or with the LA Times, and you’ll see four very different takes on the world.

    Pariser longs for the day when we all read the same canon of literature and daily news. But note the fundamental contradiction—he worries that we’re all being pulled into separate information cells that are re-confirming our beliefs and diverse in the extreme, but at the same time he wants us to live in HIS filter bubble where the *important* (that is, important to him) information is force-fed to us whether we want it or not.

    Is this an important book? Probably, if only because it has surfaced some important issues. We DO need to be aware of the filtering that is being baked into all of our information services. But this has forever been thus: his book reminds us that we need to take this filtering seriously, especially now that the filtering is constantly changing. In the end, I actually agree with his recommendations that we become aware of the filters and that we take conscious action to not be simple passive consumers of everything that is wafted our way.

    I just wish he’d written a more organized argument about it and been less rhetorically inflamed by the whole thing.

  • Felicia

    Well, if you want to be terrified about how the web is scooping information about us, stereotyping us, pigeonholing us, basically doing the opposite of what we thought the web was GOING to do for society, then read this book. At the very least, it helps become informed about exactly what we do when we surf the web. Nothing is safe online. Everything you do online is defining you in ways you never thought you'd be defined. Everything you do is hackable. The future is even worse in those respects.

    Lots of fun paranoia-inspiring information for the tech-savvy!

  • Laura (Kyahgirl)

    3.5/5; 4 stars; B+

    The first half of this book is a solid 5 star read and I'd recommend it to anyone who wants to learn more about social engineering and the Internet and some of the ways we are heavily manipulated through our searches, likes, clicks.

    The author got a bit carried away and dragged the story out, taking away some of the impact so that is why I didn't give it a 5 star all the way through.

    I think an important message is that people have to be diligent about looking for information , don't believe everything you read, don't let someone else's filters and algorithm lead you down the garden path. I think a lot of people are fundamentally lazy and therein lies the danger in the filter bubble. Many, many people are perfectly content to sit back in their figurative easy chair and soak in whatever 'truth' is being fed to them. The real strength in this book is that it points out in a clear and understandable way all the many things being done to our information stream that can alter the truth we perceive.

    Well worth the read. I'd recommend that anyone who uses the Internet give this book a read.

  • Kristen

    The big message in this book is that "curators'" of information on the Internet, like Google and Facebook, use of personalization has significant negative consequences. If I search for something on Google, I am going to get results tailored to where I am and who Google "thinks" I am. Pariser argues that we are less and less confronted with ideas we don't agree with or new and surprising ideas.

    The biggest issue is not even that the personalization is happening , but that it is completely opaque and there is no way to "opt out" and say, "Hey, I want to see some new stuff!" He also actually provides suggestions for how to deal with this that don't involve saying "let's go back to not doing it" which I really appreciated.

    I was never quite sure that Google did this, but I did see that when I was looking for research, results from the local university that I got my graduate degrees from came up more often than I would expect. I have no reason to want those results to come up, I want the thought leaders, which they may or may not be. On the other hand, I certainly appreciate the efforts to prioritize result for me; there's too much out there to do it myself. But, as Pariser points out, it is not clear what their model of me is, and I have no way to judge it.

    Good read!

  • Atila Iamarino

    Comecei a ler por influência do @luisbrudna e não me arrependi. Achei que fosse ser um tanto repetitivo um livro inteiro sobre este tema, ou que fosse falar sobre o que já foi dito em outros livros do gênero como o Information. Nem próximo disso. Excelente argumentação, ótimos exemplos, fundamentação sólida em cognição. Tanto que mesmo sendo de 2011 não está desatualizado. E o mais legal, quando o
    Eli Pariser cita alguma teoria da cognição, explica rapidamente sem precisar redescrever tudo, citando uma fonte mais completa. Entre os melhores livros desse ano, tranquilamente.

  • Matt Maldre

    Very interesting book. Here are the notes I wrote in the margins while reading it on the Kindle.

    -------------------
    Page 15
    Note: This is why I love going to libraries. The chance encounter of a new topic you never thought of exploring. (256)
    -------------------
    Page 17
    Note: I need to go to town hall meetings (279)
    -------------------
    Page 20
    Notes on this intro: I don't mind companies targeting me as I live my life much with a transparent attitude. However the author makes very good point that we each end up in or own bubble. Now, all this info gathering is what I totally want our strategy to be at work, we need to know our customer. I blows my mind that ppl don't see that at work, or don't care. (316)
    -------------------
    Page 29
    Quote: “When you log in after a day reading Kindle e-books at the beach, Amazon is able to subtly customize its site to appeal to what you’ve read:”

    Note: Ha! I'm reading this on a kindle app now! Hi Amazon! (423)
    -------------------
    Page 49
    Quote: Now all that was changing. One executive in the marketing session was especially blunt. “The publishers are losing,” he said, “and they will lose, because they just don’t get it.”

    Note: So true. This carries over to the syndication world as well. Ppl don't get that you have to make your content reach a demographic. Your content can't just be general generalness anymore, like how newspapers behave. (661)
    -------------------
    Page 83
    Quote: Stumbling on Happiness author Dan Gilbert presents volumes of data to demonstrate that we’re terrible at figuring out what makes us happy.

    Note: But God knows (1080)
    -------------------
    Page 85
    Quote: ensure that we aren’t constantly seeing the world anew:

    Note: I wonder if I have poor schema, cuz I often see the world anew. Or perhaps my schemata is flexible. Or maybe I have a bunch able to be referenced like a library. Hmmm I think it's the flexible one. I don't have a deep library brain. My brain is not strong in brute force, but more nimble and flexible. (1099)
    -------------------
    Page 85
    Quote: Schemata can actually get in the way of our ability to directly observe what’s happening.

    Note: What we called in art critiques, "baggage" the viewer brings (1105)
    -------------------
    Page 91
    Quote: contents. But to feel curiosity, we have to be conscious that something’s being hidden.

    Note: Kinda like how I'm going to start hanging the "what is your treasure" tag outside the treasure chests (1175)
    -------------------
    Page 91
    Quote: Stripped of the surprise of unexpected events and associations, a perfectly filtered world would provoke less learning. And there’s another mental balance that personalization can upset: the balance between open-mindedness and focus that makes us creative.

    Note: Yes! The flexible mind for creativity! It makes me thankful that right after college I made it a goal to live a creative life and to show others how to live creatively. I researched and devoured books on creativity. It makes me glad that I did that at a young age, it set me up to be where I'm now, and where I'm going. (1185)
    -------------------
    Page 92
    Quote: Stripped of the surprise of unexpected events and associations, a perfectly filtered world would provoke less learning. And there’s another mental balance that personalization can upset: the balance between open-mindedness and focus that makes us creative.

    Note: Whoa. Adderall cuts down creativity? AWAY ADDERALL AWAY! (1191)
    -------------------
    Page 93
    Quote: Farah, the director of the University of Pennsylvania’s Center for Cognitive Neuroscience, has bigger worries: “I’m a little concerned that we could be raising a generation of very focused accountants.”

    Note: Lol our poor accountants! (1202)
    -------------------
    Page 93
    Quote: definition, ingenuity comes from the juxtaposition of ideas that are far apart, and relevance comes from finding ideas that are similar.

    Note: Long live wikipedia and its abilities to make us curious about random topics (1206)
    -------------------
    Page 93
    Quote: “By definition, ingenuity comes from the juxtaposition of ideas that are far apart, and relevance comes from finding ideas that are similar.”

    Note: Hmmm interesting point. I agree on that definition of ingenuity/creativity, but couldn't creativity also be bringing together relevant things? Or am I watering down what creativity is? (1210)
    -------------------
    Page 99
    Quote: One could tie a string to the barometer, lower it, and measure the string—thinking of the instrument as a “thing with weight.” The unamused instructor

    Note: Ha! (1286)
    -------------------
    Page 100
    Quote: Avoid smartass physicists. But the episode also explains why Bohr was such a brilliant innovator: His ability to see objects and concepts in many different ways made it easier for him to use them to solve problems.

    Note: It's too bad there are so many narrow-minded. But that doesn't bother me much, I can stil do my creative thing. (1293)
    -------------------
    Page 100
    Quote: The kind of categorical openness that supports creativity also correlates with certain kinds of luck. While science has yet to find that there are people whom the universe favors—ask people to guess a random number, and we’re all about equally bad at it—there are some traits that people who consider themselves to be lucky share. They’re more open to new experiences and new people. They’re also more distractable.

    Note: Lol. I'm distractible. I wonder how many of the hundred of people here so intently focused on the music are distractable. (1296)
    -------------------
    Page 102
    Quote: Creative environments often rely on “liquid networks” where different ideas can collide in different configurations.

    Note: This is SO why I want flexible workspaces at work (1315)
    -------------------
    Page 121
    Quote: an awful day in the near future, Pandora might know to preload Pretty Hate Machine for you when you arrive.

    Note: If I'm in a bad mood, I want to hear something positive. I don't want to continue in my rut. That's like being stuck in a pit, and not coming out (there's a Scripture reference about the pit in Psalms) (1552)
    -------------------
    Page 121
    Quote: it can also be used to take advantage of your psychology.

    Note: It goes to show that you choose your mood. Circumstances can be hard, but ultimately you choose how to handle it. Lke in James are you a ship that is tossed to and fro by every prevailing wind? Your attitude is determined by choices made over a period of time. (1552)
    -------------------
    Page 122
    Quote: spring for the slicer-dicer that they’d never purchase in the light of day.

    Note: I see what he's saying, but 3am is the least likely time I would buy. I want to do research first. I don't fear this at all. Bu it does suck for those who are prone. (1558)
    -------------------
    Page 122
    Quote: it’s not such a stretch to imagine political campaigns targeting voters at times

    Note: Notice the proliferation of political tv shows on Sunday mornings, the time when we spend worshipping what is most important to us. (1565)
    -------------------
    Page 123, 124
    Quote: your phone should figure out what you would like to be searching for before you do. In the fast-approaching

    Note: Ha. Right my mind is way too all over the place when I'm walking around public. BUT I wouldn't mind hearing interesting trivia about certain places. It would also be neat to have a record of al the places I've placed public art for people to see even after it's gone. (1579)
    -------------------
    Page 124
    Quote: Westen, a neuropsychologist whose focus is on political persuasion, demonstrates the strength of this priming effect by asking a group of people to memorize a list of words that include moon and ocean. A few minutes later, he changes topics and asks the group which detergent they prefer. Though he hasn’t mentioned the word, the group’s show of hands indicates a strong preference for Tide.

    Note: Is that because Tide has a logo shaped like the moon or because it's the moon/tide moving thing? Maybe I use Tide because I tend to do my laundry at night. +rolls eyes+ (1590)
    -------------------
    Page 126
    Quote: Though there are people whose posts you’re far more interested in, it’s her posts that you see.

    Note: I totally want to be able to control this. That's why I set up lists, but it doesn't show all that I want. :-/ (1606)
    -------------------

  • Ignacio Izquierdo

    Si tu y yo realizamos exactamente la misma búsqueda en Google obtendremos resultados distintos. En toda esta maraña de información que se ha convertido internet, encontrar exactamente lo que quieres puede resultar tremendamente complicado, asi que las webs se han propuesto conocerte mejor. A través de la información que vas dejando mientras navegas se puede hacer un perfil bastante preciso de quién eres y de cuales son intereses. Las miguitas de pan que vas dejando son clics en google, "me gusta" en las redes sociales y una larga lista de cookies en tu historial. Pongamos, por ejemplo, que buscas "pantera". Si eres un amante de la naturaleza priorizará los resultados del felino, si entre tus intereses principales está el rock es probable que el primer resultado sea de la banda de heavy Metal y si por el contrario eres un forofo del deporte y vives en Carolina del Norte, google lo tiene claro: tu quieres información del equipo de futbol americano.

    Dejemos de un lado el algo escamoso asunto de todo lo que vamos revelando involuntariamente y lo que una máquina puede saber de nosotros. Dejemos por otro lado que ni somos lo que clicamos ni somos lo que compartimos (todo eso se irá puliendo con los siguientes avances tecnológicos). Pensándolo fríamente lo cierto es que como idea no está nada mal.

    Pero tiene una serie de inconvenientes sobre los que merece la pena pararse a reflexionar. Lo primero es que cuanto mejores sean los algoritmos y más precisos los resultados de las búsquedas, menos posibilidades tendremos de encontrarnos con cosas que no estábamos buscando y que son parte de nuestro proceso de enriquecimiento personal. Después de todo, cuando la web se formó, para muchos era la mejor manera de compartir ideas, de conocer como se pensaba en la otra parte del mundo. Pero si nuestros intereses se quedan acotados a solo lo que nos interesa en este momento, ¿como vamos a ser capaces de abrirnos a encontrarnos nuevas ideas?.

    Me explico con un ejemplo. La información. Los periódicos digitales. A los medios les interesa que pases el mayor tiempo posible en sus páginas, de tal manera que eso se pueda convertir en potentes estadísticas que vender a sus anunciantes. Para ellos puede ser una idea excelente que según entres en tu periódico esa primera página ya esté personalizada para ti según tus intereses. ¿Para que ocupar espacio con noticias de economía en las que nunca pinchas? ¿O en lejanas guerras a las que tampoco haces mucho caso? Puede que te guste la política nacional, o las noticias de ciencia y tecnología, así que... ¿por que mostrarte el resto? Solo conseguirá que te aburras y que te vayas a otra web. Creo que se pueden entender las consecuencias de esto, algo que ya sucede pero que se puede agravar aún más. Ciertas noticias realmente importantes en el mundo que se perderán, ni siquiera tendrán la oportunidad de dos líneas y un titular que resuene mínimamente.

    Poco a poco, la web, ese vasto océano de información insondable se va quedando filtrado en una burbuja en la que no estas expuesto a ideas nuevas. Eres una caja de resonancia de ti mismo. El debate desaparece. "Haces clic en un enlace, que indica un interés en algo y eso significa que en el futuro serás más propenso a leer artículos acerca de ese tema en concreto que previamente habrán sido preparados para ti. Quedas atrapado en un bucle sobre ti mismo.".

    Otro planteamiento. Hemos partido del supuesto de que los algoritmos están para ayudarte a filtrar la información en base a la buena fe de empresas como google o Facebook. Convendrán en que ese supuesto es bastante dudoso y esa información está circulando a raudales por la web, vendiéndose a anunciantes. ¿Que sucede si hay empresas que sean capaces de entrar en esa burbuja? ¿Que sucede si son partidos políticos? "La democracia solo funciona si nosotros, en cuanto ciudadanos, somos capaces de pensar más allá de nuestro limitado interés personal. Para ello necesitamos tener una opinión generalizada del mundo en que vivimos juntos".

    Realmente me ha parecido un un libro tremendamente interesante, que abre la puerta a un montón de debates, no solo ahora sino en los próximos años. (Y por cierto, me ha parecido que acompaña muy bien a muchas de las ideas que se exponen en Homo Deus de Yuval Noah).

  • Richard

    It's ironic how I became aware of this book and read it, given the topic of filtering and personalization. I found this book serendipitously. I was in the public library waiting for a workstation to open up. I was standing at the beginning of the non-fiction book section. This book has Dewey decimal number 004.678, right at eye-level where I happened to be standing, idly waiting. "Oh," I thought, "This looks interesting." I flipped though it and decided to check it out and read it. Just what the author says will NOT happen in the future when every aspect of our lives is filtered and personalized for it.

    It's ironic even further as I discovered that the author is (or was) the board president of MoveOn.org, therefore, has a political view very different from my own. Well, I thought, this book isn't about politics, so I'll invest the time and see what he has to say. I was rewarded for that time. A few incidental references excepted, the author Eli Pariser treated his subject in a very even-handed and thoughtful way.

    At first, I took "you" in the subtitle of the book "What the Internet Is Hiding from You" to mean the "collective you", in other words, all of us. But no, he means each person's online displays are different from those of everyone else, therefore, preempting what some other people would see.

    In a nutshell (and I don't really think this is a spoiler), even as of 2 years ago when this was written, personalization is more ubiquitous than you might think, and the ramifications are far more widespread. Pariser poses interesting questions, including, how can we have a national culture when we no longer have common experiences, common information, and common frames of reference?

    It's kind of interesting, at the end of the book the author doesn't really have any prescriptions to fix the problem or deal with it. He talks about a few things he thinks won't work, like the national Do Not Track registry. And he suggests generic things like contacting your Congressman to express your concern about the issue. To be fair, I don't expect every author to know how to remedy problems that they write about, but it did make the book a little anti-climactic.

    This book shares a flaw with most other sourced non-fiction books today. The author makes use of end notes instead of footnotes. I'd rather see the source of the information on the page on which it appears rather than at the end of the book with no visual indication in the text that a note even exists at that point.

  • Angie Boyter

    NOTE: A month after writing my original review I changed my rating from 4 to 5 because of how it has stayed with me and the number of interesting conversations I have had about it.
    In the introduction to The Filter Bubble, Eli Pariser delivers a very thought-provoking message: the internet is getting better and better at knowing what we want and personalizing what we see, and that is not necessarily a good thing. We all want searches and websites to show us what we are after, but the more our computer experience is personalized, what will we NOT see? And what are the consequences of that? After reading his introduction, he had me convinced that this is a tricky issue, and I wondered what was left to say in the rest of the book. The answer is, “A lot”, and Pariser says it very well.
    Pariser explores current-day influences on the internet, from Google, Facebook and Amazon to lesser-known but important players like Acxiom, and explores possible future “enhancements”, with their advantages and their dangers. He does an excellent job of explaining the cognitive biases and other thought mechanisms that make personalization a problem and of describing the effects on various aspects of our lives and society. His research was broad and impressive; he quotes sources from John Stuart Mill to Walter Lippman to Dan Ariely (WARNING: This book cites so many other interesting-sounding works that your Want to Read list is likely to grow) .
    If there is a weakness it is in the relative lack of solutions, but that is not surprising. I wouldn’t expect an easy answer to such a complex question as this, but at least he has done a good job of raising the question. This is the kind of book I will recommend to a number of my friends, all for different reasons, and if enough people become aware of the issue and all of its ramification I am hopeful that we can maximize the utility of the internet while avoiding the worst of the pitfalls.
    Read again December 2014 for The Sunday Philosophers

  • Betsy

    A very important book for anyone who uses the internet. The big providers -- Facebook and Google especially -- filter the content they present to you, without telling you and without your permission. Even if you think you've elected to receive everything. They do it in the name of personalization, but it's largely to services advertisers, and it affects your online experience in insidious ways.

    This book is short, well-written, and easy to understand. Although written by a well-known liberal activist, it is not a political book. Anyone who is concerned about freedom and control over his or her own life should read this. At the very least you should be aware of the issues. But also consider carefully the solutions Pariser suggests -- they are logical and reasonable and entirely within our means if we don't wait too long.

  • Raghu

    The Mosaic Browser unleashed the internet boom of the 1990s. The National Center for Supercomputing Applications (NCSA) at the University of Illinois in Urbana–Champaign developed it in late 1992. NCSA released the browser in 1993. It was a 'Killer App' which brought the Graphical User Interface in our quest to search and navigate the exploding wealth of distributed information that was on offer. Edward Snowden says that the Internet was mostly made of, by, and for the people till about 1998. Its purpose was to enlighten and not to monetize. It was administered more by a provisional cluster of perpetually shifting collective norms. Snowden believes that this was the most pleasant and successful anarchy he had ever experienced. Fast forward twenty-five years, we have this book by Eli Pariser. It tells us that there are algorithms at work which sabotage our access to the open and free Internet of the 1990s. What we get now is what the algorithms think we want to get. How did we get to this bizarre world in just two decades? Is it true that this is what has become of the Internet, or is it another false futuristic projection?

    Before answering the above questions, let us look into what Pariser says about 'the Filter Bubble.' The Internet explosion of the 1990s ushered in a new era of open access to information as it was, direct from the producer to the consumer. Independent online magazines and websites flourished, and it looked as though we could break out of a filtered universe, managed by the media companies. Alas, Pariser says that, on the contrary, it is getting a lot worse than before. Inscrutable algorithms have replaced the human sentinels of the past with even more control. They scrutinize what kind of searches we make and what search results we click on. They know what websites we visit and what we buy online. They listen in on what we say on various issues in our emails and blogs, what news stories we read, and what books we buy. Based on this and many other of our online actions, they generate a composite profile of who we are and then use it to filter our online experience. As a consequence, when we search on Google for 'Climate Change Issues,' we get a different set of results from our contrarian friend.

    The filters operate on a three-step process. First, the algorithms figure out who we are and what we like. Then, they provide us with content and services that they think best fit us. Finally, with more and more online activity, they refine it to get the fit just right. The end product is that your identity shapes your media. Is this a problem? Pariser says that this is dangerous for our democracy. The reason is that democracy requires citizens to see things from more than one point of view. Instead, these filters enclose us in our bubbles and offer us parallel but separate universes. Personalization filters influence us with ideas which we already hold. They amplify our desire for things that are familiar and leave us oblivious to the contrarian world beyond.

    The author also offers a solution. He says that the Internet must provide us with what we want to see and even with what we need to see. We should get a plethora of information that includes ideas that challenge our notions and make us uncomfortable. This requires the filtering algorithms embed in themselves a broader sense of social responsibility. That is the only way to do justice to the original goal of the Internet flooding us with new ideas, new contacts, and contrarian outlooks.

    When I finished reading the book, I found that I am not at all in agreement with either the author's thesis or his solutions. I have been an active user of the Internet for the past twenty-five years. Contrary to the author's belief, my experience is that we now hear more diverse voices than ever before. Independent studies also show that most people do not live in echo chambers and 'filter bubbles' created by Facebook or Twitter. I shall try to put forward the reasons for my conclusions below.

    This book suggests as if we had a filter-free world before and that the Internet giants have taken it away. The truth is that we never had a golden age of a 'Filter-free information world.' Before the Internet of the 1990s, in democratic societies, information was disseminated to the public mainly through newspapers, radio, TV, and other magazines. However, the owners of these media and their journalists controlled and edited what we saw and read. Hence, we always got only a filtered view of the world, depending on which parts of the media we favored. For example, if one was a liberal, one chose to read one or more of the NY Times, Washington Post, The Guardian, The New Yorker, the Le Monde Weekly, and so on. In visual media, the choice would have been CNN, MSNBC, or BBC for news.
    Similarly, people on the conservative side chose their options in the media. Our social circles consisted mostly of family, friends, colleagues at work, and neighbors who probably served partially as echo chambers. A majority of us lived in bubbles like this, created by ourselves. These were perhaps the reasons why it was easy for the government and the media to convince us that Saddam Hussein had WMDs. Weapons inspectors like Hans Blix and Scott Ritter had repeatedly stated in public that they had disarmed Saddam already. But they did not get through the filters. Going back to the 1950s, we had the media creating echo chambers to enable Senator McCarthy to create the paranoia of 'a communist under every bed.' In the next decade, it was the panic that the Soviet Union had opened up a dangerous lead over the United States in the deployment of intercontinental ballistic missiles. In the 1980s, it was the Japanese who were supposed to take over our electronics and chip industry. To get exposed to diametrically opposite and different ideas, one had to read various newspapers or magazines or books. All the contrarian views cost money to access, and so most people ended up just spending on media that reflected the world view they already had.

    Contrary to the author's thesis, there is evidence to support the view that people find themselves less and less in a filter bubble since the new internet era began in the 1990s. Three independent studies, according to the BBC, have been conducted since 2013 to evaluate the claims of echo chambers and filter bubbles. Seth Flaxman and colleagues at Oxford University examined the browsing histories of 50,000 users based in the US in 2013. They found that social media and search users who landed on Breitbart and Fox News were also more likely to visit sites expressing opposing viewpoints. Their media diet was more varied, overall. Flaxman says that social media, by its nature, exposes you to several other sources, increasing diversity.
    Similarly, a Pew survey around the 2016 US Presidential Election broadly agreed with Flaxman's findings, with the majority of people reporting a range of opinions in their social media feeds. And the University of Ottawa's Dubois came to similar conclusions with her studies as well. In an ironic twist, a team led by Christopher Bail at Duke University measured a group of more than 1,600 Twitter users' political positions and came to some startling results. They found evidence that well-meaning attempts to counter the echo chamber and filter bubble could lead to more political polarization!

    Moreover, it is not as though we are in a vice-like grip of the Filter Bubble. As a political liberal, I can access the conservative and right-wing perspectives easily and without cost by visiting websites that promote them. I can also check the veracity of their claims immediately by going to a fact-checking website. For example, Duke University, NC, maintains a database of fact-checking organizations. It tracks more than 100 non-partisan organizations around the world. The criteria for inclusion in the database is also specified so that we can make our own decision on trusting the websites.
    As for getting diversified views on a given question, today's Internet offers multiple numbers of ways.
    When we read a news story, we can also read the Readers' comments column. Often, this section exposes us to critical views and comments from people of different backgrounds, outside our social circle. It is especially so in news sites like NYTimes, WP, and The Guardian. Such a possibility did not exist in the pre-internet era. In the past, the readers' responses were posted only a day or two later, when the story was already cold. Even then, we got to read only a few of them. Nowadays, we get exposed to hundreds of opinions on each news story.
    Secondly, Google itself has course-corrected now after criticisms of the filter bubble. A year ago, to give readers a full range of perspectives, Google News app included a 'top stories' option, in addition to the personalized news. Articles included here were selected according to how trusted their source is rather than the user's preferences.
    Thirdly, Twitter is unfiltered. We can follow people on Twitter who are active in areas of our interest but have a different opinion from us. Similarly, on Facebook, we can keep friends whose contributions we do not like, but which represent a good counterpoint, with the function "show first" prominently in the newsfeed.
    Fourthly, we could use DuckDuckGo to search instead of Google. DuckDuckGo does not track us. However, in my experience, search results from Google are superior to the ones from DuckDuckGo.
    Fifthly, we can stop using 'Sign in with Facebook/Google/Twitter' while logging on to other sites, to prevent profiling.

    In conclusion, I think the fears of filter bubbles and echo chambers are overblown. Media has always shaped our identity. In the past, once we chose the media to tune into, its editorial board decided on the filter. Now, based on our online behavior, an algorithm selects the filter. In my view, this is a better situation for us. We have control over how the algorithm profiles us. If it correctly profiles me as a liberal, then I can watch a few NPR videos on Steve Bannon and his interviews to broaden my profile. I could search for papers on 'Climate change skepticism' and read them. I could regularly read the London Times. Such online actions would diversify my profile. I can even throw a wrench in the works by googling something nutty to mess up the personalization algorithm and thereby randomize my profile. The simple fact is that those who always strove for diversity in their world view would always seek it and find it, whether filter bubbles exist or not.
    On the other hand, those who preferred to live in filter bubbles and echo chambers will continue to do so even if the Internet offered diversity. It is not the job of the filtering algorithms to shoulder a broader sense of social responsibility. The original goal of the Internet is to flood us with new ideas, new contacts, and contrarian outlooks. It is still so, and it is our responsibility to tap it.

  • Alan Calvillo

    Siento que a veces no logramos dimensionar el poder que tienen las grandes tecnológicas y la visión tan positiva que tenemos de ellas. A cualquier lado que volteamos podemos ver alguien interactuando con ellas.

    Creo que este libro es una buen punto de partida para comenzar a dimensionar los peligros, en lo individual como en lo colectivo,de la automatización y algoritmización de nuestras vidas.

  • Ani Hakobyan

    Օղորմածիկ Անտոն Նոսիկն էս գրքի մասին ասել ա՝ սա Աստվածաշունչ ա բոլոր նրանց համար, ովքեր ուզում են մահու չափ վախենալ Ինտերնետից, Գուգլից Ֆեյսբուքի հետ միասին ու հավատալ համաշխարհային դավադրությանը։ Յուրաքանչյուր պարանոյիկ պիտի կարդա սա ու անգիր անի)) Pariser-ը Ֆիլտրերի պատի ու պերսոնալիզացիայի մասին գրել ա մոտ տասը տարի առաջ։ Տասը տարի առաջ ամերիկյան համացանցում հենց էն գաղջն էր, ինչ մենք ունենք այսօր Ֆեյսբուքահայաստանում։

    Մի խոսքով՝ անցնում ենք թվային դիետային ու սպասում Պարիզերի կանխատեսած դիջիթըլի էկոակտիվիստների ի հայտ գալուն։

    Իսկ ավելի լուրջ, թերևս, գնամ քուքիներես մաքրեմ ու գրեմ լուրջ նյութ էս գրքի մասին։

  • Mark

    Eli Pariser argues in The Filter Bubble that "rise of pervasive , embedded filtering is changing the way we experience the internet and ultimately the world." Now that companies can aggregate our web behaviors, likes, and purchases, online profiles of web users can be built that can be profitably sold to interested parties. This book therefore covers two issues: total personalization of delivered web data, and nature of these created web personas.

    Regarding the first issue, I'm not as concerned as Mr. Pariser. He clearly describes the architecture underlying the web personalization process, and demonstrates how it can result in vastly different web experiences for different people, based on their interests expressed on the web. In one example, he suggests that two people googling the exact same term could receive different results custom-built to their web-perceived selves. Great, on the one hand, if you are looking for a local restaurant, but maybe not so much if you are looking up, say, a politician.

    But this presupposes an absolutely passive approach to the web. I don't get my news from the Google news reader, where articles will be served up for my interest. I get my news from the New York Times, the Washington Post, and the Wall Street Journal, where I will be exposed (at least so far) to the same stories as everyone else. If the day comes that they start tailoring their home pages according to what I've clicked on and read, I guess I will need a new strategy.

    The second issue is of much greater concern. As the author points out, "While the internet has the potential to decentralize knowledge and control, in practice it's concentrating control over what we see and what opportunities we're offered in the hands of fewer people than ever before." Not just fewer people, but data aggregating businesses that are largely invisible to the public. Mr. Pariser calls for the creation of a federal body to oversee the use of private information, and legislation along the the lines of the Fair Credit Reporting Act. To achieve this kind of regulation, though, will take an educated electorate, and The Filter Bubble does a great job of laying out the issues.

  • محمود أغيورلي

    يتحدث كتاب " فقاعة التصفية / الترشيح " للكاتب ايلي باريسير عن المشاكل والسلبيات التي تنتج اليوم عن خوارزميات التصفية والترشيح التي تستخدمها مواقع البحث والتواصل الاجتماعي وكيف تقوم تلك الخوارزميات بتغير طبيعة العلاقات الاجتماعية و طريقة التفكير والتحليل , ويقدم الكتاب المميز هذا مئات من الامثلة والاحداث ا��ت�� تدخلت فيها عمليات جمع المعلومات الهائلة في صنع القرارات الاقتصادية والسياسية والاجتماعية مما يترك الكاتب مذهولاً من العالم الخفي الكبير الذي يتمدد بسرعة كبيرة خلف مواقع البحث والتواصل الاجتماعي , والكتاب ركز بصورة عامة على ثلاث سلبيات محورية تنتج عن مشاهدة ومتابعة ما يتوافق مع خيارات المستخدم فقط او المنحى الشائع العام : الاولى : ان الخوارزميات تقدم صورة خاطئة للمراهقين عن العالم و هي صورة مسالمة مشابهة لخيارات المستخدم بينما العالم الحقيقي ليس كذلك . الثانية : ان الخورزميات وبسبب انها تقوض الافكار المعاكسة للمستخدم فإنها تحد من الابداع والتفكير ومحاججة الفكر . الثالثة : الخوارزميات وبسبب اتاحتها كم بسيط من معلومات هائلة متنوعة تعمل على ايهام المستخدم بأنه نال المعرفة فيرضى بذلك و يتقاعص عن البحث والمعرفة الحقيقية . الكتاب بصورة عامة كتاب مميز جداً و يستحق القراءة بجدارة وخاصة ان عالم الخوارزميات سوف يصبح قريباً هو العالم الوحيد المتاح للأجيال القادمة وهي التي قد تحدد يوماً الخطأ من الصواب , الحق من الباطل , الملائم من غير الملائم ! . تقيمي للكتاب 5/5

  • Mara

    Please, please, guys, read this book. It's your future and your data we're talking about.

  • Kerry

    This book holds up even though it was published several years ago. While some info may be outdated, the book focuses on the psychology and other larger issues behind our use of technology/the internet/social media, and so it still has a lot to offer by way of encouraging critical thinking about what companies want out of us and how it shapes our behavior. The book is well-written, engaging, and largely still relevant, and where it isn't, it's curious to see how the future diverged from what was predicted and where we are, in fact, today since the time of the book's publication.

  • Nancy Mills

    Interesting and important topic. Well written.

  • Hector Bosch

    Muy buen libro. Lo recomiendo a todo el mundo de cualquier tipo de formación. No solo te explica como funciona la red en la actualidad y lo transparentes q somos, si no que la base teórica q tiene es brutal q aprendes en todos los sentidos. No me cabe duda de q le volveré a echar un vistazo en mi vida.

  • Jacob

    A better entry than
    The Googlization of Everything:, this one actually references that book, but it still can't escape the "being alarmist but not having any real catastrophic consequences to point to" trap. It's getting kind of easy to recognize the arguments. First there's this fact which is kind of unsettling, then that fact which is kind of unsettling, and then there's launching into a fear of something extreme resulting which doesn't really follow from the basic facts. And, one of my pet peeves, it certainly doesn't account for any human or societal reaction which might occur to prevent or respond to such catastrophes.

    A lot of the basic issues which have arisen are good ones for people to be aware of; people should know, for example, that when they use a free service it's because the company behind the service wants to make money marketing (or helping other companies market) to them. People should be aware that their search content is being tailored to their previous browsing experience and what the search engine thinks it knows about them. And people should be aware that decisions are being made about them based on their data which they are unlikely to find out about.

    However, it's a long way from the beginning of these trends to the "sky falling" scenario of companies ruthlessly controlling or manipulating us. The first reason is because, although companies are collecting data so they can display ads you are more likely to click on and make a purchase, for example, they're... not great at it yet. And it will be quite some time before they really are. The next reason is because the author, like others, doesn't factor in what human response might be as this change occurs. There are a lot of reactions that might change the effect of companies tracking our behavior in the future, and very little time is spent anticipating that.

  • Naum

    Admittedly, upon initial reading, began by sharpening the cutlery and prepared to launch into critical invective about this book. But it was not a terrible read at all, and the Mr. Pariser struck salience at a number of points.

    I just reject the overt thesis that personalized filtering is the great 21st century media Satan. Yes, lack of serendipity is of some concern, but not the petrifying bogeyman that seems to warrant most of the book's main topic is way overblown, in an age where a discerning user can still discover and ferret out a panoply of diverse perspectives, reports and views.

    However, this tidbit (which I shall condense in a single quote from the book) should trigger an alarm for all -- how code embeds then dictates the parameters of societal interaction:

    We live in an increasingly algorithmic society, where our public functions, from police databases to energy grids to schools, run on code. We need to recognize that societal values about justice, freedom, and opportunity are embedded in how code is written and what it solves for. Once we understand that, we can begin to figure out which variables we care about and imagine how we might solve for something different.


    But, still, this is at best a long essay that's been extended into book form. The argument would have been better served by further editing.

  • Kate Woods Walker

    Although much-discussed in the past year and oft-quoted amongst the websites, blogs and message boards I frequent, The Filter Bubble by Eli Pariser, for me, was a rather plodding look at internet "personalization" trends. I found myself putting the book aside and forgetting to take it up again, perhaps due to the immediacy of the internet itself, which made much of what Pariser presented already old news to his intended audience.

    But it was a good, solid book about an important subject, so perhaps the fault was more in the attitude of the consumer rather than the presenter.

    With the continuing death of the old school publishing that feeds internet news, it's too soon to know what will become of news content. I can't help but agree with the author's conclusions that ideological ghettoization will continue, and people will continue to enjoy having their beliefs reinforced.

    His solution? We should take care to consume information off our usual beaten tracks. Be unpredictable online. Run zig-zag, so to speak. And I'd also add to that list, lie boldly, frequently and colorfully about who you really are. But only online, and only to the right people.



  • Margaret Sankey

    Pariser dissects the dark side of the algorithms that allow search engines to guess what we want--the results aren't just tailored to what we want, but to what advertisers and perhaps more nefarious editors want us to see, not to mention the extremely easy habit of only reading what we agree with or what back-fills our own confirmation biases. While I am not sure that there is a technological or regulatory solution for even the privacy aspects of this, it speaks to a drum I am constantly pounding--all media, from cave paintings to old-school newspapers to the Huffington Post are edited in some way, and unless you know where information comes from and how to evaluate it, even to seeks out quality information that you disagree with or is in a style you don't like, you're adding to the problem, no matter how self-righteous you feel about being well-informed.

  • Maria

    Who doesn't like something individualized for them? Facebook, Google and Amazon, not to mention every other website are busy trying to tailor their customers' experiences and personalizing them. But if you search engine gives you different results than everyone else, how do we build a public community of shared facts.

    Why I started this book: It was a short audio, and as a librarian, the topic of information access is always interesting to me.

    Why I finished it: I didn't know about some of these algorithms that the websites and software are using. And I didn't realize just how much companies are tracking me online. If you want to freak yourself out... read this book.

  • Amanda

    Such an interesting book. If you use the Internet in any capacity (hint if you’re reading this, you do), you should read this or something like it. Hopefully there is more awareness around this than there was when the book was written, but sadly many people don’t realize that their worlds have closed in on them so much, with personalization. I think most people see that with targeted ads, but are oblivious to it because of confirmation bias, when it comes to searching for news, or general world view- especially in regard to Facebook. Another reminder that you need to be active and not passive while searching for information about the world.

  • Huda Ak

    Everyone should read this.

    Delete your web cookies and web history often.

  • Luis Brudna

    Pelo título e por conhecer um pouco sobre o assunto achei que os argumentos seria fracos. Estava enganado. O livro foi uma grata surpresa.

  • David Dinaburg

    The appeal of The Filter Bubble isn’t in the oft-disheartening revelations about internet companies tracking data; that’s established and not surprising to most people. “When you read your Kindle, the data about which phrases you highlight, which pages you turn, and whether you read straight through or skip around are all fed back to Amazon’s servers and can be used to indicate what books you might like next.” What is revealing is how that targeted personalization is beginning to edge out what most of us consider “the Internet” to the point that there is not a “the Internet"; there is “your internet.” The Filter Bubble concerns itself with the social—not technologic—consequences of the increasingly unique user experience of the Internet.

    SimCity was released recently, and it had a terrible launch. Customers couldn’t connect to the game’s server network, a requirement to play. It’s hard to say why the game is having so many problems, but SimCity’s inability to maintain a workable online network in this modern era of cloud computing seems to have upset some of their consumers. When a possible non-technical reason for the current kerfuffle was suggested, the counter-proposal was a further explanation of the technical aspects of cloud computing.

    If your business runs in the cloud, you don’t need to buy more hardware when your processing demands expand: You just rent a greater portion of the cloud. Amazon Web Services, one of the major players in the space, hosts thousands of Web sites and Web servers and undoubtedly stores the personal data of millions. On the one hand, the cloud gives every kid in his or her basement access to nearly unlimited computer power to quickly scale up a new online service.
    The point that was apparently unclear about a possible non-technical reason to SimCity’s current woes relies on recognizing the importance of “if your business runs in the cloud.” Well, why wouldn’t your multinational conglomerate make use of cloud services, if it would alleviate all the technical problems surrounding your game launch? It seems like “the cloud,” particularly software-as-a-service functionality from a company like Amazon, is now taken for granted as baseline functionality, because it is technically the most efficient way to scale up or scale down to meet the demands of an unknown quantity of simultaneous users.
    On the other, as Clive Thompson pointed out to me, the cloud “is actually just a handful of companies.” When Amazon booted the activist Web site WikiLeaks off its servers under political pressure in 2010, the site immediately collapsed.
    Similar to how personalization might be the most technologically adept solution to filtering out the unthinkably large amount of data on the internet, using another company’s web services isn’t always the best solution.

    It isn’t financially sound—even if it is the technological de rigeur—to hand over the keys to the kingdom on all your players’ metadata. It’s a valuable commodity to which—if you contract out your servers to Amazon’s SaaS utility—you lose exclusive access and complete control. It’s like Ford solving the automobile pollution problem by agreeing to sell cars designed and built by current electric-car leader Telsa Motors. It would make Ford’s cars exhaust-free, but at the cost of maintaining control and financial independence over their own infrastructure.
    Personal data stored in the cloud is also actually much easier for the the government to search than information on a home computer. The FBI needs a warrant from a judge to search your laptop. But if you use Yahoo or Gmail or Hotmail for your email, you “lose your constitutional protections immediately,” according to a lawyer for the Electronic Freedom Foundation. The FBI can just ask the company for information—no judicial paperwork needed, no permission required—as long as it can argue later that it’s part of an emergency. “The cops will love this,” says privacy advocate Robert Gellman about cloud computing. “They can go to a single place and get everybody’s documents.”
    Cloud computing brings it’s own set of problems; it is technologically very versatile and useful in many situations, but it isn’t the one-size-fits-all solution to all networking problems. It comes with some very harsh financial, political, and social ramifications; if a company doesn’t apply cloud functionality to every problem it doesn’t mean they don’t understand how cloud computing works on a technical level.
    The Filter Bubble is a book about the non-technical problems that have begun to arise from the current trend in internet technical solutions. The Internet is vast; without some sort of filter, it would become exponentially more difficult to find relevant data as billions of new words and images are uploaded every day. “Most of us assume that when we google a term, we all see the same results--the ones that the company’s famous Page Rank algorithm suggests are the most authoritative based on other pages’ links. But since 2009, this is no longer true.” What happens is your user profile (or your IP address, or ISP provider) helps winnow down your search results, so if you type in “thai restaurant” it will be weighted more to your geographical proximity (among other factors). That sounds useful, and it is. A technical solution to the absurdly large amount of online data about the thai restaurants that are completely useless to you. But what about the non-technical problems that come bundled with building an immense dossier on all internet users? “Search for a word like “depression” on Dictionary.com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants.

    That is unsettling, but it is advertising; maybe there is an overall increase in utility where, instead of a broad blast of products, people see more things they like. But—and here’s where The Filter Bubble begins its parade of horribles—framing the internet as inevitably sliding into a worse version of what it does now isn’t very convincing. “For now, retargeting is being used by advertisers, but there’s no reason to expect that publishers and content providers won’t get in on it. After all, if the Los Angeles Times knows that you’re a fan of Perez Hilton, it can front-page its interview with him in your edition, which means you’ll be more likely to stay on the site and click around.” What is happening is bad enough; a recursive loop built around a caricature of you that marketing companies surreptitiously put together from data you had a every reason to expect was private. “If we knew (or even suspected, for that matter) that purchasers of 101 Ways to Fix Your Credit Score tend to get offered lower-premium credit cards, we’d avoid buying the book.
    The personality traits that serve us well when we’re at dinner with our family might get in the way when we’re in a dispute with a passenger on the train or trying to finish a report at work. The plasticity of self allows for social situations that would be impossible or intolerable if we always behaved exactly the same way. Advertisers have understood this phenomenon for a long time. In the jargon, it’s called day-parting, and it’s the reason that you don’t hear many beer ads as you’re driving to work in the morning. People have different needs and aspirations at eight A.M. than they do at eight P.M. Personalization doesn’t capture the balance between your work self and your play self, and it can also mess with the tension between your aspirational and current self. How we behave is a balancing act between our future and present selves. In the future, we want to be fit, but in the present, we want the candy bar. In the future, we want to be a well-rounded, well-informed intellectual virtuoso, but right now we want to watch the Jersey Shore.
    The Internet is still heralded as a great equalizer in the same Utopian sense in which it began. As it has shifted from research to commerce, the internet as a whole grows increasingly segmented. The thrilling parts of a global marketplace of ideas allows people to find others who shared their interests; now, as the technology advances, those ideas are crystallizing around each user, locking them into a web presence that is as immutable as a physical locale. “In a postmaterial world where your highest task is to express yourself, the public infrastructure that supports this kind of expression falls out of the picture.” If you have a coterie of web-friends that accept you and your worldview, there is little reason to broaden your opinions or hear other voices. The webpages your search results turn up, the advertising you see, and the products you are offered will all support your pre-established opinion. It is confirmation bias via algorithm.

    Much like cloud computing has technical benefits and economic problems, the increasingly personalized web changes the nature of the internet in more ways than just easing the functional difficulty in pulling relevant data from an overwhelmingly large source. Not knowing whether you are being pandered to is the most troubling part, and shedding light on that fact is where The Filter Bubble excels. The author visits Google and poses a question: “If a 9/11 conspiracy theorist searches for “9/11,” was it Google’s job to show him the Popular Mechanics article that debunks his theory or the movie that supports it? Which was more relevant?” That sums up the importance of this book and illustrates the questions that we all need to be asking right now.

    As far as the government-mandated “Do Not Track” list—semi-analogous to the “Do Not Call” list for telemarketing—stands, the current situation is bleak. Google’s own information page, which was last updated at least a year after The Filter Bubble was first published:
    Enabling ‘Do Not Track’ means that a request will be included with your browsing traffic. Any effect depends on whether a website responds to the request, and how the request is interpreted. For example, some websites may respond to this request by showing you ads that aren't based on other websites you've visited. Many websites will still collect and use your browsing data - for example to improve security, to provide content, services, ads and recommendations on their websites, and to generate reporting statistics.

    Does Chrome provide details of which websites and web services respect Do Not Track requests and how they interpret them?

    No. At this time (last updated October 2012), most web services, including Google's, do not alter their behavior or change their services upon receiving Do Not Track requests.
    There is no opt-out, and no way to know what is being tracked, recorded, and used as a signifier of you. Some techno-determinists, like Facebook’s Mark Zuckerberg, have stated that, “You have one identity.” And despite the fact that it is patently false, it becomes true as social networking makes it that way. The more you “like,” the more you’re prompted to “like” similar things, until you truly do have one singularly bland identity. The Filter Bubble informs the reader that it isn't that your data is being tracked that should worry you, but that the profile built from this unapproved, unregulated, and unknown data is the only "you" that is being consulted in an increasingly important aspect of modern life.

  • Eva

    Some great ideas and sentences, but this would have been better, I think, as a really thoughtful article in The Atlantic -- not a full book.

    Kindle quotes:

    A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa. —Mark Zuckerberg, Facebook founder - location 77


    Starting that morning, Google would use fifty-seven signals—everything from where you were logging in from to what browser you were using to what you had searched for before—to make guesses about who you were and what kinds of sites you’d like. Even if you were logged out, it would customize its results, showing you the pages it predicted you were most likely to click on. Most of us assume that when we google a term, we all see the same results—the ones that the company’s famous Page Rank algorithm suggests are the most authoritative based on other pages’ links. But since December 2009, this is no longer true. - location 87


    As sociologist danah boyd said in a speech at the 2009 Web 2.0 Expo: Our bodies are programmed to consume fat and sugars because they’re rare in nature.... In the same way, we’re biologically programmed to be attentive to things that stimulate: content that is gross, violent, or sexual and that gossip which is humiliating, embarrassing, or offensive. If we’re not careful, we’re going to develop the psychological equivalent of obesity. - location 242


    Researchers at the University of Minnesota recently discovered that women who are ovulating respond better to pitches for clingy clothes and suggested that marketers “strategically time” their online solicitations. - location 258


    As a consumer, it’s hard to argue with blotting out the irrelevant and unlikable. But what is good for consumers is not necessarily good for citizens. What I seem to like may not be what I actually want, let alone what I need to know to be an informed member of my community or country. “It’s a civic virtue to be exposed to things that appear to be outside your interest,” technology journalist Clive Thompson told me. “In a complex world, almost everything affects you—that closes the loop on pecuniary self-interest.” Cultural critic Lee Siegel puts it a different way: “Customers are always right, but people aren’t.” - location 293


    At Amazon, the push for more user data is never-ending: When you read books on your Kindle, the data about which phrases you highlight, which pages you turn, and whether you read straight through or skip around are all fed back into Amazon’s servers and can be used to indicate what books you might like next. - location 420


    Amazon users have gotten so used to personalization that the site now uses a reverse trick to make some additional cash. Publishers pay for placement in physical bookstores, but they can’t buy the opinions of the clerks. But as Lanier predicted, buying off algorithms is easy: Pay enough to Amazon, and your book can be promoted as if by an “objective” recommendation by Amazon’s software. For most customers, it’s impossible to tell which is which. - location 424


    By getting people to log in, Google got its hands on an enormous pile of data—the hundreds of millions of e-mails Gmail users send and receive each day. And it could cross-reference each user’s e-mail and behavior on the site with the links he or she clicked in the Google search engine. Google Apps—a suite of online word-processing and spreadsheet-creation tools—served double duty: It undercut Microsoft, Google’s sworn enemy, and it provided yet another hook for people to stay logged in and continue sending click signals. All this data allowed Google to accelerate the process of building a theory of identity for each user—what topics each user was interested in, what links each person clicked. - location 473


    On Friendster and MySpace, to find out what your friends were up to, you had to visit their pages. The News Feed algorithm pulled all of these updates out of Facebook’s massive database and placed them in one place, up front, right when you logged in. Overnight, Facebook had turned itself from a network of connected Web pages into a personalized newspaper featuring (and created by) your friends. It’s hard to imagine a purer source of relevance. - location 512


    As soon as the hijackers’ names had been publicly released, Acxiom had searched its massive data banks, which take up five acres in tiny Conway, Arkansas. And it had found some very interesting data on the perpetrators of the attacks. In fact, it turned out, Acxiom knew more about eleven of the nineteen hijackers than the entire U.S. government did—including their past and current addresses and the names of their housemates. - location 583


    But here’s what Acxiom knows about 96 percent of American households and half a billion people worldwide: the names of their family members, their current and past addresses, how often they pay their credit card bills whether they own a dog or a cat (and what breed it is), whether they are righthanded or left-handed, what kinds of medication they use (based on pharmacy records) ... the list of data points is about 1,500 items long. - location 587


    Loopt is working on an ad system whereby stores can offer special discounts and promotions to repeat customers on their phones—right as they walk through the door. - location 610


    The Internet had delivered a number of mortal blows to the newspaper business model, any one of which might be fatal. Craigslist had made classified advertisements free, and $18 billion in revenue went poof. - location 643


    most of the actual reporting and story generation happens in newspaper newsrooms. They’re the core creators of the news economy. Even in 2010, blogs remain incredibly reliant on them: according to Pew Research Center’s Project for Excellence in Journalism, 99 percent of the stories linked to in blog posts come from newspapers and broadcast networks, and the New York Times and Washington Post alone account for nearly 50 percent of all blog links. - location 671


    At Yahoo’s popular Upshot news blog, a team of editors mine the data produced by streams of search queries to see what terms people are interested in, in real time. Then they produce articles responsive to those queries: When a lot of people search for “Obama’s birthday,” Upshot produces an article in response, - location 928


    Critics on the left frequently argue that the nation’s top media underreport the war. But for many of us, myself included, reading about Afghanistan is a chore. The story is convoluted, confusing, complex, and depressing. In the editorial judgment of the Times, however, I need to know about it, and because they persist in putting it on the front page despite what must be abominably low traffic rates, I continue to read about it. (This doesn’t mean the Times is overruling my own inclinations. It’s just supporting one of my inclinations—to be informed about the world—over the more immediate inclination to click on whatever tickles my fancy.) There are places where media that prioritize importance over popularity or personal relevance are useful—even necessary. - location 963


    personalized filters limit what we are exposed to and therefore affect the way we think and learn. - location 1059


    Fifteen percent of Americans stubbornly held on to the idea that Obama was a Muslim. That’s not so surprising—Americans have never been very well informed about our politicians. What’s perplexing is that since the election, the percentage of Americans who hold that belief has nearly doubled, and the increase, according to data collected by the Pew Charitable Trusts, has been greatest among people who are college educated. People with some college education were more likely in some cases to believe the story than people with none—a strange state of affairs. Why? According to the New Republic’s Jon Chait, the answer lies with the media: “Partisans are more likely to consume news sources that confirm their ideological beliefs. People with more education are more likely to follow political news. Therefore, people with more education can actually become mis-educated.” - location 1133


    “Learning is by definition an encounter with what you don’t know, - location 1160


    Personalization is about building an environment that consists entirely of the adjacent unknown—the sports trivia or political punctuation marks that don’t really shake our schemata but feel like new information. - location 1163


    It’s not just artists and writers who use wide categories. As Cropley points out in Creativity in Education and Learning, the physicist Niels Bohr famously demonstrated this type of creative dexterity when he was given a university exam at the University of Copenhagen in 1905. One of the questions asked students to explain how they would use a barometer (an instrument that measures atmospheric pressure) to measure the height of a building. Bohr clearly knew what the instructor was going for: Students were supposed to check the atmospheric pressure at the top and bottom of the building and do some math. Instead, he suggested a more original method: One could tie a string to the barometer, lower it, and measure the string—thinking of the instrument as a “thing with weight.” The unamused instructor gave him a failing grade—his answer, after all, didn’t show much understanding of physics. Bohr appealed, this time offering four solutions: You could throw the barometer off the building and count the seconds until it hit the ground (barometer as mass); you could measure the length of the barometer and of its shadow, then measure the building’s shadow and calculate its height (barometer as object with length); you could tie the barometer to a string and swing it at ground level and from the top of the building to determine the difference in gravity (barometer as mass again); or you could use it to calculate air pressure. Bohr finally passed, and one moral of the story is pretty clear: Avoid smartass physicists. But the episode also explains why Bohr was such a brilliant innovator: His ability to see objects and concepts in many different ways made it easier for him to use them to solve problems. - location 1263


    Eckles noticed that when buying products—say, a digital camera—different people respond to different pitches. Some people feel comforted by the fact that an expert or product review site will vouch for the camera. Others prefer to go with the product that’s most popular, or a money-saving deal, or a brand that they know and trust. Some people prefer what Eckles calls “high cognition” arguments—smart, subtle points that require some thinking to get. Others respond better to being hit over the head with a simple message. And while most of us have preferred styles of argument and validation, there are also types of arguments that really turn us off. Some people rush for a deal; others think that the deal means the merchandise is subpar. Just by eliminating the persuasion styles that rub people the wrong way, Eckles found he could increase the effectiveness of marketing materials by 30 to 40 percent. While it’s hard to “jump categories” in products—what clothing you prefer is only slightly related to what books you enjoy—“persuasion profiling” suggests that the kinds of arguments you respond to are highly transferrable from one domain to another. A person who responds to a “get 20% off if you buy NOW” deal for a trip to Bermuda is much more likely than someone who doesn’t to respond to a similar deal for, say, a new laptop. If Eckles is right—and research so far appears to be validating his theory—your “persuasion profile” would have a pretty significant financial value. - location 1506


    Consider the implications, for example, of knowing that particular customers compulsively buy things when stressed or when they’re feeling bad about themselves, or even when they’re a bit tipsy. If persuasion profiling makes it possible for a coaching device to shout “you can do it” to people who like positive reinforcement, in theory it could also enable politicians to make appeals based on each voter’s targeted fears and weak spots. Infomercials aren’t shown in the middle of the night only because airtime then is cheap. - location 1529


    “Participants exposed to a steady stream of news about defense or about pollution came to believe that defense or pollution were more consequential problems,” Iyengar wrote. - location 1561


    But combine them with personalized media, and troubling things start to happen. Your identity shapes your media, and your media then shapes what you believe and what you care about. You click on a link, which signals an interest in something, which means you’re more likely to see articles about that topic in the future, which in turn prime the topic for you. You become trapped in a you loop, and if your identity is misrepresented, strange patterns begin to emerge, like reverb from an amplifier. - location 1572


    Marketers are already exploring the gray area between what can be predicted and what predictions are fair. According to Charlie Stryker, an old hand in the behavioral targeting industry who spoke at the Social Graph Symposium, the U.S. Army has had terrific success using social-graph data to recruit for the military—after all, if six of your Facebook buddies have enlisted, it’s likely that you would consider doing so too. Drawing inferences based on what people like you or people linked to you do is pretty good business. And it’s not just the army. Banks are beginning to use social data to decide to whom to offer loans: If your friends don’t pay on time, it’s likely that you’ll be a deadbeat too. “A decision is going to be made on creditworthiness based on the creditworthiness of your friends,” Stryker said. “There are applications of this technology that can be very powerful,” another social targeting entrepreneur told the Wall Street Journal. “Who knows how far we’d take it?” - location 1651


    Gerbner called this the mean world syndrome: If you grow up in a home where there’s more than, say, three hours of television per day, for all practical purposes, you live in a meaner world—and act accordingly—than your next-door neighbor who lives in the same place but watches less television. “You know, who tells the stories of a culture really governs human behavior,” Gerbner later said. - location 1853


    That Facebook chose Like instead of, say, Important is a small design decision with far-reaching consequences: The stories that get the most attention on Facebook are the stories that get the most Likes, and the stories that get the most Likes are, well, more likable. - location 1865


    One especially striking chart shows a strong correlation between level of life satisfaction and comfort with living next door to someone who’s gay. - location 1968


    In the early 2000s, Pabst was struggling financially. It had maxed out among the white rural population that formed the core of its customer base, and it was selling less than 1 million barrels of beer a year, down from 20 million in 1970. If Pabst wanted to sell more beer, it had to look elsewhere, and Neal Stewart, a midlevel marketing manager, did. Stewart went to Postland, Oregon, where Pabst numbers were surprisingly strong and an ironic nostalgia for white working-class culture (remember trucker hats?) was widespread. If Pabst couldn’t get people to drink its watery brew sincerely, Stewart figured, maybe they could get people to drink it ironically. Pabst began to sponsor hipster events—gallery openings, bike messenger races, snowboarding competitions, and the like. Within a year, sales were way up—which is why, if you walk into a bar in certain Brooklyn neighborhoods, Pabst is more likely to be available than other low-end American beers. - location 1978


    According to his biographer, Robert A. Caro, Moses’s vision for Jones Beach was as an island getaway for middle-class white families. He included the low bridges to make it harder for low-income (and mostly black) New Yorkers to get to the beach, as public buses—the most common form of transport for inner-city residents—couldn’t clear the overpasses. - location 2166


    Facebook describes itself as a “social utility,” as if it’s a twenty-first-century phone company. But when users protest Facebook’s constantly shifting and eroding privacy policy, Zuckerberg often shrugs it off with the caveat emptor posture that if you don’t want to use Facebook, you don’t have to. It’s hard to imagine a major phone company getting away with saying, “We’re going to publish your phone conversations for anyone to hear—and if you don’t like it, just don’t use the phone.” - location 2210


    When he heard the first word of the World Trade Center attacks on 9/11, he ran up to his lower-Manhattan rooftop and watched in horror. “I talked to more strangers in the next three days,” he says, “than in the previous five years of living in New York.” Shortly after the attacks, Heiferman came across the blog post that changed his life. It argued that the attacks, as awful as they were, might bring Americans back together in their civic life, and referenced the bestselling book Bowling Alone. Heiferman bought a copy and read it cover to cover. “I became captivated,” he says, “by the question of whether we could use technology to rebuild and strengthen community.” MeetUp.com, a site that makes it easy for local groups to meet face-to-face, was his answer, - location 2321


    his statement is now known as Kranzberg’s first law: “Technology is neither good or bad, nor is it neutral.” - location 2341


    In January 2009, if you were listening to one of twenty-five radio stations in Mexico, you might have heard the accordion ballad “El más grande enemigo.” Though the tune is polka-ish and cheery, the lyrics depict a tragedy: a migrant seeks to illegally cross the border, is betrayed by his handler, and is left in the blistering desert sun to die. Another song from the Migra corridos album tells a different piece of the same sad tale: To cross the border I got in the back of a trailer There I shared my sorrows With forty other immigrants I was never told That this was a trip to hell. If the lyrics aren’t exactly subtle about the dangers of crossing the border, that’s the point. Migra corridos was produced by a contractor working for the U.S. Border Control, as part of a campaign to stem the tide of immigrants along the border. The song is a prime example of a growing trend in what marketers delicately call “advertiser-funded media,” or AFM. - location 2535

  • Youghourta

    كتاب مُخيف. يتناول فكرة أننا على الإنترنت نعيش في فقاعات ، كل في فقاعته الخاصة، يترشح (��تفلتر؟) إلى داخلها محتويات غُربِلت لتتناسب مع أفكارنا ومُعتقداتنا فقط، حتى ليصبح الواحد يعتقد بأن رأيه هو الرأي السائد لأنه الرأي الوحيد الذي يُمكن أن يراه أينما ذهب على الإنترنت بشكل عام وعلى الشبكات الاجتماعية ومنصات الإنترنت بشكل خاص.

    لما نبحث على محرك البحث جوجل على كلمة مُعيّنة فإن نتائج البحث قد تختلف بشكل كبير. فإن كنت مثلًا مهتمًا بالقضايا البيئية وبحثت عن اسم شركة بترولية مثلًا فمن الوارد جدًا بأنّك ستحصل على نتائج تخصّ المشاكل البيئية التي سببتها تلك الشركة. أما إن كنت تعمل لدى تلك الشركة فمن المُرجّح أن تكون نتائج البحث أكثر إيجابية.
    نفس الأمر على الشّبكات الاجتماعية وخاصّة فيس بوك، فإن كنت تملك توجّها مُعيّنا (دينيًا كان أو سياسيًا أو غيره) فإنك بطبيعة الحال ستتفاعل أكثر مع المنشورات ذات نفس التّوجّه (أو على الأقل ستتعامل بشكل أقل مع المُحتويات التي تتعارض معها) ومع مرور الوقت سيفهم فيس بوك ذلك ولن يُظهر لك سوى المنشورات التي تتوافق مع قناعاتك. مع مرور الوقت ستعتقد بأنه لا وجود لرأي آخر إلّا الرّأي الذي تعتقد بصحّته.

    لتحصل على نتائج تتوافق بشكل كبير مع ما تبحث عنه ومع من تكون فإن هذه الشركات (محركات البحث، الشبكات الاجتماعية، منصّات التجارة الإلكترونية وغيرها) تجمع الكثير من المعلومات عنك، عن الصفحات التي تتصفّحها، عن اهتماماتك. لا يقتصر البحث عن البيانات التي تترك أثرها لما تتصفّح تلك المواقع، بل تقوم هذه المنصات بشراء بيانات أخرى قد لا تخطر على بالك أبدًا. فهناك شركات كثيرة تقوم على فكرة تجميع بيانات المُستخدمين (العنوان، المشتريات المُرتبطة ببطاقة الائتمان الخاصة بك، وظيفتك، اهتماماتك….) ثم تبيعها لمن يدفع أكثر.


    من بين الأفكار المُفيدة في الكتاب، هو أن التّعرّض لأفكار مُخالفة هو أمر صحّي ولا يُمكن بناء مُجتمعات صحّيّة من دون ذلك. مُشكلة الفقاعات هذه هي أنها تجعلنا نعيش في بيئة مُتجانسة ومتوافقة بشكل كبير مع من نكون دون تعريضها لأفكار جديدة أو تيارات مُخالفة لتياراتنا. تمامًا مثل حال الطفل الذي حرص والداه على تربيته في بيئة نظيفة إلى حد الهوس، ومع أول خرجة للطفل إلى العالم الخارجي، سيتعرض نظامه المناعي إلى صدمة قويّة.

    الإشكال الكبير الذي تخلقه هذه الفقاعات هو سهولة التأثير وتوجيه الرأي العام حيث يُمكن تحليل بيانات المُستخدم من مصادر مُختلفة لمعرفة أفضل طريقة مُناسبة للتأثير عليه. حتى ولو كان هذا التأثير سلبيًا. فعلى سبيل المثال إن كنا نعلم بأن النساء في مرحلة الخصوبة أو الحمل مثلًا يكن أكثر تقبلًا لبعض الأفكار أو المُنتجات، فإنه يُمكن استهدافهن بسهولة في تلك الفترة بالتحديد. تتساءل كيف يُمكن معرفة متى تكون المرأة حاملًا؟ الأمر أبسط مما تتصوّره، فما تشتريه المرأة من المتاجر خلال تلك الفترة أو الفترة التي تسبقها يحمل دلائل يُمكن تحليلها وتتبّعها بسهولة (نفس الأمر مع ما تبحث عنه على جوجل، أو الصّفحات التي تتُابعها على فيس بوك أو المنشورات التي تتفاعل معها أو تنشرها هناك). وإن كانت تلك المُشتريات تتم ببطاقة ائتمانية وكان المحل يعرف بشكل تقريبي عنوانها فإن استهدافها أصبح أسهل وأدق.

    خطورة ذلك تكمن في أن اكتشاف هذا الاستهداف الدقيق أصبح صعبًا. يعني لن تعرف أصلا بأنّك مُستهدف لتقي نفسك من هذا الاستهداف. فعلى سبيل المثال يُمكن معرفة جميع الإعلانات على التلفزيون أو على الجرائد وأن نكتشف الإعلانات المُضلّلة ونحاول إيقافها (أو على الأقل نعرف نوعية الإعلانات والمُحتويات التي يتعرض لها غيرنا). لكن كيف يُمكن اكتشاف هذه الإعلانات إن كانت تستهدف جمهورًا بعينه دون غيره.

    الكتاب للأسف لا يقدّم حلولًا حقيقية لهذه المشاكل. رغم أنه اقترح بعض الحلول لكن ليست على المُستوى الفردي.

    قد تجد بعض الحلول في كتاب "فن التّخفي" ستجد مُراجعتي له هنا:

    https://www.goodreads.com/review/show...