cost of communicating over large distances and to large groups of people has plummeted, we’re increasingly unable to attend to it all. Our focus flickers from text message to Web clip to e-mail. Scanning the ever-widening torrent for the precious bits that are actually important or even just relevant is itself a full-time job.

So when personalized filters offer a hand, we’re inclined to take it. In theory, anyway, they can help us find the information we need to know and see and hear, the stuff that really matters among the cat pictures and Viagra ads and treadmill-dancing music videos. Netflix helps you find the right movie to watch in its vast catalog of 140,000 flicks. The Genius function of iTunes calls new hits by your favorite band to your attention when they’d otherwise be lost.

Ultimately, the proponents of personalization offer a vision of a custom-tailored world, every facet of which fits us perfectly. It’s a cozy place, populated by our favorite people and things and ideas. If we never want to hear about reality TV (or a more serious issue like gun violence) again, we don’t have to—and if we want to hear about every movement of Reese Witherspoon, we can. If we never click on the articles about cooking, or gadgets, or the world outside our country’s borders, they simply fade away. We’re never bored. We’re never annoyed. Our media is a perfect reflection of our interests and desires.

By definition, it’s an appealing prospect—a return to a Ptolemaic universe in which the sun and everything else revolves around us. But it comes at a cost: Making everything more personal, we may lose some of the traits that made the Internet so appealing to begin with.

When I began the research that led to the writing of this book, personalization seemed like a subtle, even inconsequential shift. But when I considered what it might mean for a whole society to be adjusted in this way, it started to look more important. Though I follow tech developments pretty closely, I realized there was a lot I didn’t know: How did personalization work? What was driving it? Where was it headed? And most important, what will it do to us? How will it change our lives?

In the process of trying to answer these questions, I’ve talked to sociologists and salespeople, software engineers and law professors. I interviewed one of the founders of OkCupid, an algorithmically driven dating Web site, and one of the chief visionaries of the U.S. information warfare bureau. I learned more than I ever wanted to know about the mechanics of online ad sales and search engines. I argued with cyberskeptics and cybervisionaries (and a few people who were both).

Throughout my investigation, I was struck by the lengths one has to go to in order to fully see what personalization and filter bubbles do. When I interviewed Jonathan McPhie, Google’s point man on search personalization, he suggested that it was nearly impossible to guess how the algorithms would shape the experience of any given user. There were simply too many variables and inputs to track. So while Google can look at overall clicks, it’s much harder to say how it’s working for any one person.

I was also struck by the degree to which personalization is already upon us—not only on Facebook and Google, but on almost every major site on the Web. “I don’t think the genie goes back in the bottle,” Danny Sullivan told me. Though concerns about personalized media have been raised for a decade—legal scholar Cass Sunstein wrote a smart and provocative book on the topic in 2000—the theory is now rapidly becoming practice: Personalization is already much more a part of our daily experience than many of us realize. We can now begin to see how the filter bubble is actually working, where it’s falling short, and what that means for our daily lives and our society.

Every technology has an interface, Stanford law professor Ryan Calo told me, a place where you end and the technology begins. And when the technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens. That’s a powerful position, Calo says. “There are lots of ways for it to skew your perception of the world.” And that’s precisely what the filter bubble does.

THE FILTER BUBBLE’S costs are both personal and cultural. There are direct consequences for those of us who use personalized filters (and soon enough, most of us will, whether we realize it or not). And there are societal consequences, which emerge when masses of people begin to live a filter-bubbled life.

One of the best ways to understand how filters shape our individual experience is to think in terms of our information diet. As sociologist danah boyd said in a speech at the 2009 Web 2.0 Expo: Our bodies are programmed to consume fat and sugars because they’re rare in nature…. In the same way, we’re biologically programmed to be attentive to things that stimulate: content that is gross, violent, or sexual and that gossip which is humiliating, embarrassing, or offensive. If we’re not careful, we’re going to develop the psychological equivalent of obesity. We’ll find ourselves consuming content that is least beneficial for ourselves or society as a whole.

Just as the factory farming system that produces and delivers our food shapes what we eat, the dynamics of our media shape what information we consume. Now we’re quickly shifting toward a regimen chock-full of personally relevant information. And while that can be helpful, too much of a good thing can also cause real problems. Left to their own devices, personalization filters serve up a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown.

In the filter bubble, there’s less room for the chance encounters that bring insight and learning. Creativity is often sparked by the collision of ideas from different disciplines and cultures. Combine an understanding of cooking and physics and you get the nonstick pan and the induction stovetop. But if Amazon thinks I’m interested in cookbooks, it’s not very likely to show me books about metallurgy. It’s not just serendipity that’s at risk. By definition, a world constructed from the familiar is a world in which there’s nothing to learn. If personalization is too acute, it could prevent us from coming into contact with the mind-blowing, preconception-shattering experiences and ideas that change how we think about the world and ourselves.

And while the premise of personalization is that it provides you with a service, you’re not the only person with a vested interest in your data. Researchers at the University of Minnesota recently discovered that women who are ovulating respond better to pitches for clingy clothes and suggested that marketers “strategically time” their online solicitations. With enough data, guessing this timing may be easier than you think.

At best, if a company knows which articles you read or what mood you’re in, it can serve up ads related to your interests. But at worst, it can make decisions on that basis that negatively affect your life. After you visit a page about Third World backpacking, an insurance company with access to your Web history might decide to increase your premium, law professor Jonathan Zittrain suggests. Parents who purchased EchoMetrix’s Sentry software to track their kids online were outraged when they found that the company was then selling their kids’ data to third-party marketing firms.

Personalization is based on a bargain. In exchange for the service of filtering, you hand large companies an enormous amount of data about your daily life—much of which you might not trust friends with. These companies are getting better at drawing on this data to make decisions every day. But the trust we place in them to handle it with care is not always warranted, and when decisions are made on the basis of this data that affect you negatively, they’re usually not revealed.

Ultimately, the filter bubble can affect your ability to choose how you want to live. To be the author of your life, professor Yochai Benkler argues, you have to be aware of a diverse array of options and lifestyles. When you enter a filter bubble, you’re letting the companies that construct it choose which options you’re aware of. You may think you’re the captain of your own destiny, but personalization can lead you down a road to a kind of informational determinism in which what you’ve clicked on in the past determines what you see next—a Web history you’re doomed to repeat. You can get stuck in a static, ever narrowing version of yourself—an endless you-loop.

And there are broader consequences. In Bowling Alone, his bestselling book on the decline of civic life in America, Robert Putnam looked at the problem of the major decrease in “social capital”—the bonds of trust and allegiance that encourage people to do each other favors, work together to solve common problems, and collaborate. Putnam identified two kinds of social capital: There’s the in-group-oriented “bonding” capital created when you attend a meeting of your college alumni, and then there’s “bridging” capital, which is created at an event like a town meeting when people from lots of different backgrounds come together to meet each other. Bridging capital is potent: Build more of it, and you’re more likely to be able to find that next job or an investor for your small business, because it allows you to tap into lots of different networks for help.

Everybody expected the Internet to be a huge source of bridging capital. Writing at the height of the dot-com

Вы читаете The Filter Bubble
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату
×