for your work friends or coworkers and for the other people you know are probably coming to an end pretty quickly…. Having two identities for yourself is an example of a lack of integrity.”

A year later, soon after the book had been published, twenty-six-year-old Zuckerberg sat onstage with Kirkpatrick and NPR interviewer Guy Raz at the Computer History Museum in Mountain View, California. “In David’s book,” Raz said, “you say that people should have one identity…. But I behave a different way around my family than I do around my colleagues.”

Zuckerberg shrugged. “No, I think that was just a sentence I said.”

Raz continued: “Are you the same person right now as when you’re with your friends?”

“Uh, yeah,” Zuckerberg said. “Same awkward self.”

If Mark Zuckerberg were a standard mid-twenty-something, this tangle of views might be par for the course: Most of us don’t spend too much time musing philosophically about the nature of identity. But Zuckerberg controls the world’s most powerful and widely used technology for managing and expressing who we are. And his views on the matter are central to his vision for the company and for the Internet.

Speaking at an event during New York’s Ad Week, Facebook COO Sheryl Sandberg said she expected the Internet to change quickly. “People don’t want something targeted to the whole world—they want something that reflects what they want to see and know,” she said, suggesting that in three to five years that would be the norm. Facebook’s goal is to be at the center of that process—the singular platform through which every other service and Web site incorporates your personal and social data. You have one identity, it’s your Facebook identity, and it colors your experience everywhere you go.

It’s hard to imagine a more dramatic departure from the early days of the Internet, in which not exposing your identity was part of the appeal. In chat rooms and online forums, your gender, race, age, and location were whatever you said they were, and the denizens of these spaces exulted about the way the medium allowed you to shed your skin. Electronic Frontier Foundation (EFF) founder John Perry Barlow dreamed of “creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth.” The freedom that this offered anyone who was interested to transgress and explore, to try on different personas for size, felt revolutionary.

As law and commerce have caught up with technology, however, the space for anonymity online is shrinking. You can’t hold an anonymous person responsible for his or her actions: Anonymous customers commit fraud, anonymous commenters start flame wars, and anonymous hackers cause trouble. To establish the trust that community and capitalism are built on, you need to know whom you’re dealing with.

As a result, there are dozens of companies working on deanonymizing the Web. PeekYou, a firm founded by the creator of RateMyProfessors.com, is patenting ways of connecting online activities done under a pseudonym with the real name of the person involved. Another company, Phorm, helps Internet service providers use a method called “deep packet inspection” to analyze the traffic that flows through their servers; Phorm aims to build nearly comprehensive profiles of each customer to use for advertising and personalized services. And if ISPs are leery, BlueCava is compiling a database of every computer, smartphone, and online-enabled gadget in the world, which can be tied to the individual people who use them. Even if you’re using the highest privacy settings in your Web browser, in other words, your hardware may soon give you away.

These technological developments pave the way for a more persistent kind of personalization than anything we’ve experienced to date. It also means that we’ll increasingly be forced to trust the companies at the center of this process to properly express and synthesize who we really are. When you meet someone in a bar or a park, you look at how they behave and act and form an impression accordingly. Facebook and the other identity services aim to mediate that process online; if they don’t do it right, things can get fuzzy and distorted. To personalize well, you have to have the right idea of what represents a person.

There’s another tension in the interplay of identity and personalization. Most personalized filters are based on a three-step model. First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media. There’s just one flaw in this logic: Media also shape identity. And as a result, these services may end up creating a good fit between you and your media by changing… you. If a self-fulfilling prophecy is a false definition of the world that through one’s actions becomes true, we’re now on the verge of self-fulfilling identities, in which the Internet’s distorted picture of us becomes who we really are.

Personalized filtering can even affect your ability to choose your own destiny. In “Of Sirens and Amish Children,” a muchcited tract, information law theorist Yochai Benkler describes how more-diverse information sources make us freer. Autonomy, Benkler points out, is a tricky concept: To be free, you have to be able not only to do what you want, but to know what’s possible to do. The Amish children in the title are plaintiffs in a famous court case, Wisconsin v. Yoder, whose parents sought to prevent them from attending public school so that they wouldn’t be exposed to modern life. Benkler argues that this is a real threat to the children’s freedom: Not knowing that it’s possible to be an astronaut is just as much a prohibition against becoming one as knowing and being barred from doing so.

Of course, too many options are just as problematic as too few—you can find yourself overwhelmed by the number of options or paralyzed by the paradox of choice. But the basic point remains: The filter bubble doesn’t just reflect your identity. It also illustrates what choices you have. Students who go to Ivy League colleges see targeted advertisements for jobs that students at state schools are never even aware of. The personal feeds of professional scientists might feature articles about contests that amateurs never become aware of. By illustrating some possibilities and blocking out others, the filter bubble has a hand in your decisions. And in turn, it shapes who you become.

A Bad Theory of You

The way that personalization shapes identity is still becoming clear—especially because most of us still spend more time consuming broadcast media than personalized content streams. But by looking at how the major filterers think about identity, it’s becoming possible to predict what these changes might look like. Personalization requires a theory of what makes a person—of what bits of data are most important to determine who someone is—and the major players on the Web have quite different ways of approaching the problem.

Google’s filtering systems, for example, rely heavily on Web history and what you click on (click signals) to infer what you like and dislike. These clicks often happen in an entirely private context: The assumption is that searches for “intestinal gas” and celebrity gossip Web sites are between you and your browser. You might behave differently if you thought other people were going to see your searches. But it’s that behavior that determines what content you see in Google News, what ads Google displays—what determines, in other words, Google’s theory of you.

The basis for Facebook’s personalization is entirely different. While Facebook undoubtedly tracks clicks, its primary way of thinking about your identity is to look at what you share and with whom you interact. That’s a whole different kettle of data from Google’s: There are plenty of prurient, vain, and embarrassing things we click on that we’d be reluctant to share with all of our friends in a status update. And the reverse is true, too. I’ll cop to sometimes sharing links I’ve barely read—the long investigative piece on the reconstruction of Haiti, the bold political headline—because I like the way it makes me appear to others. The Google self and the Facebook self, in other words, are pretty different people. There’s a big difference between “you are what you click” and “you are what you share.”

Both ways of thinking have their benefits and drawbacks. With Google’s click-based self, the gay teenager who hasn’t come out to his parents can still get a personalized Google News feed with pieces from the broader gay community that affirm that he’s not alone. But by the same token, a self built on clicks will tend to draw us even more toward the items we’re predisposed to look at already—toward our most Pavlovian selves. Your perusal of an article on TMZ.com is filed away, and the next time you’re looking at the news, Brad Pitt’s marriage drama is more likely to flash on to the screen. (If Google didn’t persistently downplay porn, the problem would presumably be far worse.)

Facebook’s share-based self is more aspirational: Facebook takes you more at your word, presenting you

Вы читаете The Filter Bubble
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату