as you’d like to be seen by others. Your Facebook self is more of a performance, less of a behaviorist black box, and ultimately it may be more prosocial than the bundle of signals Google tracks. But the Facebook approach has its downsides as well—to the extent that Facebook draws on the more public self, it necessarily has less room for private interests and concerns. The same closeted gay teenager’s information environment on Facebook might diverge more from his real self. The Facebook portrait remains incomplete.
Both are pretty poor representations of who we are, in part because there is no one set of data that describes who we are. “Information about our property, our professions, our purchases, our finances, and our medical history does not tell the whole story,” writes privacy expert Daniel Solove. “We are more than the bits of data we give off as we go about our lives.”
Digital animators and robotics engineers frequently run into a problem known as the
To start with, Zuckerberg’s statement that we have “one identity” simply isn’t true. Psychologists have a name for this fallacy: fundamental attribution error. We tend to attribute peoples’ behavior to their inner traits and personality rather than to the situations they’re placed in. Even in situations where the context clearly plays a major role, we find it hard to separate how someone behaves from who she is.
And to a striking degree, our characteristics are fluid. Someone who’s aggressive at work may be a doormat at home. Someone who’s gregarious when happy may be introverted when stressed. Even some of our closest-held traits—our disinclination to do harm, for example—can be shaped by context. Groundbreaking psychologist Stanley Milgram demonstrated this when, in an oft-cited experiment at Yale in the 1960s, he got decent ordinary people to apparently electrocute other subjects when given the nod by a man in a lab coat.
There is a reason that we act this way: The personality traits that serve us well when we’re at dinner with our family might get in the way when we’re in a dispute with a passenger on the train or trying to finish a report at work. The plasticity of the self allows for social situations that would be impossible or intolerable if we always behaved exactly the same way. Advertisers have understood this phenomenon for a long time. In the jargon, it’s called
On his own Facebook page, Zuckerberg lists “transparency” as one of his top Likes. But there’s a downside to perfect transparency: One of the most important uses of privacy is to manage and maintain the separations and distinctions among our different selves. With only one identity, you lose the nuances that make for a good personalized fit.
Personalization doesn’t capture the balance between your work self and your play self, and it can also mess with the tension between your aspirational and your current self. How we behave is a balancing act between our future and present selves. In the future, we want to be fit, but in the present, we want the candy bar. In the future, we want to be a well-rounded, well-informed intellectual virtuoso, but right now we want to watch
The phenomenon explains why there are so many movies in your Netflix queue. When researchers at Harvard and the Analyst Institute looked at people’s movie-rental patterns, they were able to watch as people’s future aspirations played against their current desires. “Should” movies like
At its best, media help mitigate present bias, mixing “should” stories with “want” stories and encouraging us to dig into the difficult but rewarding work of understanding complex problems. But the filter bubble tends to do the opposite: Because it’s our present self that’s doing all the clicking, the set of preferences it reflects is necessarily more “want” than “should.”
The one-identity problem isn’t a fundamental flaw. It’s more of a bug: Because Zuckerberg thinks you have one identity and you don’t, Facebook will do a worse job of personalizing your information environment. As John Battelle told me, “We’re so far away from the nuances of what it means to be human, as reflected in the nuances of the technology.” Given enough data and enough programmers, the context problem is solvable—and according to personalization engineer Jonathan McPhie, Google is working on it. We’ve seen the pendulum swing from the anonymity of the early Internet to the one-identity view currently in vogue; the future may look like something in between.
But the one-identity problem illustrates one of the dangers of turning over your most personal details to companies who have a skewed view of what identity is. Maintaining separate identity zones is a ritual that helps us deal with the demands of different roles and communities. And something’s lost when, at the end of the day, everything inside your filter bubble looks roughly the same. Your bacchanalian self comes knocking at work; your work anxieties plague you on a night out.
And when we’re aware that everything we do enters a permanent, pervasive online record, another problem emerges: The knowledge that what we do affects what we see and how companies see us can create a chilling effect. Genetic privacy expert Mark Rothstein describes how lax regulations around genetic data can actually reduce the number of people willing to be tested for certain diseases: If you might be discriminated against or denied insurance for having a gene linked to Parkinson’s disease, it’s not unreasonable just to skip the test and the “toxic knowledge” that might result.
In the same way, when our online actions are tallied and added to a record that companies use to make decisions, we might decide to be more cautious in our surfing. If we knew (or even suspected, for that matter) that purchasers of
In theory, the one-identity, context-blind problem isn’t impossible to fix. Personalizers will undoubtedly get better at sensing context. They might even be able to better balance long-term and short-term interests. But when they do—when they are able to accurately gauge the workings of your psyche—things get even weirder.
Targeting Your Weak Spots
The logic of the filter bubble today is still fairly rudimentary: People who bought the
Eckles noticed that when buying products—say, a digital camera—different people respond to different pitches. Some people feel comforted by the fact that an expert or product review site will vouch for the camera. Others prefer to go with the product that’s most popular, or a money-saving deal, or a brand that they know and trust. Some people prefer what Eckles calls “high cognition” arguments—smart, subtle points that require some thinking to get. Others respond better to being hit over the head with a simple message.