debate over his veracity continued to haunt the CIA for decades, with “master plan” theorists sparring with those who believed he was telling the truth. In the end, six separate investigations were made into Nosenko’s case. When he passed away in 2008, the news of his death was relayed to the New York Times by a “senior intelligence official” who refused to be identified.

One of the officials most affected by the internal debate was an intelligence analyst by the name of Richards Heuer. Heuer had been recruited to the CIA during the Korean War, but he had always been interested in philosophy, and especially the branch known as epistemology—the study of knowledge. Although Heuer wasn’t directly involved in the Nosenko case, he was required to be briefed on it for other work he was doing, and he’d initially fallen for the “master plot” hypothesis. Years later, Heuer set out to analyze the analysts—to figure out where the flaws were in the logic that had led to Nosenko’s lost years in a CIA prison. The result is a slim volume called The Psychology of Intelligence Analysis, whose preface is full of laudatory comments by Heuer’s colleagues and bosses. The book is a kind of Psychology and Epistemology 101 for would-be spooks.

For Heuer, the core lesson of the Nosenko debacle was clear: “Intelligence analysts should be self- conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves.”

Despite evidence to the contrary, Heuer wrote, we have a tendency to believe that the world is as it appears to be. Children eventually learn that a snack removed from view doesn’t disappear from the universe, but even as we mature we still tend to conflate seeing with believing. Philosophers call this view naive realism, and it is as seductive as it is dangerous. We tend to believe we have full command of the facts and that the patterns we see in them are facts as well. (Angleton, the “master theory” proponent, was sure that Nosenko’s pattern of factual errors indicated that he was hiding something and was breaking under pressure.)

So what’s an intelligence analyst—or anyone who wants to get a good picture of the world, for that matter—to do? First, Heuer suggests, we have to realize that our idea of what’s real often comes to us secondhand and in a distorted form—edited, manipulated, and filtered through media, other human beings, and the many distorting elements of the human mind.

Nosenko’s case was riddled with these distorting factors, and the unreliability of the primary source was only the most obvious one. As voluminous as the set of data that the CIA had compiled on Nosenko was, it was incomplete in certain important ways: The agency knew a lot about his rank and status but had learned very little about his personal background and internal life. This led to a basic unquestioned assumption: “The KGB would never let a screw-up serve at this high level; therefore, he must be deceiving us.”

“To achieve the clearest possible image” of the world, Heuer writes, “analysts need more than information…. They also need to understand the lenses through which this information passes.” Some of these distorting lenses are outside of our heads. Like a biased sample in an experiment, a lopsided selection of data can create the wrong impression: For a number of structural and historical reasons, the CIA record on Nosenko was woefully inadequate when it came to the man’s personal history. And some of them are cognitive processes: We tend to convert “lots of pages of data” into “likely to be true,” for example. When several of them are at work at the same time, it becomes quite difficult to see what’s actually going on—a funhouse mirror reflecting a funhouse mirror reflecting reality.

This distorting effect is one of the challenges posed by personalized filters. Like a lens, the filter bubble invisibly transforms the world we experience by controlling what we see and don’t see. It interferes with the interplay between our mental processes and our external environment. In some ways, it can act like a magnifying glass, helpfully expanding our view of a niche area of knowledge. But at the same time, personalized filters limit what we are exposed to and therefore affect the way we think and learn. They can upset the delicate cognitive balance that helps us make good decisions and come up with new ideas. And because creativity is also a result of this interplay between mind and environment, they can get in the way of innovation. If we want to know what the world really looks like, we have to understand how filters shape and skew our view of it.

A Fine Balance

It’s become a bit in vogue to pick on the human brain. We’re “predictably irrational,” in the words of behavioral economist Dan Ariely’s bestselling book. Stumbling on Happiness author Dan Gilbert presents volumes of data to demonstrate that we’re terrible at figuring out what makes us happy. Like audience members at a magic show, we’re easily conned, manipulated, and misdirected.

All of this is true. But as Being Wrong author Kathryn Schulz points out, it’s only one part of the story. Human beings may be a walking bundle of miscalculations, contradictions, and irrationalities, but we’re built that way for a reason: The same cognitive processes that lead us down the road to error and tragedy are the root of our intelligence and our ability to cope with and survive in a changing world. We pay attention to our mental processes when they fail, but that distracts us from the fact that most of the time, our brains do amazingly well.

The mechanism for this is a cognitive balancing act. Without our ever thinking about it, our brains tread a tightrope between learning too much from the past and incorporating too much new information from the present. The ability to walk this line—to adjust to the demands of different environments and modalities—is one of human cognition’s most astonishing traits. Artificial intelligence has yet to come anywhere close.

In two important ways, personalized filters can upset this cognitive balance between strengthening our existing ideas and acquiring new ones. First, the filter bubble surrounds us with ideas with which we’re already familiar (and already agree), making us overconfident in our mental frameworks. Second, it removes from our environment some of the key prompts that make us want to learn. To understand how, we have to look at what’s being balanced in the first place, starting with how we acquire and store information.

Filtering isn’t a new phenomenon. It’s been around for millions of years—indeed, it was around before humans even existed. Even for animals with rudimentary senses, nearly all of the information coming in through their senses is meaningless, but a tiny sliver is important and sometimes life-preserving. One of the primary functions of the brain is to identify that sliver and decide what to do about it.

In humans, one of the first steps is to massively compress the data. As Nassim Nicholas Taleb says, “Information wants to be reduced,” and every second we reduce a lot of it—compressing most of what our eyes see and ears hear into concepts that capture the gist. Psychologists call these concepts schemata (one of them is a schema), and they’re beginning to be able to identify particular neurons or sets of neurons that correlate with each one—firing, for example, when you recognize a particular object, like a chair. Schemata ensure that we aren’t constantly seeing the world anew: Once we’ve identified something as a chair, we know how to use it.

We don’t do this only with objects; we do it with ideas as well. In a study of how people read the news, researcher Doris Graber found that stories were relatively quickly converted into schemata for the purposes of memorization. “Details that do not seem essential at the time and much of the context of a story are routinely pared,” she writes in her book Processing the News. “Such leveling and sharpening involves condensation of all features of a story.” Viewers of a news segment on a child killed by a stray bullet might remember the child’s appearance and tragic background, but not the reportage that overall crime rates are down.

Schemata can actually get in the way of our ability to directly observe what’s happening. In 1981, researcher Claudia Cohen instructed subjects to watch a video of a woman celebrating her birthday. Some are told that she’s a waitress, while others are told she’s a librarian. Later, the groups are asked to reconstruct the scene. The people who are told she’s a waitress remember her having a beer; those told she was a librarian remember her wearing glasses and listening to classical music (the video shows her doing all three). The information that didn’t jibe with her profession was more often forgotten. In some cases, schemata are so powerful they can even lead to information being fabricated: Doris Graber, the news researcher, found that up to a third of her forty-eight subjects had added details to their memories of twelve television news stories shown to them, based on the schemata those stories activated.

Once we’ve acquired schemata, we’re predisposed to strengthen them. Psychological researchers call this

Вы читаете The Filter Bubble
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату