For that matter (and no, I'm not making this up) office workers are more likely to pay for coffee from a communal coffee machine if the coffee machine is positioned under a poster featuring a pair of eyes — which somehow makes people feel that they are accountable

— than under a poster that has a picture of flowers.

10. Distance yourself. Buddhists tells us that everything seems more important in the moment, and for the most part, they're right. If an out-of-control car is bearing down on you, by all means, drop everything and focus all of your energies on the short-term goal of getting out of the way. But if I want to top off the meal with that chocolate cake, I should ask myself this: am I overvaluing my current goals (satisfying my sweet tooth) relative to my long-term goals (staying healthy)? It'll feel good now to send that email excoriating your boss, but next week you'll probably regret it.

Our mind is set up to ponder the near and the far in almost totally different ways, the near in concrete terms, the far in abstract terms. It's not always better to think in more distant terms; remember the last time you promised to do something six months hence, say, attend a charity event or volunteer at your child's school? Your promise probably seemed innocuous at the time but might have felt like an imposition when the date came to actually fulfill it. Whenever we can, we should ask, How will my future self feel about this decision? It pays to recognize the differences in how we treat the here and now versus the future, and try to use and balance both modes of thinking

— immediate and distant — so we won't fall prey to basing choices entirely on what happens to be in our mind in the immediate moment. (A fine corollary: wait awhile. If you still want it tomorrow, it may be important; if the need passes, it probably wasn't.) Empirical research shows that irrationality often dissipates with time, and complex decisions work best if given time to steep.

Beware the vivid, the personal, and the anecdotal. This is another corollary to 'distancing ourselves,' also easier said than done. In earlier chapters we saw the relative temptation prompted by cookies that we can see versus cookies that we merely read about. An even more potent illustration might be Timothy Wilson's study of undergraduates and condom brands, which yielded a classic 'do as I say, not as I do' result. Subjects in the experiment were given two sources of information, the results of a statistically robust study in Consumer Reports favoring condoms of Brand A and a single anecdotal tale (allegedly written by another student) recommending Brand B, on the grounds that a condom of Brand A had burst in the middle of intercourse, leading to considerable anxiety about possible pregnancy. Virtually all students agreed in principle that Consumer Reports would be more reliable and also that they would not want their friends to choose on the basis of anecdotal evidence. But when asked to choose for themselves, nearly a third (31 percent) still yielded to the vivid and anecdotal, and went with Brand B. Our four- legged ancestors perhaps couldn't help but pay attention to whatever seemed most colorful or dramatic; we have the luxury to take the time to reflect, and it behooves us to use it, compensating for our vulnerability to the vivid by giving special weight to the impersonal but scientific.

Pick your spots. Decisions are psychologically, and even physically, costly, and it would be impossible to delay every decision until we had complete information and time to reflect on every contingency and counteralternative. The strategies I've given in this list are handy, but never forget the tale of Buridan's Ass, the donkey that starved to death while trying to choose between two equally attrac

tive, equally close patches of hay. Reserve your most careful decision making for the choices that matter most.

13. Try to be rational. This last suggestion may sound unbelievably trivial, on par with the world's most worthless stock market advice ('Buy low, sell high' — theoretically sound yet utterly useless). But reminding yourself to be rational is not as pointless as it sounds.

Recall, for example, 'mortality salience,' a phenomenon I described earlier in the chapter on belief: people who are led in passing to think about their own death tend to be harsher toward members of other groups. Simply telling them to consider their answers before responding and 'to be as rational and analytic as possible' (instead of just answering with their 'gut-level reactions') reduces the effect. Another recent study shows similar results.

One of the most important reasons why it just might help to tell yourself to be rational is that in so doing, you can, with practice, automatically prime yourself to use some of the other techniques I've just described (such as considering alternatives or holding yourself accountable for your decisions). Telling ourselves to be rational isn't, on its own, likely to be enough, but it might just help in tandem with the rest.

Every one of these suggestions is based on sound empirical studies of the limits of the human mind. Each, in its own way, addresses a different weakness in the human mind and each, in its own way, offers a technique for smoothing out some of the rough spots in our evolution.

With a properly nuanced understanding of the balance between the strengths and weaknesses of the human mind, we may have an opportunity to help not only ourselves but society. Consider, for example, our outmoded system of education, still primarily steeped in ideas from nineteenth-century pedagogy, with its outsized emphasis on memorization echoing the Industrial Revolution and Dickens's stern schoolmaster, Mr. Gradgrind: 'Now, what I want is, Facts. Teach these boys and girls nothing but Facts .. . Plant nothing else, and root out everything else.' But it scarcely does what education ought to do, which is to help our children learn how to fend for themselves. I doubt that such a heavy dose of memorization ever served a useful purpose, but in the age of Google, asking a child to memorize the state capital has long since outlived its usefulness.

Deanna Kuhn, a leading educational psychologist and author of the recent book Education for Thinking, presents a vignette that reminds me entirely too much of my own middle-school experience: a seventh-grader at a considerably above average school asked his (well-regarded) social studies teacher, 'Why do we have to learn the names of all thirteen colonies?' The teacher's answer, delivered without hesitation, was 'Well, we're going to learn all fifty states by June, so we might as well learn the first thirteen now.' Clear evidence that the memorization cart has come before the educational horse. There is value, to be sure, in teaching children the history of their own country and — especially in light of increasing globalization — the world, but a memorized list of states casts no real light on history and leaves a student with no genuine skills for understanding (say) current events. The result, in the words of one researcher, is that

many students are unable to give evidence of more than a superficial understanding of the concepts and relationships that are fundamental to the subjects they have studied, or of an ability to apply the content knowledge they have acquired to real-world problems .. . it is possible to finish 12 or 13 years of public education in the United States without developing much competence as a thinker.

In the information age, children have no trouble finding information, but they have trouble interpreting it. The fact (discussed earlier) that we tend to believe first and ask questions later is truly dangerous in the era of the Internet — wherein anyone, even people with no credentials, can publish anything. Yet studies show that teenagers frequently take whatever they read on the Internet at face value.

Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату