efforts in building more sophisticated models without regard to the ability of such models to more accurately predict real-life data”, Makridakis and Hibon write.
Someone may counter with the following argument: Perhaps economists’ forecasts create feedback that cancels their effect (this is called the Lucas critique, after the economist Robert Lucas). Let’s say economists predict inflation; in response to these expectations the Federal Reserve acts and lowers inflation. So you cannot judge the forecast accuracy in economics as you would with other events. I agree with this point, but I do not believe that it is the cause of the economists’ failure to predict. The world is far too complicated for their discipline.
When an economist fails to predict outliers he often invokes the issue of earthquakes or revolutions, claiming that he is not into geodesies, atmospheric sciences, or political science, instead of incorporating these fields into his studies and accepting that his field does not exist in isolation. Economics is the most insular of fields; it is the one that quotes least from outside itself! Economics is perhaps the subject that currently has the highest number of philistine scholars – scholarship without erudition and natural curiosity can close your mind and lead to the fragmentation of disciplines.
“OTHER THAN THAT”, IT WAS OKAY
We have used the story of the Sydney Opera House as a springboard for our discussion of prediction. We will now address another constant in human nature: a systematic error made by project planners, coming from a mixture of human nature, the complexity of the world, or the structure of organizations. In order to survive, institutions may need to give themselves and others the appearance of having a “vision”.
Plans fail because of what we have called tunneling, the neglect of sources of uncertainty outside the plan itself.
The typical scenario is as follows. Joe, a nonfiction writer, gets a book contract with a set final date for delivery two years from now. The topic is relatively easy: the authorized biography of the writer Salman Rushdie, for which Joe has compiled ample data. He has even tracked down Rushdie’s former girlfriends and is thrilled at the prospect of pleasant interviews. Two years later, minus, say, three months, he calls to explain to the publisher that he will be
Let’s look at the source of the biographer’s underestimation of the time for completion. He projected his own schedule, but he tunneled, as he did not forecast that some “external” events would emerge to slow him down. Among these external events were the disasters on September 11, 2001, which set him back several months; trips to Minnesota to assist his ailing mother (who eventually recovered); and many more, like a broken engagement (though not with Rushdie’s ex-girlfriend). “Other than that”, it was all within his plan; his own work did not stray the least from schedule. He does not feel responsible for his failure.[30]
We can run experiments and test for repeatability to verify if such errors in projection are part of human nature. Researchers have tested how students estimate the time needed to complete their projects. In one representative test, they broke a group into two varieties, optimistic and pessimistic. Optimistic students promised twenty-six days; the pessimistic ones forty-seven days. The average actual time to completion turned out to be fifty-six days.
The example of Joe the writer is not acute. I selected it because it concerns a repeatable, routine task – for such tasks our planning errors are milder. With projects of great novelty, such as a military invasion, an all-out war, or something entirely new, errors explode upward. In fact, the more routine the task, the better you learn to forecast. But there is always something nonroutine in our modern environment.
There may be incentives for people to promise shorter completion dates – in order to win the book contract or in order for the builder to get your down payment and use it for his upcoming trip to Antigua. But the planning problem exists even where there is no incentive to underestimate the duration (or the costs) of the task. As I said earlier, we are too narrow-minded a species to consider the possibility of events straying from our mental projections, but furthermore, we are too focused on matters internal to the project to take into account external uncertainty, the “unknown unknown”, so to speak, the contents of the unread books.
There is also the nerd effect, which stems from the mental elimination of off-model risks, or
We cannot truly plan, because we do not understand the future – but this is not necessarily bad news. We could plan
In the not too distant past, say the precomputer days, projections remained vague and qualitative, one had to make a mental effort to keep track of them, and it was a strain to push scenarios into the future. It took pencils, erasers, reams of paper, and huge wastebaskets to engage in the activity. Add to that an accountant’s love for tedious, slow work. The activity of projecting, in short, was effortful, undesirable, and marred with self-doubt.
But things changed with the intrusion of the spreadsheet. When you put an Excel spreadsheet into computer-literate hands you get a “sales projection” effortlessly extending ad infinitum! Once on a page or on a computer screen, or, worse, in a PowerPoint presentation, the projection takes on a life of its own, losing its vagueness and abstraction and becoming what philosophers call reified, invested with concreteness; it takes on a new life as a tangible object.
My friend Brian Hinchcliffe suggested the following idea when we were both sweating at the local gym. Perhaps the ease with which one can project into the future by dragging cells in these spreadsheet programs is responsible for the armies of forecasters confidently producing longer-term forecasts (all the while tunneling on their assumptions). We have become worse planners than the Soviet Russians thanks to these potent computer programs given to those who are incapable of handling their knowledge. Like most commodity traders, Brian is a man of incisive and sometimes brutally painful realism.
A classical mental mechanism, called anchoring, seems to be at work here. You lower your anxiety about uncertainty by producing a number, then you “anchor” on it, like an object to hold on to in the middle of a vacuum. This anchoring mechanism was discovered by the fathers of the psychology of uncertainty, Danny Kahneman and Amos Tversky, early in their heuristics and biases project. It operates as follows. Kahneman and Tversky had their subjects spin a wheel of fortune. The subjects first looked at the number on the wheel,
Similarly, ask someone to provide you with the last four digits of his social security number. Then ask him to estimate the number of dentists in Manhattan. You will find that by making him aware of the four-digit number, you elicit an estimate that is correlated with it.
We use reference points in our heads, say sales projections, and start building beliefs around them because less mental effort is needed to compare an idea to a reference point than to evaluate it in the absolute
So the introduction of a reference point in the forecaster’s mind will work wonders. This is no different from a starting point in a bargaining episode: you open with high number (“I want a million for this house”); the bidder