hyperinsulinemia, was the problem, which is consistent with the carbohydrate hypothesis of chronic disease. Both of these, Neel suggested, would be triggered by the “composition of the diet, and more specifically the use of highly refined carbohydrates.”
It wasn’t until the late 1970s, just a few years before Neel himself publicly rejected his hypothesis, that obesity researchers began invoking thrifty genes as the reason why putting on weight seems so much easier than losing it. Jules Hirsch of Rockefeller University was among the first to do so, and his logic is noteworthy, because his primary goal was to establish that humans, like every other species of animal, had apparently evolved a homeostatic system to regulate weight, and one that would do so successfully against fluctuations in food availability. We eat during the day, and yet have to supply nutrients to our cells all night long, while we sleep, for example, so we must have evolved a fuel storage system that takes this into account. “To me, it would be most unthinkable if we did not have a complex, integrated system to assure that a fraction of what we eat is put aside and stored,” Hirsch wrote in 1977. To explain why these components might cause obesity so often in modern societies, he assumed as fact something that Neel had never considered more than speculation. “The biggest segment of man’s history is covered by times when food was scarce and was acquired in unpredictable amounts and by dint of tremendous caloric expenditure,” Hirsch suggested. “The long history of food scarcity and its persistence in much of the world could not have gone unnoticed by such an adaptive organism as man. Hoarding and caloric miserliness are built into our fabric.”
This was one of the first public statements of the notion that would evolve into the kind of unconditional proclamation made by Kelly Brownell a quarter century later, that the human body is an “exquisitely efficient calorie conservation machine.” But it depended now on an assumption about human evolution that was contradicted by the anthropologic evidence itself—that human history was dominated by what Jared Diamond had called the “conditions of unpredictably alternating feast and famine that characterized the traditional human lifestyle.” Reasonable as this may seem, we have no evidence that food was ever any harder to come by for humans than for any other organisms on the planet, at least not until our ancestors began radically reshaping their environment ten thousand years ago, with the invention of agriculture.
Both the anthropological remains and the eyewitness testimony of early European explorers suggest that much of the planet, prior to the last century or two, was a “paradise for hunting,” in the words of the Emory University anthropologist Melvin Konner and his collaborators, with a diversity of game, both large and small, “present in almost unimaginable numbers.”*72 Though famines have certainly been documented among hunter- gatherer populations more recently, there’s little reason to believe that this happened prior to the industrial revolution. Those isolated populations that managed to survive as hunter-gatherers well into the twentieth century, as the anthropologist Mark Nathan Cohen has written, were “conspicuously well-nourished in qualitative terms and at least adequately nourished in quantitative terms.”
Hunter-gatherers lived in equilibrium with their environment just as every other species does. The oft-cited example is the !Kung Bushmen of the semi-arid Kalahari desert, who were studied by Richard Lee of the University of Toronto and a team of anthropologists in the mid-1960s. Their observations, Lee noted, were made during “the third year of one of the most severe droughts in South Africa’s history.” The United Nations had instituted a famine-relief program for the local agriculturalists and pastoralists, and yet the Bushmen still survived easily on “some relatively abundant high-quality foods,” and they did not “have to walk very far or work very hard to get them.” The !Kung women would gather enough food in one day to feed their families for the next three, Lee and his colleagues reported; they would spend the remaining time resting, visiting, or entertaining visitors from other camps.
The prevailing opinion among anthropologists, not to be confused with that of nutritionists and public-health authorities, is that hunting and gathering allow for such a varied and extensive diet, including not just roots and berries but large and small game, insects, scavenged meat (often eaten at “levels of decay that would horrify a European”), and even occasionally other humans, that the likelihood of the simultaneous failure of all nutritional resources is vanishingly small. When hunting failed, these populations could still rely on foraging of plant food and insects, and when gathering failed “during long-continued drought,” as the missionary explorer David Livingstone noted of a South African tribe in the mid-nineteenth century, they could relocate to the local water holes, where “very great numbers of the large game” also congregated by necessity. This resiliency of hunting and gathering is now thought to explain why it survived for two million years before giving way to agriculture. In those areas where human remains span the transition from hunter-gatherer societies to farmers, anthropologists have reported that both nutrition and health declined, rather than improved, with the adoption of agriculture. (It was this observation that led Jared Diamond to describe agriculture as “the worst mistake in the history of the human race.”)
Although famines were both common and severe in Europe until the nineteenth century, this would suggest that those with European ancestry should be the most likely to have thrifty genes, and the most susceptible to obesity and diabetes in our modern toxic environments. Rather, among Europeans there is “a uniquely low occurrence of Type 2 diabetes,” as Diamond puts it, more evidence that the thrifty-gene hypothesis is incorrect.
Species adapt to their environment over successive generations. Those that don’t, die off. When food is abundant, species multiply; they don’t get obese and diabetic.
When earlier generations of obesity researchers discussed the storage of fat in humans and animals, they assumed that avoiding excessive fat is as important to the survival of any species as avoiding starvation. Since the average 150-pound man with a body fat percentage of only 10 percent is still carrying enough fat calories to survive one month or more of total starvation, it seems superfluous to carry around more if it might have negative consequences. “Survival of the species must have depended many times both on the ability to store adequate yet
The thrifty-gene hypothesis, on the other hand, implies that we (at least some of us) are evolutionarily adapted to survive extreme periods of famine, but assigns to humans the unique concession of having evolved in an environment in which excess fat accumulation would not be a burden or lead to untimely death—by inhibiting our ability to escape from predators or enemies, for instance, or our ability to hunt or perhaps even gather. It presupposes that we remain lean, or at least some of us do, only as long as we remain hungry or simply lack sufficient food to indulge our evolutionary drive to get fat—an explanation for leanness that the British metabolism researchers Nancy Rothwell and Michael Stock described in 1981 as “facile and unlikely,” a kind way of putting it. The “major objection” to the thrifty-genotype hypothesis, noted Rothwell and Stock, “must be based on the observation that most wild animals are in fact very lean” and that this leanness persists “even when adequate food is supplied,” just as we’ve seen in hunter-gatherers. If the thrifty-gene hypothesis were true of any species, it would suggest that all we had to do was put them in a cage with plentiful food available and they would fatten up and become diabetic, and this is simply not the case.
Proponents of the thrifty-gene hypothesis, however, will invoke a single laboratory model—the Israeli sand rat —to support the notion that at least some wild animals will get fat and diabetic if caged with sufficient food. “When this animal is removed from the sparse diet of its natural environment and given an abundant, high-calorie diet,” wrote Australian diabetologist Paul Zimmet in a 2001 article in