Few stories that begin, “Many Americans clueless…” can really be called “news.” Nonetheless, a recent study made headlines earlier this month by confirming what research has shown time and again: most people don’t know how many calories they supposedly burn. The 2010 Food & Health Survey by Cogent Research asked respondents (1,024 adults “nationally representative of the US population based on the Census”) to estimate how many calories someone of their age, height, weight, and activity levels “should consume” per day. Only 12% got within 100 calories +/- their Estimated Energy Requirement (or EER, the formula currently used by the USDA) and 25% wouldn’t even venture a guess. The remaining 63% were just wrong. This seems to pose a problem for the claim that publishing calorie counts on menus will improve public health. Logically, if people don’t know if they burn 10 or 10,000 calories in a day, which is the range of estimates collected in another survey, conducted in 2006 at the University of Vermont (full text with UMich login), knowing how many calories a particular menu item contains probably isn’t going to do them much good.
The new calorie publishing policy actually includes a provision to help address this problem—in addition to the calorie counts of all menu items, menus will also have to publish the average daily calorie requirement for adults (2,000 Kcal). New York City also attempted to address the problem of calorie ignorance when it instituted its calorie count requirement by launching an ad campaign aimed at drilling the 2000/day calorie requirement into people’s heads.
But that’s not the kind of calorie ignorance I’m concerned about. For one, I don’t think the success of calorie counts in reducing obesity or improving public health depends on people keeping strict caloric budgets. Enough people have internalized the belief they ought to eat fewer calories that the numbers could be useful as a point of comparison regardless of how many people can accurately estimate how many calories they supposedly burn based on their age, height, weight, and activity level. Even if you’re under the mistaken impression that you’re Michael Phelps, if your goal is to consume less energy, choosing between the 250-calorie sandwich and the 350-calorie one is a simple matter of figuring out which number is smaller. IF calorie counts were accurate, and they inspired at least some people to consistently chose lower-calorie items, and at least some of those people didn’t compensate for those choices by eating more later or being less active, and some of them continued to burn the same number of calories despite eating fewer of them, then the counts would actually have the intended effect. The magnitude of the effect might be small, but it would be in the right direction.
Of course, that’s a big “if.” I already addressed the first condition (calorie counts are often wrong), and will be looking at the next two (people don’t order fewer calories but if they think they have they are likely to compensate later) in more detail in later entries. The problem of most people not knowing how many calories they burn is related to the third condition—the mistaken assumption that people will continue to burn the same number of calories even if they reduce the number of calories they eat.
In other words, the problem isn’t that too few people know that the average adult probably burns something in the vicinity of 2000 calories per day. The problem is that metabolism varies. It doesn’t stick to the formula based on height, weight, age, and activity levels. Most people don’t know how many calories they burn because they can’t know, because it’s dependent on lots of factors that formulas don’t and can’t account for. And one of the things that usually causes people to burn fewer calories per day is eating fewer of them. This starts to get at one of the other reasons I don’t think posting calorie counts will have the desired effect: it’s true that eating fewer calories often leads to short-term weight loss, but the vast majority of people either get hungry and can’t sustain the energy deficit or their bodies adjust to burning fewer calories and erases the deficit. Either way, almost all of them regain all of the weight they lost, and often more.
The Rise, Fall and Return of the Calories-in/Calories-out Myth
The idea that weight gain and loss is simple matter of calories in versus calories out also dates back to William Atwater (the turn of the 20th C. USDA scientist who was into burning food and excrement). Before Atwater, most people believed that the major nutrients in food were used in entirely different ways—proteins were thought to be “plastic” and used exclusively for tissue repair and growth, like little band-aids that the body could extract from food and simply insert where necessary; fats were similarly thought to be extracted from food and stored basically intact; only carbohydrates were thought to be transformed by digestion as they were burned for fuel. The discoveries that protein could be converted to glucose by the liver and that carbohydrates could be transformed into body fat were both seen as wildly counterintuitive and controversial. Some physicians continued to give advice based on the earlier principles as late as 1910.
However, in the last few decades of the 20th C., Atwater and others managed to convince an increasing number of people that a calorie was a calorie was a calorie—that all of the major nutrients could be burned for fuel and that any fuel not immediately consumed in heat or motion would be stored as fat. The idea of seeking an equilibrium between calories ingested and calories used was first advocated by Irving Fischer, a Yale economist who drew a parallel between Atwater’s new measure of food energy and the laws of thermodynamic equilibrium and market equilibrium. This theory had widespread appeal in the age of Taylorism and scientific management, which coincided with the first major national trend of weight-loss dieting and the aesthetic ideal of thinness represented by the Gibson Girl and the flapper.* Caloric equilibrium was a way to apply the same universal, rational logic thought to govern the laws of chemistry and the market to the body. From the 1890s through the 1920s, the calorie reigned supreme. As historian Hillel Schwartz says:
The calorie promised precision and essence in the same breath. It should have been as easy to put the body in order as it was to put the books in order for a factory” (Never Satisfied: A Cultural History of Diets, Fantasies, and Fat 1986, 135).
That human bodies don’t reliably obey this logic in practice didn’t matter then any more than it seems to matter to most contemporary advocates of caloric algebra. Skeptics noted, even then, that many fat people seemed to eat much smaller meals than thin people, and that some people could reduce their intake to practically nothing without losing weight while others seemed to eat constantly without gaining weight. But the theory of caloric equilibrium is powerfully seductive, not just because of its simple, elegant logic, but also because it seems to “work,” at least in the short term. People who reduce the number of calories they eat do tend to lose weight initially, often at approximately the predicted rate of 1 lb/3500 calories. That offers a kind of intermittent reinforcement. When it doesn’t work or stops working, people scramble to come up with excuses—either the dieter’s estimates of how much they were eating must have been wrong, or they were “cheating” and eating too much (more on this in the entry on why calorie-cutting diets fail).
However, caloric math hasn’t always been the dominant nutritional theory (despite what many people claim). In the 1930s and 1940s, as weight-loss dieting became less popular and feminine ideals got a little plumper again, nutrition science became more concerned with the psychology of appetite—often relying on Freudian-influenced theories about how traumatic childhood experiences and sexual dysfunction might manifest as insatiable hunger—and a new theory of body types.
The theory of somatotypes was initially developed by William Sheldon in the 1940s as part of an attempt to use measurements of the body to predict personality types and behaviors, like criminality. He proposed a sort of three-part continuum between three extremes: the thin ectomorph, the fat endomorph, and the muscular mesomorph, based on the three layers of tissue observed in mammalian embryos. It was similar to the medieval medical theory of different physical constitutions based on the balance of humors (blood, phelgm, bile, etc.) but with a new sciencey gloss and some nationalist overtones—Sheldon noted, for example, that Christ had traditionally been portrayed as an ectomorph (supposed to be cerebral and introspective), and suggested that therefore Christian America would have a military advantage over the mesomorphic Nazis (supposed to be constitutionally bold and arrogant). Somatotypes were later used to customize diet and exercise plans, but at the time, they were primarily embraced as a way to describe and justify the apparent differences in peoples’ ability to be thin. Unlike the algebra of calories in/calories out, somatotyping suggested that no matter what they did, endomorphs could never become ectomorphs. They simply did not burn calories at the same rate, and their bodies would cling stubbornly to fat, especially in the abdominal region.
Sheldon’s theory, like many projects with eugenicist overtones, fell out of favor somewhat after WWII, especially after the embryonic tissue theory was discredited. However, his somatotypes live on, primarily among bodybuilders and competitive weightlifters, perhaps because they still need some way to explain individual differences in outcomes for identical (and rigorously-monitored) inputs. There are also subtler echoes in the idea that people have individual “set points” or genetic predispositions towards certain body types, which isn’t meant to imply that there’s no validity to those theories—I think it seems far more likely that there are genetic components to body size than that all family resemblances are environmental. However, as the new calorie labeling policy exemplifies, the universalizing logic of calories in/calories out is back with a vengeance. Almost every popular diet plan today, with the exception of paleo/low-carb/grain-free diets, is based on creating a calorie deficit (and in practice, many low-carb diets also “work” to the extent that they do at least partially by reducing caloric intake).
The point of this little history lesson is that the extent to which people ascribe to either the theory of calories in/calories out or the theory of intransigent body types seems to have more to do with what they want to believe than the available evidence. Calories-in/calories-out may appeal to Americans today for different reasons than it appealed to the enlightenment rationalist seeking to find and apply universal laws to everything. I suspect that it has a lot to do with normative egalitarianism and faith in meritocracy, e.g. anyone can be thin if they eat right and exercise. The idea of predetermined body types, on the other hand, appealed to mid-century Americans eager to identify and justify differences and hierarchies of difference. But in every case, the evidence is either cherry-picked or gathered specifically to support the theory rather than the theory emerging from the evidence, which is complicated and contradictory.
*Before the 1880s, the practice of “dieting” and various regimens like Grahmism (inspired by Sylvester Graham), the water cure, and temperance were concerned more with spiritual purity or alleviating the discomforts of indigestion and constipation than achieving a particular body shape or size. Graham’s followers actually weighed themselves to prove that they weren’t losing weight, because thinness was associated with poor health.
Even if most people can estimate how many calories they burn on an average day now with some degree of accuracy, and the calorie counts help them eat fewer calories than they did before or would have otherwise, there’s no guarantee that they’ll continue burning the same number of calories if they continue to eat fewer calories, which they would have to do for the policy to have long-term effects. In fact, given >6 months of calorie restriction, most people appear to burn fewer calories or start eating more and any weight lost is regained. So either the calorie counts will change nothing about how people order at restaurants and there will be no effect on their weight or health. Or they will have the desired change on how people order… but there still won’t be any effect on their weight or health.
But boy am I glad we have easier access to that critical information.