Meet the Pawpaw, aka the Michigan Banana

Tied down with brown twine
Up past the tree line
Up by I hope where
The King of Spain don’t care.
I’m sitting up in my pawpaw tree
Wait they make mango mush outta me.
The Fiery Furnaces

A Taste of the Tropics in the Midwest

Photo by flickr user Voxphoto, who writes Petravoxel and Silverbased, click for bigger + photostreamDiscovering the pawpaw has been a lot like discovering the ground cherry. In both cases, part of the thrill was getting to try something you can’t buy at a typical grocery store. However, they’re both pretty unpleasant when they’re green—sour, bitter and mildly astringent—so based on my very first taste of both, I wasn’t 100% convinced they were safe for human consumption. But once they ripen, they’re just glorious. They’re as sweet as strawberries or the ripest melon or peach and fill the kitchen with a beautiful citrusy, floral perfume. And they’re both strangely reminiscent of vanilla custard. Despite whatever challenges are involved in cultivating and transporting them, they’re both so appealing that I’m genuinely surprised no one’s managed to make them commercially viable, at least in some kind of frozen or preserved form.

However, the pawpaw seems just slightly more incredible than the ground cherry. The latter at least has familiar local analogs like cherry tomatoes and blueberries. The pawpaw, on the other hand, seems like nothing that ought to grow anywhere within fifteen degrees latitude of Michigan. They’re shaped like champagne mangos, but the flesh is paler and much creamier, almost like ripe avocado. They smell like a cross between orange blossom, pear, pineapple, and honey. The flavor is milder than the aroma, but still unmistakably tropical.

I hate when that word is used as an independent flavor descriptor—normally, as far as I’m concerned, “tropical” is a climate or a region, not a flavor—but whatever the common denominator is between mangos and guavas and papayas, whatever combination of volatile esters and carbonyls and alcohols artificially-flavored “tropical” candies attempt to mimic, pawpaws have it too. As the common name implies, their closest gustatory cousin is probably the banana. In addition to the “Michigan banana,” the pawpaw has also been referred to as the: prairie banana, Indiana (Hoosier) banana, West Virginia banana, Kansas banana, Kentucky banana, Michigan banana, Missouri Banana, the poor man’s banana, Ozark banana, and banango. Flourishing, it looks a bit like a magnolia bushEven the tree looks like it belongs somewhere with winters that could be described as “balmy.” It has broad, glossy leaves and big reddish-purple flowers. My supplier, a generous friend with a pawpaw tree in his backyard, says it looks like a tree that “took a wrong turn at Panama and ended up in the US midwest.”

Where to Find Them, and Why It’s So Hard

Unfortunately, unlike ground cherries, pawpaws haven’t shown up at my local farmer’s market (at least not that I’ve seen). They may be available at some markets in Ohio and Indiana or farther south. If you do come across someone selling them, don’t be turned off by bruising or soft spots. That often indicates that the fruit was ripened on the tree, which is what you want. Unlike tomatoes, which are merely lackluster when they’re picked while green and artificially “ripened,” a pawpaw picked early may remain unpleasantly bitter even after it softens into mush. Many people actually think they taste best when they’re a little spotted and soggy. However, once they’re ripe, they’re very delicate—just like a brown banana or a super-ripe pear. Probably the number one reason the pawpaw hasn’t been cultivated for mass distribution is that it would be virtually impossible to transport tree-ripened pawpaws in bulk without damaging them so much that most people wouldn’t want to buy them.

the flower, which even kind of looks a little like raw meat, doesn't it? from wikimedia commonsAnother reason they’re hard to find is they’re apparently not especially easy to grow. The flowers don’t self-pollinate and their evolutionary adaptation to that is to emit an odor similar to rotting meat to attract flies. However—as I’m sure people who grow them near their homes appreciate—the odor is quite faint, so it doesn’t always do an especially good job of attracting the desired pollinator. Some growers actually drape roadkill on the branches of the tree or around the base to attract more flies in hopes of getting more fruit out of their trees (CSMonitor).

They’re not really vulnerable to any insect pests, but foxes, possums, squirrels, raccoons, and bears are very fond of them, and may even knock nearly-ripe fruit off the tree and then leave it to ripen on the ground for a day or two before coming back for it. Just to complicate things even further, although the trees will tolerate the midwestern winter, they don’t like it if the ground is too cold for too long, so depending how far North of the equator they are, they may only grow in low-lying areas near running water.

On top of all of that, it can take a half a dozen years or more for a seedling to reach fruit-bearing maturity and even if you manage to get a pawpaw tree or two to thrive, flower, and fruit, there’s a good chance they will only yield a substantial harvest for about 4 years. They will propagate—like aspens, a single root base can send up multiple trees—so if you manage to get one tree going, you’ll probably eventually have a small thicket (which is also why references in early American literature are almost always to a “paw paw patch” rather than to a single tree). Presumably, the younger trees (okay, technically, just newer branches of the same clonal colony) will mature as the older ones pass into dormancy, providing you with a steady stream of fruit. However, that lifecycle doesn’t lend itself well to sustained, large-scale pawpaw production.

There is a research program at Kentucky State University dedicated to studying and promoting pawpaw cultivation, which should come as especially good news for anyone concerned about the future or ethics of banana consumption. And there’s some guy named John Vukmirovich in Chicago who claims to be part of a “pawpaw conspiracy,” which essentially involves planting pawpaw seeds around the south side of the city. The Christian Science Monitor interviewed him for their 2008 story about the pawpaw’s “comeback,” and his rationale was:

We need to create more than we consume. We need to reverse the process and green the world more.

I’m not sure if planting such a fussy little tree is really going to “green the world,” or what planting seeds has to do with any “conspiracy,” but I would be delighted to discover a pawpaw growing on a tree in the middle of a city sidewalk so by all means, conspire on. 

If either of those projects succeed, perhaps someday we’ll be able to buy pawpaw pulp in the frozen section of any supermarket or forage fresh fruit even in urban areas. But for now, your best bet is to make friends with someone who has a tree in their yard, try to find a commercial grower near where you live, or order a seedling and start your own pawpaw patch. Unfortunately, I missed the 12th annual Pawpaw Festival held in Albany, Ohio the weekend before last, but consider this your advance notice—if you really want to try pawpaw and can’t find them anywhere else, start planning now to attend the 13th annual Pawpaw Festival, which I imagine will probably be held in mid-September next year.

How to Eat Them

You basically just want to get them into your pawpaw-hole any which way you can, but here are some suggestions:

iago paw paws 0671) Halve or quarter the fruit and scoop out the flesh with a spoon, avoiding the large, lima bean-shaped seeds. 

2) Cut off the top and bottom and then cut the peel off in long strips, like a mango, and slice or cut into wedges, prying the seeds out with the knife as you go.

3) Just take a bite. The skin is slightly bitter, but it’s not as tough as mango skin and it won’t hurt you.

Or, if you’re the lucky sort who has too many pawpaws to eat plain and raw, there seem to be two classic ways of preserving them:

4) Make pawpaw bread by substituting pawpaw puree for banana puree in any banana bread recipe (or for the pear puree in this recipe). According to the Wikipedia article about pawpaws, in West Virginia, this style of bread is traditionally baked in canning jars and then processed in a hot water bath to seal, which will preserve it for up to a year. Presumably you could do that with any kind of snack cake, although the Ball Mason Jar folks don’t endorse it because the variable heat in the oven could cause the jars to break and the very center of the bread might not reach safe internal temperatures. It seems to me like it would be pretty low-risk as long as you test the center to make sure it’s baked through. The worst that might happen is when you went to open it, you’d find mold or it would smell rotten and you’d have to throw it away. But I suppose just to be on the safe side, you could just bake the loaves in normal pans and freeze or give away any excess.

5) Puree the flesh and freeze it, and then thaw it slightly before serving. Frozen pawpaw was apparently one of George Washington’s favorite desserts, and I imagine it would taste even more like ice cream than it does at room temperature. You could also blend frozen pawpaw pulp with other fruits and juices or milk and ice cream to make a smoothie or milkshake.

The aforementioned Kentucky State University pawpaw research group also offers recipes for pawpaw pie, custard, cake, cookies, muffins, ice cream, preserves, zabaglione, sherbet, or punch. The fruit also ferments easily, which some people take advantage of to make pawpaw wine, and I imagine if you distilled fermented pawpaw mash, you’d end up with a powerful pawpaw brandy. But I don’t expect to be trying that, or any of the other recipes anytime soon because I honestly can’t imagine wanting to fuss with the original. It tastes like tropical fruit custard right off the tree. Some people compare it to banana cream pie filling, which is like a fantasy out of Charlie and the Chocolate Factory: these bananas don’t taste like bananas, they taste like what people do to bananas to make them into dessert!

All of which means you don’t even have to be a locavore to lament how much harder it is to get your hands on pawpaws than bananas. If you do manage to get your hands on some and you like them too, perhaps you should join Vukmirovich and me in the “pawpaw conspiracy,” and save the seeds and plant them wherever you roam. Check back in a few years for advice on step 2: how to gather and drape roadkill all over town without anyone noticing.

Buckeyes, Schmuckeyes, or if you prefer, Peanut Butter Bon-bons

When I first set out to make these chocolate-covered peanut-butter balls, I intended not to refer to them by their traditional Midwestern moniker. Surely, I thought, neither the State of Ohio nor its flagship public university can claim any special relationship to sweetened peanut butter in a chocolate shell. There’s no reason I have to invoke tOSU’s mascot in the middle of football season in Michigan. But then I found some pictures of actual buckeyes nuts, and I’ll be damned if they don’t look uncannily like their namesake.

shown here popping out of the big spiny, smelly balls that grow on the treesand here, looking almost unmistakable from the chocolate variety

 

really, the only difference is that the candy version has a flat edge

and yes, I posed these specifically to mimic the above picture

I'll eat YOUR eyes! Whitetail buck from flickr user key lime pie yumyum

Real buckeyes are the seeds of trees in the genus Aesculus, which includes between 13 and 19 species (depending on how you count) that grow all across the Northern Hemisphere. The name “buckeye” is generally attributed to an American Indian word for the seeds and the nutritious mash they made from them after roasting—“hetuck,” which means “eye of a buck.” One species in particular, Aesculus glabra, became commonly known as the “Ohio buckeye,” even though it grows throughout the American Midwest and Great Plains regions, ranging from southern Ontario to northern Texas, apparently because the botanist who gave the tree its English name first encountered it on the banks of the Ohio River.

However, there’s also a California buckeye and a Texas buckeye and even a Japanese buckeye. And the seeds of all the trees in the genus—including Aesculus glabra—are also commonly known as horse chestnuts, after the larger family they belong to (Hippocastanaceae). So there doesn’t seem to be any simple botanical or taxonomical reason why the “buckeye” became so firmly associated with the state of Ohio.

How the Buckeye Became Ohioan and Ohioans Became Buckeyes

According to one story, it all goes back the spectacularly-named Ebenezer Sproat (or Sprout), who was a Colonel of the Continental Army in the Revolutionary War. After an unsuccessful post-war stint as a merchant, he became a surveyor for the state of Rhode Island and bought stock in the Ohio Company of Associates, which sent him west with the group led by Rufus Putnam that founded Marietta, Ohio, the first permanent American settlement in the Northwest Territory. There, Sproat became the first sheriff in the NW Territory. And aside from being a relatively prominent citizen, he also happened to be quite tall and, “of perfect proportions,” according to Wikipedia, whatever that’s supposed to mean. The Indians in Ohio were impressed with his height and/or his importance, and thus came to refer to him as “Hetuck” or “Big Buckeye.” A similar account suggests that it was mostly his height—claiming he was 6’4” (which would have been tall indeed in the 18th C.) and that he earned the sobriquet on September 2, 1788 when he was leading a procession of judges to the Marietta courthouse. Indians watching the giant of a man walk by began calling out “Hetuck, hetuck.” 

E. G. Booz's Log Cabin whiskey bottle, c. 1860-1890 from Cornell University LibraryBut it’s not entirely clear why that nickname would have ever been generalized to the shorter residents of the region. The more commonly-accepted theory is that the association between buckeyes and Ohio(ans) has something to do with William Henry Harrison.

Harrison was a resident of Ohio in 1840 when he made his first, successful presidential run. According to the Wikipedia article about him, he had already acquired the nickname “Buckeye,” as a “term of affection” when he served in the U.S. Congress, first as a representative of the Northwest Territory and then as one of Ohio’s Senators—presumably because of the prevalence of the tree in the regions he represented. However, the general consensus elsewhere is that Harrison and his presidential campaign advisors carefully cultivated the buckeye mascot and nickname to bolster Harrison’s image as a “man of the people.” Particularly in Ohio, log cabins were frequently made from the wood of buckeye trees and people in rural areas used to string up the nuts that would accumulate wherever the trees grew, so the buckeye was a useful symbol of the kind of rustic frontier populism that Harrison was trying to project.

Meanwhile, they portrayed the Democratic incumbent, Martin Van Buren, as an elitist, or even as a royalist intent on the restoration of the British crown, largely by publicizing the fact that he had hired a French chef for the White House and purportedly enjoyed French wine.Van Buren was actually the son of small upstate New York farmers and educated in rural schoolhouses, whereas Harrison was the son of wealthy Virginia slaveholders and educated in elite New England academies—he even studied medicine with the renowned Dr. Benjamin Rush before deciding he didn’t want to be a doctor. But Harrison successfully managed to convince people he was one of them with the help of bottles of whiskey shaped like log cabins and campaign propaganda like this pull card:

From The Granger Collection Marvin Van Buren smiles when drinking “A Beautiful Goblet of White House Champagne”
pull the string, and he frowns with “An Ugly Mug of Log-Cabin Hard Cider”

Shortly after that, popular songs and texts start to show up that refer to “Buckeye it's not even really "anthropomorphic" because that would be a nut with arms and legs...this one has a separate torsoboys” and “Buckeye girls” and to Ohio as “the Buckeye State.” In the 1850s, Samuel Sullivan Cox wrote a series of letters based on his travels to Europe and the Ottoman Empire, which he published under the title “A Buckeye Abroad.” It obviously continued to the point that now, there are probably almost as many drycleaners, diners, and car repair shops named Buckeye Blank in Ohio as there are Empire Blanks in New York City.

Brutus the Buckeye, the bizarre nut-headed mascot that dances on the sidelines at football and basketball games wasn’t invented until 1965. But students, alumnus, and athletes from the Ohio State University [awkward definite article sic] were always called “Buckeyes.” The name is older than the University itself, which was founded in 1870, and was seemingly applied to sports teams from the very beginning. The short-lived AA professional baseball team that existed in Columbus from 1883-4 was also named the Buckeyes. And Jessie Owens, who won four gold medals at the 1936 Olympics, while he was a student at OSU, was  sometimes called “the Buckeye Bullet.”

But What a Stupid Reason That Would Be Not to Make Them

So even though it probably originated with a dishonest political campaign (is there any other kind?), I still feel like I have to cede the name “buckeye” to Ohio—after all, it’s older than the UM v. tOSU rivalry itself. And it just seems foolish to deny the resemblance. But it would be a real shame to let the apparent legitimacy of a name that happens to be associated with any state or school bias you against the salt-studded awesomeness of homemade chocolate-covered, sweetened balls of nut butter. Sure, they’re basically just Reese’s peanut butter cups, but you shouldn’t underestimate the difference that good chocolate, flaky salt, and having personal control over the level of sweetness can make.

the toothpicks make for easier dipping, and it's easy enough to smooth away the holesI probably wouldn’t normally bother with something so…I don’t know, cliché? Pedestrian? It’s not that I don’t like simple foods or classic flavor combinations, but somehow anything consisting primarily of peanut butter and chocolate just seems like cheating. Just like it seems like cheating whenever the contestants on Chopped use bacon if it’s not one of the secret ingredients, and like a petty perversion of justice that the bacon-cheater almost always wins. 

However, this recipe popped up on Serious Eats just as I was musing about how maybe I should throw together some sort of sweet nibble in case we happened to have people over this weekend—something I could make in advance and that would keep relatively well in case we didn’t have people over. These seemed to fit the bill because like most cookies, you can make them well in advance of serving, but like most candies, they won’t get stale. But what really sold me was the description of the crunchy flakes of salt in the peanut butter mixture—“like little mouth-fireworks,” the author said.

If they seem too boring as is, you could mix up the nut butter/chocolate coating combination or add a third or fourth flavor element. You could make Thai coconut version with a little chili pepper, powdered ginger, and dried coconut. Or mix in bits of toffee, puffed rice, or crumbled cookies for a different flavor or texture. You could use cashew butter or almond butter instead of peanut butter, powdered honey for some of the powdered sugar, and white or milk chocolate if any of those is more to your liking. You could even freeze little drops of fruit preserves or caramel and roll the nut butter around them so at room temperature, they’d melt into a sweet, gooey center. Now I’m dreaming of white chocolate-covered sunflower butter balls with vanilla caramel centers. You could even make a whole buffet of different buckeyes…and if you really can’t get past the name, just call them bon-bons or shmuckeyes instead. If you cede them to tOSU, I think that’s just another victory for “tWorst State Ever.”

Recipe: Peanut Butter Bon-Bons (from Serious Eats)
halved from the original, to make approximately 3 dozen

Ingredients:

  • 12 T. salted butter (or coconut oil)
  • 1 1/2 c. unsalted, unsweetened peanut butter (or any other nut or seed butter)
  • 3 c. confectioner’s sugar
  • 1 1/2 t. kosher salt (or more to taste)
  • 1 bag chocolate chips (or ~2 cups chopped bar chocolate, I used a 70% cacao)

1. Leave the butters at room temperature to soften.

2. Beat them together with a spatula or the paddle attachment of a stand mixer until completely smooth and well-combined.

the butters alone will be pretty liquidy first addition of powdered sugar but by the last addition it will be fairly stiff and should be able to be handled

3. Add the powdered sugar 1 cup at a time, mixing until it forms a thick, malleable dough.

4. Stir in the kosher salt just until evenly distributed—you want to add the salt at the end so it doesn’t dissolve into the butter. Put the bowl in the freezer for about 10 minutes.

5. Roll heaping tablespoons of the peanut butter mixture into balls about the size of walnuts (or buckeyes) and place on a cookie sheet lined with waxed paper or parchment paper. Place a toothpick in each ball and return to the freezer for 30 minutes.

september 066

6. Meanwhile, reserve a few pieces of chocolate and melt the rest in 15-second bursts in a microwave or a double-boiler just until it’s about 75% molten. You don’t want the chocolate to get too warm or it will burn.

7. Remove from the heat and stir occasionally until it’s entirely melted and slightly cooled, and then stir in the reserved pieces.  Wrap the pot in a kitchen towel—you want to keep the chocolate around 88F—I didn’t bother pulling out a candy thermometer, because that’s right around body temperature, so it should feel just barely warm to the touch. Otherwise, it won’t temper correctly, and will set slightly soft and greasy to the touch and may develop a white “bloom” on the surface. The reserved chips  “seed” the melted chocolate with the right crystalline structure to make it harden.

8. Dip each ball in the chocolate to coat and place on waxed paper or parchment paper until firm. Remove the toothpicks and gently smooth over the hole. Store in an air-tight container in a cool place or refrigerate until ready to serve.

Price, Sacrifice, and the Food Movement’s “Virtue” Problem

I'm not elitist, I just think you should reconsider whether your cell phones or Nike shoes or whatever it is you fat fucks spend your money on is really more important than eating heirloom beets. I just want you to make what I believe would be the more satisfying choice for you. Because I am the authority on what you find satisfying.

Urging others to eat better (and thus more expensive) food is not
elitist,
[Alice Waters] said. It is simply a matter of quality versus quantity
and encouraging healthier, more satisfying choices. “Make a sacrifice
on the cellphone or the third pair of Nike shoes,”
she said.

The Price Paradox

One of the most frequent critiques of what has been called the “food revolution” and especially its de facto spokespeople, Alice Waters & Michael Pollan, is that the kind of food they want people to eat—fresh, organic, free-range, grass-fed, local, slow, “healthy” etc.—is generally more expensive than the alternatives: processed, conventional, caged, corn-fed, industrially-farmed, fast, “junk” food. For example, in an interview with DCist to promote his newest book, Anthony Bourdain said:

I’ll tell you. Alice Waters annoys the living shit out of me. We’re all in the middle of a recession, like we’re all going to start buying expensive organic food and running to the green market. (interview with the DCIst)

Pollan and Waters have responded to this critique numerous times, and their standard defense goes something like this:

Pollan, nomming something virtuous, I'd wagerWell, $4 for a single peach or $8 for a dozen eggs isn’t really that expensive. The real problem is that government  subsidies have made junk food artificially cheap and confused us about the real price of food. Many people have discretionary income that should be spent on more expensive food that’s better for their health, the environment, animal welfare, etc. If consumers demand it, producers will find a way to provide it.

Michael Pollan, demonstrating his undeniable talent for reducing complicated issues to pithy sayings, has summarized this in his rule: “Pay more, eat less.” In essence, they suggest that good food should cost more. But then, on the other hand, they argue:

Local, organic, [yadda yadda] food is so self-evidently superior that the primary reason most people continue to choose crappy, industrially-produced fast food that destroys their health and the environment is because it’s just so much cheaper. Many people don’t have discretionary income, and therefore something needs to be done on a structural level—possibly an entire overhaul of the agricultural subsidy system—to make “real” food affordable enough for everyone. In other words, good food should cost less.

These aren’t wholly incompatible, and indeed, I suspect that many proponents of the “food revolution” support both: people who can should be willing to pay more for fresh, local, organic food now. At the same time, we should collectively pursue policies that make that kind of food cheaper until everyone can buy it.

But what if the problem isn’t cost?

As James Williams points out in a recent article in The Atlantic, “Should We Really Pay $4 for a Peach?”, what he calls “healthy food” like apples, dry beans, carrots, and celery have declined in price right along with cookies, ice cream, and potato chips over the last two and half decades. According to the Economic Research Service of the USDA, from 1980 to 2006—precisely the period when many people claim that fast food overtook our national diet and made us into the fattest people on the planet—food declined in price across the board, and crucially, “the price of a healthy diet has not changed relative to an unhealthy diet.” As Williams says:

Evidently, consumers have chosen to take advantage of the declining prices for the cookies rather than the apples, thereby undermining the claim that we choose cheap unhealthy food because it’s cheap. As it turns out, we also choose it because we appear to like it better than cheap healthy food.

I take issue with the rhetorical move of collapsing American consumers—a diverse lot—into a single “we,” but even if what he says isn’t true for everyone, it must be true for a lot of people. I suspect that many proponents of food reform don’t want to believe that’s really the reason people continue eating “bad” industrially-processed junk because they have the special conviction of born-again religious zealots. Being converts themselves, many of them believe that all the unconverted masses need is to be enlightened the same way they were. They assume that once other people are “educated” about how superior organic, local, yadda yadda food is, they too will see the light.

But what if that’s not true? What if most people will remain skeptical about the supposed superiority of natural, organic, local, etc. food (often with good reason) or, more often, be simply indifferent to claims about its superiority, no matter how cheap and accessible it is? What if buying organic food really won’t be more satisfying to many people than a third pair of Nikes? (And could Waters have chosen a more racist or classist example of conspicuous consumption? Seriously, why not a flat-screen television or granite countertops?)

When Price IS King

from Sociological Images, click for link There’s an important caveat about the price issue that Williams left out: calorie for calorie, soda, candy, chips, and fast foods made with cheap meat, soybean oil, and white flour are significantly cheaper than apples and dried beans. For the roughly one in five Americans who lacked the money to buy the food they needed at some point in the last year, or the more than 49 million Americans categorized as “food insecure,” price may still be the dominant factor guiding their food choices.

I don’t think most food insecure people necessarily stand around in grocery store aisles looking at nutritional labels and crunching the numbers to figure out what will give them the highest caloric bang for their buck, the way Adam Drewnowski did. However, many of them probably stick to cheap, processed combinations of corn, wheat, and soy because they know they can afford enough of that to get by on, and because it tastes good to them.

In the somewhat-dated account of urban poverty There Are No Children Here, Alex Kotlowitz describes the monthly shopping ritual of a single mother on food stamps. She goes to the office where she gets her food stamps, and then goes directly to the store where she buys the same array of canned, boxed, bagged, and frozen foods every month. She has it down to a science. She knows exactly how many loaves of white bread, boxes of macaroni and cheese, cans of soup, pounds of ground beef and bologna, and bricks of generic American cheese it takes to to feed her family for a month on her government-allotted budget.

However, as those damned hipsters on food stamps have demonstrated, even people of limited means can produce the kinds of meals Alice Waters might smile upon. So why don’t they?

The Elision of Virtue and Sacrifice

As much as its proponents like to portray eating local, organic, yadda yadda food as a purely joyful and delicious celebration, there’s basically no getting around the fact that it takes a lot more work to turn fresh produce into a meal than it does to go through a drive-thru or microwave something frozen. I suspect that even most of the “food revolution” faithful rely on prepared foods and cheap take-out now and then, even if it’s more likely to be from Trader Joe’s or a local organic pizzeria instead of Walmart or Little Caesar’s. The problem is not that the ingredients of a home-cooked meal, like $1.29/lb broccoli, are so much more expensive than dollar menu meals at McDonald’s, which is what Food, Inc. implies. The problem is that broccoli isn’t a ready-made meal.

My friend Patti made a similar point on her blog recently, noting that the recent initiatives to get fresh food into Detroit smack of race/class privilege. As a friend of hers said:

If he has $3 left til payday and payday is two days away, he’s going to the Golden Arches.  There’s no denying that you can get a meal for $3 versus some tomatoes and a banana.

Eating “right” according to the the (shifting and frequently conflicted) priorities of the “food revolution” requires effort and sacrifice—perhaps not a sacrifice of objective deliciousness for people who’d rather have a bowl of homemade chili than a burger any day of the week, but a sacrifice of familiar tastes and habits and the instant gratifications of foods composed primarily of sugar, starch, fat, and salt.

For the food revolution faithful, that sense of sacrifice isn’t a deterrent, but actually seems to be central to their perception that eating local, organic, etc. food is morally superior. I first started thinking about this at a roundtable on “Food Politics, Sustainability and Citizenship” at the 2008 Annual Meeting of the American Studies Association. The panelists acknowledged that local, organic, and/or “natural” foods were not always objectively superior in the ways people want to think they are—they often require more energy to produce and transport even if they have a much shorter distance to travel, there’s no consensus on whether or not they’re healthier than the conventional, processed alternatives, and they are often labor-intensive and rely on child labor, unpaid interns, and the willingness of farmers to self-exploit. In short, they admitted that “bad” industrial food is often more sustainable, just as healthy, and possibly sometimes more ethical. But they all insisted that regardless of its real impact, what was more important was that consumers of local and organic foods were “trying.” And they all emphasized the importance of narratives—the stories we tell ourselves about why we do what we do, which is one realm where “natural,” local, etc. food has the indisputable upper hand. Their recommendation for the “food revolution” was to focus on fostering those narratives and encouraging people to keep trying, as if the intentions and the effort involved in eating “better”—futile or not—were more important than the actual carbon footprint or nutritional ramifications of the behavior.

And that seems to me like a serious confusion of intention and effect. If the ideal that you’re pursuing is sustainability—a slippery term to be sure, but for the moment, let’s just say it means a practice that could be continued indefinitely without making the environment inhospitable for human life—and you’re advocating a new practice in service of that goal, but it turns out to be worse for the environment than your previous course of action, I don’t think you should just shrug and say, “well, at least our hearts are in the right place” and continue with the new, worse practice. However, that’s exactly what the food revolution faithful do all the time. No matter how many studies show that food miles are far less important than efficiencies of scale and growing things in optimal climates, or that organic food is no healthier than conventional, or that people can’t tell the difference in blind taste tests, they’ll either say “that’s not the point” or insist that you must be wrong. The practice precedes the evidence, but they seize on any evidence that justifies it after already deciding on the course of action and systematically ignore any evidence to the contrary.

All of which suggests that eating “better” isn’t driven by evidence-based beliefs about what’s really healthier, more sustainable, more humane, or even better-tasting—which are often conflicting ideals anyhow. The main appeal of natural, organic, local, yadda yadda food is a deep, often inchoate, feeling that it’s superior, which precedes and trumps reason or any objective weighing of the evidence. I think what reinforces that feeling of superiority most is the experience of sacrifice, which channels good old-fashioned Protestant Work Ethic values like the satisfactions of hard work and delaying immediate gratification. The relationship between virtue and self-sacrifice actually long predates Protestantism—in the Nicomachean Ethics, Aristotle argues that people achieve eudemonia, which essentially means “self-actualization” or a virtuous happiness, through hard work and mastering their bodily desires (usually by denying them). The underlying belief seems almost instinctual—it is good to work hard and resist immediate pleasures. And we might like to think that what makes the hard work and sacrifice good is some long-term goal or “greater good” they’re done in service of, but that’s not necessary to produce the sensation of virtue.

Ergo, making a special trip to the farmer’s market during the few hours per week it’s open must be a good thing to do simply because it’s so much less convenient than shopping at a grocery store that’s open all the time. And turning a box of locally-grown produce you might be totally unfamiliar with into edible foods and acceptable meals must be better—morally, if not nutritionally—than microwaving a Lean Cuisine. And spending more money on something labeled “natural” or “environmentally friendly” must be better. People don’t even need to know what for to reap the psychic rewards. Things that are difficult, inconvenient, or require sacrifices just feel virtuous.

"If you care, buy our environment-friendly disposable baking cups. Buy those other disposable baking cups if you're some asshole who doesn't give a shit." From www.passiveagressivenotes.com

The Limits to the Virtuous Appeal

The problem for the “food revolution” is that the virtuous effort and sacrifice in service of questionable returns doesn’t appeal to everyone. Some people feel like they do enough hard work and make enough sacrifices in other areas of their life that they don’t need the extra moral boost of buying morally-superior food. And I suspect it’s no coincidence that the people who are most likely to make buying “superior” food a priority and get satisfaction from that are people who identify as “middle class” and often express a lot of guilt about how much they (and “their society”) consume in general—i.e. the politically left-leaning, (sub)urban elite. (The politically right-leaning elite might consume just as much or more, they just don’t feel as guilty about it.)

Many working-class people might technically have the time and money to eat the kind of food Alice Waters and Michael Pollan prescribe, but they don’t want to. They’re not looking to put a lot of effort into and make a lot of sacrifices for their diet. There’s no incentive.

It reminds me of Barbara Ehrenreich’s epiphany about smoking in Nickel & Dimed:

Because work is what you do for others; smoking is what you do for yourself. I don’t know how the antismoking crusaders have never grasped the element of defiant self-nurturance that makes the habit so enduring to its victims.

As she discovers, the real reason so many working class people smoke isn’t because of some rational calculation—like the fact that they might not get as many breaks if they don’t—just like the real reason more people don’t buy “healthy” food isn’t (mostly) because of the price. Smoking and familiar, convenient, “junk” foods appeal to working-class people for exactly the same reason they don’t appeal to the left-leaning, progressive, urban and suburban middle-class. They like them because they’re “bad,” because they’re self-indulgent, because they don’t do anything for anyone else—not the environment, not the animals, not third-world coffee-growers, not even their own health. For a lot of people, the narrative of virtuous effort and self-sacrifice isn’t just irrelevant, it’s an active deterrent.

Change I Can Believe In 

I see nothing elitist about campaigning for greater availability of fresh, local food in low-income neighborhoods, public schools, and prisons. But when you start telling people what you think they ought to be willing to give up in order to make what you’ve decided will be “more satisfying choices” for them, I think you’ve gone too far. I really like—but don’t totally believe—the concession Michael Pollan made in the WSJ interview:

To eat well takes a little bit more time and effort and money. But so does reading well; so does watching television well. Doing anything with attention to quality takes effort. It’s either rewarding to you or it’s not. It happens to be very rewarding to me. But I understand people who can’t be bothered, and they’re going to eat with less care.

He claims to understand why some people can’t be bothered, and even equates it with hobbies like reading and watching television—but his entire oeuvre is devoted to to making the opposite claim. For example, in The Omnivore’s Dilemma, he says:

“Eating is an agricultural act,” as Wendell Berry famously said. It is also an ecological act, and a political act, too. Though much has been done to obscure this simple fact, how and what we eat determines to a great extent the use we make of the world—and what is to become of it. To eat with a fuller consciousness of all that is at stake might sound like a burden, but in practice few things in life can afford quite as much satisfaction. By comparison, the pleasures of eating industrially, which is to say eating in ignorance, are fleeting. (p. 11)

That doesn’t sound like the opinion of someone who thinks choosing junk food over fresh, local, organic food is an innocent predilection, like preferring trashy pulp fiction to James Joyce or watching reality television instead of Mad Men. The moral stakes in our steaks—a pun I’m stealing from Warren Belasco—are higher.

Even if the stakes of our food choices are higher (which is debatable), that doesn’t mean that it’s okay for food revolutionaries to try to push their priorities on anyone else, least of all the working class or disproportionately black urban poor. If they really want more people to make the choice to eat what they think is “better” food (and I think that should be based on evidence about the actual impact and not just intentions), they’re going to have to work on making healthy, sustainable, humane, etc. foods answer the masses’ needs and desires. They’re not going to get anywhere by declaring the masses ignorant for wanting cheap, convenient, reliable, good-tasting food or trying to convince them that what they should want is expensive, inconvenient, unfamiliar, less immediately palate-pleasing food.

I’m not actually sure it’s possible to create a food system that would satisfy both the desires of the “food revolution” and the needs of the working class, but if it is, it will require letting go of the narratives about sacrifice and virtue. As long as eating “better” is constructed as dependent on hard work and self-sacrifice, the “food revolution” is going to continue to appeal primarily to the left-leaning elite and efforts to get other people to join them will be—rightly—portrayed as “elitist.” And until that changes, natural, local, sustainable foods will continue to serve primarily as markers of belonging to the progressive, urban, coastal elite rather than the seeds of any real “revolution.”

Fresh Tomato Soup Two Ways: Clean and spicy or Creamy and comforting

no grilled cheese required 

Not Your Famous Pop Arist’s Tomato Soup

I love Campbell’s classic condensed tomato soup. I know, me and half the rest of the Western world, right? There’s a reason it’s an icon. I’m not saying it makes me special or anything. But I really, really love it. My freshman year of high school, I made a Campbell’s Tomato Soup can costume for Halloween out of two giant macramé rings and some red, white, and mustard-yellow felt. That was before everyone had cell phones and all cell phones were cameras, so you’ll just have to believe me when I say it was a decent facsimile. I wore it again on Halloween my sophomore year of college, and my roommate went as Andy Warhol. No one got it. We didn’t care.

I’ve mostly abandoned the other processed foods I loved as a kid, like Kraft Mac & Cheese, Nissin Cup O’Noodles, and the generic versions of Lucky Charms and Apple Jacks. But I always have tomato soup in my cupboard. I had never even thought about making fresh tomato soup before, because why bother? Campbell’s is so good.

And then, this summer, I just kept finding myself stuck with mountains of too-ripe tomatoes (and a lot of rotting ones that ended up in the trash bin). I did the fresh tomato pasta sauce thing. I did the fresh salsa thing. But eventually, all I could think about when I looked at the endlessly-refilling pile was the tomato juice I made a couple weeks ago for the tomato bars. There were a few ounces left over, and as I drank them, I regretted the fact that I didn’t have gallons of it. It reminded me of a velvety, chilled version of Campbell’s soup.

You Can Never Go Home Again

Looking back, I feel like that juice was my first tentative bite of a forbidden apple. Now that I’ve fallen to temptation and made fresh tomato soup not once, but twice, I’m a little afraid that I won’t be able to go back my old canned stand-by. What if I don’t like it as much anymore? What if I’ve spoiled myself?

this version is rich enough to be a satisfying dinner with a salad and some breadIt would be one thing if it was entirely different—the way baked macaroni and cheese is so unlike Kraft, the two don’t even really compete. Apples and oranges. But it turns out that if you cut up some really ripe tomatoes and then simmer them for 15-20 minutes, maybe with some sautéed onion and a dash of sugar and salt, maybe with some jalapeno and/or ginger, and then you strain out all the solids…it’s almost exactly like Campbell’s, except better. Transcendent. It’s the purest, richest tomato flavor I’ve ever tasted, permeated by sweet onion and spice. It feels like a warm hug in a bowl, like a last gift from Summer delivering you gently into the Fall.

I didn’t even want to put crackers in it because they just would have diluted it. I might eat it alongside a grilled cheese sandwich someday, perhaps something snooty like port salut with mango chutney, or raclette with ham and pickle slices…but there will be no dipping. This soup is the platonic ideal of cooked tomato all by itself. 

However, I guess since I figured I’d already ruined myself for Campbell’s, I decided: why not make a creamy version? Same process, but I cooked the tomato down a little more before straining it and then made a quick roux to thicken the milk before whisking the tomato back in. This time, I garnished it with a few parmesan curls and some chopped fresh parsley. As I was eating it, I got to thinking that what would really put it over the top is some grilled cheese croutons: sharp cheddar or gruyere melted between two thin slices of bread and then cut into bite-sized pieces, tossed in a little oil or melted butter and sprinkled with dried parsley, garlic powder, and maybe some grated parmesan and then toasted for 10-15 minutes in a hot oven until golden brown and crisp all over. Of course, that’s really just gilding the lily

Both ways, it’s fantastic and deceptively comforting, given that it may be robbing you of one of your most enduring childhood pleasures. I just hope I don’t find myself tomato soup-less when the first cold front hits and I’m out of fresh tomatoes. When I have to go running sheepishly back to Campbell’s, which I inevitably will, I hope its familiarity will overwhelm the inevitable disappointment. And maybe that will be the time to break out the grilled cheese croutons.

just over 2 lbsRecipe: Fresh Tomato Soup
makes approximately 16 oz, or 2 bowls and 4 small cups

  • 4-5 large tomatoes (~3”+ diameter) or 6-10 medium ones (~2”); 2 to 2 1/2 lbs total 
  • 1 T. olive oil
  • 2 shallots or 1 small onion
  • 1 small jalapeno, including seeds (optional)
  • 1” piece of fresh ginger (optional)
  • 1 T. sugar
  • 1 1/2 t. kosher salt (or 1 t. regular)

1. Heat the olive oil in a medium saucepan while you dice the shallot or onion. Slice the jalapeno (seeds and all if you like) and the ginger, and add them, too. Sweat them until the onion is cooked through and golden.

I didn't have any fresh ginger, and can't even imagine how awesome it would have been with it will look nothing like soup to start off

2. Meanwhile, core and roughly chop the tomatoes. Add them to the pan with all their juice.

3. Add the sugar and salt and cook over medium to medium high heat for 15-20 minutes, or until the tomatoes are entirely broken down.

a little hard to tell because of all the steam, but it looks kind of like a very soupy pasta sauceI ended up with about 1/2 cup solids to discard

4. Press the mixture through a fine mesh sieve. Don’t forget to scrape the solids off the bottom of the sieve. Stir well, taste and adjust seasoning.

about 2 1/2 lbs--it looks like a lot more, but these were much smallerRecipe: Cream of Fresh Tomato Soup (adapted from Allrecipes)
makes approximately 30 oz—2-3 big servings or 5-6 small ones

  • 4-5 large tomatoes (~3”+ diameter) or 6-10 medium ones (~2”); 2 to 2 1/2 lbs total
  • 1 T. cooking oil (olive, peanut, canola, etc.)
  • 1 small onion, or half of a large one
  • 3-4 cloves garlic
  • 3 T. sugar
  • 1 t. salt
  • 2-3 T. fresh or 2-3 t. dried herbs like parsley, oregano, basil, sage, tarragon, and/or rosemary
  • 2 T. butter
  • 2 T. flour
  • 2 cups milk
  • more salt and pepper to taste

1. Heat the olive oil in a medium saucepan while you dice the shallot or onion and crush the garlic. Sweat them in the hot oil until the onion is cooked through and golden.

2. Meanwhile, core and roughly chop the tomatoes. Add them to the pan with all their juice.

3. Add the sugar, salt, and herbs and cook over medium to medium high heat for 30-40 minutes, or until the tomatoes are entirely broken down and beginning to get thick and saucy.

everything in the pot after 30 minutes of simmering

4. Press the mixture through a fine mesh sieve into a separate bowl.

5. Melt the butter in an empty saucepan. Whisk in the flour until no lumps remain and then gradually add the milk, whisking after each addition until smooth—add just a few tablespoons at a time for about the first cup, and then add the rest in a steady stream. Heat until steaming, and beginning to bubble gently at the edges but not yet boiling, stirring constantly. Remove from the heat.

flour whisked into the butter a few additions of milk, roux will be very thick

6. Whisk the strained tomato into the milk, taste and adjust seasoning.

still swirling together

In Praise of Fast Food by Rachel Laudan

is this really inferior...

to this?

I don’t normally just post links to other content, but I think this article is such a smart and elegant polemic against “Culinary Luddism,” or the idea that “traditional” foodways (many of which aren’t actually traditional) are sacrosanct and modern industrialism has just ruined everything. I found myself nodding enthusiastically throughout and felt called to arms when I reached the conclusion:

What we need is an ethos that comes to terms with contemporary, industrialized food, not one that dismisses it; an ethos that opens choices for everyone, not one that closes them for many so that a few may enjoy their labor; and an ethos that does not prejudge, but decides case by case when natural is preferable to processed, fresh to preserved, old to new, slow to fast, artisanal to industrial. Such an ethos, and not a timorous Luddism, is what will impel us to create the matchless modern cuisines appropriate to our time.

YES. It is naive—at best— to believe that the world before industrial, processed food was idyllic in terms of nutrition, the distribution of agricultural and culinary labor, or the quality and variety of culinary experiences. At worst, ideas about what counts as “natural” or “real” with little basis in historical or scientific fact are used to reinforce social inequalities and make people feel either guilty or morally superior to others because of how they eat. Laudan corrects mistaken assumptions about the “halcyon days of yore” and critiques common beliefs about the inherent superiority of local, slow, artisanal, etc.

At least for now, the whole article is available for free. If you want to know why “modern, fast, processed” food isn’t such an unmitigated disaster after all, and why we may actually need mass-production, even if it’s a more considered, regulated, ethical form of mass-production, this is a good place to start:

In Praise of Fast Food

Labor Day Lemon & Herb Chicken Drumsticks

I was a tiny bit afraid they'd gotten too dark, but they turned out perfect. If anything, we almost wanted them with a little more char.  

My Most Ambivalent Holiday

I was raised in a Union family. We checked clothing labels and only bought the ones that said “Made in the U.S.” We didn’t buy grapes because of César Chávez. Every year, my dad went to the Eugene V. Debs Memorial Kazoo Night where he watched a Tigers game from the bleachers and between innings, hummed “Solidarity Forever” in unison with bunch of other Union guys. He would bring home magnets that said “Stick it to capitalist tools” and sponges that said “Wipe up capitalist scum” and t-shirts emblazoned with a twist on Debs’ most famous quote:

While there is a lower class, I am in it, and 
while there is a criminal element I am of it, and 
while there is a soul in prison, I am not free, and
while there is a game in Tiger’s Stadium,
I am in the bleachers.

I stole this at some point in college because I just had to have it--a keychain is something you keep with you all the time, and I wanted one that would remind me of my dad

 I think he asked about it at some point and I pretended not to know where it had gone to. I wonder sometimes about whether that makes me a bad daughter...or a good one

My feelings about labor organization have gotten more complicated over the years. I’ve had to reckon with the fact that unions are fallible and that labor history is marred by strategic missteps and ugly bigotry. The current popularity of anti-union sentiment can’t be entirely attributed to Reaganomics and  right-wing campaigns—unions themselves bear at least some responsibility. However, that awareness—the idea that little-u unions can be wrong—seems to exist on a different spatio-temporal plane than my belief that the idea of Unions, or Unions qua Unions are good. That thought/feeling is deeper and also somehow before my ability to think about why unions make mistakes or the erosion of labor organization in the U.S. I guess it’s something like an article of faith.

That’s not to say I don’t have reasons for being pro-Union. I think all workers deserve a say in their conditions of employment. I think more egalitarian resource distribution is both morally and practically a good thing (for some of the same reasons that Robert Reich mentioned in his recent NYTimes op-ed). I believe that protections against some of the worst abuses of workers in the name of profit wouldn’t exist without labor organization, like the minimum wage and child labor laws. But ultimately, it’s impossible for me to separate those beliefs, which might be subjected to rational debate and supported or contested with evidence, from a more inchoate “Union = good” thought/feeling that precedes and undergirds them.

Ultimately, that faith eclipses my cynicism about how the holiday was only established to try to placate workers who were (justifiably) outraged about the fact that federal troops called in to put an end to the Pullman Strike had killed 13 workers and wounded 57. Or how the September date was set to distance it from International Worker’s Day, which commemorates the Haymarket Massacre and tends towards far more radical agitation and demonstration. Or how those injustices and the accomplishments of organized labor have largely slipped from our national memory. In many ways, Labor Day itself is a far better symbol of how unions are pacified and convinced to delay—often indefinitely—their pursuit of more radical demands than it is of the victories of organized labor.

I need bigger cages or stakes to tie them to. Also, I should probably check on them more than once a week.And then there’s the fact that it’s also the symbolic end of summer. The end of sundresses and afternoons when it’s too hot to do anything but take a nap near an open window and hope for the occasional breeze. The end of my annual half-hearted attempt to control a small tomato jungle. It is the official point when I can no longer pretend I’ll ever make up for the gap between everything I had intended to do and the summer that has actually eclipsed—with too few meandering walks, too little of my dissertation written, and far too few mint juleps.

Despite all of that, I love Labor Day for basically the same reason I love Thanksgiving and remain grudgingly fond of Memorial Day and the Fourth of July and, to a  lesser extent, Mother’s and Father’s Days. For one, they remind me to appreciate and celebrate things that I am truly grateful for. And moreover, my primary association with them has way less to do with the ostensible reasons for the holidays than it does with the way I celebrate them: by taking a day off work, getting together with friends and family, and eating some great food.

om nom nom nom nom

Like Sunday dinner, taken outside and designed for a crowd

For me, Labor Day food is less about specific dishes than a general principle—get outside and grill while you still can. And while I’m a big fan of all manner of proteins stuffed into sausage casings or mashed into patties, sometimes by September I’ve had enough of that. This is a recipe to turn to if you’ve also hit burger fatigue.

These drumsticks remind me of my favorite roast chicken. They’re marinated in a salty lemon and herb dressing that’s almost like a brine and then seared over the hottest part of the grill until the skin crisps and bathes the flesh in rendered chicken fat. Then, you move them to a cooler part of the grill where they slowly finish cooking, so the flesh stays almost indecently moist and succulent. Meanwhile you can roast some veggies or whatever else you like on the hot side.

Unlike roasting a whole chicken, the whole process—marinating and grilling—takes less than a hour. It’s also cheaper than roasting a whole chicken because you can often get drumsticks for as little as $1/lb, thanks largely to the national anxiety about fat. Ever since boneless, skinless chicken breasts became the protein of choice for many weight-conscious Americans, the more delicious parts have been practically free for the taking. And unlike whole roasted chickens, this recipe scales up or down effortlessly and doesn’t require any carving. You can whip this up in practically no time whether you’re just trying to get a weeknight dinner on the table or you’re cooking for the neighborhood block party, and you don’t even need silverware to eat them. If it’s the main protein you’re serving I’d estimate 2-4 drumsticks per person.

Solidarity Forever

Whatever your feelings on Unions/unions, whatever your current employment situation, and however you choose to celebrate (or not): I wish you dignity and justice in all your endeavors, the respect of anyone you work with or for, the pleasures and rewards of meaningful labor, and a good meal at the end of every day. Many of us who enjoy any of those things have unions to thank somewhere down the line. Happy Labor Day.

Recipe: Lemon & Herb Chicken Drumsticks (adapted from Epicurious)

  • lemon & herbs, post-zesting, pre-choppingone medium lemon, zest (~2 t.) and juice (~1/4 cup) 
  • 2 T. olive oil
  • 2 t. kosher salt (or 1 1/2 t. regular)
  • half a dozen grinds of pepper (~1/2 t.)
  • 2 fresh garlic cloves, minced, or 1 t. garlic powder
  • a small handful of fresh herbs (~1/4 cup) or 2 t. dried (oregano, rosemary, sage, parsley, tarragon and/or basil)
  • 12 chicken drumsticks 

1. Whisk together the lemon juice, zest, olive oil, salt, pepper, garlic, and herbs.

2. Place the drumsticks in a plastic bag, preferably with a zip-top, and add the marinade. Squish all over to coat.

I didn't have a zip-top bag so I just used a twist tie to seal it, which worked just fine there's no need to wash the chicken first--any nasty bacteria on the outside will get killed when you cook it; if you wash it, you just risk getting that bacteria all over your kitchen, which you're not going to cook

3. Let sit at room temperature for 30 minutes, or up to 2 days in the refrigerator.

4. Start the grill or preheat the oven to 450F.

5. Place the marinated drumsticks on the hottest part of the grill or on a foil-covered baking sheet in the preheated oven. If grilling, cook until they have a nice char on both sides and then move to the cooler part of the grill. If baking, turn the oven heat down to 200F.

moved to the cool side of the grill, making way for corn on the cob

6. Cook until the internal temperature is 150-155F or until the juices run clear and flesh at the bone is opaque. (Some people recommend cooking chicken to 165 or even 180F, but salmonella dies after 30 min at 140F so there’s really no reason to).

Why Posting Calorie Counts Will Fail, Part III: Calorie-restriction dieting doesn’t work long-term for most people

Previously in this series: Intro, Part I, and Part II.

The article on "Making Weight Loss Stick" is by Bob Greene, the personal trainer and "fitness guru" Oprah first started consulting with in 1996. Sadly, I don't think that's *meant* to be ironic. Oprah 2005/2009

To test whether turning [fat people] into thin people actually improves their health, or is instead the equivalent of giving bald men hair implants, it would be necessary to take a statistically significant group of fat people, make them thin, and then keep them thin for long enough to see whether or not their overall health then mirrored that of people who were physiologically inclined to be thin. No one has ever successfully conducted such a study, for a very simple reason: No one knows how to turn fat people into thin people.
Paul Campos, The Obesity Myth (2004)

Diets do cause weight loss…in the short term

People who think calorie restriction dieting “works” haven’t necessarily been duped by the diet industry or seduced by the prevailing nutritional “common sense” that weight loss and gain are a simple matter of calories in vs. calories out. Many of them believe it because their personal experience seems to confirm it, often repeatedly. Of course, “repeatedly” is part of the problem. Weight cycling—losing and re-gaining 5% or more of one’s total body weight—isn’t what dieters or public health policy makers are shooting for. Even people dieting with a specific occasion in mind, like a wedding or a high school reunion, would generally prefer to achieve permanent weight-loss.

But almost a century of research has shown that dieting—which usually involves calorie restriction—is not the way to do that. Studies repeatedly find that while eating less causes weight-loss in the short term, a majority of participants in weight-loss interventions focused on diet gain most of the weight back within 1 year and the vast majority (90-95%) gain all of it back within 3-5 years. Approximately 30% gain back more than they initially lost, and there’s some evidence that people who’ve lost and regained weight have more health problems than people who weigh the same, but have never lost and regained a significant amount of weight.

This is not controversial. Virtually every study of weight-loss dieting that has followed participants for longer than 6 months has found that the majority of dieters regain all the weight they lose initially, if not more. In other words, Oprah’s high-profile weight fluctuations are not the unfortunate exception to most dieters’ experience, they are the rule. A gallery of pictures of Oprah through the years illustrates the most frequent and reliable outcome of dieting:

Oprah in The Color Purple Screen shot of the infamous "fat wagon" episode first aired in the fall of 1988, when Oprah strode on set in what she proudly declared were size 10 Calvin Klein jeans after an Optifast diet, wheeling a Red Flyer wagon full of lard representing how much weight she'd lost  At the Emmy Awards, holding her third for "Outstanding Talk/Service Show Host"  Holding yet another Emmy at the end of that impressively-muscled arm, shaped with the help of trainer Bob Greene

             1985                           1988                             1992                             1996

 At the party celebrating the first anniversary of O Magazine  At the Academy Awards, wearing Vera Wang Presenting at the Emmy Awards presenting at the 2010 Oscars, possibly on the way back down again?

              2001                            2005                          2008                            2010        

I am not concerned (in this entry) with why calorie restriction diets fail—there are competing theories and perhaps I’ll try to tackle them some other time. However, when it comes to evaluating public health policies aimed at the general population, like posting calorie counts on menus, it doesn’t really matter why the kind of behavior it’s designed to encourage fails, especially when it fails so spectacularly. Whether the problem is that 90-95% of people don’t have the willpower to stick to calorie-restricted diets or that most peoples’ metabolic rates eventually adjust or both or something else entirely, continuing to prescribe calorie restriction to individuals seeking to lose weight is futile at best. Given the health problems associated with weight cycling and psychological distress caused by diet “failure,” it’s probably also dangerous and cruel. More on that another day, too.

The goal of this entry is to provide a condensed-but-comprehensive overview of the evidence that convinced me that weight-loss dieting—and particularly calorie-restriction dieting or eating less—does not “work” for most people. By “work” I mean lead to significant weight loss—at least 10% of starting body weight—that lasts for more than 3 years (in keeping with the clinical definition of “weight loss success” proposed by the 1998 National Heart, Lung, and Blood Institute [NHLBI] Obesity Education Initiative Expert Panel proposed). I honestly tried to keep this as short as possible and bolded the “highlights” if you want to skim. However, if brevity is what you’re looking for, see this 2007 Slate article.


A Meta-Review of the Literature

Of course, I’m not the first person to try to figure out what kind of picture decades of weight-loss research was painting. I found 14 reviews of weight-loss research in peer-reviewed journals (Mann et al 2007, Jeffrey et al 2000, Perri & Fuller 1995, Garner & Wooley 1991, Jeffrey 1987, Bennett 1986, Brownell & Wadden 1986, Brownell 1982, Foreyt et al 1981, Wilson & Brownell 1980, Stunkard & Penick 1979, Wooley et al 1979, Foreyt 1977, Stunkard & Mahoney 1976). And they all say basically the same thing: calorie-restriction diets don’t work long-term. Here’s how three of the most recent ones came to that conclusion, and one meta-analysis that claims to challenge the consensus, although it turns out that all they’ve really done is redefine “success.” 

Diets Don’t Work—Mann et al 2007 (free full text): This review of 31 weight-loss studies by a team of UCLA researchers was aimed at developing recommendations for Medicare regarding obesity prevention and treatment. They were only able to find 7 studies of weight-loss dieting that randomly assigned participants to diet or control groups and followed them for at least two years (the “gold standard” required to make causal claims about the effects of dieting). And the “gold standard” studies did not support the claim that dieting promotes significant or long-term weight loss:

Across these studies, there is not strong evidence for the efficacy of diets in leading to long-term weight loss. In two of the studies, there was not a significant difference between the amount of weight loss maintained by participants assigned to the diet conditions and those assigned to the control conditions. In the three studies that did find significant differences, the differences were quite small. The amount of weight loss maintained in the diet conditions of these studies averaged 1.1 kg (2.4 lb), ranging from a 4.7-kg (10.4-lb) loss to a 1.6-kg (3.5-lb) gain. (223)

They also examined 14 studies with long-term follow-ups that didn’t involve control groups. The average initial weight loss in those studies was 14 kg (30.8 lb), but in the long-term follow-ups, participants typically gained back all but 3 kg (6.6 lb). Of the eight studies that tracked how many participants weighed more at the follow-up than before they went on the diet, the average was 41% with a range of 29%-64%, and in every case was higher than the percentage of participants who maintained weight loss. In other words, participants were more likely to regain more weight than they initially lost than they were to maintain their initial weight loss. Although Mann et al note several problems with these studies, like low participation rates in the long-term follow-ups, heavy reliance on self-reporting as the primary or only measure of weight, and failure to control for the likelihood that some of participants were already dieting again at the follow-up, those factors should have biased the results in the direction of showing greater weight-loss and better long-term maintenance, not less.

Finally, they looked at 10 long-term studies that didn’t assign participants to “diet” or “non diet” conditions randomly. In general, these were observational studies that assessed dieting behavior and weight at a baseline time and then followed up with participants to measure changes in behavior and weight over time. Of those studies, only 1 found that that dieting at the baseline led to weight loss over time, 2 showed no relationship between dieting at the baseline and weight gain, and 7 showed that dieting at the baseline led to weight gain.

Their recommendation to Medicare:

In the studies reviewed here, dieters were not able to maintain their weight losses in the long term, and there was not consistent evidence that the diets resulted in significant improvements in their health. In the few cases in which health benefits were shown, it could not be demonstrated that they resulted from dieting, rather than exercise, medication use, or other lifestyle changes. It appears that dieters who manage to sustain a weight loss are the rare exception, rather than the rule. Dieters who gain back more weight than they lost may very well be the norm, rather than an unlucky minority. If Medicare is to fund an obesity treatment, it must lead to sustained improvements in weight and health for the majority of individuals. It seems clear to us that dieting does not. (230)

Long-term Maintenance of Weight Loss: Current Status—Jeffrey et al 2000 (free abstract or full text with umich login): A review of 20 years of long-term weight loss studies that describes the weight loss and regain among patients who participate in behavioral treatments for obesity as “remarkably consistent” (7) which is visually represented by lots of graphs of different studies on the long-term results of weight loss studies that all pretty much look the same:

Very low calorie diets vs. Low calorie diets (Wadden et al 1993)  Fat restriction vs. calorie restriction (Jeffrey et al 1995)

Diet only vs. Diet + exercise (Sikand et al 1988) People who were paid $25/wk for successful weight loss vs. people who weren't paid (Jeffrey et al 1993)

Basically no matter what researchers do, most dieters achieve their maximum weight loss at 6 months and then gradually regain all or almost all of the initial weight lost within 3-5 years, if not faster. They conclude:

The experience of people trying to control their weight is a continuing source of fascination and frustration for behavioral researchers. Overweight people readily initiate weight control efforts and, with professional assistance, are quite
able to persist, and lose weight, for several months. They also experience positive outcomes in medical, psychological, and social domains (NHLBI Obesity Education Initiative Expert Panel, 1998). Nevertheless, they almost always fail to maintain the behavior changes that brought them these positive results. Moreover, as we hope we have shown, efforts to date to change this weight loss-regain scenario have not been very successful.

Confronting the Failure of Behavioral and Dietary Treatments for Obesity—Garner and Wooley 1991 (free abstract or full text with umich login): Like Mann et al, Garner and Wooley were seeking to translate the available evidence about weight-loss dieting into recommendations for treatment—in this case, best practices for mental health practitioners seeking to counsel and treat overweight and obese patients. They note that short-term behavioral studies consistently show that modifications in eating and exercise behaviors lead to weight-loss, but that as the duration of studies increases, “over and over again the initial encouraging findings are eroded with time” (734).

The authors are particularly perturbed that poor results are often presented by study authors as positive. For example, an 1981 study comparing standard behavioral therapy with a weight-loss drug, or the therapy and drug combined found that all of the treatment groups lost a significant amount of weight in the first 6 months, and then all of the treatment groups showed significant re-gain by the end of the 18 month follow-up.the consistency in the curves is really eerie after a while...the 6 month nadir, the gradual incline; also, it is completely baffling to me how someone could look at this graph and think the most notable part is the gap between the three treatments at 18 months

Instead of concluding that all of the treatments had failed to produce lasting weight loss, the authors conclude that these results provide hope for behavioral therapy, because that group showed the slowest rate of weight re-gain:

This most recent study provides grounds for optimism as to the future of behavioral treatment of obesity . over the long run, behavior therapy clearly outperformed the most potent alternative treatment with which it has yet been compared. (734 in Garner and Wooley, 135 in the original)

This pattern is nearly as consistent as the finding that weight is gradually regained and many individuals eventually weigh more than they did at the start of the treatment. After four years, nearly all participants in nearly all studies gain back nearly all the weight they initially lost: Adams, Grady, Lund, Mukaida, & Wolk, 1983; Dubbert & Wilson,1984; Kirschenbaum, Stalonas, Zastowny, & Tomarken, 1985; Murphy, Bruce, & Williamson, 1985; Rosenthal, Allen, & Winter, 1980, Bjorvell & Rossner, 1985; Graham, Taylor, Hovell, & Siegel, 1983; Jordan, Canavan, & Steer, 1985; Kramer, Jeffery, Forster, & Snell, 1989; Murphy et al. 1985; Stalonas, Perri, & Kerzner, 1984; Stunkard & Penick, 1979. And yet, the authors of those studies insist that the diet interventions are “effective,” sometimes claiming that if the subjects had not dieted they would weigh even more. They almost never admit that the treatments completely failed to do what they set out to do, which is produce a clinically significant weight loss that can be maintained long-term. When they do admit that the results are “disappointing,” they frequently call for more “aggressive” treatments like very low calorie diets (VLCD or <800 kcal/day) or supervised fasting (which is no longer approved because of the risk of mortality).

Garner and Wooley also evaluate studies that used VLCD, some of which involved Optifast, the protein shake that Oprah used to achieve her 67 lb weight loss in 1988. Just like with other calorie-restriction diets, people on VLCD generally lose weight initially, although drop-out rates are much higher than in other weight loss studies (50% or more). Participants who stick to the diet typically maintain the weight loss for about a year, but regain most if not all of the weight they lost within three years and many gain more than they initially lost (Swanson and Dinello, 1970, Sohar and Sneh, 1973, Stunkard and Penick, 1979, Johnson and Drenick 1977, Drenick SC Johnson, 1980, Wadden et al., 1983, Wadden, Stunkard, & Liebschutz 1988, Hovel et al., 1988). Based on all of those studies, they conclude:

Although the rate and magnitude of weight loss have been the basis for recommending the VLCD, its most remarkable feature is the speed of weight regain following treatment. (740)

Garner and Wooley found only two studies of weight-loss dieting that reported better long-term results, and both had extremely low rates of participation in the follow-up and relied on self-reported weights. For example, Grinker et al (1985) reported that 55% of the participants in a residential treatment program had maintained a 5-kg weight loss based on the responses of only 38% of the original participants. They suggest that it seems far more likely that the low participation in the follow-up biased the results than that those studies are right and all the other ones or wrong and conclude:

It is only the rate of weight regain, not the fact of weight regain, that appears open to debate. While this may be discouraging to the individual intent on weight loss, it should also provide some solace to the many individuals who have failed at dieting and have attributed the failure to a personal lack of will power. (740)

It is difficult to find any scientific justification for the continued use of dietary treatments of obesity. Regardless of the specific techniques used, most participants regain the weight lost. (767)

They make the following recommendation to mental health practitioners:

We suggest that at the least, if weight loss is offered, it should be done with full disclosure of the lack of long-term efficacy and of the possible health risks [which, as they explain, include physical and psychological risks correlated with weight fluctuation]. It is further recommended that alternative nondieting approaches aimed at improving the physical and psychological well-being of the obese individual be given priority over dietary treatments as a subject of research and that such treatments be offered on an experimental basis. (767)

Long-term weight-loss maintenance: a meta-analysis of US studies—Anderson et al 2001 (free full text): As the title suggests, this is a meta-analysis rather than a review article, meaning rather than summarizing and evaluating what other studies found, they lumped together the data from 29 different studies. 13 of the studies involved “very low energy diets” (VELDs), 14 involved “hypoenergetic balanced diets” (HBDs) and 2 involved both—in other words, they were all calorie-restriction diets, and about half of them required participants to eat less than 800 kcal/day. The authors claim that no long-term randomized, controlled studies were available, and it’s unclear why they didn’t think studies like Jeffrey and Wing 1995 (discussed below) should count.

They don’t provide details for any of the studies individually, but do disclose that the number of participants ranged from 6 to 504, the length of treatment ranged from 8 to 30 weeks, average initial weight loss ranged from 3.5 to 37.9 kg for women and 6.2 to 44.2 kg for men, and follow-up participation rates ranged from 50% to 100% with a median of 82%. In other words, these were very different studies. Here are the results of their aggregation of the data:

again, what they're focusing on is the relatively small loss maintained by year 5 rather than, say, the precipitous drop from year 1 to year 2

The average weight loss at 5 years for both VELDs and HBDs was 3.0 kg, or ~3.2% of the participants’ starting weight and 23.4% of their initial weight loss. Anderson et al conclude:

These average values are higher than those reported in earlier studies and indicate that most individuals who participate in structured weight-loss programs in the United States of the type reported in the literature do not regain all of the weight lost at 5 y. of follow-up.

Sure, not all of the weight, only 76.6% of it. It still seems to me like a perversion of the idea of “success” to claim that these results show that calorie-restriction diets are “effective.” The average initial weight loss was 14 kg. If you lost almost 31 lbs and then regained 25 lbs, would you consider your diet a long-term success? Mann et al wouldn’t. In the 14 long-term studies without control groups that Mann et al evaluated, they also note an average maintenance of ~3 kg. They just don’t think that’s very impressive:

It is hard to call these obesity treatments effective when participants maintain such a small weight loss. Clearly, these participants remain obese. (Mann et al 223)

Interpretation/equivocation aside, there’s still some discrepancies between their analysis and the consensus in the other reviews which I wish I could explain. It’s not like this was a study of a new treatment—they relied exclusively on existing studies, at least some of which were also included in the reviews of the literature discussed above. However, some of the studies they included must have reported (possibly significantly) better results to bring up the average. Since they didn’t evaluate the studies individually, it’s impossible to tell from their write-up whether those studies involved some sort of strategy that made calorie restriction dieting “work” (and somehow didn’t attract widespread attention) or whether the results in those studies were biased by low participation rates in follow-ups, self-reporting, or some other factor(s).

A Closer Look at the Studies Themselves

I have not read every single study referenced in the review articles, although I have at least glanced at many of them. The ones I chose to explore in further depth here either 1) meet the “gold standard” of randomized assignment to diet/non-diet conditions and at least 2 years of follow-up or 2) are too recent to be included in the review articles.

Long-term Effects of Interventions for Weight Loss—Jeffrey and Wing 1995 (free abstract or full text with umich login): This is one of the seven studies included in the first part of the Mann review. 202 participants between the ages of 25 and 45 who were between 14-32 kg above the MetLife standards for the “ideal weight” for their height were randomly assigned to one of five experimental groups:

  • a control group which received no intervention
  • a standard behavioral therapy group (SBT) that received instruction on diet (including advice on how to follow a 1000-1500 calorie/day diet), exercise (including the recommendation to walk or bike 5 days/wk with an initial goal of burning 250 kcal/wk and gradually increasing that to 1000 kcal/wk), and behavior modification (including keeping food and exercise diaries. This advice was given in weekly counseling sessions for the first 20 weeks and monthly sessions thereafter for a period of 18 months.
  • a SBT + food group, which received the same counseling along with premeasured and prepackaged breakfasts and dinners for 5 days/week for 18 months
  • a SBT + $ incentive group, which received the same counseling along with up to $25/week  for achieving and maintaining weight loss
  • a SBT + food + $ incentive group, which got the counseling, meals, and money

In addition to the 18 months of the study, the participants were contacted at 30 months (a full year after the study ended) for an additional follow-up, which was completed by 177 (88%) of the original participants. Here are the results:

 is this shape getting familiar? 

All the treatment groups lost weight during the intervention, achieving their maximum results at 6 months. However, by 12 months—even though they were all still receiving the treatment, they were beginning to regain weight. By 30 months, there was no significant difference between any of the treatment groups and the control group. The authors wheedle a bit, claiming the difference “approaches levels of statistical significance” (.08), but are honest enough to admit in the end:

The overall results of this evaluation reemphasize the important point that maintaining weight loss in obese patients is a difficult and persistent problem.

Preventing Weight Gain in Adults: The Pound of Prevention Study—Jeffrey & French 1999 (free full text): This more of a “failure of low-cost educational interventions designed to encourage weight loss” than a failure of weight loss dieting per se, but it’s still relevant because 1) the experimental group “got the message” communicated in the educational intervention but gained the same amount of weight over 3 years as the control group and 2) calorie labeling is essentially a large-scale, low-cost educational intervention. The idea that education will make people thinner relies on the assumption that people would not be (as) obese if they only knew they were gaining weight, that they should eat more fruits and vegetables, that they should reduce their consumption of high-fat foods, and/or that they should get more exercise.

But most people do know all those things. In this study, 228 men and 594 women employed by the University of Minnesota and 404 low-income women, all between the ages of 20-45, were recruited to participate in a 3-year study. Half of the participants were assigned randomly to a control group and the other half were assigned to the “intervention” group, which received a 2-4 pg monthly newsletter called Pound of Prevention. The newsletter emphasized five themes:

1) weighing yourself regularly (at least once a week)
2) eating at least 2 servings of fruit per day
3) eating at least 3 servings of vegetables per day
4) reducing the consumption of high-fat foods
5) increasing exercise, especially walking

In other words, “common sense” nutritional advice, although not explicitly calorie reduction. The newsletter included recipes, suggested particular areas/routes in the local areas for walking, and included a return-addressed, stamped postcard asking participants to report their current weight and also answer whether they had walked for 20 minutes or more, eaten 2 servings of fruit, eaten 3 servings of vegetables, or weighed themselves in the last 24 hours. Intervention participants were also invited to take part in a variety of activities during the three years, including 4-session weight control classes, aerobic dance classes, free 1-month memberships to community exercise facilities, walking groups, and a walking competition. Additionally, half of the “intervention” group was assigned randomly to an “incentive” group who were eligible for a monthly $100 lottery drawing for members who returned the postcards.

All participants were evaluated in annual physicals where they were weighed, their height was measured, their dietary intake evaluated using a standard 60-item Food Frequency Questionnaire, and they were asked about behaviors like exercising, eating fruits and vegetables, decreasing fat intake, using “unhealthy diet practices” like laxatives and diet pills or liquid diet supplements, weighing themselves, and smoking. At some point in the study, a questionnaire was administered to test “message recognition.”

Participation in the “intervention” group was high—68% of postcards were returned, 80% of the participants reported having read most or all of the newsletters at their annual visits, and 25% participated in one or more of the extra activities. The “message recognition” test was somewhat successful—the intervention group was significantly more likely to identify the 5 targeted treatment messages as being among the best ways to prevent weight gain; however, even 66% of the control group endorsed the treatment messages. The intervention groups were slightly-but-significantly more likely to weigh themselves and more likely to continue practicing “health weight loss practices” as measured by a 23-item questionnaire. However, changes in BMI, energy intake, percent of calories from fat, and rates of physical activity were not significantly different between the control and intervention groups. All participants gained an average of 3.5 lbs over the course of the 3 years.

In short, the intervention was a failure. The authors conclude:

It is easier to teach people what to do than to persuade them to actually do it…. The overall impact on weight itself…was very weak, indicating that stronger educational strategies are needed or, alternatively education alone is insufficient to deal effectively with this important problem.

Weight Maintenance, Behaviors and Barriers—Befort et al 2007 (free abstract or full text with umich login): Based on the abstract, this study sounds like a success, but under closer examination, not so much. The data was collected at a university weight loss clinic where participants were recruited to follow low-calorie or very low-calorie (500 kcal/day) weight-loss diets followed by a maintenance program. The “weight-loss” phase lasted for 3 months during which participants consumed prepackaged meals and/or shakes. The maintenance programs ranged from 6 to 21 months and consisted of weekly or bi-weekly meetings at the clinic during which participants were counseled to follow a structured diet plan with a daily calorie goal and exercise 150-300 minutes per week. In 3 out of 4 trials, the participants were also encouraged to continue consuming the shakes/prepackaged meals.

Out of 461 participants who started treatment, 44 dropped out during the 3-month weight loss phase and 211 dropped out during the maintenance phase. They sent follow-up surveys to everyone who completed the 3-month weight loss phase (n=417), and got 179 back (46.6.%). The more recently participants had been part of one of the studies, the more likely they were to respond to the follow-up survey. Responders had only been out of treatment for an average of 14 months.

Their claim that a “majority” of the participants maintained their initial weight loss is based on them lumping together respondents who had only been out of treatment for 6 months with people who had been out of treatment for 24 months or more, despite the fact that—just like in every other study of calorie-restriction weight loss—the results showed that most participants gradually regain weight. As they admit:

Compared to participants who were out from treatment for 24 months or longer, those who were out for less than 6 months (P<0.05) or for 6–12 months (P<0.01) had significantly greater weight loss maintenance, both in terms of kg and percent of baseline weight.

What they don’t say is that the percentage of respondents who report maintaining their initial weight loss drops off precipitously after 24 months.

no graph; perhaps it would have been too damning?

Of the 31 respondents who’d been out of treatment for 24+ months, only 25.8% had maintained a weight loss of 10% of their body weight or more and 48.4% had maintained a weight loss of 5% or more. That means out of the original pool of 417 who completed the 3-month diet, only 8 had proven capable of maintaining weight loss equal to 10% of their body weight for more than 2 years and only 15 had proven capable of maintaining a weight loss equal to 5% of their body weight. Other participants might be able to maintain their initial weight loss—that data isn’t available, but the trajectory certainly doesn’t look good. And that’s based on the half of the participants who participated in the follow-up—as Garner and Wooley note, the higher the rate of participation and the longer the follow up, the less weight loss on average is maintained.

What About the National Weight Loss Control Registry?

Several of the studies and at least one person who commented on one of the earlier posts in this series mentioned the National Weight Loss Control Registry (NWCR) as evidence that people can indeed lose weight and keep it off. I’ve never disputed that. Even in the studies that show the least hope for long-term maintenance, there are exceptions to the general trend. But that’s what they are: exceptions.

According to the NWCR website, they have over 5,000 members, all of whom have lost at least 30 lbs and kept it off for at least 1 year; however, most of them have done far better—registry members have lost an average of 66 lbs and kept it off for an average of 5.5 years. As the research above suggests, that’s not remotely “representative” of people who attempt to lose weight. On the contrary, the entire raison d’être of the registry is to figure out what’s different about the 5-10% of dieters who lose significant amounts of weight and keep it off. The goal is to identify strategies that might help other dieters, but as the researchers who run the registry admitted in a 2005 article (free abstract):

Because this is not a random sample of those who attempt weight loss, the results have limited generalizability to the entire population of overweight and obese individuals.

Indeed, the kinds of things the registry members do are generally the same things the participants in most weight loss studies are counseled to do (or, in clinical settings, forced to do): most of them follow a low calorie, low fat diet, eat breakfast every day, weigh themselves at least once a week, watch less than 10 hrs of TV per week, and engage in very high levels of activity—420 minutes per week on average. The NWCR has yet to figure out what makes those things work for them and/or makes them capable of sustaining those behaviors when for most people, they don’t/can’t.

Collecting 5,000 success stories does not prove that dieting “works” for most people let alone that it’s the norm. Somewhere between 45 million and 90 million Americans diet to lose weight every year, most of them by attempting to reduce their caloric intake. According to a survey conducted in April 2010 by a private consumer research firm on behalf of Nutrisystem, 30% of Americans have dieted repeatedly—an average of 20 times. Unsurprisingly, weight loss attempts are more common among overweight and obese people. If calorie-restriction dieting “worked,” America would be a nation of thin people.

Conclusion: Putting the burden of proof back where it belongs

Traditionally, researchers assume that a treatment is not effective until they have evidence that proves otherwise. The reverse is true in regard to weight-loss dieting: most people assume dieting is effective for long-term weight loss and challenge anyone who believes otherwise to prove that it doesn’t—not that that’s difficult, given the consistent failure of most weight-loss interventions to produce lasting results. I have not been able to find one long-term, randomized, controlled study that shows that dieting works (i.e. a statistically significant group of people following a calorie-reduction diet losing a clinically significant amount of weight and keeping it off for more than 3 years). Instead, what all the research to date shows is that the most reliable outcome of calorie-restriction dieting is short-term weight loss followed by weight regain.

I suspect the stubborn persistence in prescribing calorie-restriction dieting as a weight loss strategy in spite of the available evidence probably has a lot to do with dominant and deeply-engrained attitudes about fatness, meritocracy, virtue, and effort. People exhibit remarkable cognitive dissonance when it comes to the research on weight loss—they hold up exceptions as the rule and claim that the 90-95% of people for whom calorie restriction dieting does not produce weight loss must simply not be trying hard enough. 

Imagine this scenario playing out with any other condition—imagine that instead of weight, we were talking about some kind of rash that was widely considered unattractive and thought to be correlated with a variety of other health problems. There’s a treatment that showed promise in short-term trials. In virtually every study, most of the people who get the treatment experience significant improvement in their symptoms, with peak results around six months. However, in longer-term studies, there’s a reversal. Just as consistently, the vast majority of sufferers—at least 75% and usually closer to 90 or 95%—experience a gradual return of their symptoms. For approximately 30-40% of participants, their symptoms actually get worse than before they started the treatment. Only 5-10% show lasting improvement. Of course you would want to do more research to figure out why the treatment works for that 5-10%, but in the meantime, would you keep prescribing it to everyone with the same skin condition?

Even if the problem is that only 5-10% of them fail to use the treatment as instructed—say, it’s a topical cream that only works if you apply it every hour on the hour and people get fatigued, especially by trying to wake up at night to put it on. If 90% of the affected population can’t use the treatment effectively, the results are the same as if the treatment never worked in the first place. Well, except for that part where 30-40% of them end up worse off than before they started the treatment…

So even if the calorie counts on menus were accurate, and people could accurately and reliably estimate how many calories they burn, and they did choose lower-calorie options at least some of the time, and they didn’t compensate by eating more on other occasions…in other words, even if the calorie counts worked the way they were intended to, the best you could hope for would be short-term weight loss. There’s no reason to believe the policy—even under ideal conditions—would have a lasting effect on most Americans’ weight or health.