Category Archives: history

You can help kickstart the new America Eats project

click for kickstarter link

If you like food history, or like reading the occasional blog post that references food history, you might be interested in supporting the American Eats digital archive project.

The Lansing-area non-profit Sustainable Farmer and MSU departments of Journalism and  History want to send food historian Helen Veit to digitize documents related to the Depression-era WPA program America Eats. The program put unemployed writers to work, including Saul Bellow and Zora Neale Hurston, by sending them around the country to write about regional food specialties. A selection of the essays edited by Mark Kurlansky was published a few years ago as The Food of a Younger Land. ah, nostalgia for the past that never wasAs you can tell from the cover & subtitle, “Before the National Highway System, Before Chain Restaurants, and Before Frozen Food, When the Nation’s Food Was Seasonal, Regional, and Traditional—from the lost WPA files,” Kurlansky is heavily invested in the myth that everyone ate fresh, local, authentic food back in the Good Old Days three decades after The Jungle. That’s despite the fact that the essays and other materials he included reveal that people involved in America Eats were really divided over questions of what to include, particularly regarding recipes and events that involved industrial, commercial products. Because counter to the pastoral image on the right, those were were a huge part of the inter-war American diet and the basis for many unique, regional practices like Coca-Cola parties in Atlanta. The decision to focus instead on ethnically and regionally-marked church suppers and first and second-generation immigrant practices resurrected primarily for holiday meals was driven by particular ideologies about the nation and the goals of documentary. 

The planned WPA book series never materialized, partially because of the conflicts over what to include, but primarily because attention and resources were diverted by World War II. Many of the documents generated by the project were scattered to state archives or thrown away, but there are four boxes of material at the Library of Congress. The goal of the Kickstarter campaign is to send Dr. Veit to DC to scan all of that in and make it available online. I’ve never seen Kickstarter used as academic fund-raising before, but I guess for a project with potential public appeal like this one, why not?

As little as $1 gets your name on the website as a supporter and access to all the material they digitize (unclear if they’re planning on restricting access, which would make me a little cranky, but I guess if the price barrier stays as low as $1 that’s not too bad). For a little more, you can get heirloom tomato seeds, a tote bag, and/or a t-shirt. The America Eats Today site, where the archival materials will eventually be available, appears to be a work in progress, but there’s a short video up on the Kickstarter page where you can hear a little more about the project from Dr. Veit.

Please Just Shut Up About the “Family Meal”

The most remarkable thing about my mother is that
for thirty years she served the family nothing but leftovers.
The original meal has never been found.
-Calvin Trillin

Don't they look like they would talk in Comic Sans?   
Image Credit: She Knows.com

Every few months, someone publishes another article claiming that the “family meal” is dead or dying, and that if we could only revive it, it would help us fix a laundry list of societal ills from obesity to teen pregnancy. These stories usually make four main rhetorical moves:

1) Invoking Tradition and History: They universally portray the family meal as a long-standing tradition that Americans lost sometime in the recent past. For example, a post on Epicurious last week was titled “Radical Call To Take Back The Family Dinner,” which not only implies that it’s something we had at some point but also that something or someone took it from us, possibly without our consent. Frequently, these articles conflate cooking from scratch, eating at home, and eating with the members of your nuclear family, which makes it difficult to pinpoint exactly which behavior they think has declined, or what counts as a “family meal” in the first place—if a nuclear family eats together at a restaurant, does that count? What about eating take-out at home? Is eating in a car an automatic disqualifier, even if the food is homemade and everyone talks about their day?

The historical timeline is also usually pretty fuzzy. It’s almost never clear when the author thinks the “family meal” supposedly prevailed or when we “lost” it—the 1920s? the 1960s? The 1980s? And yet they’re sure it existed at some point. Much like the missing “original” meal in Calvin Trillin’s joke about leftovers, it’s an absent referent.

2) Explaining Why the Family Meal Is a Panacea: Whatever they think it was, and whenever they think we lost it, the authors are clear about at least one thing: family meals are a good thing and we need to get them back. That’s often exemplified by the titles, like the Huffington Post article linked in the Epicurious post: “How Eating at Home Can Save Your Life” or Miriam Weinstein’s 2005 book, The Surprising Power of Family Meals: How Eating Together Makes Us Smarter, Stronger, Healthier and Happier. The authors often point—or at least gesture vaguely—towards studies that have shown correlations between how often children eat with their parents and higher GPAs, lower BMIs, lower rates of alcohol & drug use, and a decreased likelihood of developing an eating disorder. Because everyone knows correlation = causation, right?

3) Blaming Individuals Instead of Structural Changes: Authors sometimes initially point the finger at structural changes like the increasing reliance on industrial-scale agriculture and processing to feed a growing & urbanizing population (dated either to the post-Civil War or post-WW II era) and growing numbers of women in the workforce after the 1960s. However, they all find a way to shift blame, and thus responsibility, back onto individuals. That’s rhetorically necessary, because the goal of the articles is usually to change individuals’ behavior. Also, it would be futile and/or offensive to suggest that the “answer” is a massive population cull, giving up city living, and/or  women leaving the workforce en masse. Indeed, many feminists were rankled by Michael Pollan’s article “Out of the Kitchen, Onto the Couch,” even though he specifically notes that the amount of time spent on food preparation has declined dramatically even for women who don’t work outside the home and specifically calls on women and men to “make cooking a part of daily life.”

The factors they end up blaming instead are nonsense or, at best, totally un-measurable. My favorite is when they claim that the problem is that Americans are busier than ever…but also lazier than ever. From Epicurious:

We’re certainly at a lazy point in history, though ironically, for all the conveniences at our disposal, we seem even shorter of actual time.

We’re working more hours than ever, but as a culture, we’ve gotten lazy. American kids are running around trying to do more extra-curriculars than ever, especially if they’re college-bound, but they also spend all their time on passive, mind-slushifying electronic entertainments. Chefs, farmers, and food writers have achieved celebrity status, the Food Network and shows like Jamie Oliver’s Food Revolution and Bravo’s Top Chef achieve stellar ratings, and there’s never been more interest in eating organic, natural, fresh, local food…but we’re a fast food nation addicted to HFCS and cheap, industrially-processed, “empty” calories. Yadda yadda, a dozen other unsubstantiated, contradictory clichés that tell us nothing about what is responsible for Americans cooking and/or eating together less (if indeed they are cooking and eating together less).

4) Calling for Change: All of which leads up to the same old rallying cry: Even if you think you don’t have time, you should make the time! If you don’t know how to cook, you should learn to cook! If you and/or your kids prefer fast food and convenience foods, you’re basically a failure at life and you should learn to roast Brussels sprouts and grow an adult palate, stat. Then they make perhaps the stupidest claim of all: not only is cooking at home and eating together a moral obligation to your children, family, and nation, the real reason you should do it is because it will make you happier. After all, there is no greater joy than sharing a home-cooked meal with the people you love.

I make that face all the time, which may make me a snarky, whiny adoescent. But tell me how I'm wrong. American Beauty dir. Sam Mendes 1999 (from Orange Crate Art)

The “family meal” lecture would be annoying enough as it is—smug, hectoring, and totally unoriginal—but worse, it’s almost entirely based on myths, lies, and logical fallacies.

You Cannot “Take Back” What You Never Had, or Never Lost

Again, the lack of a coherent definition makes it a little difficult to say with any real reliability whether or not the “family meal” has declined, and since when. But here’s why Hyman thinks it’s dying: 

In 1900, 2 percent of meals were eaten outside the home. In 2010, 50 percent were eaten away from home and one in five breakfasts is from McDonald’s. Most family meals happen about three times a week, last less than 20 minutes and are spent watching television or texting while each family member eats a different microwaved "food." More meals are eaten in the minivan than the kitchen. (full article on The Huffington Post)

No sources cited, but let’s just assume that’s all true. Hyman seems to be setting a pretty high bar—his idealized meal has to be eaten inside the home, cooked without a microwave, last longer than 20 minutes, be more frequent than 3x/week and take place in the absence of television or texting. We know that in 1910 at least two of his criteria were being met—we had the eating at home part down and televisions and texting hadn’t yet been invented. But that doesn’t mean people were sitting down to eat together for 20+ minutes on a daily basis, or that the ones who did were cooking those meals themselves.

Actually, the idea of the “family meal” was invented in the mid-19th C. Before that, in wealthy families, children would eat in the nursery with a nanny or servants until they were old enough to be sent off to boarding schools or join their parents and other guests at formal meals, probably in what we’d now consider adolescence. In poorer families, everyday meals were casual affairs and often staggered. As Michael Elins writes in Time magazine:

Back in the really olden days, dinner was seldom a ceremonial event for U.S. families. Only the very wealthy had a separate dining room. For most, meals were informal, a kind of rolling refueling; often only the men sat down. Not until the mid–19th century did the day acquire its middle-class rhythms and rituals; a proper dining room became a Victorian aspiration. When children were 8 or 9, they were allowed to join the adults at the table for instruction in proper etiquette. By the turn of the century, restaurants had appeared to cater to clerical workers, and in time, eating out became a recreational sport.

Other food historians like Harvey Levenstein also agree that it was only in the Victorian era that the “family meal” became a preoccupation—and it emerged first among bourgeois families, for whom eating “correctly” became a crucial way of distinguishing themselves from the working classes. It was increasingly seen as inappropriate to delegate the feeding of children to servants because mealtime was such a crucial opportunity for training in manners, conversation, and taste.

Much of the advice published in the 19th C. aimed at Victorian mothers focused on the proper care of the adolescent female body, including how daughters should eat and exercise to cultivate the correct social identity and moral character. Meat and spicy foods were thought to stimulate and signal sexual desire so their consumption was seen as unsuitable for girls and women. In a more general sense, eating, food preparation, digestion, and defecation all became constructed as coarse and un-ladylike. According to Joan Jacobs Brumberg, this combination of smothering maternal concern about eating and the elevation of restraint and physical delicacy prompted the emergence of anorexia nervosa among middle-class girls in the Victorian era (Fasting Girls p. 134).

Furthermore, concerns that the “family meal” is dying are almost as old as the idea of the “family meal” itself. According to nutrition policy and research analyst Paul Fieldhouse:

[The] nuclear concept of the family meal is a fairly modern phenomenon… and there is evidence that every generation has lamented its demise. Already in the 1920s there were worries being expressed about how leisure activities and the rise of the car were undermining family mealtimes! (From Eating Together: The Culture of the Family Meal)

The fact that the “family meal” is a modern idea, was historically limited mostly to the wealthy, and has apparently always been on the decline doesn’t mean its prevalence hasn’t been historically variable. The statistics I find the most convincing are the ones Robert Putnam cites in Bowling Alone:

The fraction of married Americans who say “definitely” that “our whole family usually eats dinner together” has declined by a third over the last twenty years, from about 50 percent to 34 percent…. The ratio of families who customarily dine together to those who customarily dine apart has dropped from more than three to one in 1977-78 to half that in 1998-99. (Bowling Alone p. 100)

That’s a considerably more conservative definition than Hyman’s—Putnam is interested in togetherness, not what kind of food you’re eating or whether the television is on—and even so, neither its original prevalence nor its decline are all that staggering. The idealized "family meal": the Cleaver family in an episode calls "Teacher Comes to Dinner" first aired in 1959.The phenomenon of nuclear families eating together probably peaked sometime between the 1940s and 1970s, but it was still habitual for less than half of the American population and probably mostly limited to relatively affluent, dual-parent, single-income households. And despite how much more free time Americans supposedly had to cook, and how much harder-working they were back then, we know that most of those households relied on domestic servants, restaurant meals, take-out, and/or industrially-processed convenience foods at least some of the time. The heyday of the “family meal” was also the heyday of Jell-O salads and Peg Bracken’s I Hate to Cook Cookbook

By the 1990s, the percent of families “usually” eating together had declined by 16%, which is significant. However, recent trends point towards a revival since the 1990s, not further decline. According to the National Center on Addiction and Substance Abuse (CASA) at Columbia University, the number of adolescents who reported eating with their families “most nights” increased 23% between 1998 and 2005. In CASA’s 2010 survey of over 2000 teens and 456 parents, 60% said they eat dinner with their families at least five times a week. And there’s more good news for people who think the “family meal” has to happen at home: in the last two years, restaurant traffic in the U.S. has fallen, probably due to the recession.

In other words, it’s not at all clear that the family meal is dying or that taking it back would be all that “radical.” More coming soon on why it doesn’t really matter if the television is on or who did the cooking (at least in terms of benefits for children), why increasing the prevalence of family meals—even if you could achieve that just by scolding people about it—probably wouldn’t fix anything, and what could be done that might actually improve American eating habits…

2010 Year in Review, Part II: The Non-Recipes

2010 nonrecipes collage

A Record of Sticking Places

In September, Lauren Berlant wrote the following description of writing on her blog, Supervalent Thought

Most of the writing we do is actually a performance of stuckness.  It is a record of where we got stuck on a question for long enough to do some research and write out the whole knot until the original passion and curiosity that made us want to try to say something about something got so detailed, buried, encrypted, and diluted that the energetic and risk-taking impulse became sealed and delivered in the form of a defense against thinking any more about it. Along the way, something might have happened to the scene the question stood for:  or not.

At first, I thought of that as something that applied only to “serious” writing—to articles or book chapters that unfold over months or years. But in retrospect, I think it’s actually one of the reasons I started this blog: to have a place to delve (even if only shallowly) into the kinds of questions that were distracting me from writing my dissertation and then seal them up so they’d stop cluttering my thought process. At some point in the process of writing most of the longer, essayish posts, I get sick of the topic and just want to be done with it. So I finish it, and even if I haven’t entirely resolved the question I started with, I feel released from thinking about it at least for a while.

However, the blog hasn’t quite had the intended effect of freeing me up to write the dissertation because, unsurprisingly, getting mentally “free” takes up a lot of the time and energy I ought to be spending on that other, more important “performance of stuckness.” And the whole idea of having a mentally “clean slate” before I deal with my dissertation was probably always a hopeless ambition.

So this part of the retrospective on the year is also a sort of penitent offering to anyone who’s come to appreciate or even maybe expect this kind of content. In the next six months, I need to finish and defend and submit my dissertation. Also, I’m getting married. Between the two, I’m probably not going to have the time to do a lot of longer posts on culture/history/politics. I’m toying with the idea of taking excerpts from the dissertation and editing them into blog-friendly essays on the weekends. But in case I don’t end up having the time to post much of anything substantial for at least the first half of 2011 and that makes you sad, maybe there will be something here that you missed or might be interested in revisiting.

Special Series

Image from Look at this Fucking HipsterHipsters on Food StampsA three-part look at the bogus “trend” piece published last March in Salon about college-educated people using food stamps to buy organic, ethnic, and otherwise non-subsistence-diet foods and what it says about food & social class in America:

Part I: The New Generation of Welfare Queens—A critique of the article that places it in the longer history of concern about how the poor eat

Part II: Who Deserves Public Assistance?—An analysis of the comments and some of the myths about social class and poverty in America they reflect

Part III: Damned If You Do-ritos and Damned If You Don’t—An attempt to explain the contradictory trends of patronizing vs. romanticizing the poor and how they eat and what kinds of contemporary anxieties the bogus trend of hipsters on food stamps might be a response to

Responses to Food, Inc.—Posts related to the film (and the broader agendas it gave voice to) and how they distort the picture of the American food system and confused their audience.

I never got around to going through the list of suggestions at the end of the film. Perhaps I'll get to it in 2011.Part I: No Bones in the Supermarket—An interrogation of the film’s premise that “looking” at the food system will lead everyone to the same conclusion

Part II: Is the Food More Dangerous?—The film suggests that industrial animal agriculture is responsible for the deadly strain of e coli that killed at least one innocent child, but it turns out that’s not true. Grass-fed cattle have less generic, harmless e coli but the same prevalence of 0157:H7.

Price, Sacrifice, and the Food Movement’s “Virtue” Problem—Why a food “movement” predicated on spending more or making sacrifices is necessarily limited to the privileged few.

The Myth of the Grass-Fed Pig—Why not every farm animal can or should be “grass fed,” and the ecological argument for vegetarianism.

The Myth of the Grass-Fed Pig, Part II: Cornphobia—On the epidemic of irrational fears about corn inspired by Michael Pollan’s books and the documentaries he has appeared in.

Don’t Drink the Agave-Sweetened Kool-AidWhy agave nectar Greenwashing alert.isn’t “natural,” healthy, or (probably) more delicious than other sweeteners.

Part I: Natural, My Foot—Agave nectar isn’t an “ancient sweetener” used by Native Americans, it was invented in the 1990s and involves a process almost identical to the one used to make High Fructose Corn Syrup.

Part II: What’s Wrong With Any High-Fructose Sweetener—Why agave nectar, with up to 90% fructose, isn’t a healthier substitute for sugar.

Part III: The Mint Julep Taste Test and Calorie Comparison—The results of a comparison between agave and simple syrup-sweetened mint juleps and some number crunching that shows you could theoretically cut a small number of calories by substituting agave for sugar, but not if you use the recommended amount, which is calorically identical.

Why Posting Calorie Counts Won’t WorkCalorie counts are already appearing on menus across the country, and will soon be required for most chains. This series explores why they won’t make Americans thinner or healthier. 

Another thing I didn't mention--many of the calorie counts are being posted as "ranges" that take into account all the forms of customization, which makes the numbers even less useful. What are you supposed to do with the knowledge that a burrito has somewhere between 400-1400 calories?Introduction—A brief run-down of the reasons I don’t think the policy will work as intended.

Part I: The Number Posted is Often Wrong—What you see on the label is not always what you get, and the difference isn’t entirely random. 

Part II: Most People Don’t Know How Many Calories They Burn—The problem of calorie ignorance isn’t one that can be fixed with an educational campaign—people don’t know how many calories they burn because they can’t know, because it changes, especially if they change their diets.

Part III: Calorie-restriction Dieting Doesn’t Work Long Term—A meta-literature review of three decades of research on calorie-restriction weight loss that shows again and again that by far the most common result of dieting is weight loss followed by regain. And an explanation of why the National Weight Loss Control Registry isn’t a representative sample.


Health

Probably my favorite post because writing it helped me get over/through that rough patch.When What I Want Isn’t What I Want: Temptation and Disordered Thinking/Eating—Not about nutrition, but about mental health and how easy it is to fall into into negative thought patterns about food and body image, even if you think you’re “beyond” all that

Salt Headlines That Make the Vein in My Forehead Throb—Irresponsible news media reporting about public health research, and especially comparisons between the relative merits of cutting salt  and quitting smoking, may be hazardous to my health

Stop Serving Assemblyman Felix Ortiz Salt in Any Form—A plea to the restaurateurs of New York to teach Mr. Ortiz a lesson handed down from fairytales about what it would be like to eat food without salt.Unless you are a rabbit or a chicken, cholesterol in your food does not automatically translate to cholesterol in your veins.

Things that Won’t Kill You Volume IV: Saturated Fat, Part II: Cholesterol Myths—No one, not even Ancel Keys, ever thought you should avoid dietary cholesterol. Volumes I: High Fructose Corn Syrup, II: Fruit Juice, III: MSG, and IV: Saturated Fat Part I went up in 2009.

Things That Might Kill You Volume I : Trans-fats—Why you might want to avoid trans fats, including things with “0 grams of trans fats per serving,” which still contain potentially non-trivial amounts.

HFCS Follow-up: What the Rats at Princeton Can and Can’t Tell US—A review of the study claiming rats consuming HFCS gained more weight than rats consuming table sugar

Food Policy & Politics

I'm still sometimes uneasy trying to choose between better-for-the-environment and better-for-animals and often end up buying Omega-3 enriched eggs because so far at least it seems like those eggs might be measurably different and healthier.You’re All Good Eggs: New research shows that specialty eggs aren’t any better for the environment or  more delicious—A review of the evidence for and against specialty eggs, concluding that they might be marginally more humane but come at an environmental cost.

Good Egg Update: Someone’s Keeping Score—Explaining the Cornucopia Institute’s guide to specialty eggs

A Food Policy & Politics Christmas Wish List—Seven things that might improve the U.S. food system

Robots

Who Says Robots Can’t Taste? On Cooking Robots and Electronic Noses—A survey of cooking robots and  anxieties about electronic incursions on the acts of cooking and eating

Ingredient Spotlight

The first three listed below were stand-alone posts without recipes. The others were also collected in the 2010 recipe retrospective, but I thought they might merit inclusion here, too, because they involved some research beyond just looking at a few recipes and cooking something.

I'm still not totally satisfied by what I was able to find--the active chemicals have been identified, but it's still a bit of a mystery how they work the way they do. The Sweet Science of Artichokes—Why they make things taste sweet after you eat them

Morel Time in Michigan—How to identify morels and tell them apart from vague look-a-likes.

Meet the Paw-Paw, aka the Michigan Banana—A tropical fruit for the American midwest, with its very own Johnny Appleseed. 

Two on the Tomato: The Official Verdict in the Fruit v. Vegetable Debate and The Case For Tomatoes as Dessert—On the Supreme Court case that ruled tomatoes a “vegetable,” and why there’s still a debate about them even though there are lots of other “vegetables” that are botanically fruits. And how to use them to substitute for sweeter fruits in dessert recipes.

Cheddar-Garlic Biscuits: In Defense of Garlic Powder—Why garlic powder is so maligned, and a culinary defense.

The saffron crocus--each bloom produces 3 pistils, which must be harvested by hand during the brief window when they bloom, before sunrise because the flowers wilt in the sun. Jonathan Franzen and Joël Robuchon-inpspired Rutabaga Purée—On the root vegetable’s biggest fans (some of whom use it as a curling rock), its many detractors, and its supporting role in Jonathan Franzen’s novel The Corrections.

Now in Season: Sour Cherry Pie—What makes sour cherries different from normal pie cherries, and the science of flaky pie crusts.

Deviled Eggs with Saffron Aioli—On the history of deviled eggs and why saffron is so expensive.

Pork Chops with Cider Reduction and Greens—A review of several theories on why pork is so often prepared or paired with apples.

Recipes with History

These were all in the recipe round-up, but again, they have something to offer aside from cooking instruction. New annotations to explain what else you might learn there.

Benedictines and Pimento Cheese Sandwiches for Derby Day—On Miss Jennie Carter Benedict of Louisville,  Kentucky and the shaping of an “American” cuisine for the emerging middle classI'm still tickled by the idea that a reality television show can have a soul.

Jook (Chicken and Rice Porridge)—On the cross-cultural phenomenon of prescribing bone broths and particularly chicken broth-based soups as a healing or restorative food.

Lemon and Herb Chicken Drumsticks—On the history of Labor Day and the relationship between food and holidays

Sourdough-risen Whole Wheat Bagels and the Sweetness of the Old World—On the fetishization of a humble roll with a hole, its origins in the Jewish diaspora and why you don’t have to use “malt extract” to make it authentic (but why some people think you do).

Introducing Ezekiel and How and Why to Make a Sourdough Starter—A brief history of sourdough starters and why so many of them are named “Herman.”

Buckeyes, Shmuckeyes, or if you prefer, Peanut Butter Bon-Bons—How buckeyes became Ohioan and Not, I suspect, bluffin' with her muffin. Ohioans became buckeyes, starring General Ebenzer Sproat and President William Henry Harrison.

Sourdough English Muffins: Of nooks and crannies and double-entendres—Muffin nationalism explained, and also how muffin became a slang term for women and various parts of their anatomy.

American Pumpernickel—Devil’s Fart Bread! The history of Old World and New World rye breads.

Baguettes, regular or whole wheat—On the history and Frenchification of long, skinny, crusty loaves of bread.

A Sourdough-risen Challah Trinity: Braid, Loaf, Knot—The history of challah from tithing to the temple to European decorative braided breads. 

Homemade Peeps and Chocolate-covered Marshmallow Eggs—On this history of the candy, from the therapeutic uses of the mallow flower to the contemporary, mallow-less confection.

Pork Chops with Cider Reduction and Greens and Why Pork Loves Apples

could be done without whole grain mustard, but it won't be as pretty

Our CSA subscription kept me so busy finding new ways to eat greens and green beans this summer that I haven’t done anything new with meat in a long time. But I recently found myself with a package of pork chops, more apple cider than I wanted to drink, and one last bunch of chard. Turns out pork chops are pretty easy—if you just season them with salt and pepper, sear them on both sides over high heat and then cook them until they’re pink inside over lower heat (~155F), they turn out pretty tasty. So this was the meal I came up with: I reduced the apple cider to a glaze along with some whole-grain mustard, cooked the pork chops as described, and then served them both over a bed of sautéed shallots and chard. Quick, easy, elegant, delicious, and perfect for Fall. 

image from sodahead.comI got the idea from a recent conversation I had about why bacon is so often smoked with applewood. The answer, as far as I can tell, is because pork loves apples. Applesauce or cooked spiced apples are a classic accompaniment for pork chops. You can buy apple-flavored pre-made sausages. Whole roasted pigs are traditionally presented with apples in their mouths. The pairing is at least as old as Apicus (a 1st Century Roman) whose writings include a recipe for minutal matianum, which was a sort of stew or ragout of pork and apples. In England, serving pork with applesauce was common by the Early Modern period, and may have started much earlier (according to The Food Timeline).

There seem to be three possible explanations:

1) They Are What They Eat: Wherever there are both orchards and pigs, the pigs have traditionally been allowed to graze on the windfall apples that cover the orchard floor during harvest season, which also happens to be pig-slaughtering season. Pigs like apples—especially ones that may be fermenting a bit—but that’s not the main reason they get to eat them. Instead, it’s because windfall produce is an ample source of omnivore-feed that’s generally not quite fit for human consumption. There’s a nod to both pigs’ affinity for apples and using windfall apples to keep pigs “in good health” in Orwell’s Animal Farm:

Now if there was one thing that the animals were completely certain of, it was that they did not want Jones back. When it was put to them in this light, they had no more to say. The importance of keeping the pigs in good health was all too obvious. So it was agreed without further argument that the milk and the windfall apples (and also the main crop of apples when they ripened) should be reserved for the pigs alone. (right at the end of Chapter 3)

Most livestock animals are ruminants and thus don’t compete with humans for food (at least traditionally—the shift from grass to grain as the mainstay of cattle feed is a recent development in the history of animal agriculture). Pigs are extraordinarily efficient at converting feed into flesh, but since they can’t survive on grass and alfalfa, if there aren’t enough “slops,” pigs sometimes eat at the expense of hungry people. Anthropologist Marvin Harris suggests that that’s likely part of the reason they’re the object of religious/cultural prohibitions originating in certain regions of the world.

It’s possible—likely, even—that apple-fed pigs taste a little bit like apples, especially if that’s what they’re gorging on right before they’re killed. Fancypants Iberico ham is supposed to taste like the acorns that pampered Spanish pigs are fed. Some artisan pork producers claim their pigs are fed exclusively with apricots, which supposedly imparts a uniquely sweet and floral taste. Cooking apple-fed pigs with apples or smoking their meat with applewood might have initially become popular at least in part because they would enhance and complement the apple flavor.

2) It’s the Time of the Season for Apples & Pork: The second and probably more important reason is that both pork and apples are Fall foods. The apple harvest coincides with pig slaughtering season—Fall was traditionally the time to put up enough cured pork products to last through the long winter, especially before over-wintering the animals was common. And although much of the pig could be preserved—the legs and the belly would be salted and/or smoked for ham and bacon and much of the meat could be ground up and preserved in some kind of sausage—some of the cuts were eaten right away. Apples would have been a natural component of those meals because they were plentiful at the same time. Further evidence: duck and goose, Fall game birds, are also often paired with apples.

3) Cutting the Fat: Pork is fatty and umami. Apples are sweet, acidic, and light. Applesauce is a condiment that makes the same kind of sense with pork as mint jelly with lamb or malt vinegar with fish & chips (I’m in Ireland this week, can you tell?)—it’s just something sharp and bright to cut something rich and meaty. However, the gustatory rationale probably explains why the tradition lasted more than why it started. Ultimately, bright and acidic condiments are sort of interchangeable. If not for pig affinity/seasonal considerations, people might have as easily paired pork with mint jelly.

I suspect it’s a combination of all three. And tradition aside, the cider-mustard glaze would also be excellent on baked winter squash, chicken (I can imagine it as a great dipping sauce for wings), or anywhere you might use a sweet, mild barbeque sauce.

a bit shiny from this angle

Recipe: Pork Chops with Cider Reduction and Greens

  • clockwise from the bottom left: chops, tea, chard, & cider. not exactly a one-pot meal, but well worth the extra dishes1 pork chop per person 
  • 1 T. cooking oil
  • 1 bunch of cooking greens (about 4 cups raw) per person
  • 1-2 shallots per person
  • 2-3 T. whole grain Dijon mustard
  • 2-3 cups apple cider
  • 2 T. butter, divided
  • salt and pepper to taste
  • rubbed sage (optional)

1. Combine the mustard and cider in a saucepan over high heat and let cook until reduced by at least half (20-30 minutes).

boiling vigorously reduced to less than a cup, slightly thickened

2. Meanwhile, melt one tablespoon of the butter in a pot large enough to hold the greens. Dice the shallot and add to the butter and let cook until golden (5-10 minutes).

3. Pull the greens from their stems, rinse them, and tear them into 2-3” pieces. Add them to the pan with the shallots with the water still clinging to them. Stir, cover, and reduce heat to low. Cook until tender (5 minutes for chard, more for kale or mustard greens). Season with salt and pepper to taste.

handful of diced shallots shallot caramelized and slightly obscured by apple cider steam

you can cook the chard stems too if you want, just add them to the shallots with a little water about 5-10 minutes before adding the leaves and cook until tender chard, wilting from the heat

4. Heat the oil in a large skillet over medium-high heat and season the pork chops with salt and pepper and sage (if desired). Cook the pork chops for 2-3 minutes on each side to develop a golden-brown crust and then reduce the heat to low and cook for an additional 4-5 minutes on each side, or until done but still just slightly pink in the middle.

some nice maillard browning could have been just slightly pinker--I left them on the heat a little too long, but they were plenty moist, especially with the cider reduction

5. Whisk the remaining tablespoon of butter into the cider-mustard reduction.

6. To serve, place the pork chop on a bed of the greens and top with 2-3 tablespoons of the cider reduction.

shalloty greens appley pork

Green Tomato Double-Feature: Fried Green Tomatoes and Green Tomato Mincemeat Bars

the yield from six plants: 4 lbs, 10 oz

Green Tomatoes: Get Them While They’re Cold

We’re past due for a killing frost, and it’s virtually guaranteed before Halloween. According to Climate-charts.com, there’s a 10% chance of frost by September 30 in Ann Arbor and a 90% chance by October 30. You can, obviously, tempt fate and leave your tomatoes out to see how long you can stretch the caprese salad and BLT season, but even if we end up in the long tail this year, the end is nigh. Also, the end is delicious. Here are the two best ways I’ve found use up the tomatoes that didn’t get a chance to ripen on the vine:

great on their own, or with any kind of mayonnaise-based dressing like Ranch or Thousand Island

if "green tomato mincemeat" squicks you out, just call them "spiced streusel bars"

This should conclude Tomatofest 2010 (previous entries this year: Tomato Jam, Tomato Soup, and Sweet Tomato Curd Squares). However, I also have an article about tomatoes coming out in a community recipe and resource book by Edible Avalon, and I should have more details about that soon.

I. Fried Green Tomatoes

A friend mentioned recently that knowing “fried green tomatoes” were a classic, he’d tried just slicing up some tomatoes and throwing them into a skillet with some rendered bacon fat. That actually doesn’t sound like a terrible idea, but you should be prepared to watch the tomatoes fall apart as they cook. So depending on how much bacon fat there is and what you’d planned on doing with them, it might not have the desired effect.

Raw green tomatoes are much firmer than ripe ones—coring them is almost like coring an apple. However, as they cook, the cell walls break down and the bitterness abates and whatever acids and glutamates and aromatic compounds the tomato accumulated before it got prematurely yanked from the vine will intensify. Once it’s cooked through, it will taste kind of like a ripe tomato, or at least like a roasted grocery store tomato, which is to say, not bad.

I find that medium heat is about right on my stovetop--you want them to get nice and brown in about 2-3 minutes on each sideThe classic way to prevent them from dissolving before they cook long enough to be palatable is to dredge them in egg and flour (or cornmeal or bread or cracker crumbs). Then, you fry them in about 1/4” of hot oil, melted lard or shortening (not butter, unless it’s clarified, because the milk solids will burn and the water content will make them soggy). When they’re golden brown on the outside and cooked through inside, they’re done.

Even if a few pieces of the breading fall off, they should stay together well enough to be crispy on the outside and soft and savory on the outside. However, you have to eat them immediately—fried tomatoes retain too much moisture to be kept crisp in an oven or re-crisped in a toaster, so only make as many as you want to eat right away. If you want to save some of your green tomatoes for later in the year, you can slice them, spread them out individually on a foil-lined sheet and freeze them for a few hours (just to keep them from freezing into one big hunk). Then transfer them to another container, like a gallon zip-top freezer storage bag. When you want to cook them, just pull them out of the freezer, bread them, and fry them. Don’t defrost them first, or they’ll turn to mush (that’s also why you need to slice and freeze them separately). But if you get them in the pan while they’re still frozen, the breading should keep them together once they cook through.

II. Green Tomato Mincemeat Bars

the "before" shot: all the mincemeat ingredients dumped in a pot to simmerthe "after" shot: what really stands out are the golden raisins, but basically everything else is cooked green tomato

The other recipe that pops up the most in google searches “green tomatoes” is green tomato mincemeat. Mincemeat was originally one of those Early Modern dishes that seems pretty odd to most Americans now because comes from a time and place before meat and sweets were firmly separated (with transgressors like bacon desserts merely reinforcing the binary by playing up how “wrong” it is to violate it). Mincemeat usually included less desirable cuts or leftover bits of meat and suet (raw beef or mutton fat) cooked with dried fruits, sugar, alcohol, and spices. It was a way to stretch the meat, make it palatable, and preserve it, and was most often baked in a pastry crust, either as single-serving pockets or double-crusted pie. Here’s an 18th C. recipe that calls for making a massive amount of the suet and dried fruit mixture to bake and eat over four months, with the option of adding a little boiled tongue or beef later:

To make Mince-Pies the best Way
Take three Pounds of Suet shread very fine, and chopped as small as possible, two Pounds of Raisins stoned, and chopped as fine as possible, two Pounds of Currans, nicely picked, washed, rubbed, and dried at the Fire, half a hundred of fine Pippins [apples], pared, cored, and chopped small, half a Pound of fine Sugar pounded fine, a quarter of an Ounce of Mace, a quarter of an Ounce of Cloves, a Pint of Brandy, and half a pint of Sack [sherry]; put it down close in a Stone-pot, and it will keep good four Months. When you make your Pies, take a little Dish, something bigger than a Soop-plate, lay a very thin Crust all over it, lay a thin Layer of Meat, and then a thin Layer of Cittron cut very thin, then a Layer of Mince meat, and a thin Layer of Orange-peel cut think over that a little Meat; squeeze half the Juice of a fine Sevile Orange, or Lemon, and pour in three Spoonfuls of Red Wine; lay on your Crust, and bake it nicely. These Pies eat finely cold. If you make them in little Patties, mix your Meat and Sweet-meats accordingly: if you chuse Meat in your Pies, parboil a Neat’s Tongue [ox tongue], peel it, and chop the Meat as finely as possible, and mix with the rest; or two Pounds of the Inside of a Surloin or Beef Boiled." From The Art of Cookery Made Plain and Easy, Hannah Glasse, 1747 (Prospect Books: Devon, 1995, p. 74). From The Food Timeline

Gradually over the 18th and 19th C, meat went from central to optional to uncommon and the dried fruit & spice preparation eaten alone was still referred to as “mincemeat” or sometimes just “mince.” It became a favorite way to use green tomatoes, because their savory glutamates stood in well for the meat and because boiling them with sugar and dried fruits was a good way to flavor and preserve them, too. Just like the meaty versions, the mixture is usually baked into a pastry. Also, like most cooked tomato products, it can be preserved in canning jars processed in a boiling water bath.

I had 4 1/2 lbs of green tomatoes, which made enough mincemeat for two recipes. I froze half of it rather than canning it, and perhaps I’ll bake that into a mincemeat pie for Christmas. I decided to treat the other half like any standard fruit preserve and bake it into a simple streusel bar cookie. What’s great about this recipe is you use the same mixture for the crust and the topping, so it’s dead simple to throw together. You could also substitute any kind of pie filling or preserves for the tomato mincemeat, use any kind of nuts you want in the crust and topping, use any kind of fat, any kind of flour. It’s entirely customizable. Same goes for the mincemeat—add some crystallized ginger if you have it, add other spices like cardamom or mace if you want them or leave out the cloves or nutmeg if you’re not a fan, throw in a tart apple or two or some carrots or winter squash, use currants or cranberries in place of the golden raisins, etc. It’s a template, not a chemical formula.

The result is just a great, simple spiced bar cookie. The tomato mincemeat is salty-sweet and has a kind of savory umami funkiness, almost like a sweet tomato chutney. The spices evoke pumpkin pie and apple crisp and piles of raked leaves and itchy hay rides. The oats and nuts in the streusel give it a sort of rustic chew and crunch. If my tomato curd squares were Summer in a bar cookie, this is the same idea dressed in a sweater and scarf for Fall.

you'll get a close-up of the Jack o'Lantern when I post about roasting pumpkin seeds

Recipe: Fried Green Tomatoes

Ingredients:

  • green tomatoes (one medium tomato per person)
  • 1 egg for every 4 tomatoes
  • 1/2 cup flour
  • 1/2 cup cornmeal, cracker crumbs, bread crumbs, panko, or something else with crunch
  • 2-3 t. seasoned salt, Old Bay, Bacon Salt, Jerk or Cajun seasoning blend, or whatever other herbs or spices you desire (just nothing that burns easily, like cinnamon or Chinese Five Spice)
  • 2 t. kosher salt, divided (or slightly less regular salt)
  • 1/2-1 cup oil, lard, or shortening for frying

Method:

1. Heat the oil in a wide skillet over medium heat.

2. Combine the flour, crunchy bits, seasonings, and salt in one bowl and lightly beat the egg in a second bowl.

3. Core the tomatoes and slice them into 1/4-1/2” rounds.

three tomatoes was too many for two of us to eat. really, one tomato per person is plenty breading and frying set-up; Bacon Salt!

4. Test the oil for heat by flinging a few water droplets at it (mind the splatter). If it sizzles, it’s ready. Dip each slice of tomato in the egg and then then the flour mixture, turning to coat, and place them gently in the hot oil.

5. Fry for 2-3 minutes on each side, or until golden brown and cooked through. Drain on paper towels. Sprinkle the hot tomatoes with a little more salt. Eat immediately. 

best to salt them when they're just out of the oil so it adheres served along side tilapia with lemon and shallots

Recipe: Spiced Green Tomato Streusel Bars (adapted from CDKitchen and GardenTenders)

Ingredients:

For the filling:

  • 4 cups finely chopped green tomatoes (~2 lbs)
  • 1 cup golden raisins
  • 2 t. kosher salt
  • 1 cup brown sugar (or 1 cup white sugar with a glug of molasses)
  • a hearty glug of rum or brandy (optional)
  • 2 t. ground cinnamon
  • 1/2 t. ground cloves
  • 1/2 t. ground nutmeg
  • juice from one medium lemon (3-4 T.)
  • zest from one medium lemon (2-3 t.)

For the crust and topping:

  • 3/4 cup butter, softened
  • 1 cup brown sugar (or 1 cup white sugar with a glug of molasses)
  • 1 1/2 cup flour
  • 1/2 teaspoon baking soda
  • 1 t. kosher salt
  • 2 c. rolled oats
  • 1/2 cup chopped walnuts, pecans, hazelnuts, cashews, or macadamia nuts

Method:

1. Core and chop the tomatoes. I usually cut them in half first and then cut a wedge-shaped piece around the stem and the toughest white part in the center. I let the food processor do the chopping part.

minced green tomatoes minceMEATed green tomatoes

2. Combine the tomatoes, sugar, lemon juice, lemon zest, and spices in a large pot and simmer until thickened, about 30 minutes, stirring occasionally. (I cooked it for almost an hour because I doubled the recipe. Some recipes call for cooking it for up to 3 hrs. Just keep an eye on it as it thickens to keep it from burning to the bottom of the pot).

3. Meanwhile, whisk together the dry ingredients for the crust and topping and then mix in the softened butter until the mixture is crumbly and all of the flour is moistened.

green tomato bars and pumpkins 045 pressed into the bottom of the pan for the crust

4. Preheat the oven to 375F. Grease a 9×13 pan, and press 2 1/2 cups of the crumbs into the bottom. Spread the cooked tomato mixture over the crust, and sprinkle with the remaining crumbs.

5. Bake for 30-35 minutes. Let cool completely before slicing—or, for the cleanest cuts, chill. For the best flavor, let it come back to room temperature before serving.

sprinkling the reserved streusel on top

Buckeyes, Schmuckeyes, or if you prefer, Peanut Butter Bon-bons

When I first set out to make these chocolate-covered peanut-butter balls, I intended not to refer to them by their traditional Midwestern moniker. Surely, I thought, neither the State of Ohio nor its flagship public university can claim any special relationship to sweetened peanut butter in a chocolate shell. There’s no reason I have to invoke tOSU’s mascot in the middle of football season in Michigan. But then I found some pictures of actual buckeyes nuts, and I’ll be damned if they don’t look uncannily like their namesake.

shown here popping out of the big spiny, smelly balls that grow on the treesand here, looking almost unmistakable from the chocolate variety

 

really, the only difference is that the candy version has a flat edge

and yes, I posed these specifically to mimic the above picture

I'll eat YOUR eyes! Whitetail buck from flickr user key lime pie yumyum

Real buckeyes are the seeds of trees in the genus Aesculus, which includes between 13 and 19 species (depending on how you count) that grow all across the Northern Hemisphere. The name “buckeye” is generally attributed to an American Indian word for the seeds and the nutritious mash they made from them after roasting—“hetuck,” which means “eye of a buck.” One species in particular, Aesculus glabra, became commonly known as the “Ohio buckeye,” even though it grows throughout the American Midwest and Great Plains regions, ranging from southern Ontario to northern Texas, apparently because the botanist who gave the tree its English name first encountered it on the banks of the Ohio River.

However, there’s also a California buckeye and a Texas buckeye and even a Japanese buckeye. And the seeds of all the trees in the genus—including Aesculus glabra—are also commonly known as horse chestnuts, after the larger family they belong to (Hippocastanaceae). So there doesn’t seem to be any simple botanical or taxonomical reason why the “buckeye” became so firmly associated with the state of Ohio.

How the Buckeye Became Ohioan and Ohioans Became Buckeyes

According to one story, it all goes back the spectacularly-named Ebenezer Sproat (or Sprout), who was a Colonel of the Continental Army in the Revolutionary War. After an unsuccessful post-war stint as a merchant, he became a surveyor for the state of Rhode Island and bought stock in the Ohio Company of Associates, which sent him west with the group led by Rufus Putnam that founded Marietta, Ohio, the first permanent American settlement in the Northwest Territory. There, Sproat became the first sheriff in the NW Territory. And aside from being a relatively prominent citizen, he also happened to be quite tall and, “of perfect proportions,” according to Wikipedia, whatever that’s supposed to mean. The Indians in Ohio were impressed with his height and/or his importance, and thus came to refer to him as “Hetuck” or “Big Buckeye.” A similar account suggests that it was mostly his height—claiming he was 6’4” (which would have been tall indeed in the 18th C.) and that he earned the sobriquet on September 2, 1788 when he was leading a procession of judges to the Marietta courthouse. Indians watching the giant of a man walk by began calling out “Hetuck, hetuck.” 

E. G. Booz's Log Cabin whiskey bottle, c. 1860-1890 from Cornell University LibraryBut it’s not entirely clear why that nickname would have ever been generalized to the shorter residents of the region. The more commonly-accepted theory is that the association between buckeyes and Ohio(ans) has something to do with William Henry Harrison.

Harrison was a resident of Ohio in 1840 when he made his first, successful presidential run. According to the Wikipedia article about him, he had already acquired the nickname “Buckeye,” as a “term of affection” when he served in the U.S. Congress, first as a representative of the Northwest Territory and then as one of Ohio’s Senators—presumably because of the prevalence of the tree in the regions he represented. However, the general consensus elsewhere is that Harrison and his presidential campaign advisors carefully cultivated the buckeye mascot and nickname to bolster Harrison’s image as a “man of the people.” Particularly in Ohio, log cabins were frequently made from the wood of buckeye trees and people in rural areas used to string up the nuts that would accumulate wherever the trees grew, so the buckeye was a useful symbol of the kind of rustic frontier populism that Harrison was trying to project.

Meanwhile, they portrayed the Democratic incumbent, Martin Van Buren, as an elitist, or even as a royalist intent on the restoration of the British crown, largely by publicizing the fact that he had hired a French chef for the White House and purportedly enjoyed French wine.Van Buren was actually the son of small upstate New York farmers and educated in rural schoolhouses, whereas Harrison was the son of wealthy Virginia slaveholders and educated in elite New England academies—he even studied medicine with the renowned Dr. Benjamin Rush before deciding he didn’t want to be a doctor. But Harrison successfully managed to convince people he was one of them with the help of bottles of whiskey shaped like log cabins and campaign propaganda like this pull card:

From The Granger Collection Marvin Van Buren smiles when drinking “A Beautiful Goblet of White House Champagne”
pull the string, and he frowns with “An Ugly Mug of Log-Cabin Hard Cider”

Shortly after that, popular songs and texts start to show up that refer to “Buckeye it's not even really "anthropomorphic" because that would be a nut with arms and legs...this one has a separate torsoboys” and “Buckeye girls” and to Ohio as “the Buckeye State.” In the 1850s, Samuel Sullivan Cox wrote a series of letters based on his travels to Europe and the Ottoman Empire, which he published under the title “A Buckeye Abroad.” It obviously continued to the point that now, there are probably almost as many drycleaners, diners, and car repair shops named Buckeye Blank in Ohio as there are Empire Blanks in New York City.

Brutus the Buckeye, the bizarre nut-headed mascot that dances on the sidelines at football and basketball games wasn’t invented until 1965. But students, alumnus, and athletes from the Ohio State University [awkward definite article sic] were always called “Buckeyes.” The name is older than the University itself, which was founded in 1870, and was seemingly applied to sports teams from the very beginning. The short-lived AA professional baseball team that existed in Columbus from 1883-4 was also named the Buckeyes. And Jessie Owens, who won four gold medals at the 1936 Olympics, while he was a student at OSU, was  sometimes called “the Buckeye Bullet.”

But What a Stupid Reason That Would Be Not to Make Them

So even though it probably originated with a dishonest political campaign (is there any other kind?), I still feel like I have to cede the name “buckeye” to Ohio—after all, it’s older than the UM v. tOSU rivalry itself. And it just seems foolish to deny the resemblance. But it would be a real shame to let the apparent legitimacy of a name that happens to be associated with any state or school bias you against the salt-studded awesomeness of homemade chocolate-covered, sweetened balls of nut butter. Sure, they’re basically just Reese’s peanut butter cups, but you shouldn’t underestimate the difference that good chocolate, flaky salt, and having personal control over the level of sweetness can make.

the toothpicks make for easier dipping, and it's easy enough to smooth away the holesI probably wouldn’t normally bother with something so…I don’t know, cliché? Pedestrian? It’s not that I don’t like simple foods or classic flavor combinations, but somehow anything consisting primarily of peanut butter and chocolate just seems like cheating. Just like it seems like cheating whenever the contestants on Chopped use bacon if it’s not one of the secret ingredients, and like a petty perversion of justice that the bacon-cheater almost always wins. 

However, this recipe popped up on Serious Eats just as I was musing about how maybe I should throw together some sort of sweet nibble in case we happened to have people over this weekend—something I could make in advance and that would keep relatively well in case we didn’t have people over. These seemed to fit the bill because like most cookies, you can make them well in advance of serving, but like most candies, they won’t get stale. But what really sold me was the description of the crunchy flakes of salt in the peanut butter mixture—“like little mouth-fireworks,” the author said.

If they seem too boring as is, you could mix up the nut butter/chocolate coating combination or add a third or fourth flavor element. You could make Thai coconut version with a little chili pepper, powdered ginger, and dried coconut. Or mix in bits of toffee, puffed rice, or crumbled cookies for a different flavor or texture. You could use cashew butter or almond butter instead of peanut butter, powdered honey for some of the powdered sugar, and white or milk chocolate if any of those is more to your liking. You could even freeze little drops of fruit preserves or caramel and roll the nut butter around them so at room temperature, they’d melt into a sweet, gooey center. Now I’m dreaming of white chocolate-covered sunflower butter balls with vanilla caramel centers. You could even make a whole buffet of different buckeyes…and if you really can’t get past the name, just call them bon-bons or shmuckeyes instead. If you cede them to tOSU, I think that’s just another victory for “tWorst State Ever.”

Recipe: Peanut Butter Bon-Bons (from Serious Eats)
halved from the original, to make approximately 3 dozen

Ingredients:

  • 12 T. salted butter (or coconut oil)
  • 1 1/2 c. unsalted, unsweetened peanut butter (or any other nut or seed butter)
  • 3 c. confectioner’s sugar
  • 1 1/2 t. kosher salt (or more to taste)
  • 1 bag chocolate chips (or ~2 cups chopped bar chocolate, I used a 70% cacao)

1. Leave the butters at room temperature to soften.

2. Beat them together with a spatula or the paddle attachment of a stand mixer until completely smooth and well-combined.

the butters alone will be pretty liquidy first addition of powdered sugar but by the last addition it will be fairly stiff and should be able to be handled

3. Add the powdered sugar 1 cup at a time, mixing until it forms a thick, malleable dough.

4. Stir in the kosher salt just until evenly distributed—you want to add the salt at the end so it doesn’t dissolve into the butter. Put the bowl in the freezer for about 10 minutes.

5. Roll heaping tablespoons of the peanut butter mixture into balls about the size of walnuts (or buckeyes) and place on a cookie sheet lined with waxed paper or parchment paper. Place a toothpick in each ball and return to the freezer for 30 minutes.

september 066

6. Meanwhile, reserve a few pieces of chocolate and melt the rest in 15-second bursts in a microwave or a double-boiler just until it’s about 75% molten. You don’t want the chocolate to get too warm or it will burn.

7. Remove from the heat and stir occasionally until it’s entirely melted and slightly cooled, and then stir in the reserved pieces.  Wrap the pot in a kitchen towel—you want to keep the chocolate around 88F—I didn’t bother pulling out a candy thermometer, because that’s right around body temperature, so it should feel just barely warm to the touch. Otherwise, it won’t temper correctly, and will set slightly soft and greasy to the touch and may develop a white “bloom” on the surface. The reserved chips  “seed” the melted chocolate with the right crystalline structure to make it harden.

8. Dip each ball in the chocolate to coat and place on waxed paper or parchment paper until firm. Remove the toothpicks and gently smooth over the hole. Store in an air-tight container in a cool place or refrigerate until ready to serve.

In Praise of Fast Food by Rachel Laudan

is this really inferior...

to this?

I don’t normally just post links to other content, but I think this article is such a smart and elegant polemic against “Culinary Luddism,” or the idea that “traditional” foodways (many of which aren’t actually traditional) are sacrosanct and modern industrialism has just ruined everything. I found myself nodding enthusiastically throughout and felt called to arms when I reached the conclusion:

What we need is an ethos that comes to terms with contemporary, industrialized food, not one that dismisses it; an ethos that opens choices for everyone, not one that closes them for many so that a few may enjoy their labor; and an ethos that does not prejudge, but decides case by case when natural is preferable to processed, fresh to preserved, old to new, slow to fast, artisanal to industrial. Such an ethos, and not a timorous Luddism, is what will impel us to create the matchless modern cuisines appropriate to our time.

YES. It is naive—at best— to believe that the world before industrial, processed food was idyllic in terms of nutrition, the distribution of agricultural and culinary labor, or the quality and variety of culinary experiences. At worst, ideas about what counts as “natural” or “real” with little basis in historical or scientific fact are used to reinforce social inequalities and make people feel either guilty or morally superior to others because of how they eat. Laudan corrects mistaken assumptions about the “halcyon days of yore” and critiques common beliefs about the inherent superiority of local, slow, artisanal, etc.

At least for now, the whole article is available for free. If you want to know why “modern, fast, processed” food isn’t such an unmitigated disaster after all, and why we may actually need mass-production, even if it’s a more considered, regulated, ethical form of mass-production, this is a good place to start:

In Praise of Fast Food

Deviled Eggs with Saffron Aioli: A waste of a very expensive spice

they're prettier if you pipe the yolks back in with an icing tip, but I usually just can't be bothered

The Emperor’s New Spice

Saffron is well-known for being the most expensive spice in the world. In 2009, the average U.S. retail price was nearly  $3000/lb. For comparison, vanilla beans, the second most expensive spice, retail around $150/lb and even if the cheapest you can get them is $5/piece, they’re only about $450/lb. Although saffron is prized for its aroma, its subtle flavor, and its ability to dye dishes a rich golden hue, the cost is primarily due to how resource and labor-intensive it is to produce.

the saffron crocus, from WikipediaEach thread of saffron is a pistil from a particular species of crocus, and each flower only produces three of them. It takes 170,000 flowers to produce a single kilogram of dried saffron. Furthermore, the pistils have to be harvested by hand during the short window of time when they bloom in October, and the harvesting must happen before sunrise because the flowers are so delicate that they wilt in the sun.

As my friend Kevin recently pointed out, saffron is a great example of something priced at its economic or exchange value rather than at its intrinsic (or use) value. As lovely as it smells, its aromatic compounds are extremely volatile and especially vulnerable to light and oxidizing agents. It’s somewhat more resistant to heat, but I generally find the flavor all but impossible to discern in most dishes, including some of the classic applications like paella and bouillabaisse. Given that, I probably should have known better than to put it in deviled eggs, which get their name from the pungent spices combined with the yolks. I could barely even discern it in the aioli on its own.

However, it sure does make things sound fancier. Kevin also recalled a recipe he’d seen for a bean dish involving saffron that noted, “adding saffron to beans is a good way to tell your guests that you’re not just being cheap by serving them beans.” And this, I think, is the true function of saffron at least most of the time: it’s something you put in food to prove you know what it is and can afford it, and then everyone feels compelled to say they can taste it and maybe they even think they do. But really, you could get the same effect by just telling people you put saffron in the dish.

A Classic for a Reason

In the future, I’ll probably skip the saffron-soaking. And I’ll probably just use Hellman’s/Best Foods mayonnaise with a little garlic and lemon juice mixed in instead of making my aioli from scratch. If, for some reason, you really want the aioli to be a vibrant yellow (not that you can tell anyway if you’re mixing it with egg yolks), you could always add some turmeric. Of course, then what you’ve got is just plain old deviled eggs. But there’s probably a reason the same basic preparation has been around for possibly as long as eggs and spices have been consumed.

I should have done half saffron aioli and half Hellmann's and seen if anyone could tell the difference.

According to The Food Timeline, recipes for boiled eggs topped with spicy sauces appear shortly after the Ancient Greeks and Romans domesticated egg-laying birds. There are recipes for spicy stuffed eggs in a 13th C. Andalusian cookbook, 15th C. Italian cookbooks, and 16th and 17th C. British cookbooks. Sometimes the recipes call for the yolks to be pounded with raisins, cheese, and spices like cinnamon and cloves, which might have produced something similar to mincemeat. However, mustard, onion, parsley, and cayenne are also common flavorings, and would probably have produced something virtually indistinguishable from the way most people “devil” their eggs today.

The association with the devil is apparently an 18th C. invention. As a culinary verb it was used for other hot & pungent preparations too— “devilled biscuits” referring to shortbreads spread with anchovy paste, mustard, and cayenne and then grilled (doesn’t that sound fantastic?) and seafood preparations that usually sound something like a curry. At least one cookbook suggested “devilling” or broiling meat with cayenne as a way of dealing with “relics of poultry or game.” None of them, I should note, involve saffron, and in retrospect if there’s anywhere you could expect saffron to shine, a dish specifically noted for being devilishly spicy is probably not it. So here’s a very foolish recipe if you want to waste some saffron, too. Or just skip to the egg part:

Recipe: Deviled Eggs with Saffron Aioli (makes enough aioli for about 2 dozen eggs)

Ingredients:

For the aioli:the water-solubility of the pigments is one of the reasons it's traditionally used to color/flavor grain dishes, becasue if you diffuse it in the liquid first it dramatically changes the appearance of the dish

  • a large pinch of saffron (about 20 threads)
  • 1 1/2 T. warm water
  • 1 egg yolk
  • 1 t. white wine vinegar
  • 2 t. lemon juice (plus more if needed)
  • 1 garlic clove, minced
  • 1/2 t. mustard powder
  • pinch of salt
  • pinch of white pepper
  • 1/2 c. canola
  • 1/4 c. olive oil

For the deviled eggs:saffron tea

  • 1 dozen eggs
  • 1/2 cup saffron aioli
  • 1 T. Dijon mustard
  • 1 t. celery salt (or celery seed + salt)
  • 1/2 t. ground white pepper (or black pepper)
  • pinch of cayenne (optional)
  • paprika to garnish (optional)
  • pimento slices to garnish (optional)

Method

For the aioli:

1. Place the saffron and warm water in a small bowl and let soak for about 20 minutes

2. Immersion blender method: Put the saffron tea, and all of the other ingredients except for the oils in a 2 cup measure or the beaker that came with the blender and then place the blender flush against the bottom. Carefully pour the canola oil into the container so that it sits on top of the other ingredients and let the contents settle for a minute. Without lifting or moving the blender at all, begin pulsing it. A cloud of emulsified dressing should begin to bloom up from the bottom. Keep pulsing for about a minute, until at least half of the mixture is emulsified. Then, begin to slowly rock or rotate the blender to incorporate more of the oil. Once almost all of the mixture is emulsified, plunge the blender vertically through the mixture once or twice until the texture is homogenous. Then, whisk in the olive oil by hand. Do Not use a blender to combine the olive oil or substitute olive oil for the canola—blending olive oil releases bitter-tasting compounds that will ruin the aioli.

everything but the oil goes in the measuring cup immersion blender flush against the bottom of the measure pour the oil in after putting the blender in so it sits on top of the other ingredients pulse to gradually incorporate and emuslify the oil

Food processor method: Put the saffron tea and all the other ingredients except for the oils in the bowl of a food processor and pulse to combine. With the processor running, add the oil slowly—start with just a few drops at a time, gradually working up to a thin stream. Once the emulsification has formed you can add the oil more quickly. After all the canola has been emulsified, stop the processor and whisk in the olive oil by hand.

Whisk method: Combine the egg yolk, vinegar, lemon juice, and spices and whisk together until the yolk begins to lighten in color. Whisking constantly and furiously, begin to add the oil one or two droplets at a time. Once the emulsification begins to form, you can add the oil a thin, steady stream. After all the canola has been added in and emulsified, whisk in the olive oil and the saffron tea.

3. Taste and adjust seasoning as desired. Let sit at room temperature for 4-6 hrs to help kill any unwanted bacteria. Then, refrigerate and use within a week.

this was slightly more liquid than the mayonnaise I've made before, I assume because of the 1.5 T. water required to soak the saffron

For the deviled eggs:

1. Place the eggs in a pot with enough cold water to cover by at least 1”.

2. Bring to a rapid boil and set a timer for 6 minutes.

3. Meanwhile, prepare an ice-water bath. As soon as the six minutes is up, drain the eggs and plunge them into the cold water.

4. When the eggs are cool enough to handle, peel them and cut them in half. Squeeze gently to remove the yolks.yolks out the aioli was actually almost exactly the same color as the yolks, so it's hard to discern here

5. Combine the yolks with the aioli (or 1/2 cup prepared mayonnaise combined with a clove of minced garlic and 2 t. lemon juice), mustard, and spices and stir until smooth. Taste and adjust seasoning as desired.

6. Fill the whites with a rounded scoop of the yolk mixture. Or, if you’re feeling fancy, pipe the mixture back into the whites with an icing bag and tip.

7. Garnish with a sprinkle of paprika and slice of pimento, if desired.

early 20th C. American cookbooks often suggest serving them on a bed of chopped cresses or cabbage, so that's always an option for presentation as well

Sourdough-risen Baguettes, Regular and Whole-Wheat

not quite as long as traditional baguettes, because my oven isn't as long as commercial ovens

A “French” Bread from Austria

There are conflicting accounts about the origins of the baguette—the thin rod of bread with a crisp and chewy crust and soft, yielding inside with large, irregular holes that most Americans associate primarily with France. Indeed, baguettes or at least something baguette-shaped is usually what English-speaking people have in mind when they refer to “French bread.” Nonetheless, according to The Food Timeline and Elizabeth David’s English Bread and Yeast Cookery (1979), the baguette actually originated in Vienna, where steam ovens were invented in the 19th C. “True French bread,” according to David, is “the old round or cylindrical hand-shaped ‘pain de campagne’ [country bread] or pain de menage’ [bread of the household, or common bread], plump, and crossed with cuts so that when baked the crust is of many different shades, gradations and textures and the crumb rather open and coarse.” That explains why in France, and still occasionally elsewhere, things that look very like baguettes are called “Vienna bread.”

large-ish, irregular holesHowever, baguette-shaped loaves were common in France nearly a century before Viennese steam-blasting ovens were adopted. According to Jim Chevallier, the author of a self-published book on the croissant, by the 18th C. “the default shape [for bread] was already long and narrow, and Malouin refers to the round shape as how ‘bread was shaped in former times’.”

Both David and Chevallier suggest that the shift from round balls to long batons was caused not by the steam oven, but instead by the increasing use of soft doughs (molle or batarde, meaning in-between or “bastard”), which relied on two inventions: a more refined flour sifted to remove most of the the fibrous bran and germ and the use of brewer’s barm or dried yeast. The resulting breads were much softer and lighter than the older style of bread made with whole grain flour and leavened with old dough (levain, which is basically a kind of sourdough starter). The older styles, called pâte briée or pâte broyée, were so dense and coarse that they were traditionally kneaded with the feet or pounded with long iron sticks.

the whole grain version has fewer large holes and is just slightly denser, but still soft in the middle, crusty on the outside, and flavorful and pleasantThe shift from hard, whole grain dough to soft, refined-flour dough also prompted a proliferation of interest in crust. Before the 18th C., the crust was considered the least desirable part of a loaf and often grated off and sold separately as bread crumbs. But the lighter loaves, when not burned by the uneven wood-burning ovens of the day, developed a golden-brown exterior with a rich, toasted flavor that was still soft enough to  chew. Instead of getting rid of the crust, bakers started to develop ways to maximize it, including new shapes and slashing techniques, like the fluted pain long, which if not a “baguette” proper certainly looked a lot like one.

Ultimately, whether we believe David that the baguette is a 19th C. invention or Chevaillier that it dates to the 18th C. may come down to the definition of “baguette." If you take the name “baguette” to refer primarily to the shape of the loaf, it seems clear that it pre-dated the Industrial Revolution and Viennese steam-blasting oven. However, if you think “baguette” refers only to the specific kind of baton that’s 2-3’ long and about 2” in diameter with barely-there insides and the kind of crust you can only achieve by blasting it with steam periodically during the baking process, then it’s a far more recent invention.

I No Can Haz Steam-Blasting Oven, Oh Noes!

seriously, how French does this kid look? I mean, he *is* French, but does he have to be SO FRENCH? From Salut! by Stacey in France, click for sourceSo, as suggested above, it’s true that the kind of baguettes that instantly make anyone holding one look impossibly-French get their characteristic crustiness from steam-blasting ovens. I’ve discussed this issue before.

I can’t create quite the same dramatic seam-splitting and crustiness in my standard dry-heat oven, and I imagine the best home results probably rely on a specially-shaped lidded ceramic baking dish like this La Cloche, which traps the moisture from the dough just like the covered pot used in Jim Leahy’s no-knead method. However, I have not been disappointed with the results I get from overnight refrigeration, a pizza stone, a cast iron pot, and a spray bottle. Mine turn out a little breadier than a traditional baguette, but they also last a bit longer without getting stale and still have a nice crisp, chewy crust.

Further blasphemy: even though the baguette was created specifically for the special characteristics of refined flour—the quick-rising, seam-splitting, ethereal insides and shattering outsides that depend on the dough being composed almost exclusively of easily-digestible starches and not a lot of indigestible fiber, I think I get pretty good results even using almost-entirely whole wheat flour as long as I add a little more gluten and sugar. Sure, my whole wheat loaves are a little denser and a little chewier, but not, I think, unpleasantly so. As you can tell from the pictures, they rise almost as much as their refined-flour counterparts, although the crumb isn’t quite as open and irregular. They still seem unmistakably baguette-ish to me.

What follows should be in no way construed as a “traditional” baguette recipe—if anything, it’s probably closer to the 18th C. predecessors than the modern baguette. Nevertheless, it is shaped like a baton, crusty on the outside, soft and flavorful on the inside, and just right for serving alongside a few wedges of cheese or slicing on a bias and topping however you like for canapés.

Recipe: Sourdough-risen Baguette (makes 2 loaves about 2’ long) both batches of dough mixed and ready to knead; I let them rest in the bowl instead of turning them out onto my silpat, so that's always an option too

  • 1 cup refreshed 100% hydration sourdough starter
  • 1 cup water
  • 3-4 cups bread flour
  • 2 t. kosher salt
  • 1 t. white sugar
  • extras: 1/4 to 1/2 cup more flour, wheat germ, cornmeal, rolled oats, seeds, fried shallots or garlic, salt, or a combination

Recipe: Sourdough-risen Whole Wheat Baguette (approximately .78-.83 whole grain)

  • 1 cup refreshed 100% hydration sourdough starter
  • 1 cup water
  • 3-4 cups whole wheat flour
  • 4 T. vital wheat gluten
  • 1 T. malt extract or maple syrup or honey or any other sweetener
  • 1 T. white sugar
  • 2 t. kosher salt
  • extras: 1/4 to 1/2 cup more flour, wheat germ, cornmeal, rolled oats, seeds, fried shallots or garlic, salt, or a combination

Instant yeast adaptation: Instead of the sourdough starter, use 1 package (about 2 1/4 teaspoons) Active Dry or Rapid Rise yeast. Add an additional 2/3 cup flour and 2/3 cup water. The first rise should only take about one hour, and you can take it out of the refrigerator just 30 minutes before baking.

Method

ingredients in, ready to mix1. Combine all ingredients except the “extras,” using the smallest amount of flour called for, and stir with a large spoon or spatula just until it comes together and starts to pull away from the bowl. If using whole wheat flour and added gluten, whisk the gluten into the flour before adding it to the moist ingredients.

2. Turn onto a lightly-floured surface, cover with the mixing bowl, and let sit for 5-15 minutes to let the flour absorb as much moisture as possible.

3. Knead for 10-15 minutes, adding as much of the additional cup of flour as needed to prevent the dough from being too sticky to work with. It should be sticky, just not so sticky that it sticks to you more than it sticks to itself.

4. Place in a bowl, cover with plastic wrap, and let rise until doubled in size (2-12 hrs, depending on your starter; 1 hr if using instant yeast)

before rising 5-6 hours later

5. Generously dust a kitchen towel with flour or whatever else you want to use to coat the loaf—you must use something, or it will become permanently adhered to the towel and when you try to unroll it onto the pan, you’ll completely destroy the shape and be stuck trying to scrape the dough off with your fingernails. Sometimes when I’m dusting it with something coarser than flour, I still dust the towel with a layer of flour first just to be sure it won’t stick. Sticking is very bad.

towel generously dusted, dough rolled out; you can see it sticking to the silpat; imagine trying to get it off something other than silicone

hard to see here, but I did dust the towel with flour before sprinkling it with oats

6. Using a rolling pin, roll the dough into a long rectangle and then roll up, jelly-roll style, into a long tube. Place on the prepared towel and dust the top side with whatever you’re using to prevent the loaf from sticking to the towel.

7. Roll the loaf in the towel, place on a baking sheet, cover with plastic wrap and refrigerate overnight or up to a week. ready to be rolledall wrapped up and ready to spend the night in the refrigerator

8. Take the loaf/ves out of the refrigerator 1 1/2 hrs before you want to bake it/them. 30 minutes before baking, preheat the oven to 450F with a baking tile on one of the racks and a cast iron pot on the oven floor.

9. Just before baking, gently unroll the loaf onto a piece of parchment paper, and slash diagonally every 3”-4”.

if you wanted, you could just bake one at a time. in fact, if you made up a double-batch of dough on the weekend, you could have a freshly-baked baguette 4 days of the week the oats are a little harder to cut through; a super-sharp knife is invaluable for this

10. Slide the loaf, parchment and all, onto the baking tile and quickly pour 1/4 cup water into the cast iron pot and close the oven. Another optional step that will create even more steam (and thus a crisper crust) is to spritz the walls of the oven 3-4 times using a spray bottle full of water.

11. Bake for a total of 20-30 minutes or until the crust is golden-brown and the loaf sounds hollow when tapped on the bottom. After 5 minutes and 10 minutes in the oven, add another 1/4 cup water to the pot and/or spritz the oven walls again to create more steam.

does this make my blog look impossibly-French? no? bummer

Why Posting Calorie Counts Will Fail, Part II: Most People Don’t Know How Many Calories They Burn

Introduction and Part I of this series.

click for USA Today article

Few stories that begin, “Many Americans clueless…” can really be called “news.” Nonetheless, a recent study made headlines earlier this month by confirming what research has shown time and again: most people don’t know how many calories they supposedly burn. The 2010 Food & Health Survey by Cogent Research asked respondents (1,024 adults “nationally representative of the US population based on the Census”) to estimate how many calories someone of their age, height, weight, and activity levels “should consume” per day. Only 12% got within 100 calories +/- their Estimated Energy Requirement (or EER, the formula currently used by the USDA) and 25% wouldn’t even venture a guess. The remaining 63% were just wrong. This seems to pose a problem for the claim that publishing calorie counts on menus will improve public health. Logically, if people don’t know if they burn 10 or 10,000 calories in a day, which is the range of estimates collected in another survey, conducted in 2006 at the University of Vermont (full text with UMich login), knowing how many calories a particular menu item contains probably isn’t going to do them much good. The campaign is called "Read 'em before you eat 'em" (the slogan in the little purple circle. Image from nyc.gov

The new calorie publishing policy actually includes a provision to help address this problem—in addition to the calorie counts of all menu items, menus will also have to publish the average daily calorie requirement for adults (2,000 Kcal). New York City also attempted to address the problem of calorie ignorance when it instituted its calorie count requirement by launching an ad campaign aimed at drilling the 2000/day calorie requirement into people’s heads.

But that’s not the kind of calorie ignorance I’m concerned about. For one, I don’t think the success of calorie counts in reducing obesity or improving public health depends on people keeping strict caloric budgets. Enough people have internalized the belief they ought to eat fewer calories that the numbers could be useful as a point of comparison regardless of how many people can accurately estimate how many calories they supposedly burn based on their age, height, weight, and activity level. Even if you’re under the mistaken impression that you’re Michael Phelps, if your goal is to consume less energy, choosing between the 250-calorie sandwich and the 350-calorie one is a simple matter of figuring out which number is smaller. IF calorie counts were accurate, and they inspired at least some people to consistently chose lower-calorie items, and at least some of those people didn’t compensate for those choices by eating more later or being less active, and some of them continued to burn the same number of calories despite eating fewer of them, then the counts would actually have the intended effect. The magnitude of the effect might be small, but it would be in the right direction.

Of course, that’s a big “if.” I already addressed the first condition (calorie counts are often wrong), and will be looking at the next two (people don’t order fewer calories but if they think they have they are likely to compensate later) in more detail in later entries. The problem of most people not knowing how many calories they burn is related to the third condition—the mistaken assumption that people will continue to burn the same number of calories even if they reduce the number of calories they eat.

In other words, the problem isn’t that too few people know that the average adult probably burns something in the vicinity of 2000 calories per day. The problem is that metabolism varies. It doesn’t stick to the formula based on height, weight, age, and activity levels. Most people don’t know how many calories they burn because they can’t know, because it’s dependent on lots of factors that formulas don’t and can’t account for. And one of the things that usually causes people to burn fewer calories per day is eating fewer of them. This starts to get at one of the other reasons I don’t think posting calorie counts will have the desired effect: it’s true that eating fewer calories often leads to short-term weight loss, but the vast majority of people either get hungry and can’t sustain the energy deficit or their bodies adjust to burning fewer calories and erases the deficit. Either way, almost all of them regain all of the weight they lost, and often more.

The Rise, Fall and Return of the Calories-in/Calories-out Myth

The idea that weight gain and loss is simple matter of calories in versus calories out also dates back to William Atwater (the turn of the 20th C. USDA scientist who was into burning food and excrement). Before Atwater, most people believed that the major nutrients in food were used in entirely different ways—proteins were thought to be “plastic” and used exclusively for tissue repair and growth, like little band-aids that the body could extract from food and simply insert where necessary; fats were similarly thought to be extracted from food and stored basically intact; only carbohydrates were thought to be transformed by digestion as they were burned for fuel. The discoveries that protein could be converted to glucose by the liver and that carbohydrates could be transformed into body fat were both seen as wildly counterintuitive and controversial. Some physicians continued to give advice based on the earlier principles as late as 1910. RMR = resting metabolism, which should probably be shaped more like a big empty question mark

However, in the last few decades of the 20th C., Atwater and others managed to convince an increasing number of people that a calorie was a calorie was a calorie—that all of the major nutrients could be burned for fuel and that any fuel not immediately consumed in heat or motion would be stored as fat. The idea of seeking an equilibrium between calories ingested and calories used was first advocated by Irving Fischer, a Yale economist who drew a parallel between Atwater’s new measure of food energy and the laws of thermodynamic equilibrium and market equilibrium. This theory had widespread appeal in the age of Taylorism and scientific management, which coincided with the first major national trend of weight-loss dieting and the aesthetic ideal of thinness represented by the Gibson Girl and the flapper.* Caloric equilibrium was a way to apply the same universal, rational logic thought to govern the laws of chemistry and the market to the body. From the 1890s through the 1920s, the calorie reigned supreme. As historian Hillel Schwartz says:

The calorie promised precision and essence in the same breath. It should have been as easy to put the body in order as it was to put the books in order for a factory” (Never Satisfied: A Cultural History of Diets, Fantasies, and Fat 1986, 135).

That human bodies don’t reliably obey this logic in practice didn’t matter then any more than it seems to matter to most contemporary advocates of caloric algebra. Skeptics noted, even then, that many fat people seemed to eat much smaller meals than thin people, and that some people could reduce their intake to practically nothing without losing weight while others seemed to eat constantly without gaining weight. But the theory of caloric equilibrium is powerfully seductive, not just because of its simple, elegant logic, but also because it seems to “work,” at least in the short term. People who reduce the number of calories they eat do tend to lose weight initially, often at approximately the predicted rate of 1 lb/3500 calories. That offers a kind of intermittent reinforcement. When it doesn’t work or stops working, people scramble to come up with excuses—either the dieter’s estimates of how much they were eating must have been wrong, or they were “cheating” and eating too much (more on this in the entry on why calorie-cutting diets fail).

However, caloric math hasn’t always been the dominant nutritional theory (despite what many people claim). In thefrom Atlas of Men, Sheldon's most popular book 1930s and 1940s, as weight-loss dieting became less popular and feminine ideals got a little plumper again, nutrition science became more concerned with the psychology of appetite—often relying on Freudian-influenced theories about how traumatic childhood experiences and sexual dysfunction might manifest as insatiable hunger—and a new theory of body types.

The theory of somatotypes was initially developed by William Sheldon in the 1940s as part of an attempt to use measurements of the body to predict personality types and behaviors, like criminality. He proposed a sort of three-part continuum between three extremes: the thin ectomorph, the fat endomorph, and the muscular mesomorph, based on the three layers of tissue observed in mammalian embryos. It was similar to the medieval medical theory of different physical constitutions based on the balance of humors (blood, phelgm, bile, etc.) but with a new sciencey gloss and some nationalist overtones—Sheldon noted, for example, that Christ had traditionally been portrayed as an ectomorph (supposed to be cerebral and introspective), and suggested that therefore Christian America would have a military advantage over the mesomorphic Nazis (supposed to be constitutionally bold and arrogant). Somatotypes were later used to customize diet and exercise plans, but at the time, they were primarily embraced as a way to describe and justify the apparent differences in peoples’ ability to be thin. Unlike the algebra of calories in/calories out, somatotyping suggested that no matter what they did, endomorphs could never become ectomorphs. They simply did not burn calories at the same rate, and their bodies would cling stubbornly to fat, especially in the abdominal region.

Sheldon’s theory, like many projects with eugenicist overtones, fell out of favor somewhat after WWII, especially after the embryonic tissue theory was discredited. However, his somatotypes live on, primarily among bodybuilders and competitive weightlifters, perhaps because they still need some way to explain individual differences in outcomes for identical (and rigorously-monitored) inputs. There are also subtler echoes in the idea that people have individual “set points” or genetic predispositions towards certain body types, which isn’t meant to imply that there’s no validity to those theories—I think it seems far more likely that there are genetic components to body size than that all family resemblances are environmental. However, as the new calorie labeling policy exemplifies, the universalizing logic of calories in/calories out is back with a vengeance. Almost every popular diet plan today, with the exception of paleo/low-carb/grain-free diets, is based on creating a calorie deficit (and in practice, many low-carb diets also “work” to the extent that they do at least partially by reducing caloric intake).

The point of this little history lesson is that the extent to which people ascribe to either the theory of calories in/calories out or the theory of intransigent body types seems to have more to do with what they want to believe than the available evidence. Calories-in/calories-out may appeal to Americans today for different reasons than it appealed to the enlightenment rationalist seeking to find and apply universal laws to everything. I suspect that it has a lot to do with normative egalitarianism and faith in meritocracy, e.g. anyone can be thin if they eat right and exercise. The idea of predetermined body types, on the other hand, appealed to mid-century Americans eager to identify and justify differences and hierarchies of difference. But in every case, the evidence is either cherry-picked or gathered specifically to support the theory rather than the theory emerging from the evidence, which is complicated and contradictory.

*Before the 1880s, the  practice of “dieting” and various regimens like Grahmism (inspired by Sylvester Graham), the water cure, and temperance were concerned more with spiritual purity or alleviating the discomforts of indigestion and constipation than achieving a particular body shape or size. Graham’s followers actually weighed themselves to prove that they weren’t losing weight, because thinness was associated with poor health.

So What?

Even if most people can estimate how many calories they burn on an average day now with some degree of accuracy, and the calorie counts help them eat fewer calories than they did before or would have otherwise, there’s no guarantee that they’ll continue burning the same number of calories if they continue to eat fewer calories, which they would have to do for the policy to have long-term effects. In fact, given >6 months of calorie restriction, most people appear to burn fewer calories or start eating more and any weight lost is regained. So either the calorie counts will change nothing about how people order at restaurants and there will be no effect on their weight or health. Or they will have the desired change on how people order… but there still won’t be any effect on their weight or health.

But boy am I glad we have easier access to that critical information.