Category Archives: myth

Please Just Shut Up About the “Family Meal”

The most remarkable thing about my mother is that
for thirty years she served the family nothing but leftovers.
The original meal has never been found.
-Calvin Trillin

Don't they look like they would talk in Comic Sans?   
Image Credit: She

Every few months, someone publishes another article claiming that the “family meal” is dead or dying, and that if we could only revive it, it would help us fix a laundry list of societal ills from obesity to teen pregnancy. These stories usually make four main rhetorical moves:

1) Invoking Tradition and History: They universally portray the family meal as a long-standing tradition that Americans lost sometime in the recent past. For example, a post on Epicurious last week was titled “Radical Call To Take Back The Family Dinner,” which not only implies that it’s something we had at some point but also that something or someone took it from us, possibly without our consent. Frequently, these articles conflate cooking from scratch, eating at home, and eating with the members of your nuclear family, which makes it difficult to pinpoint exactly which behavior they think has declined, or what counts as a “family meal” in the first place—if a nuclear family eats together at a restaurant, does that count? What about eating take-out at home? Is eating in a car an automatic disqualifier, even if the food is homemade and everyone talks about their day?

The historical timeline is also usually pretty fuzzy. It’s almost never clear when the author thinks the “family meal” supposedly prevailed or when we “lost” it—the 1920s? the 1960s? The 1980s? And yet they’re sure it existed at some point. Much like the missing “original” meal in Calvin Trillin’s joke about leftovers, it’s an absent referent.

2) Explaining Why the Family Meal Is a Panacea: Whatever they think it was, and whenever they think we lost it, the authors are clear about at least one thing: family meals are a good thing and we need to get them back. That’s often exemplified by the titles, like the Huffington Post article linked in the Epicurious post: “How Eating at Home Can Save Your Life” or Miriam Weinstein’s 2005 book, The Surprising Power of Family Meals: How Eating Together Makes Us Smarter, Stronger, Healthier and Happier. The authors often point—or at least gesture vaguely—towards studies that have shown correlations between how often children eat with their parents and higher GPAs, lower BMIs, lower rates of alcohol & drug use, and a decreased likelihood of developing an eating disorder. Because everyone knows correlation = causation, right?

3) Blaming Individuals Instead of Structural Changes: Authors sometimes initially point the finger at structural changes like the increasing reliance on industrial-scale agriculture and processing to feed a growing & urbanizing population (dated either to the post-Civil War or post-WW II era) and growing numbers of women in the workforce after the 1960s. However, they all find a way to shift blame, and thus responsibility, back onto individuals. That’s rhetorically necessary, because the goal of the articles is usually to change individuals’ behavior. Also, it would be futile and/or offensive to suggest that the “answer” is a massive population cull, giving up city living, and/or  women leaving the workforce en masse. Indeed, many feminists were rankled by Michael Pollan’s article “Out of the Kitchen, Onto the Couch,” even though he specifically notes that the amount of time spent on food preparation has declined dramatically even for women who don’t work outside the home and specifically calls on women and men to “make cooking a part of daily life.”

The factors they end up blaming instead are nonsense or, at best, totally un-measurable. My favorite is when they claim that the problem is that Americans are busier than ever…but also lazier than ever. From Epicurious:

We’re certainly at a lazy point in history, though ironically, for all the conveniences at our disposal, we seem even shorter of actual time.

We’re working more hours than ever, but as a culture, we’ve gotten lazy. American kids are running around trying to do more extra-curriculars than ever, especially if they’re college-bound, but they also spend all their time on passive, mind-slushifying electronic entertainments. Chefs, farmers, and food writers have achieved celebrity status, the Food Network and shows like Jamie Oliver’s Food Revolution and Bravo’s Top Chef achieve stellar ratings, and there’s never been more interest in eating organic, natural, fresh, local food…but we’re a fast food nation addicted to HFCS and cheap, industrially-processed, “empty” calories. Yadda yadda, a dozen other unsubstantiated, contradictory clichés that tell us nothing about what is responsible for Americans cooking and/or eating together less (if indeed they are cooking and eating together less).

4) Calling for Change: All of which leads up to the same old rallying cry: Even if you think you don’t have time, you should make the time! If you don’t know how to cook, you should learn to cook! If you and/or your kids prefer fast food and convenience foods, you’re basically a failure at life and you should learn to roast Brussels sprouts and grow an adult palate, stat. Then they make perhaps the stupidest claim of all: not only is cooking at home and eating together a moral obligation to your children, family, and nation, the real reason you should do it is because it will make you happier. After all, there is no greater joy than sharing a home-cooked meal with the people you love.

I make that face all the time, which may make me a snarky, whiny adoescent. But tell me how I'm wrong. American Beauty dir. Sam Mendes 1999 (from Orange Crate Art)

The “family meal” lecture would be annoying enough as it is—smug, hectoring, and totally unoriginal—but worse, it’s almost entirely based on myths, lies, and logical fallacies.

You Cannot “Take Back” What You Never Had, or Never Lost

Again, the lack of a coherent definition makes it a little difficult to say with any real reliability whether or not the “family meal” has declined, and since when. But here’s why Hyman thinks it’s dying: 

In 1900, 2 percent of meals were eaten outside the home. In 2010, 50 percent were eaten away from home and one in five breakfasts is from McDonald’s. Most family meals happen about three times a week, last less than 20 minutes and are spent watching television or texting while each family member eats a different microwaved "food." More meals are eaten in the minivan than the kitchen. (full article on The Huffington Post)

No sources cited, but let’s just assume that’s all true. Hyman seems to be setting a pretty high bar—his idealized meal has to be eaten inside the home, cooked without a microwave, last longer than 20 minutes, be more frequent than 3x/week and take place in the absence of television or texting. We know that in 1910 at least two of his criteria were being met—we had the eating at home part down and televisions and texting hadn’t yet been invented. But that doesn’t mean people were sitting down to eat together for 20+ minutes on a daily basis, or that the ones who did were cooking those meals themselves.

Actually, the idea of the “family meal” was invented in the mid-19th C. Before that, in wealthy families, children would eat in the nursery with a nanny or servants until they were old enough to be sent off to boarding schools or join their parents and other guests at formal meals, probably in what we’d now consider adolescence. In poorer families, everyday meals were casual affairs and often staggered. As Michael Elins writes in Time magazine:

Back in the really olden days, dinner was seldom a ceremonial event for U.S. families. Only the very wealthy had a separate dining room. For most, meals were informal, a kind of rolling refueling; often only the men sat down. Not until the mid–19th century did the day acquire its middle-class rhythms and rituals; a proper dining room became a Victorian aspiration. When children were 8 or 9, they were allowed to join the adults at the table for instruction in proper etiquette. By the turn of the century, restaurants had appeared to cater to clerical workers, and in time, eating out became a recreational sport.

Other food historians like Harvey Levenstein also agree that it was only in the Victorian era that the “family meal” became a preoccupation—and it emerged first among bourgeois families, for whom eating “correctly” became a crucial way of distinguishing themselves from the working classes. It was increasingly seen as inappropriate to delegate the feeding of children to servants because mealtime was such a crucial opportunity for training in manners, conversation, and taste.

Much of the advice published in the 19th C. aimed at Victorian mothers focused on the proper care of the adolescent female body, including how daughters should eat and exercise to cultivate the correct social identity and moral character. Meat and spicy foods were thought to stimulate and signal sexual desire so their consumption was seen as unsuitable for girls and women. In a more general sense, eating, food preparation, digestion, and defecation all became constructed as coarse and un-ladylike. According to Joan Jacobs Brumberg, this combination of smothering maternal concern about eating and the elevation of restraint and physical delicacy prompted the emergence of anorexia nervosa among middle-class girls in the Victorian era (Fasting Girls p. 134).

Furthermore, concerns that the “family meal” is dying are almost as old as the idea of the “family meal” itself. According to nutrition policy and research analyst Paul Fieldhouse:

[The] nuclear concept of the family meal is a fairly modern phenomenon… and there is evidence that every generation has lamented its demise. Already in the 1920s there were worries being expressed about how leisure activities and the rise of the car were undermining family mealtimes! (From Eating Together: The Culture of the Family Meal)

The fact that the “family meal” is a modern idea, was historically limited mostly to the wealthy, and has apparently always been on the decline doesn’t mean its prevalence hasn’t been historically variable. The statistics I find the most convincing are the ones Robert Putnam cites in Bowling Alone:

The fraction of married Americans who say “definitely” that “our whole family usually eats dinner together” has declined by a third over the last twenty years, from about 50 percent to 34 percent…. The ratio of families who customarily dine together to those who customarily dine apart has dropped from more than three to one in 1977-78 to half that in 1998-99. (Bowling Alone p. 100)

That’s a considerably more conservative definition than Hyman’s—Putnam is interested in togetherness, not what kind of food you’re eating or whether the television is on—and even so, neither its original prevalence nor its decline are all that staggering. The idealized "family meal": the Cleaver family in an episode calls "Teacher Comes to Dinner" first aired in 1959.The phenomenon of nuclear families eating together probably peaked sometime between the 1940s and 1970s, but it was still habitual for less than half of the American population and probably mostly limited to relatively affluent, dual-parent, single-income households. And despite how much more free time Americans supposedly had to cook, and how much harder-working they were back then, we know that most of those households relied on domestic servants, restaurant meals, take-out, and/or industrially-processed convenience foods at least some of the time. The heyday of the “family meal” was also the heyday of Jell-O salads and Peg Bracken’s I Hate to Cook Cookbook

By the 1990s, the percent of families “usually” eating together had declined by 16%, which is significant. However, recent trends point towards a revival since the 1990s, not further decline. According to the National Center on Addiction and Substance Abuse (CASA) at Columbia University, the number of adolescents who reported eating with their families “most nights” increased 23% between 1998 and 2005. In CASA’s 2010 survey of over 2000 teens and 456 parents, 60% said they eat dinner with their families at least five times a week. And there’s more good news for people who think the “family meal” has to happen at home: in the last two years, restaurant traffic in the U.S. has fallen, probably due to the recession.

In other words, it’s not at all clear that the family meal is dying or that taking it back would be all that “radical.” More coming soon on why it doesn’t really matter if the television is on or who did the cooking (at least in terms of benefits for children), why increasing the prevalence of family meals—even if you could achieve that just by scolding people about it—probably wouldn’t fix anything, and what could be done that might actually improve American eating habits…

In Praise of Fast Food by Rachel Laudan

is this really inferior...

to this?

I don’t normally just post links to other content, but I think this article is such a smart and elegant polemic against “Culinary Luddism,” or the idea that “traditional” foodways (many of which aren’t actually traditional) are sacrosanct and modern industrialism has just ruined everything. I found myself nodding enthusiastically throughout and felt called to arms when I reached the conclusion:

What we need is an ethos that comes to terms with contemporary, industrialized food, not one that dismisses it; an ethos that opens choices for everyone, not one that closes them for many so that a few may enjoy their labor; and an ethos that does not prejudge, but decides case by case when natural is preferable to processed, fresh to preserved, old to new, slow to fast, artisanal to industrial. Such an ethos, and not a timorous Luddism, is what will impel us to create the matchless modern cuisines appropriate to our time.

YES. It is naive—at best— to believe that the world before industrial, processed food was idyllic in terms of nutrition, the distribution of agricultural and culinary labor, or the quality and variety of culinary experiences. At worst, ideas about what counts as “natural” or “real” with little basis in historical or scientific fact are used to reinforce social inequalities and make people feel either guilty or morally superior to others because of how they eat. Laudan corrects mistaken assumptions about the “halcyon days of yore” and critiques common beliefs about the inherent superiority of local, slow, artisanal, etc.

At least for now, the whole article is available for free. If you want to know why “modern, fast, processed” food isn’t such an unmitigated disaster after all, and why we may actually need mass-production, even if it’s a more considered, regulated, ethical form of mass-production, this is a good place to start:

In Praise of Fast Food

Hipsters on Food Stamps Part II: Who Deserves Public Assistance?

Man should not be ready to show that he can live like a badly-fed animal. He should decline to live like that, and should either steal or go on the rates, which is considered by many to be a form of stealing.–Oscar Wilde

avocado is another tricky one: relatively expensive and often considered delicious, but technically "fresh produce" and generally considered to be healthy despite being high in fat; how would the people who would have food stamps restricted to virtuous, non-luxury items feel about it? would it matter if it was organic? image from Look At This Fucking Hipster

It’s been a couple of weeks, so first a brief recap: in the first entry, I looked the recent article on Salon about “hipsters” using food stamps to purchase luxury foods, which was maddeningly imprecise about the employment and financial circumstances of newly-qualified food stamp recipients and what they’re actually buying, as opposed to merely sauntering past. Relying almost exclusively on anecdotal evidence and rumor, the original article seemed designed primarily to build on popular stereotypes about “hipsters” and elicit outrage about this potentially-apocryphal trend in food stamp use.

In less than a week, the article attracted nearly 500 comments (Salon closed it down at 473). I didn’t read all of them, partially because a few themes emerge pretty quickly and they all start to sound the same. Here are the primary camps:

1. Outraged sheeple—a lot of people were completely sold on the veracity of the trend and responded exactly the way the article primes them to, i.e. how dare people who receive food stamps shop at Whole Foods, purchase gourmet or exotic ingredients, or ever buy anything more expensive or pleasurable than the bare minimum required to ensure their survival. This camp is split between people who object only to food stamps being spent on non-“essential” foods and people who apparently believe that people receiving public assistance should not be able to purchase anything that might be construed as a “luxury,” even with their own money.

2. Better than Doritos—another group of people who believed the story thought it was a good thing, at least as long the food they’re eating is healthier. This was frequently accompanied by the suggestion that eating “better” food would prevent them from getting fat and becoming a drain on the health care system. Virtually no one defended the purchase of “premium” foods on the grounds that they might be more pleasurable than whatever kind of gruel or cabbage soup might be the cheapest way to fulfill your nutritional needs.

3. Critics of the article and the sheeple—a number of people brought up the work requirement for food stamps (which has been temporarily lifted in most states by the emergency relief act). Others noted that people qualify for some set monthly allotment of food stamps so it’s not like they get more assistance if they choose to purchase expensive things. This was often expressed as a hope that this article or the idea of hipsters taking unfair advantage of the public food assistance wouldn’t be used as political leverage against food stamp programs or welfare in general.

4. Critics of welfare qua welfare—a lot of people who commented on the article seemed less concerned about what people are buying with food stamps than the fact that anyone who might be described as a “hipster” would qualify in the first place: 

My issue lies with the fact that young, healthy, educated people are receiving government assistance in the first place. Rather than sully their precious hipster cred with some dreaded, uncool job such as waiting tables or manning the counter at Borders, these spoiled, art-damaged infants decide to go on food stamps. –SadieG

I belong to the third camp, which is basically what the first entry covered. I’ll look at the first two responses in more depth some other time. In this entry, I look at the misconceptions and anxieties expressed by this last group of comments and explain why some people find the idea of educated, young people being the recipients of food stamps—whether they’re using them to buy ramen or rabbit—so very infuriating.

Choosing to Be Poor

One of the main factors in this flavor of outrage is the mistaken assumption I discussed in the first entry that people have to be unemployed to receive food stamps. What seems to rankle the welfare critics is their belief that that young, able-bodied, educated people must be unemployed by choice and thus responsible for their poverty. SadieG was far from alone on this:

These are able-bodied 20/30-somethings with education. Granted, the job market is extremely weak but I have a hard time believing these people truly exhausted their options. Did they look into fast-food, janitorial services, retail? Those types of jobs have lots of turnover, so there’s almost always something available. Did they look into picking up a trade? I would say the chances are No – these type of jobs don’t fit into their self-image as an artist or whatever. So, even though this is a situation of their own making – they’re expecting the government to subsidize their lifestyle. And its all being paid for by people who actually bite the bullet and work at jobs they don’t necessarily love do what they do in order to support themselves and their families and not be a burden to society. Yeah, its pretty appalling. –CBFE

I don’t like that food stamps and unemployment are so readily handed out to people who are arguably unemployed or underemployed by choice for years at a time. –ohthatkate

It amazes me that people can insist on saying they would never do anything they consider "beneath" them, that is never some kind of job that is not "art related", and therefore status-y, but still have no problem taking charity handouts. These people need to either find a way to make a living or face reality. –Luccianna

Maybe a degree in post feminist analysis of Sumerian Temple Prostitutes wasn’t such a wise choice after all. —Senator Neptune

The author of the original article didn’t actually specify whether the people she interviewed were unemployed, and none of the people she interviewed said anything about refusing jobs that were “beneath them.” However, her anecdotes certainly implied that this trend was largely driven by Even making $10/hr working 35 hrs/wk, a single wage-earner in a family of three would qualify for $288/month in food stamps: artists and people with humanities degrees. For many readers, the anecdotes clearly spoke louder than the dismal unemployment statistics she mentions or the fact taking a low-wage job you might be overqualified for wouldn’t actually disqualify you from receiving food stamps.

As the original article notes, unemployment rose by 176% between 2006 and 2009 for college-educated people between the ages of 20 and 24. The biggest caveat attached to the recent economic “recovery” has been the persisting unemployment disproportionately affecting young people. Increasingly, even for people with college degrees, unemployment or underemployment isn’t a choice right now.  

But even setting the reality of the job market aside, let’s take this brand of outrage to its logical conclusion: if it’s wrong for people with college degrees (or certain kinds of college degrees) to get food stamps, then presumably, that should be added to the list of disqualifying factors. In light of the specific ire directed at the arts and humanities, the exclusion could be limited to graduates with degrees in the actual analog of Senator Neptune’s “post feminist analysis of Sumerian Temple Prostitutes.” It wouldn’t even be difficult to enforce—I’m sure the administrators of the program could check for applicants’ degree history just as easily as they can verify that senior citizens in the program have no more than $2,000 in assets, and there wouldn’t even need to be a debate about what counts—states could just use the CIP codes for the “humanities” assigned by the U.S. government.

Based on that system, a low-income college graduate who majored in something like math, astronomy, or sociology could still get supplementary nutritional assistance, but one who majored in history, linguistics, or philosophy would be out of luck. Would that really make any more sense or better fulfill the goals of the Food Stamp program? Are people who study art history and end up working minimum wage jobs any more culpable for their poverty than sociology majors? Should the government really deny assistance to people with the naïveté or gumption to major in poetry writing, but extend benefits to journalism majors who chose to ignore the fact that the profession they were training for was in the middle of a precipitous decline?

Especially in the current economic climate, no one is guaranteed a job—let alone one that pays more than poverty wages—regardless of how much education they have and what kind. Aside from unfairly punishing people whose particular interests or talents might not have been well-served by one of the sciences or a pre-professional program, this kind of policy might well discourage people from finishing their degree if they don’t have guaranteed employment. College drop-outs would at least still have a safety net. It might also discourage students who don’t come from wealthy backgrounds from majoring in English or History.

The primary faulty assumption these comments seem to rely on is that if you have a degree, you should be able to get a job, and if you can’t, you have done something wrong—gotten the wrong degree, been unwilling to accept menial or low-paying work, failed to consider all your options, etc. And therefore, taxpayer dollars shouldn’t go towards making your life even marginally more tolerable.

The Meritocracy Myth

If white kids with educations – who should be entitled to the good life – end up on food stamps, that bodes ill for everyone. Clearly they must be cheating or bad people, because good people can avoid such problems.—softdog

People tend to attribute the especially fervent faith in meritocracy in the U.S. to the “Puritan work ethic” or the fact that from its founding, America was seen as a “land of opportunity” and mobility in contrast with class-bound Europe. Although opportunity has never been as universal as the American Dream suggests, the idea that hard work could get you farther in the U.S. wasn’t entirely a fiction for at least the first two centuries after the nation’s founding. According to Joseph Ferrie’s analysis of U.S. and British census records from the 1850s through the 1920s, more than 80% of the sons of unskilled men born in the U.S. during that period moved to higher-paying, higher-status positions while fewer than 60% of the sons born in Britain did so. The economic prospects for non-white, female and immigrant Americans were considerably bleaker, but throughout much of the 20th C., intergenerational income mobility in the U.S. increased regardless of race, gender, or nation of birth.

the closer the mobility,percentages are to 20, the greater the mobility,

Intergenerational  income mobility—or the chances of making more money than your parents—began to fall sharply in the early 1980s and have been declining ever since according to a 2008 study. Families have also become less likely to move out of their starting income quintile in recent decades—the panel study whose results are shown in the chart to the right found that between the 1970s and 1990s, the chances of a family moving up or down the income ladder decreased. As a result, contrary to popular belief, class structure in the United States in 2009 is less fluid than it is in countries like France, Germany, Britain, Denmark, and Canada.

Nevertheless, survey research suggests that the vast majority of Americans not only still believe in the possibility of mobility, that belief has actually increased even as mobility has declined. In 1983, only fifty-seven percent of respondents claimed to believe it was “possible to start out poor in this country, work hard, and become rich” and thirty-eight percent said it was “not possible.” In 2005, eighty percent  said they thought it was “possible to start out poor in this country, work hard, and become rich,” versus only nineteen percent who said it was not possible. And people are not only increasingly likely to believe that mobility is possible, they also tend to say that mobility is increasing rather than decreasing. The most popular response to a question in the 2005 survey about the “likelihood of moving up from one social class to another” now compared with 30 years ago was “greater” (forty percent).

from the great series on class published in the NYTimes in 2005:

What these statistics suggest is that the myth of meritocracy isn’t just some historical holdover from America’s Puritan roots—it’s an ideology that has become far more dominant in the last few decades. It’s also worth noting that the “rags to riches” stories associated with Horatio Alger emerged in the Gilded Age, another period of dramatic income inequality and relatively low mobility. Of course, there are lots of reasons why people might want to believe that hard work pays off and talent and effort are reliably rewarded in any era. It enables people to take credit for their success and represents a kind of basic fairness. But I think the reason faith in meritocracy increases as the prospects of mobility and job security decline is that it also offers a form of false but powerful reassurance that becomes more compelling in periods of insecurity and stagnation.

The myth of meritocracy makes people think that they can insulate themselves from failure or poverty by simply making the “right” choices. The more threatening those things get, the more people cling to the myth. To accept the alternative—that making the right choices can’t protect you and systemic instabilities make everyone vulnerable—means that no one is safe.

But of course, that’s the whole point of social welfare programs. No one is safe. Especially since the recent recession, when even many elite law school graduates have been unable to find jobs—or at least ones that will ever enable them to pay back their debt—it’s not just artists and the mythical majors in “post feminist analysis of Sumerian Temple Prostitutes” faced with lingering unemployment or underemployment. The idea that people choose to be poor because they can get food stamps or that food stamps represent some excessive government largess is, at best, willful ignorance.

Coming in part III, more thoughts on the various ways people in the first two camps are basically in the same camp—although they disagree about the details, they both want to impose their own ideas about how people should eat on the recipients of public assistance.

Sourdough-Risen Whole Wheat Bagels and the Sweetness of the Old World

Happy day after St. Pat's! Can I offer you some carbohydrates? Perhaps slathered in some fat?  

“Authentic” Bagels: Boil, Bake, and Bluster

There are three things that distinguish bagels from other breads:

The first, perhaps obviously, is the shape. There are at least four different theories about the origin of the word “bagel,” and all of them refer to the shape (etymology notes below the recipe for fellow word geeks). However, you can’t just make a standard bread dough into rings, throw it in a hot oven, and expect it to develop the glossy crust and dense, chewy interior that most people associate with bagels.

The second difference is an issue of method: bagels are traditionally boiled before they’re baked, which causes the surface starch to gelatinize, producing their characteristic smooth, shiny crust. The same is Or maybe the bagel married in, likely to the tacit (if not explicit) alarm of some of the older members of the Christian family.true of pretzels, which originated in the same region and, according to Maria Balinska, who wrote a 2008 book about the history of the bagel, are probably related. She specifically calls them “cousins,” whatever that means in terms of food history. She also notes that the Polish obwarzanek—another boiled, ring-shaped bread often sprinkled with sesame or poppy seeds—is an “older and Christian relative,” so perhaps that’s the spinster aunt who devoted herself to Jesus. Google translates the Polish entry on “Obwarzanek” to “Bagel,” and this travel guide refers to them as “pretzel rings.” I’m sure different people have different ways of distinguishing between the three, but the boil-then-bake method they share probably makes them more alike than different. So, for example, some people might think pretzels have to be shaped like folded arms whereas other people accept rods or rings as “pretzels,” but either way they’re formed from ropes of dough that maximize the surface area exposed to the boiling water, just like their relatives.

The third difference is an ingredient—bagels are the only bread I know of whose recipes frequently call for malt extract. Pretzel recipes occasionally include it, but not nearly as often as bagel recipes, many of which claim that the malt extract is the key to making “authentic” bagels or achieving a truly “bagel-y” flavor.

The idealized referent of bagel authenticity is usually the “New York bagel,” rather than their Polish-Jewish ancestors. However, when I lived in New York City, I ate plenty of bagels—even at delis on the Lower East Side—that were indistinguishable from the ones available at chains like Brugger’s and Einstein’s nationwide. Perhaps that’s just further evidence of the declining standard described here (accompanying a recipe that demands malt powder):

I can’t count how often expatriate New Yorkers would stop me on the street with tears in their eyes, telling me that mine were the best bagels they’d had since they left "The City," and that they were better than most in "The City" these days. The reasons are simple. I didn’t cut corners and used good ingredients. I don’t know why so many bakeries cut corners on making bagels these days, it’s really NOT that hard!

But I think it’s more likely that the idea of the superior New York bagel is primarily the product of nostalgic fantasies and social decline narratives—it’s something that never was and tells you more about contemporary anxieties and desires than anything real in the past. The tears in those expatriates’ eyes say more about contemporary feelings of depthlessness and transience, the desire for connections to the past and a sense of community, and the myriad dissatisfactions that make people want to think everything was better in the “good old days” than what makes a bagel delicious or “authentic” to anything.

Malt Extract: the Ancient Sweetener in your Bud Light

Given how the same bakers describe malted barley extract on their ingredients page, its presence is probably one of the so-infuriatingly-cut corners they’re talking about:

We wouldn’t dream of making bagels or kaiser rolls without barley malt extract, and neither should you! Barley malt extract improves the taste and texture of the breads it is used in. It goes by a number of names. barley malt extract and malt extract among them. If a malt extract doesn’t specify what grain it is made from, chances are pretty good it was made from barley. Barley is a grain used mostly in brewing beer and making Scotch Whisky. IBarley makt [sic] extract adds a nice taste to breads where it is used. For our recipes, you can either liquid or dry, diastatic or non-diastatic malt extract and not worry about changing the recipe, any combination of these will work just fine. The important things to avoid are hopped malt extract which is really only useful for making beer and the malted milk powder sold in many grocery stores as a milk flavor enhancer which has too little malt in it and too much sugar.

From an 1896 Harper's Magazine @ extract is basically just sugar made from grain, usually starting with barley. According to Harold McGee, it’s “among the most ancient and versatile of sweetening agents, and was the predecessor of modern-day high-tech corn syrups.” Just like corn syrup and agave nectar, malt extract is produced by breaking starches into their constituent sugars. Rather than adding enzymes or acids, malting works by simply germinating or sprouting the grain. As a grain germinates, it produces enzymes that digest the grain’s starch to fuel its growth. Those enzymes can be dried and mixed with cooked grains (usually rice, wheat, and barley), which they can also digest, producing a sweet slurry containing lots of glucose, maltose (glucose+glucose), maltotriose (glucose+glucose+glucose), and some longer glucose chains.

It’s not as sweet as sugar, but before sugar colonialism, it was one of the primary sweeteners available in Europe and Asia (the other two were honey and molasses made from sorghum). According to McGee, it was the primary sweetener in China until around 1000 CE, and is still used in China and Korea for confections and the sweet, caramelized gloss on dishes like Peking Duck. Malt extract is also still frequently used in beer brewing—a friend who does home brewing told me recently that American brewers are especially likely to use it to adjust the alcohol content of their beers midway through the brewing process. Apparently the laws regarding how closely the alcohol percentage matches what’s on the label are fairly strict and as the sugars in malt extract are highly available to yeast, it’s a good way to increase the yeast activity quickly and reliably.

cocktails to anyone who knows the mug's year

Possibly-Heretical Baking Substitutions

I'm not 100% sure what the label means. Is it malted wheat? Malted barley that was fed with cooked wheat? Malted wheat fed with cooked wheat? McGee claims that malt extract is “frequently used in baking to provide maltose and glucose for yeast growth and moisture retention,” and that might be true for commercial bakers, but it’s not available at most grocery stores, where home bakers get their supplies (it can be found anywhere that carries home brewing supplies and many “natural foods” retailers, including some Whole Foods). However, before sugar was readily available and cheap, it seems likely that malt extract was used the way other sugars often are today—to speed up yeast activity, enhancing rise and oven spring—in many kinds of bread, not just bagels. 

Some bagel recipes call for other sugars in place of the malt extract in bagel dough—the first recipe I tried called for maple syrup, perhaps because of it’s phonological similarity to “malt syrup,” the liquid form of malt extract or because they’re both liquids, though malt syrup is much thicker—closer to unfiltered honey. Recipes that call for “malt powder” but also recommend a sugar substitution generally call for brown sugar. And I found at least one that suggests malt powder, malt syrup, honey, and maple syrup are all interchangeable. Of course, they all have slightly different flavors, but most recipes only call for 1520 g for ~8 bagels so any affect the sweetener’s flavor has on the final product is bound to be minimal.

It’s been a while since I made the maple syrup batch, but I honestly didn’t notice any major flavor difference in the batch pictured above, which used malt extract. Perhaps part of the problem was that I used a “wheat” malt, which may not have as malty a flavor as barley malt. But, again according to McGee, even when it starts with malted barley, “malt syrup has a relatively mild malt aroma because the malted barley is a small fraction of the grain mixture.” In short, despite what some recipes say, you shouldn’t let your lack of malt extract stop you from making homemade bagels.

Nonetheless, it’s still a mystery why bagel recipes would be more insistent about using malt extract than any of the other breads descended from European varieties developed before sugar colonialism. Why are people so willing to substitute sugar in everything from soft, buttery brioche to pretzels, bagels’ closest cousins, but fanatics about the importance of using this particular Old World sugar to certify the authenticity of the bagel?

A Fetish for the Old World

My theory is that it has to do with the bagel’s iconicity and association with Jewishness. One story about the origin of the bagel that seems plausible (though Balinska lumps it with the story about stirrups—explained in the etymology note at the end—as speculative at best and possibly fictitious) is that it’s another version of the ubiquitous roll-with-a-hole developed by Jewish bakers in Krakow after a decree limiting baking or trade in flour to the bakers’ guild was lifted. Even in Poland, which from its founding was more tolerant to Jews than most countries in Europe, Christian trade and craft guilds in many cities excluded Jewish merchants and artisans, who sometimes formed their own guilds. The travel guide’s description of Obwarzanek claims that King Jan Sobieski lifted the ban in 1496, but he didn’t rule until the 17th C. Other claims that Jan Sobieski lifted the ban in the late 17th C. are problematic because the Yiddish word “beygel” was already in widespread currency in Krakow by 1610. There was a different King Jan in 1496—Jan I Olbracht or John I Albert—whose reign was also notable primarily for wars against the Turks. Perhaps he lifted the ban, and Jan Sobieski’s greater fame and friendliness to Poland’s Jews sort of absorbed the earlier Jan’s bagel-inspiring or enabling acts? 

Regardless of precisely when or why Jewish bakers in Krakow started making their own version of the obwarzanek, it’s probably the strictness of Jewish dietary laws that made it so popular and caused it to spread to different Jewish communities, whereas the obwarzanek has remained basically a Krakow specialty. It’s leavened, so it’s not kosher for Passover, but it doesn’t contain any dairy so it is parve. Additionally, the thick, solid crust keeps the interior soft and moist better than a split or craggy crust would. So while bagels, like most breads, are tastier when enhanced with fatty spreads or toppings, they’re not bad plain. I suspect that’s also why the Jewish bagel is traditionally shaped into a smooth round whereas obwarzanek look like they’re usually twisted and supposedly do stale quickly:

On leaving the oven the baked goods have a sell-by date of about three hours. As such, finding a hot one is essential. Enjoyed by people of all ages, obwarzanki also feed Kraków’s entire pigeon population when in the evenings the city’s 170-180 obwarzanki carts essentially become bird-food vendors.

Of course, soft pretzels also have a smooth crust that protects the soft interior and makes them tasty with or without added fats, but the pretzel was never associated with Jewishness. The popularity of bagels in America and canonization of the “New York bagel” has everything to do with Jewishness. According to We Are What We Eat: Ethnic Food and the Making of Americans by University of Minnesota historian Donna Gabaccia:

It is true that in the 1890s in the United States only Jews from Eastern Europe ate bagels. In thousands of nondescript bakeries—including the one founded in New Haven around 1926 by Harry Lender from Lublin, Poland—Jewish bakers sold bagels to Jewish consumers. The bagel was not a central culinary icon for Jewish immigrants; even before Polish and Russian Jews left their ethnic enclaves or ghettoes, their memories exalted gefilte fish and chicken soup prepared by their mothers, but not the humble, hard rolls purchased from the immigrant baker. As eaters, Jewish immigrants were initially far more concerned with the purity of their kosher meat, their challah, and their matzos, and with the satisfactions of their sabbath and holiday meals, than with their morning hard roll….

They became firmly identified as “Jewish” only as Jewish bakers began selling them to their multi-ethnic urban neighbors. When bagels emerged from ghetto stores as a Jewish novelty, bagels with cream cheese [which, as she elsewhere notes, was initially developed by English Quakers in the Delaware Valley and Philadelphia in the eighteenth century] quickly became a staple of the multi-ethnic mix that in this century became known as “New York deli,” and was marketed and mass-produced throughout the country under this new regional identity.

As she also notes, in Israel bagels are considered “American” not Jewish. However, their widespread association with Jewishness in the U.S.—both as a marketing tool and as the basis of legit cultural practice and memory—puts greater demands on bagels and bagel bakers to legitimate their authenticity and historicity. Whether it was a continuation of the pre-18th C. practice of using malt extract in many breads to speed up yeast action or a a re-introduction of the ingredient from some centuries-old bagel recipes, using malt extract has become one way for people to differentiate their bagels and lay claim to greater “authenticity.” 

Making Your Own

Authentic or not, this recipe is delicious and fairly easy. Like most yeast breads, it takes time, but not a lot of active time. You can use any combination of flours you want, but if you want a really chewy crust and crumb, you will need a high proportion of protein. Some recipes suggest “high-gluten” bread flour, which has an even higher percentage of protein than bread flour. King Arthur claims their “Sir Lancelot” flour is the highest-protein flour currently available for retail sale at 14.2% protein. I just used regular bread flour (10-12% protein), whole wheat bread flour (up to 14% protein, although the additional fiber seems to limit gluten action which is also why I didn’t make them with 100% whole wheat flour), and added approximately 1 T. vital wheat gluten per cup of flour (including the flour in the starter). Even if you just used all-purpose flour, they would probably still be good, just less chewy.

You can add more of any kind of sugar or whatever else you might want in them—dried fruits, nuts, chocolate chips, chopped spinach, grated cheeses, etc. And you can top them however you like—I used kosher salt for some, sesame seeds for some, and a combination of bits of fried garlic, fried shallot, black sesame seeds, and kosher salt, kind of like one version of an “everything” bagel. I think they’re best fresh out of the oven, slathered with butter, but true to form, they’re also good plain (and easy to stow in a bag for a convenient snack) and on days 2 and 3, they’re great toasted.

 and all the delicious bits that fall off can be pressed into the soft side 

Recipe: Sourdough-Risen Bagels (to substitute instant yeast see this entry)

  • 2 cups refreshed starter (450g)
  • 3 1/4 cups flour with 12-14%-protein (550g) I used:
    • 4 T. vital wheat gluten (50g)
    • 1 1/2 cups whole wheat bread flour (250g)
    • 1 1/2 cups bread flour (250g)
  • 4 t malt extract (20g)—optional
  • 2 t kosher salt (15g)
  • 3/4 cup water (170g)
  • 3 T oil (35g)
  • 1 T maple syrup (20g)
  • toppings—sesame or poppy seeds, salt, fried garlic or shallots, finely grated hard cheeses, etc.
  • 1 tsp. baking soda (for poaching water, not for dough)

1. Whisk together flours, gluten, and malt extract if using. Add starter, water, salt, oil, and maple syrup.

incredients combined enough of a dough to begin kneading

2. Mix until they begin to form a dough. Turn onto a lightly floured surface and knead for about 15 minutes. If you have a mixer or processer with a dough hook, you can use it for this step. Gluten development is pretty important if you want chewy bagels, so it’s worth checking for the baker’s windowpane.

after 12 minutes of kneading, close but not quite after 15 minutes, it's smoother and stretchier

3. Cover and let rise 3-4 hours, or until doubled. You can let it rise longer and nothing bad will happen, although the sour flavor will become more pronounced over time, and positively sourdough-like after 12-15 hours. You can significantly retard the rise by refrigerating the dough.

bagels 035 bagels 036

4. Divide the dough into 8-12 equal pieces. If you want to be especially particular, use a scale. Eight will be ~155g each, ten will be ~125g each (the size I made), twelve will be ~105g each. Shape them either by poking a hole in the middle of a round and stretching it out or rolling the dough into a rope 9-12 inches long, and pinching the ends together. In my experience, the latter makes for a slightly more consistent thickness.

the poking method the rope method two lumpier ones on the top right were shaped by stretching a hole, the others were all made with the rope method

5. Let rise another 3 hours (30-45 min. if using instant yeast) or cover and refrigerate overnight or up to a week, and remove 1 hour before you’re ready to bake to let them come back to room temperature (so if you want fresh bagels in the morning, you need to make the dough by the afternoon before).

6. When ready to bake, preheat the oven to 400F, boil a large pot of water with 1 tsp. baking soda dissolved in it, and put a couple tablespoons of any toppings you want into bowls.

clocwise from the top left: kosher salt, "everything" mix of fried garlic and shallot and black sesame seeds and salt, and plain roasted sesame seeds

7. When the water is boiling, carefully place 2-3 bagels at a time (more if the pot is large enoguh that more can float in the pot without touching) and poach them for 1 minute on each side. Remove them to a colander and then, while they’re still wet, place them in one of the bowls of toppings.

poaching side 1 poaching side 2 collecting toppings

ready to bake 

8. Bake for 20-25 minutes, or until browned. Rotate pans half-way through if your oven is uneven.

Fun With Etymology

Leo Rosten’s Joys of Yiddish (1968) says the origin is “beugel,” the German word for “a round loaf of bread,” although it’s a little perplexing why that would have been used to describe a bread that, unlike the ubiquitous round loaf, is ring-shaped. And also, Wiktionary is all “beugel? I don’t know no beugel.”

Several sources, including a 1993 New York Times article and a 2006 book titled Bakery Products: Science and Technology, refer to a popular myth that bagels were invented by Jewish bakers in Vienna as a tribute either King John (Jan) III Sobieski or a King Jan (John) Cobleskill of Poland after he saved the city from Turkish invaders in 1683. The King’s favorite hobby was horse riding, so they shaped the rolls like stirrups, the German word for which is “bugel” (the Austrian word is “beugel” which may be the origin of the first faux-etymology). However, a letter to the editor demanded that “that piece of fakelore be laid to rest,” noting that Yiddish word “beygl” appears in the communal rules promulgated by the leaders of the Cracow Jewish community in 1610: “The rules stipulate that bagels are among the gifts which may be given to women in childbirth and to midwives.” Furthermore, the word appears in the rules without any definition or explanation, suggesting that it was already well-established by the early 17th C.

Two that seem more likely: According to, the Oxford Companion to Food (1999) says the word comes from “bugel,” not the German word for stirrup, but the Middle High German word for “ring or bracelet.” And in Jewish Cooking in America (1994), Joan Nathan claims that the word derives from “biegen,” the verb meaning “to bend.” Both “bugel” and “beigen” are derived from the Old High German “biogan,” meaning to bow, bend, or curve and the related root “boug-,” which in turn is descended from the Proto-Germanic “beugan” (which, incidentally, also gives us the Old English root “beag” or “beah” which also refers to a ring—“usually meant for the arm or neck; but in one case at least used of a finger ring” OED). So that Germanic root for all things bendy and ring-like is likely the origin of the Yiddish word that was in wide use in Poland by 1610.

Restaurants of New York: Stop Serving Assemblyman Felix Ortiz Food Prepared With Salt In Any Form

My sincere apologies to any lookalikes. Perhaps you could go moustache-less for a while? No salt for you!

Just over a week ago, a New York state assemblyman from Brooklyn named Felix Ortiz proposed a bill that would prohibit “the use of salt in any form in the preparation of any food for consumption” with penalties of “not more than one thousand dollars for each violation.” Presumably that wouldn’t prevent restaurants from providing salt for customers to add at their own discretion, but the bill offers no further details about what would and wouldn’t be considered a “violation” of the law or what is and isn’t included in the definition of “salt in any form”: see the full text here (hat tip: Reason).

Surely table salt (NaCl) would count, but what about any of the other edible ionic compounds that are chemically considered to be salts, like MSG (a sodium salt with the molecular formula C5H8NNaO4) or cream of tartar (a potassium acid salt with the formula KC4H5O6)? What about salty condiments like soy sauce, fish sauce, and ketchup? Would a restaurant that serves a ketchup-topped meatloaf have to forego the salt in the loaf mixture but still be able to slather ketchup on top (if so, why wouldn’t they just start adding ketchup to the mix as well, and finding ways to incorporate condensed soups and bouillon into dozens of other things that don’t already have them)? Or would they have to find or make their own salt-free ketchup—obviously a much larger burden on some kinds of restaurants? Even if it could make you live forever, would it be worth it?What about all the other prepared foods that already include salt and get used as ingredients in the preparation of other foods? Would Momofuku Milk Bar be banned from serving its famous compost cookies, which call for the addition of two “snack foods” like potato chips and salted pretzels?

House-baked, cured, and brined things would clearly suffer most from a law like this. It’s one thing to have to salt a soup or curry or burger at the table, but everything from deli pickles and salami to homemade cinnamon rolls and pie crusts would become completely unpalatable, if not impossible, without salt. When questioned by the Albany Times Union about salt-cured meats and pickles:

Ortiz didn’t have answers, saying repeatedly, "This all needs to be debated."

Of course, it’s probably not worth worrying about the ramifications of a bill that I can’t imagine has any chance of passing. Even the NYTimes has backed down from their initial, crazypants coverage of the recent NEJM study that claimed a small reduction in sodium consumption would save 44,000 lives a year—which is exactly the sort of statistic that gives legs to hysterical nutritional crusades (hysterical both in the funny-ha-ha sense and in the wandering-uterus-induced-insanity sense). The best example of that phenomenon is probably the equally batshit claim that obesity causes 300,000 deaths per year, but even anti-obesity crusaders have struggled to get far less aggressive measures passed, like the mandatory inclusion of calorie counts on fast food menus (which, incidentally, do not seem to reliably reduce how many calories people purchase).

Ortiz’s bill is actually so preposterous and so much more aggressive than the other recent proposals for reducing salt consumption, like the New York City Department of Health and Mental Hygiene’s campaign to persuade food manufacturers to reduce the salt content of processed food by 40 percent over the next 10 years, that I initially thought it might be a sort of “straw man” bill designed by restaurateurs and/or salt-reform-skeptics to win people over by making salt reform seem even crazier than it actually is. But according to Ortiz, it was actually inspired by his father’s death:

He said he was prompted to introduce the bill because his father used salt excessively for many years, developed high blood pressure and had a heart attack (Albany Times Union).

Pity his father’s heart attack couldn’t be attributed to excessive exposure to creepy moustaches.

I've been trying to come up with equivalents and most of them end up being alcoholic: "As much as gin loves olives," "As much as tequila loves lime." There are so few other set-in-stone pairings. "As much as manchego loves quince"? "As much as rich gravies and stews love just a little bit of acid"? Ortiz’s salt-banning tribute to his dad is sort of like an inversion of the stories about filial love and salt that show up in traditional folklore of many different cultures from England to Central Europe to the Himalayan foothills. Many of them begin with a Lear-like scenario where a King or a nobleman in the unfortunate situation of having three daughters in a patriarchal society demands professions of love from each of them to help him decide how to divide his kingdom or estate between them (or, more accurately, their husbands). The elder daughters supply all the hyperbolic declarations of love you’d expect from adult children trying to protect their inheritance, although we’re meant to understand that they’re duplicitous opportunists who love their father’s money and power more than they love him. The youngest, who really loves him, says that she loves her father either as much as she loves salt or “as much as meat loves salt."

The King balks at being equated with a lowly condiment and banishes her for her seemingly insufficient devotion. Then, one of two things usually happens: either her departure magically causes salt to stop coming into the kingdom, their supplies begin to dwindle and people begin to sicken and die until the daughter returns and feeds her ailing father a nourishing, salty broth or bit of bread spread with butter and sprinkled with salt and he realizes that she was the one who loved him best of all OR someone arranges to have a feast prepared without salt, and as course after course comes out of the kitchen completely inedible, the King realizes his error and welcomes his daughter back. In Ortiz’s case, it’s the father who loves salt too much and the son who doesn’t realize its value.

The crux of the trope is that it’s only after people are deprived of salt that they realize how important it is to their happiness, and everyone gets to live happily ever after. In the English version called “Cap o’Rushes,” after the Lear bit, the story proceeds basically like the Grimm brothers’ “Allerleiruah” or “All-Kinds-of-Fur.” After banished from her father’s house, the daughter disguises herself in a cloak of rushes and becomes a servant in another nobleman’s home. He happens to have a son of marrying age so there are series of wife-seeking balls, Cinderella-style, and she’s the mysterious girl who steals his heart and disappears, though Cap o’Rushes manages to hang onto her shoes. Instead, the prince-figure gives her a ring, and when he falls into a deep depression because he doesn’t know how to find her, she prepares a stew or some gruel for him and slips the ring into it. Her identity is revealed and he proposes—and the interesting part is that the story doesn’t end there the way it normally would, not just in fairytales but in most English bildungsroman involving female protagonists until the 20th C. Boys become men and get jobs; girls become women and get married, The End. But in “Cap o’Rushes,” the resolution is about the salt as much as the marriage. The girl’s father is invited to the wedding, and she instructs the cooks to prepare her wedding feast without a grain of salt. By the last course, the man bursts into tears, finally realizing how much the daughter he sent away really loved him. The bride comes to his side, he recognizes her, she forgives him, and that’s what makes people happy ever after.

So here’s my proposal: if Felix Ortiz really wants restaurants to stop serving food prepared with salt “in any form,” I think that’s precisely what they should give him, but only him. I suppose, like the bill, what “in any form” means should be left up to the restaurants themselves, but I would encourage them to take a broad interpretation in case that’s how the court would chose to interpret it. Probably none of whatever the nibbles in the lower right corner are, either. Catering counts as restaurant-prepared food, too.So, no soy sauce or MSG, although I suppose we can let non-sodium salts like cream of tartar slide. But definitely no ham, bacon, salami, pepperoni, mortadella, corned beef, pickles, or kippered herring. No meats that have been brined, rubbed with salt, or dipped in a salted batter before cooking—let him taste what fried chicken and blackened fish are like without salt, what pulled pork is like without salt in the dry rub, and what roast chicken is like without any salt rubbed under the skin. No Chinese-style tofu (silken tofu, which is often made from soy milk coagulated with acid instead of salt could theoretically be okay, but be sure to check the label). No ketchup, mustard, mayonnaise, salad dressing, or cheese unless they’re house-made and can be made without salt. The same goes for pasta, bread, pastries, and puddings. No salt in the patty of any burger or in eggs cooked any style. No packaged potato or corn chips, pretzels, crackers, or cookies. No soups made with bouillon, no canned tomatoes. He can have them at home, but not at any establishment that would be covered by the ban.

If he goes to a noodle bar for ramen, he should be served a bowl of unsalted noodles in a salt-less broth with unsalted toppings. If he orders a BLT, he can have salt-free bread with lettuce, tomato, and salt-less mayonnaise—if there’s no salt-free bread or mayonnaise available, just the lettuce and tomato. Let him try salting cheesecake, ice cream, caramels, cookies, and croissants to taste at the table with a salt shaker. I don’t expect him to burst into sobs in the manner of Cap o’Rushes father, but we’ll just see how long it takes before he reconsiders the wisdom of banishing salt from the kitchens of New York.

Don’t Drink the Agave-Sweetened Kool-Aid Part I: “Natural” my foot

UGH the subtitle. I really want Ms. Catalano to show me exactly where in "nature" she gets her agave nectar. Also, I find the use of "ultimate" to mean "exemplary" or "best" instead of "final" or "last" grating, but that's a petty battle against usage change that "Ultimate Frisbee" has clearly already won. Still, I like to think of it as "Frisbee for the End Days" Just as "wholesome" as any other hydrolyzed, refined sweetener. If you've been snarky about the Corn Refiners' Assn's recent "Sweet Surprise" marketing campaign, but have a bottle that looks like this in your cupboard, I have some delicious all-natural snake oil to sell you, good sir or madam.

This entry was nearly titled “Things That Might Not Kill You In Moderation But Certainly Won’t Make You Any Healthier Vol. I,” or “Hydrolyzed, Refined Sweeteners Masquerading as ‘Natural,’ Whole Foods,” but those seemed a little unwieldy. They do, however, capture the essence of the argument: agave is nutritionally no better than most other refined sweeteners, including high-fructose corn syrup (HFCS). If anything, it’s probably worse because it contains more fructose than table sugar or HFCS. It’s also no more or less “natural” than HFCS—it’s actually produced in a remarkably similar process that was first used on the fibrous pulp of the agave in the 1990s. While, as its proponents claim, the higher proportion of fructose has enabled people to call it a “low glycemic index sweetener,” sometimes alleged to be safer for diabetics and recommended by weight-loss programs like Weight Watchers, recent research suggests that large amounts of fructose aren’t healthy for anyone, diabetic or otherwise.

I mentioned agave nectar in passing in the HFCS post, but there’s enough conflicting information about it to merit its own post(s). A lot of the misinformation comes from agavevangelists, who can sometimes get a little sanctimonious about their avoidance of the demon HFCS and preference for “natural” sweeteners. Even this Vegfamily article that concludes “the physiological effects of all [caloric] sweeteners are similar” nonetheless claims:

Given the choice between sugar, HFCS, and agave nectar, I’ll stick with organically-grown, unbleached cane sugar (evaporated cane juice) and organic raw agave nectar that are free of pesticides, herbicides, and chemical bleaching agents; not genetically engineered; and still retains some nutrients, as well as being vegan. Since HFCS is not available in organic form and is highly processed, I would never use it.

But agave nectar is just as processed as HFCS.

HFCS and Agave Nectar: One of These Things is Not Almost Exactly Like The Other

1910 magazine advertisement from Like most starches, corn starch consists of large glucose polymers—70-80% the branched, non-water soluble amylopectin and 20-30% linear, soluble amylose. Normal or non-HFCS corn syrup, like Karo, is produced by breaking those polymers down into their constituent glucose molecules using acids, enzymes, and/or heat. For the history buffs: the acid hydrolysis of starch was first discovered because of the 1806 British blockade of the French West Indies. Napoleon I offered a cash reward for anyone who could come up with a replacement for cane sugar, and a Russian chemist named Konstantin Kirchhof found he could produce a sweet syrup from potato starch by adding sulfuric acid. The same process was first applied to corn in the mid-1860s, and gained popularity in the U.S. during the sugar shortages of WWI (source: The Oxford Encyclopedia of Food and Drink in America).

HFCS is produced by converting the glucose into fructose using an enzyme technology developed in Japan in the 1960s (detailed here). The resulting syrup, which contains up to 90% fructose, is then typically mixed with corn-based glucose syrup to produce HFCS-55 (the kind used in soft drinks, which has 55% fructose/45% glucose) or HFCS-45 (the kind used in baked goods, which has 45% fructose/55% glucose). Some people, like Cynthia commenting on Daily Candor, have suggested that the fructose and glucose in HFCS are absorbed into the bloodstream faster because they’re “free" instead of bound the way they are in the disacccharide sucrose, which is broken into glucose and fructose by the enzyme sucrase. Theoretically plausible, but apparently not true:

Sucrose is hydrolysed by brush-border sucrase into glucose and fructose.
The rate of absorption is identical, regardless of whether the sugar is presented to the mucosa as the disaccharide or the component monosaccharides (Gray & Ingelfinger, I 966, cited by H. B. McMichael in “Intestinal absorption of carbohydrates in man”).

I'm going to start refering to packaging like this as granola-washingJust like HFCS, agave nectar is produced by breaking down a plant-based polymer into its constituent sugars. In the case of agave, the relevant molecule is inulin, a fiber composed mostly of fructose units with a terminal glucose. Just like with corn and potato starch, there are different methods of hydrolyzing the sugars in inulin.  Blue Agave Nectar uses a thermic process. Madhava uses an enzyme process, just like HFCS.

Agavevangelists like to claim that agave nectar is a traditional sweetener used by native peoples, which appeals to the popular notion that the foodways of the past were generally healthier (e.g. Michael Pollan’s advice not to eat anything your great-grandmother wouldn’t recognize as food). Some, like Lynn Stephens of Shake Off the Sugar, merely note that the agave plant itself “has long been cultivated in hilly, semi-arid soils of Mexico.” That’s true, although it’s about as relevant as the long history of corn cultivation. Others claim that agave nectar itself has an ancient history. Flickr user Health Guy says of agave nectar: “It is 1-1/4 times sweeter than sugar, so you need less, and it has been consumed by ancient civilizations for over 5,000 years.”

Wrong. According to the website for Madhava Honey:

Agave nectar is a newly created sweetener, having been developed during the 1990’s. Originally, the blue agave variety was used. This is the same plant used in the manufacture of tequila. During the late 90’s, a shortage of blue agave resulted in huge increases in cost and a sweetener based on this plant became uneconomical. Further research was done and a method using wild agave was developed. Overcoming the language barrier between the Indians able to supply the nectar from the wild agave on their land and the Spanish speaking local manufacturer was the key that finally unlocked a supply of raw material and has led to our bringing this wonderful new product to market.

Still doing some native-washing (wild agave harvested by Indians who don’t speak Spanish—can’t you just feel the virtue?), but here’s what happens to the agave sap after harvesting, as described in the abstract of the 1998 patent issued for the production of fructose syrup from the agave plant:

A pulp of milled agave plant heads are liquified during centrifugation and a polyfructose solution is removed and then concentrated to produce a polyfructose concentrate. Small particulates are removed by centrifugation and/or filtration and colloids are removed using termic coagulation techniques to produce a partially purified polyfructose extract substantially free of suspended solids. The polyfructose extract is treated with activated charcoal and cationic and anionic resins to produce a demineralized, partially hydrolyzed polyfructose extract. This partially hydrolyzed polyfructose extract is then hydrolyzed with inulin enzymes to produce a hydrolyzed fructose extract. Concentration of the fructose extract yields a fructose syrup. (via Patentstorm)

Probably the healthiest sweetener pictured here and the one most shoppers in the market for a "natural sweetener" would be least likely to purchaseIt’s true that the corn used in HFCS is less likely than agave to be organically-grown, but you can get organic-certified corn syrup from the same manufacturer as the blue agave nectar pictured above and nutritionally, the main difference between that, the HFCS used in most processed foods, and agave nectar is the ratio of glucose: fructose. The regular corn syrup is 100% glucose, HFCS is usually 55/45 glucose/fructose, and agave nectar 56-90% fructose, depending on the plant and the process.

I’ve already talked a little about fructose vs. glucose here and here, but more coming soon in Agave-rant Part II concerning:

1) whether the fructose in agave is somehow better than, or indeed, different in any way from the fructose in HFCS

2) whether the fact that it’s sweeter than sugar makes it a lower-calorie alternative to sugar

3) whether its “low glycemic index” rating makes less likely to produce insulin resistance than table sugar and

4) whether it’s safer for diabetics

All of which people have claimed. I won’t keep you in suspense, especially given how long it may take me to put all of that together. The short answers are:

1) not in any nutritionally meaningful way

2) perhaps very slightly, but a <10 calorie/serving difference likely doesn’t make up for the increased risk of fatty liver syndrome and insulin resistance

3) no, it’s actually more likely to produce insulin resistance and

4) in miniscule amounts, perhaps, but recent trials involving diabetics and agave nectar were halted because of severe side effects.

Things That Won’t Kill You Volume 4: Saturated Fat Part II: Cholesterol Myths

image In retrospect, this probably could have been an entirely separate article in the "things that won’t kill you" series, as many people still believe that dietary cholesterol (i.e. cholesterol in food) is a bad thing. For example, the article that image was taken from claims:

If you get too much dietary cholesterol (over 300mg a day) the extra cholesterol will accumulate in the walls of the blood vessels, making your LDL (bad) blood cholesterol levels rise. Over time, your arteries will become narrower, which can cut off the blood supply to your heart (causing a heart attack), or your brain (causing a stroke).

However, that’s pretty easily dismissed—even Ancel Keys, "Monsieur Cholesterol" himself, never argued that dietary cholesterol was related to serum cholesterol or heart disease. In a 1952 article in Circulation, the journal of the American Heart Association, Keys noted that although rabbits and chickens that eat high-cholesterol diets will develop high cholesterol and atherosclerosis, or hardening of the arteries:

No animal species close to man in metabolic habitus has been shown to be susceptible to the induction of atherosclerosis by cholesterol feeding…. Moreover, even in the favorite species for such  experimentation, the herbivorous rabbit, the necessary concentration of cholesterol in the diet is fantastically high in comparison with actual human diets. Moreover, there is reason to believe that man has a greater power of cholesterol regulation than does the rabbit or the chicken. From the animal experiments alone the most reasonable conclusion would be that the cholesterol content of human diets is unimportant in human atherosclerosis.

Two "moreovers" in one paragraph, people! “Most reasonable conclusion”! Moreover, five decades of subsequent research haven’t given anyone any reason to think differently. In 1997, Keys was even more direct:

There’s no connection whatsoever between cholesterol in food and cholesterol in blood. And we’ve known that all along. Cholesterol in the diet doesn’t matter unless you happen to be a chicken or a rabbit.

Research done in the interim on the relationship between diet and heart disease in humans like the Framingham and Tecumseh studies showed no relationship between cholesterol consumption and blood cholesterol or heart disease. I’m not even going to modify this with "probably" or "as far as we know": There is no reason to believe that how much cholesterol you eat has any effect on your health.

But that doesn’t stop the AHA from recommending that “most people…limit cholesterol intake to less than 300 mg per day” and claiming that “an egg can fit within heart-healthy guidelines for those people only if cholesterol from other sources — such as meats, poultry and dairy products — is limited.” Despite repeated studies showing that egg consumption is not associated with higher serum cholesterol, myocardial infarction, cardiovascular disease, or all-cause mortality.

Backing up for a second: Ancel Keys, wherefore art thou Monsieur Cholesterol?

The reason Ancel Keys was called "Monsieur Cholesterol" wasn’t because his theory had anything to do with cholesterol in food; it was because his theory depended on the idea that saturated fat consumption causes blood cholesterol levels to increase, presumably putting people at risk of heart disease.

If you read Part I of this article, you may remember that I said there were three things that convinced me that saturated fat wasn’t a cause of heart disease. I explained the first two in that entry. (To recap, they were: 1) the fact that people in places like France and the Pacific Islands eat way more saturated fat than Americans but have much lower rates of heart disease and 2) the fact that the study that first led people to believe saturated fat was the cause of heart disease was bad science that has since been discredited–not that that’s stopped people who think saturated fat is bad from citing it all the time anyway).

The third reason is that there’s no evidence supporting the proposed mechanism—meaning the idea that saturated fat causes heart disease by raising serum cholesterol.

You’d never know that from the mainstream media reporting on the research. Take, for example, this 1998 US News and World Report cover story, which describes the Framingham Study and claims:

Thanks to Framingham, Americans have come to understand that how they live often determines when they’ll die. After 50 years, 1,000 research papers, and $43 million, the Framingham Heart Study has shown that smoking is bad for the heart, that high blood pressure is not a normal consequence of aging, and that high cholesterol leads to heart disease. They know that women are at risk for cardiovascular disease, though later in life than men. They know that diabetes is a risk factor (a term coined by the study), that weight affects blood pressure, and that eating too much saturated fat affects cholesterol.

Compare that to what William Castelli, the director of the Framingham Study, wrote in a 1992 article in the Archives of Internal Medicine (quoted here):

In Framingham, Mass., the more saturated fat one ate, the more cholesterol one ate, the more calories one ate, the lower the person’s serum cholesterol.

The people who ate more saturated fat and calories were also more active, which might explain the results, but certainly doesn’t explain the US News and World Report article claiming the opposite.

The Tecumseh Study, which compared dietary habits with serum cholesterol and triglyceride levels, found no significant difference between saturated fat consumption and cholesterol levels (see the chart on page 3, 1386 in the original).

And Just to Complicate Things Further…

The proposed mechanism relies on two causal relationships: 1) saturated fat consumption—> increased serum cholesterol and 2) increased serum cholesterol—> cardiovascular disease. I’ve just explained why the evidence for the former is, at best, conflicting, and that alone would undermine the lipid-heart hypothesis. But it turns out the evidence for the second part of the mechanism is also complicated.

Castelli, again:

Cholesterol levels by themselves reveal little about a patient’s coronary artery disease risk. Most infarctions occur in patients who have normal total cholesterol levels." (From the American Journal of Cardiology)

from popular theory about cholesterol basically imagines that people’s arteries are like  pipes and cholesterol and fat are like grease that can gradually build up and narrow those pipes. Eventually, the arteries get clogged, and pieces of the plaque that break off or blood clots can get caught in those greased-up pipes and cause heart attacks and stroke.

It’s true that heart disease is generally caused by the buildup of a fatty plaque in the arteries, but cholesterol and fat don’t necessarily stick to and harden or clog arteries—not even so-called “bad cholesterol” or LDL. Oxidized LDL is what accumulates in white blood cells and become what are called “foam cells” which make up atherosclerotic plaque. Oxidized LDL also causes inflammation, which has been a major focus of recent research on cholesterol and heart disease. There is a much more complicated explanation, complete with citations from the relevant research here.

So the key to figuring out what causes heart disease is figuring out what causes (or prevents) the oxidation of LDL, not figuring out what causes increased levels of LDL qua LDL. Perhaps the most worrying finding is that one thing that seems to cause the oxidation of LDL is linoleic acid a poly-unsaturated fatty acid found primarily in vegetable oils. Saturated fat, on the other hand, actually seems to have a protective effect.

The literature is pretty complex, and I won’t pretend to have taken the time to parse out everything about atherosclerosis and cholesterol and essential fatty acids and endothelial cells. Nonetheless, I’ve been sufficiently moved by everything I’ve read to start using more butter and lard when I cook and seriously reconsider using vegetable oils anytime I’m preparing a meal that also includes substantial saturated fat. Because, again, there is no indication that the total volume of saturated fat or cholesterol one consumes increases the risk of heart disease or mortality, but a particular fatty acid found in vegetable oils oxidizes cholesterol, which does contribute to heart disease. Saturated fats and HDL or “good cholesterol” actually prevent oxidation and atherosclerosis.

That makes some sense with population studies too—populations that traditionally consumed large quantities of saturated fats and dietary cholesterol (Pacific Islanders, the Masai in Africa, the French) generally did not rely heavily on vegetable oils; populations that consumed large quantities of vegetable oils and fish oils (Mediterranean populations, the Japanese) generally consumed relatively little saturated fat and cholesterol.

That also means that eating lots of red meat, milk, and butter or other sources of saturated fat and cholesterol while also eating lots of olive oil, canola oil, and other sources of linoleic acid would be the worst combination possible. It would be a supreme irony if it turned out that one of the primary causes of atherosclerosis and the heart disease associated with it was the olive oil and vegetable oil that public health authorities have been urging a red-meat-eating people to substitute for animal fats for the last sixty years. 

Next up in this series…trans-fats and why they might actually kill you.

Salt Headlines That Make The Vein In My Forehead Throb

Salt has been all over the news this week because of a study just published in The New England Journal of Medicine claiming that if everyone in the U.S. reduced their sodium consumption by 3 grams/day, there would be 32,000 fewer strokes, 54,000 fewer heart attacks, and 44,000 fewer deaths every year. The story that got my attention was:

Remaining Arctic Ice Seen Melting Away Completely! (...on a computer screen)

That’s surprising, I thought. Everything I’ve read suggests that the relationship between salt consumption and cardiovascular disease is weak, inconsistent, and probably only valid for 20-30% of the population. So I expected the article to refer to some new research where, you know, “big benefits” were “seen.” As in observed. Like, in the world. And, given the claim about the magnitude, probably also measured.

To their credit, the authors of the study claim no such thing. The numbers are projections based on the application of several assumed effects of salt reduction, adjusted for different demographics and then applied to a model of the entire U.S. population. Thus, the title of the study: “Projected Effect of Dietary Salt Reductions on Future Cardiovascular Disease.”

The article seems to grasp the essentially speculative nature of the findings. The very first sentence uses the conditional tense:

…scientists writing in The New England Journal of Medicine conclude that lowering the amount of salt people eat by even a small amount could reduce cases of heart disease, stroke and heart attacks as much as reductions in smoking, obesity, and cholesterol levels.

The headline, on the other hand, seems to have confused the “scientists” with clairvoyants. Never mind doing any checking into the validity of their assumptions.

And the claim about how the benefits compare to smoking and obesity reduction led to a few headlines like this:

webmd salt

This crazypants idea initially sounds a lot like what the study’s lead author claims:

"The cardiovascular benefits of reduced salt intake are on par with the benefits of population-wide reductions in tobacco use, obesity, and cholesterol levels."

But the logic behind the claim is that a small improvement in the health of every single American would be as significant as a large health improvement in the health of every single smoker:

Dr. Bibbins-Domingo said that for many people the decrease in blood pressure would be modest, which is why, she said, “many physicians have thrown up their hands and said, ‘I’m not going to advise my patients to reduce salt because it’s too hard for patients and the benefits for any individual are small.’

“But small incremental changes in salt, such as lowering salt in tomato sauce or breads and cereals by a small amount, would achieve small changes in blood pressure that would have a measurable effect across the whole population,” she said. “That’s the reason why this intervention works better than just targeting smokers.”

For any given individual, there is no question about whether cutting salt is even close to “as good” as quitting smoking. The evidence for the link between smoking and lung cancer and death is strong, reliable, consistent, and has a clear causal mechanism (carcinogens). The link between salt and cardiovascular disease and death is weak, inconsistent, and still poorly understood.

That latter point starts to get at the problems with the study itself, and not just the headlines it inspired. A number of the assumptions the projection was based on are either demonstrably false or simply unsubstantiated. More on this some other time; for now, a few quotes and links to the essays they come from in Esquire and the medical journal Hypertension:

In a more recent statement, the founder of the American Society of Hypertension, Dr. John Laragh, goes further: "Is there any proven reason for us to grossly modify our salt intake or systematically avoid table salt? Generally speaking the answer is either a resounding no, or at that, at best, there is not any positive direct evidence to support such recommendations."

Studies show that 30 percent of the Americans who have high blood pressure would greatly benefit from a low-sodium diet. But that’s about 10 percent of the overall population — the rest of us are fine with sodium. And drastically cutting out sodium may actually hurt some people. ( "Go Ahead, Salt Your Food")


The available data suggest that the association of sodium intake to health outcomes reflected in morbidity and mortality rates is modest and inconsistent. Therefore, on the basis of the existing evidence, it seems highly unlikely that any single dietary sodium intake will be appropriate or desirable for each member of an entire population…. The decision to adopt a low sodium diet should be made with awareness that there is no evidence that this approach to blood pressure reduction is either safe, in terms of ultimate health impact, or that it is as effective in producing cardioprotection as has been proven for some drug therapies. (Salt, Blood Pressure, and Human Health)

Things That Won’t Kill You Volume 4: Saturated Fat, Part I

I know this is misleading because lard is mostly unsaturated, but it's been one of the major icons of "bad" fat and also, how graet is this image? from

This is probably going to be an even harder sell than MSG, but I swear I’m not just trying to be contrary. It’s true that all the major sources of public health and nutrition advice, including the Harvard School of Public Health, Mayo Clinic, CDC, and American Heart Association continue to refer to saturated fats as "bad fats" and suggest that people avoid them as much as possible, limit them to <7-8% of their total caloric intake, and replace them with "good," i.e. unsaturated fats, whenever possible. It’s also true that there are a few studies that suggest that increased saturated fat consumption is correlated (albeit weakly) with cardiovascular disease (CVD).

However, many other studies have found no increase in CVD associated with saturated fat consumption. And several recent review articles have concluded that the evidence for a connection between saturated fat consumption and CVD is inconsistent, insufficient or nonexistent.

There are basically three things that have convinced me that saturated fat isn’t independently responsible for heart disease or death:

1) groups of people who eat vastly more saturated fat than most Americans frequently have lower rates of CVD—or no apparent CVD at all

2) the study that first inspired many people to think saturated fat was a bad thing had a lot of obvious flaws and has been thoroughly discredited

3) there’s no good evidence that the proposed mechanism actually works—briefly: saturated fat supposedly causes heart disease by raising serum cholesterol levels because cholesterol is what clogs arteries and causes heart attacks and strokes, but serum cholesterol turns out to be a really poor predictor of CVD 

1) The French (and Polynesian and Melanesian and Masai and Fulani and Sri Lankan) Paradox 

photo by Arun Ganesh most confounding phenomenon for the theory that saturated fat consumption causes heart disease (sometimes called the lipid hypothesis or lipid-heart hypothesis or diet-heart hypothesis) is the virtual non-existence of CVD in multiple populations that eat way more saturated fat than most Americans. This has primarily been documented in the Pacific islands where coconuts, which are very high in saturated fat, are a staple food. For example, before the 1970s, the inhabitants of the island Tokelau got an estimated 55% of their calories from saturated fat, but heart disease was virtually unknown (according to Gary Taubes and Stephen Guyenet; notably, since their diet has shifted to include less saturated fat but more sugars and refined carbohydrates, many health indicators have worsened).

Many proponents of "natural" foods have already embraced the coconut as a source of "healthy" fat, but the confounding phenomenon isn’t unique to coconut-eaters. The Masai of Kenya and Tanzania and Fulani of Nigeria, whose traditional diet is composed primarily of cow’s milk, meat, and blood and estimated to be 33% saturated fat, also had virtually no incidence of CVD—or didn’t until they started eating refined carbohydrates. Red meat and dairy sources of saturated fat are about as classically "unhealthy" as you can get, and yet don’t necessarily cause heart attacks and death.

The differences can’t be attributed, at least entirely, to genetics, and not just because the phenomenon is consistent among so many far-flung populations. The Masai are known to be highly genetically diverse due to conflict/intermarriage, and all of the populations have experienced changes in their health as their diets have changed.

Nor are the changes likely due to higher activity levels, cyclical feast and famine, average lifespan, or the absence of other risk factors like cigarette use in these populations. A 1993 study of diet and health in Kitava, the Trobriand Islands, and Papua New Guinea found that Kitavans get an average of 17% of their calories from saturated fat (compared to about 10% for Americans or the <7% recommended by the AHA), tend to be about as active as moderately active Swedes, and have virtually no experience of food scarcity or shortage. Furthermore, 75% of the population smokes. Nonetheless, stroke and heart attack basically don’t exist even among the elderly there, and their fasting insulin levels are significantly lower than Swedes’ in every age group.  

Another potentially confounding fact that some low-carb proponents point to is the alleged decline in the average American’s consumption of saturated fat. It’s true that the consumption of some primary sources of saturated fat like butter and lard declined for most of the 20th Century; however, per capita red meat consumption grew until the 1970s, and according to the USDA, the proportion of saturated fat in the U.S. food supply has remained pretty steady for the last century. What is clear, at least, is that increasing rates of cardiovascular disease were not caused by increasing per capita consumption of saturated fats.

So why the hell have the institutions we trust to tell us how our diets affect our health spent the last half-century trying to convince us that saturated fat clogs your arteries, causes heart attacks, and might kill you?

2) It All Began With Some Falsified Data January 13, 1961 cover of Time magazine

I know, I sound like a climate change skeptic, but this is way more blatant than some sketchy e-mails.

The research that first convinced many Americans and public health institutions like the AHA that saturated fats cause heart disease was the Seven Countries Study led by Ancel Keys. Keys and his colleagues carried out surveys of men (only men because the rate of CVD in women was considered too low to merit attention) in eighteen rural regions in Italy, the Greek Islands, Yugoslavia, the Netherlands, Finland, Japan, and the U.S. The regions were supposedly chosen because they were areas with stable and widely-contrasting diets. The researchers asked the men about their eating habits, performed chemical analysis on meals prepared by randomly-selected families, and compared their findings to the rate of disease and death in the different populations, with special attention given to CVD and "risk factors" like serum cholesterol levels. They concluded that higher levels of dietary saturated fat cause elevated serum cholesterol and heart disease. The especially low rates of CVD in Crete, despite relatively high proportions of dietary fat, were a large part of the impetus behind the "Mediterranean Diet" and the idea that olive oil is nutritionally sacrosanct.

However, as many people later pointed out, it seems more than coincidental that Keys only included regions where both saturated fat consumption and heart disease were high or where they were both low. Apparently, data was available for a number of other countries (anywhere between 15 and 27, according to various critics), perhaps most notably France, which would have been just as easy to study as Finland and Japan. Keys seemingly ignored, or systematically excluded, all confounding data from his analysis.  

Even for the countries that were included, according to statistician Russell H. Smith:

The dietary assessment methodology was highly inconsistent across cohorts and thoroughly suspect. In addition, careful examination of the death rates and associations between diet and death rates reveal a massive set of inconsistencies and contradictions…. It is almost inconceivable that the Seven Countries study was performed with such scientific abandon. It is also dumbfounding how the NHLBI/AHA alliance ignored such sloppiness in their many "rave reviews" of the study…. In summary, the diet-CHD relationship reported for the Seven Countries study cannot be taken seriously by the objective and critical scientist.

The problem seems to be that Keys had already decided that saturated fat was the problem. According to the Time magazine cover article (click the picture to read it), Keys had treated a heart disease patient with large knobs of what turned out to be cholesterol under his skin and very high serum cholesterol. When he put that patient on a low-fat diet, his cholesterol went down. Since cholesterol is what clogs arteries and causes heart attacks, and foods high in saturated fat are often also high in cholesterol, Keys concluded that the high rates of heart disease in American men were caused by the saturated fat and cholesterol content of their diets. In the Seven Countries study, he set out to prove that theory (as opposed to testing it).

Fifty years of subsequent research have not found a consistent or reliable correlation between saturated fat consumption and heart disease or death. This entry got a little out of hand, so I split it into a couple of parts. More coming soon on dietary cholesterol and the dreaded LARD, neither of which, it turns out, are likely to kill you or even probably hurt you a little bit. Also, an epilogue of sorts on trans-fats, which might actually kill you.

Things That Won’t Kill You Volume 3: MSG

From Flickr user "The Other Dan" taken in Corktown, Toronto 

Unlike juice, which has sort of a mixed reputation even among contemporary nutritionists and doctors, MSG has been consistently demonized. Most people can’t tell you why, they just know that it’s bad. If pressed, they might tell you that it’s "unnatural," that food manufacturers put it in processed foods to con people into eating "junk," that it’s basically salt (which I’ll address in a future post in this series), or that it gives some people headaches. Or they might just gesture to the fact that it’s common knowledge that MSG is basically some kind of poison—after all, why would Chinese restaurants be so eager to reassure you that they don’t use it if it were completely benign?

A recent commercial for Campbell’s New Select Harvest Light (which is the sort of self-satirizing product name I’d expect to find in David Foster Wallace’s fiction) suggests that even if people don’t know what MSG stands for, they know that it’s bad—potentially bad enough to deter people from buying a particular brand. Reading from a Progresso Light can, blonde #1 gets through "monosodium" but stumbles on "glutamate"—fortunately, the rainbow coalition includes an Asian woman who can translate that jargon into something we all understand: "That’s MSG."

Although people may still associate it primarily with Chinese restaurant cooking, the Campbell’s ad hints at its broader prevalence—MSG and other forms of glutamic acid are omnipresent in processed foods. They’re especially likely to be found in foods designed to taste like things that have a lot of naturally-occurring glutamate (or similar molecules like inosinate or guanylate). Stock, broth, and bouillon often contain MSG, as does anything cheese-flavored or ranch-flavored, like Doritos, which actually contain five different forms of glutamate. I taste it the most in instant ramen and Chex Mix, but even though I know what it tastes like on its own, I can’t always tell when something contains it or not. When used sparingly, it may not even be possible to discern because whether the glutamate in a dish comes from a mushroom or a salt, once it’s dissolved in liquid or on your tongue, it’s the exact same molecule:

from Wikipedia, showing up weirdly gray here

So even people who think it’s "bad" and expect to feel bad after eating it probably eat MSG, at least from time to time, without even knowing it, and without suffering any negative effects.

The Chinese Restaurant Syndrome Myth

The first person to suspect that MSG might be unhealthy was a Chinese-American doctor named Ho Man Kwok, who complained in a letter to the New England Journal of Medicine in 1968 that he experienced numbness radiating from the back of his neck down his arms, weakness, and heart palpitations after eating at Chinese restaurants. He had never experienced those symptoms after eating at restaurants in China, and hypothesized that they were due to either an excess of alcohol, sodium, or MSG in American Chinese cooking. The MSG explanation caught on, with one of the response letters estimating that as many as 30% of Americans regularly suffered bad reactions to MSG. The NEJM ran the letters with the title "Chinese Restaurant Syndrome," and by the next year, articles in Science and The New York Times were referring to the syndrome and its MSG etiology as verified facts:

"monosodium glutamate, which has been pinpointed as cause of ‘Chinese restaurant syndrome’ " (NYT May 10, 1969 Page 33, Column 1)

Last year, the New York Times ran an article that attempted to set the record straight. They quoted the daughter of Chinese restaurant owners in New York City in the 1970s, who remembered the publicity around "Chinese Restaurant Syndrome" as a "nightmare":

“Not because we used that much MSG — although of course we used some — but because it meant that Americans came into the restaurant with these suspicious, hostile feelings.”

From Flickr user Chinese restaurants were among the first in the U.S. to use MSG, which was mass-produced in Japan beginning in the early 20th Century after a scientist named Kikunae Ikeda isolated glutamate from seaweed-based soup stocks. In the 1940s, it had become increasingly common in a number of processed foods and cooking styles around the world, including in the U.S. American soldiers who’d tasted Japanese army rations generally agreed that they tasted better, and the difference was widely attributed to MSG. As the war industries were refitted for peacetime manufacturing, including the greatly-expanded industrial food system, there was a greater need for flavor enhancers that would make food taste good even if it was canned or wrapped in plastic and transported long distances. MSG was great at that. It was also sold for home cooks to use under the brand name Accent, which is still available in the spice aisle of many grocery stores, and as a major component of Maggi sauce, a Swiss brand, and Goya Sazon seasoning blends, popular in the U.S. primarily with Latino/a and Caribbean immigrants.

It’s not entirely clear why Chinese restaurants were singled out, aside from the random chance of Kwok having weird feelings after eating at them. MSG was then, and still is, everywhere in American food. I suspect that it has something to do with a latent or repressed xenophobia. However, the success of Chinese restauranteurs and the fact that MSG didn’t really cause any physical symptoms were probably just as important—Cuban restaurants, where pork shoulder is often rubbed with a mixture of spices including MSG, weren’t nearly as common as Chinese restaurants. And if it had been called "chicken stock, Doritos, bologna, and Stove Top stuffing syndrome," that would have been far more difficult to accept for all the people who ate those things regularly without experiencing strange numbness and heart palpitations.

Which, of course, they generally don’t.

That’s Exactly what Forty Years of Research Has Found

No study has ever been able to find statistically significant correlations between the consumption of MSG and any of the symptoms associated with what was eventually re-named "MSG symptom complex" in 1995. According to a review article published in Clinical and Experimental Allergy in April 2009:

Descriptions of MSG-induced asthma, urticaria, angio-oedema, and rhinitis have prompted some to suggest that MSG should be an aetiologic consideration in patients presenting with these conditions…. Despite concerns raised by early reports, decades of research have failed to demonstrate a clear and consistent relationship between MSG ingestion and the development of these conditions.

Even studies involving self-identified "MSG-sensitive" subjects failed to find a significant increase in the frequency of MSG-attributed symptoms. In one study, only 2 of 130 self-identified "MSG-sensitive" subjects responded to MSG in 4/4 treatments. Additionally, no one’s ever found any clues as to why MSG, which is just the isolated form of a naturally-occurring amino acid salt, would cause numbness or heart palpitations.

The Fat Rat Caveat

Peanut & Missy, from Flickr use "a soft world" 

A decade before Kwok’s letters on "Chinese Restaurant Syndrome" were published, some scientists began doing research on the effects of MSG on mouse brains. In 1968, a neuroscientist named John Olney, also known for his work on aspartame, attempted to replicate earlier studies where mice were fed massive amounts of MSG via feeding tube. The most dramatic result wasn’t in the brain, where he was looking, but their bodies: the mice fed MSG became "obese" (which had a different medical definition in 1968 than it does now, but still referred to unusual fatness). Given that glutamate registers as "deliciousness," one might assume that the difference was that the MSG-fed rats just liked their food a lot more and ate past satiety, but the MSG was administered by feeding tube, so taste shouldn’t have had anything to do with it. Based on his work, manufacturers voluntarily agreed to stop using MSG in baby food.

Subsequent studies have repeated the finding: mice and rats fed large amounts of MSG gain weight, and it’s not entirely clear why. As far as I can tell, the amount of food they consume is generally controlled, although if they have free access to water, perhaps they’re drinking like crazy to make up for amounts of MSG as high as 10 g per day, out of 100 g food total. However, the mice in most of the studies are fed amounts of MSG that far exceed what a human even surviving on instant ramen and Doritos alone would consume. There’s no evidence that the amounts typically consumed as a flavoring do any damage to people, no matter how young. People all over the world eat MSG all the time, both in processed foods and home-cooked foods, seemingly without suffering any negative effects. The growing consensus among people who’ve looked at the research is that

"toxicologists have concluded that MSG is a harmless ingredient for most people, even in large amounts" (Harold McGee On Food and Cooking 2004).

But it does seem like vast amounts of MSG can cause weight gain, sluggishness, and brain lesions in the retinal and hypothalamus regions. I’d advise against getting 10% of your daily intake of food from MSG.

 A Nutritional Yeast Connection?

from Flickr user A random suspicion I haven’t been able to confirm is that MSG might be similar in many ways to nutritional yeast, the worst-named ingredient in the world. Nutritional yeast, also known as "nootch," is primarily used by vegans and some vegetarians as a flavoring agent that adds a slightly cheesy, deeply savory flavor to things ranging from popcorn to sauces to seitan. It also makes a tasty breading for tofu.

According to Wikipedia, "Modern commercial MSG is produced by [bacterial] fermentation of starch, sugar beets, sugar cane, or molasses." Nutritional yeast, on the other hand, is "produced by culturing yeast with a mixture of sugarcane and beet molasses, then harvesting, washing, drying it." Obviously whatever bacteria they use to ferment MSG results in a different product, but I wonder if they aren’t just different iterations of the same process. Ferment some sugar and molasses; in one case, extract the salt composed of sodium cations and glutamate anions and ditch the bacteria that do the fermenting; in the other, keep the yeast. Perhaps? If anyone knows more about the similarities or differences between the two, let me know.

From Flickr user "Fenchurch!"It definitely seems like MSG doesn’t have any of the nutritional benefits of nutritional yeast, which is full of vitamins and minerals and protein, but it would still be a delightful irony to discover that the maligned substance behind a million Chinese restaurant disclaimers is related or comparable in any way to a crunchy, natural food bulk bin staple.

I don’t use MSG often, largely because I prefer the yeasty flavor and nutritional benefits of nootch, but I don’t think homemade chex mix is nearly as good without a teaspoon or so of MSG, and a little bit can perk up lackluster soups and sauces. Most grocery stores still sell Accent, and increasingly carry Maggi sauce and Goya Sazon as well. You can also buy giant bags of it at Asian markets. If you use too much, it will make food excessively salty and overpower subtler flavors, so use a light hand and taste as you go.

More tips on how to use MSG and recipes in future entries.