Category Archives: science

Fatness Is Strongly Influenced By Genetics, But How?

An image from twin studies featured in the talk by Jeffrey Friedman posted after the jump, or click on the image for the YouTube link

A reader e-mailed me a good question about weight loss that’s outside my area of academic expertise but within the realm of stuff I’ve read enough about that I can offer some speculations and references. I am continually amazed at how complicated nutrition is and how much disagreement there is even among people who study it for a living. The only thing I can say with complete confidence is that anyone who tells you weight loss is simple and anyone can do it is (A) lying, (B) misinformed, (C) trying to sell you something, or (D) all of the above.

Here’s the e-mail I got:

If you have the time to answer a question…

I recently came across these articles claiming, more or less, that metabolism does not account for why some people are fatter or thinner than others. 

Article 1 [BBC Health]

Article 2 [Mayo Clinic]

I remember your posts on Sour Salty Bitter Sweet about dieting not being an effective solution to weight loss [here], and it sounded like you thought someone’s weight had more to do with genetic factors than lifestyle factors. Do I have that right? Would you disagree with the articles? Or point to non-metabolic genetic factors?

-Anna Macdonald

simple! right? that's why everyone is precisely the weight they want to be! Image Credit: CDC.govBoth articles essentially argue that it’s a myth that fatter people have “slow metabolisms” and burn fewer calories than thinner people. Basal metabolic rates vary based on age, gender, and body composition (or maybe just body composition, but that tends to vary based on age and gender), but as far as researchers can tell, fat people have at least roughly the same metabolic rate as thin people. They just eat more.

I think they neglect or dismiss a few complications too easily—in both diet studies and over-feeding studies, subjects lose/gain less weight than they should based on caloric arithmetic, usually by a significant margin. That’s usually attributed to shifts in thermogenesis, or how much heat you generate, and unconscious motions like fidgeting. If you eat more than you’re used to, your body may respond by getting slightly warmer and engaging in more restless activity. Eat less, and your body may respond by getting cooler and engaging in less activity. There’s some evidence that even for the very rare individuals who lose weight and keep it off long-term, basal metabolic rate remains depressed compared to people with the same relevant characteristics (weight, age, gender, and body composition) who were not previously fat (see NYTimes “The Fat Trap”). Incidentally, there’s research from as early as 1980 suggesting that people who maintain weight loss long-term are frequently monomaniacal about food and exercise, engaging in behaviors that might be seen as evidence of an eating disorder in thinner people.

But in general, the articles seem pretty accurate up to the point where they claim that people can lose weight if they eat fewer calories and exercise. The BBC article even claims that “people not only manage to lose weight but are able to keep control of it in the long term,” which is technically true—a small percentage of the people who lose weight by dieting do—but certainly isn’t the norm. Both articles make an unsubstantiated leap from the idea that basal metabolic rate is at least relatively stable and consistent to the idea that therefore, anyone can be thin if they only eat as many calories as a thin person burns. The key question they fail to address is why fat people eat more calories than thin people in the first place.

I suspect that’s because most people think they know the answer: they assume fat people have less willpower, knowledge, or motivation than thin people and therefore make bad choices about what and how much they eat. There’s a widespread assumption that if fat people knew better or tried harder, they could be thin. Many people, whatever their weight, believe that they themselves would be probably thinner if they ate better and exercised more and would be fatter if they ate whatever they wanted all the time and exercised less (which is actually probably true, but only within a small range). A lot of people even have personal experiences with weight loss or gain that they may be able to attribute to conscious choices or lifestyle changes. However, for most people, those changes prove to be temporary and I think they overestimate how much control they actually have.

Fat people are not fat because they’re weak or lazy or unmotivated or unaware of the supposedly-dire medical consequences and actually-dire social consequences of being fat. Body size is strongly genetically determined and biologically-regulated. It may be sensitive to some environmental conditions, but that doesn’t mean it’s within individuals’ conscious control. If the tendency towards weight homeostasis doesn’t work by regulating how many calories people tend to burn, which I agree that it probably doesn’t, it must work by influencing how much people eat.

How Heritable Is Fatness? 

Also from the Friedman lecture below.Very. Perhaps less than eye color, but more than other conditions widely seen as having a significant genetic component, like schizophrenia or alcoholism. Based on twin studies, one of the classic ways of evaluating the genetic component of all kinds of conditions, weight consistently appears to be approximately as heritable as height—most studies conclude that just under 80% of variation in weight and height is attributable to genetics. Furthermore, genetic influence consistently trumps environmental effects by a wide margin. In adoption studies, another way of evaluating genetic influence, children’s weights are strongly correlated with their biological parents’ and not at all with their adoptive parents. 

In Stunkard et al 1986, which compared approximately 4000 sets of male twins, the “concordance rates for different degrees of overweight were twice as high for monozygotic twins as for dizygotic twins.” In other words, the “identical” twins who share nearly 100% of their genetic material were twice as likely to have similar body types than “fraternal” twins who share only 50% of the same genes. At age 20, comparisons of height, weight, and BMI for both sets of twins yielded heritability estimates of .80, .78, and .77, respectively (1.0 would be perfectly heritable, .00 would be not heritable at all). At a 25-year follow-up, the heritability estimates for the same traits were .80, .81, and .84.

In another Stunkard et al 1986, which divided a sample of 540 adult Danish adoptees into four weight classes: thin, median, overweight and obese, there were strong correlations between the weight class of the adoptees and their biological parents (p<.0001 for mothers, p<.02 for fathers). There was no correlation between the weight class of the adoptees and their adoptive parents.

In Stunkard et al 1990, the researchers used a Swedish database of twins separated early in life versus those reared together collected between 1886 and 1958. They ended up with 93 pairs of identical twins reared apart, 154 pairs of identical twins reared together, 218 pairs of fraternal twins reared apart, and 208 pairs of fraternal twins reared together. The mean age of comparison was 58.6 years old. The heritability estimates (shown in the chart below) are similar to those in the 1986 study. Notably, twins reared together were no more similar than twins reared apart.

"The fourth estimate of heritability, the intrapair correlation for the monozygotic twins reared apart, was the most direct and perhaps the best estimate of the heritability of the body-mass index. It was 0.70 for men and 0.66 for women."

A review study done in 1997 by Maes et al looked at the data from 25,000 twin pairs and 50,000 biological and adoptive family members, finding BMI correlations of .74 for monozygotic twins, .32 for dizygotic twins, .25 for siblings, .19 for parent-offspring pairs, .06 for adoptive relatives, and .12 for spouses.

Researchers have also been curious to see if the “obesity epidemic” has changed anything. Have environmental changes in the last few decades trumped genetic factors? Not really. In Wardle et al 2008, they evaluated 5092 sets of twins between the ages of 8 and 11 whose body measurements were taken in 2005. The heritability estimate for BMI was .77. In comparison, the shared-environment effect in the same study was estimated at .10.

As a 2008 review study of research on the heritability of fatness by Stephen O’Rahilly and I. Sadaf Farooqi concluded:

Hereditary influences on adiposity [fatness] are profound and continuing…. There is little serious doubt that the single most powerful determinant of inter-individual differences in adiposity is heredity.

Okay, But How Does It Work? Part I: Epigenetics

Genetics isn’t the whole story. We have known for a long time that children without access to adequate nutrition may have their growth "stunted," meaning they may never achieve the same height or weight as adults as they would have if they had been able to eat more as children. Dietary composition also seems to have an effect: populations with access to more protein (or calcium?) may grow taller or fatter than genetically-similar populations who consume less protein. The availability of highly-palatable, calorie dense, high-sugar and high-fat food in countries like the U.S. may create the conditions for some people (though clearly not all) to become fatter than they would in another environment. However, how fat they get in that environment  is still determined largely by genetics, just like how tall people get in the presence of ample protein is still largely determined by genetics.

There’s a lot of research being done right now on what are sometimes called "epigenetic" effects, which are factors that influence whether or not (and how) genes get expressed, without any changes happening in the genome itself. This is the idea that genes can get turned “on” or “off.”  Some epigenetic effects are trans-generational, meaning something that affects a particular individual or population may only show up in their offspring. So, for example, population that experienced a famine may have offspring who are more inclined to store fat when it’s available than a genetically similar population that didn’t live through a famine.

See: This Nature article or Herrera et al 2011

Okay, But How Does It Work? Part II: Leptin

Leptin-deficient child before and after leptin therapyThe expression of genes that affect body size probably involves changes in the endocrine system, and particularly the release or suppression of the hormones leptin and gherlin, which control appetite and satiety. Leptin in particular seems to be crucial to the regulation of body fat. It was only discovered in the mid-1990s, so scientists are still trying to understand how it works and what the implications are.

Some extremely fat children, like the kid pictured on the right, have been found to be deficient in leptin. They have seemingly insatiable appetites—when presented with meals in excess of 2,000 calories, they’ll eat the whole thing and still be hungry. After receiving leptin injections, they eat age-appropriate portion sizes and lose weight rapidly without dieting or engaging in any formal exercise program.

One of the major differences between people who’ve lost weight through dieting and people who weigh the same without having dieted is in their leptin levels. Most of what I know about leptin (and most of images in this post) are taken by the following talk by the biologist who discovered the hormone, Dr. Jeffrey Friedman:

In the very beginning of the talk, he makes some causal claims about high BMI/adiposity and mortality that I disagree with, because I’m not sure the correlations are actually caused by fatness, rather than social stigma, racism, poverty, lack of health insurance, etc. (all of which are also correlated with BMI). If fatness caused mortality, why would “overweight” people live longer on average than “normal” weight and “underweight” people and “obese” people who are active live longer than “normal” people who are sedentary? He also says that even modest weight loss is associated with significant health improvements, and I wonder if that claim isn’t based on studies where participants begin eating more vegetables and exercising, lose something like 5 pounds, and get healthier overall and researchers conclude that weight loss improves health when really the weight loss is totally meaningless. But once you get past that bit, he makes a pretty strong case for the genetic basis of body size and the role of leptin in the regulation of body fat.

Okay, But How Does it Work? Part III: Endocrine Disruptors

Just to complicate things even further, it turns out that a lot of the chemicals we’re exposed to can affect the endocrine system. Bisphenol A, the now-vilified chemical used primarily in plastics and also in the lining of aluminum cans, turns out to be an endocrine disruptor. Fluoride is also an endocrine disruptor. So are brominated fire retardants and many pesticides (even organic-certified ones, like copper sulfate).

Some of these may only affect people if they’re exposed at a particular point in their development—in utero, pre-adolescence, etc.—or at a particular dosage. So if your mom ate a lot of highly-acidic canned foods while she was pregnant with you, that might affect your thyroid function from birth. Or if you spent a lot of time on a rug treated with flame retardant chemicals as an infant, that might affect you, but maybe if you’d been 5 or 6 years old, it wouldn’t. Those are just hypothetical scenarios, the actual effects and doses of endocrine disruptors are not yet well understood or documented. So I’m not saying you should stop getting fluoride treatments. I suspect (and hope) that in another decade or so, we’ll have a better sense of how chemical exposure affects weight gain.

For more on endocrine disruptors and obesity, see Julie Guthman, Weighing In especially Chapter 5.

When Metabolism Matters: The Evidence From Overfeeding

If body weight is genetic, it should probably be nearly as difficult to gain weight as it is to lose it. Although it does seem to be possible to gain weight deliberately—some athletes and actors do this—it takes a lot of work. The results of overfeeding studies suggest that people who deliberately eat more than they would normally have to suppress their desire to stop eating and lose any weight they gain very easily as soon as they stop “overeating.”

BBC made an hour-long documentary about one of these studies, called “Why Are Thin People Not Fat,” which I first saw posted on Tom Naughton’s blog FatHead:

If you don’t want to watch it (spoiler if you do): ten thin people were told to eat twice as much as they normally do (the target caloric intake for men was approximately 5000 kcal/day) and refrain from exercise for four weeks. There was a lot of variation in the results—some gained more weight than others, some gained more fat than others. One of the participants gained muscle. None of them gain as much weight as they “should” based on caloric arithmetic, meaning there must have been changes in their metabolism. Additionally, the subjects report feeling pretty miserable: the amount of food they have to eat makes them feel sick. At least one of them mentions throwing up some of what he ate. They all get tired of milkshakes and chocolate and pork pies. And month after the experiment was over, the participants had all lost most or all of the weight they gained during the experiment without engaging in any deliberate weight-loss strategies.

The documentary mentions another overfeeding study known as the Vermont Prison Experiment. Researchers at the University of Vermont led by Ethan Sims initially tried to use students as subjects. They were told to eat 2-3x their normal caloric intake, but even after 5 months, most had increased their weight by only 10-12%. Sims’ goal was 25%, so he turned to inmates at the Vermont State Prison, who he describes as “equally dedicated volunteers.” After 200 days of eating up to 9-10,000 kcal/day, some of the participants were still not able to gain 25% of their starting weight. For the few who were able to gain 25% or more, in order to maintain the goal weight for any length of time, they had to continue eating on average ten times the number of calories that should have been necessary based on simple caloric arithmetic. This is also explained by metabolic changes—whether through thermogenesis or unconscious activity, the men were burning vastly more calories than before despite being prevented from exercising. Again, after the study was over, the prisoners easily lost most or all of the weight they had gained.

In Conclusion

The genetic influence on weight seems to work primarily by affecting how much people eat, not how many calories they burn. Fat people burn more calories than thin people, but they also eat more than thin people. That doesn’t mean that fat people “overeat.” Most people, fat or thin, maintain a relatively stable weight over long periods of time. If fat people were eating more calories than they typically burn, presumably they would be constantly gaining weight. Appetite and satiety are governed by biology, not willpower. Most people seem to be capable of consciously and deliberately reducing or increasing their caloric intake temporarily, but that’s difficult and unpleasant and virtually impossible to maintain long-term.

Diet Soda Follow-up: Are Diet Sodas Better For You Than Regular Soda?

Artificial sweeteners definitely pre-dated the "obesity epidemic." Saccharin was being used commercially in the early 20th C. and diet sodas were widely available by the 1960s For more on the history of artificial sweeteners, see Carolyn de la Pena's brilliant book _Empty Pleasures_

Soda cans from the 1970s from Found in Mom’s Basement

In response to the recent entry about the association between diet soda and fatness, Jim asked:

Has anyone proved that drinking Diet Soda is better for you than drinking Regular Soda? Does Diet Soda have the same impact on the body as drinking say a glass of water? I haven’t done any research on it and I don’t know if any is out there. I’d really like to see a study of what happens to obese people who stop drinking diet soda and switch to regular.

There’s a ton of research on artificial sweeteners, but I can’t find any studies in which obese people who habitually consume artificially sweetened-drinks were made to switch to sugar-sweetened drinks. That might partially due to ethical/IRB concerns—it’s possible that asking people to consume more sugar than they were previously would be considered a significant health risk. On the other hand, there are studies in which subjects are randomly assigned to consume either artificial or caloric sweeteners, so maybe consuming regular soda falls into the realm of acceptable risk with informed consent.

In those kinds of studies, both “overweight” and “healthy”* individuals who consume regular sweeteners (usually sucrose or high-fructose corn syrup, which are nutritionally equivalent as far as we know) end up eating more calories overall than people who consume “diet,” artificially sweetened foods and drinks. The sugar/hfcs groups also gain weight and fat mass and have negative health indicators like increased blood pressure. I don’t think fatness is bad or that being thin is better, but based on the current available evidence, regular soda appears to be both more likely to make you fat and also worse for your health than diet soda.

*Stupid current labels for BMI categories that don’t correspond at all to actual health outcomes.(1)

A Closer Look at the Studies

This is apparently what Diet Coke looks like in Denmark. Or did in 2009. Pretty! In a 2002 study from Denmark, 41 “overweight” men and women between 21 and 50 years old were assigned to two groups, matched for sex, age, weight, height, BMI, fat mass, fat-free mass and usual amount of physical activity. One group was given sucrose-sweetened dietary supplements (2 g/kg of body weight daily; 70% from drinks and 30% from solid foods) and the other was given artificially-sweetened dietary supplements (an equivalent amount of food and drink by weight sweetened with a combination of aspartame, acesulfame, cyclamate, and saccharin, collectively and individually far below intake levels generally regarded as safe). All the supplements were commercially-available foods and included soft drinks, flavored fruit juices, yogurt, marmalade, and stewed fruits. The researchers note that “great efforts were made before the intervention to find the most palatable artificially sweetened food products on the market for which a matching sucrose-containing product existed.” As some of the artificially-sweetened foods were also fat-reduced, subjects in the sweetener group were given additional butter or corn oil every week.

The study lasted 10 weeks. In addition to the supplements, subjects were free to consume whatever they wanted and as much as they wanted. The subjects visited the lab weekly to pick up the supplements and have urine samples taken (which were used to validate their dietary records). Their height, weight, and fat mass were measured every two weeks. They also kept food diaries that included ratings of their  hunger, fullness, the palatability of the food they ate, and their sense of well-being over the course of each day in the week before the study began, the fifth week, and the tenth week.

Results: The sucrose group ate more calories overall than the sweetener group and got more of their calories from carbohydrates (58% compared to 44%). Both groups decreased how many carbohydrates they were eating in addition to the supplements, but the sugar in the supplements more than made up for the decrease in the sucrose group. The sucrose group gained an average of 3.5 pounds—which was, interestingly, only about half the weight gain that would have been predicted based on how many more calories they were eating. Their activity levels didn’t increase, so the most likely explanation is thermogenesis—i.e. their metabolism changed in response to the increased caloric intake. The group eating artificial sweeteners lost an average of two pounds. In the sucrose group, systolic and diastolic blood pressure increased; in the sweetener group, it decreased. There were no differences in appetite sensations, hunger, or satiety.

Similarly, in a 1990 study done at the Monnell Chemical Sense Center, a group of 30 subjects gained weight during a three-week period when they were given regular soda (sweetened with HFCS) and lost or maintained their weight during the two three-week periods when they were given diet soda (aspartame-sweetened) or no soda. In the regular and diet soda periods of the experiment, they were given 40 oz. of soda to drink every day. In the no soda period, they were told they could consume any beverages as they normally would. They also kept detailed dietary records for the duration of the experiment. The order of the 3-week periods was counterbalanced so some of them got regular soda first, some of them got the artificial stuff first, some of them had no soda for the first three weeks, etc. Here’s what the aggregate changes in their body weight looked like: 

Tordoff and Alleva 1990 in the American Journal of Clinical Nutrition 51: 963-9 (graph appears on 965)

During both the regular and diet soda weeks, they decreased their dietary sugar consumption by an average of 33% (i.e. aside from the sugar in the soda).

Studies like these also point to what I suspect is the more likely explanation why there’s never been a study like the one Jim describes: there’s just not much debate about whether consuming calorically-sweetened drinks leads to weight gain and possible health risks (which shouldn’t be conflated—weight gain is primarily an aesthetic issue, and high levels of sugar consumption may lead to negative health outcomes whether or not they make you fat). What is up for debate is whether artificial sweeteners are a good substitute and likely to promote weight loss or also bad and contributing somehow to weight gain. And if they’re contributing to weight gain, how and how much?

There appear to be three types of theories about why artificial sweeteners might cause weight gain and/or other undesirable outcomes.

Theory #1: Artificial sweeteners might have direct metabolic effects

I don't know what this has to do with anything, I just thought the entry needed more picturesIt’s possible that although they have no caloric value, artificial sweeteners could affect blood sugar or insulin in ways that cause the body to store fat. This is the theory being tested in the study described in the previous entry in which mice consuming aspartame in amounts comparable to an average-sized woman drinking 20 oz of aspartame-sweetened soda per day had higher fasting glucose levels than mice on the same diet minus the aspartame. The effect could be chemical, but seems more likely to be an effect of the sweet taste—i.e. the perception of sweetness might affect the hormones that govern appetite and metabolic rate.

Evidence for this is still extremely scant. Not only is it unclear whether or not the effect is reliable, biologically significant, or occurs in humans; it’s also unclear if it’s specific to aspartame or an effect of all artificial sweeteners, if it scales such that a small amount of aspartame causes a smaller increase in fasting glucose or only occurs at a certain critical level of aspartame consumption, if it only occurs after regular daily consumption over a long period of time or after a single dose, if it affects all people in the same way or only “overweight” people, if it interacts with other dietary conditions (i.e. does it only happen in conjunction with diets high in corn oil, like the ones the mice in the study were fed?), etc.

There are studies involving rats that suggest some kind of metabolic effect of artificial sweeteners might promote weight gain. Rats fed artificially-sweetened yogurt consume more calories than those fed sugar-sweetened yogurt.

However, it seems like it might not work the same in people—or at least that the effect might be smaller. Note that in both of the studies described above, subjects given artificial sweeteners decreased both their overall carbohydrate and dietary sugar intake. Additionally, in a 2001 study done at the Pennington Biomedical Research Center in Baton Rouge, 31 subjects (19 lean, 12 obese) were given sucrose (493 kcal), aspartame (290 kcal), or stevia-sweetened (290 kcal) "preloads" before lunch and dinner on three separate days. Their food intake, satiety (how full they felt), and postprandial (after-meal) glucose and insulin levels were measured. When they had the lower-calorie, artificially-sweetened preloads, they did not compensate by eating more at either the subsequent meals and reported similar levels of satiety as they did on the day they consumed the higher-calorie, sucrose preload.

Theory #2: Artificial sweeteners might have a psychological effect.

Another possibility is that drinking “diet” soda might make people believe that they can afford to eat more or nutritionally worse foods. This is similar to the “health halo” research being done by Brian Wansink and others, which has shown that people are more likely to underestimate the caloric content of foods they perceive as “healthy,” like a turkey sandwich from Subway, than they are with foods they perceive as unhealthy, like a Big Mac. They’re also more likely to order sides with the “healthy” choice that ultimately push the calorie content of the meals higher. Organic and “trans-fat free” labels or even just having calorie counts posted on a menu can have similar effects—triggering people’s dietary conscientiousness seems to cause many people to “treat” themselves to something extra.

However, it’s not clear that the “halo” affect actually influences total or long-term consumption. Thinking they’re getting the “healthier” sandwich may make people more likely to eat a bag of chips at that meal than they would have if they’d eaten a burger, but if that means they’re less likely to have an afternoon snack or they eat less at dinner, it might not affect their weight. I can’t find any studies that measure that.

Theory #3: Artificial sweeteners might change people’s palates

Artificial sweeteners might make people more accustomed to sweetness, which might cause them to eat more sweet things or sweeter things than they would otherwise. Since sweet things and the taste for them are seen as a kind of indulgence and not liking or eating sweet things is often constructed as proof of maturity, masculinity, or self-control, this is often described in morally judgmental terms like “infantilizing our taste sense” or “corrupting the palate.” But it’s not a theory entirely confined to blowhards. In an opinion piece in JAMA published in 2009, David S. Ludwig writes:

Individuals who habitually consume artificial sweeteners may find more satiating but less intensely sweet foods (eg, fruit) less appealing and unsweet foods (eg, vegetables, legumes) less palatable, reducing overall diet quality in ways that might contribute to excessive weight gain.

However, he admits that there’s no research showing this to be true. On the contrary, at least one study has found that people who consume artificial sweeteners regularly are more likely to eat foods generally considered to be healthy and less likely to consume foods generally considered to be fattening. According to a 2006 study done by the American Cancer Society as part of a larger project involving 1-2 million men and women who weigh 40% or more above average for their age and height, those who reported using artificial sweeteners also ate chicken, fish and vegetables significantly more often than non-users and consumed beef, butter, white bread, potatoes, ice cream and chocolate significantly less often. That study also found that artificial sweeteners were associated with weight gain. Given that their diets were apparently “healthier,” the authors conclude: “our weight change results are not explicable by differences in food consumption patterns,” perhaps implying that artificial sweetener might indeed be the culprit.

I think their data suggest something different entirely: people who drink diet soda are more likely to be dieters. They’re eating more of the stuff everyone tells them they ought to be eating to lose weight, and less of the stuff they’re supposed to avoid. It’s not working, and they’re getting fatter anyway, but that doesn’t mean diet soda makes you fat, it could simply mean that dieting doesn’t work.

Not Implausible, Just Not Supported By the Evidence

My suspicion is that if diet soda has any affect on weight, it’s a small one. I think it might be possible that in large amounts (probably 16 oz or more of diet soda per day), some artificial sweeteners might affect the metabolism slightly and lead to people being slightly fatter than they would be if they consumed less or no artificial sweeteners at all. However, I don’t think you’d see the results you see in studies like the ones from Denmark or the Monnell Chemical Sense Center if artificial sweeteners really have a dramatic, immediate effect on weight gain or fat storage.

Of course, that doesn’t mean artificial sweeteners are healthy, just that they probably don’t make you fat. Jury’s still out on the relationships between aspartame and cancer, sucralose (Splenda) and intestinal bacteria, saccharine and neurological function (especially in children), and stevia & its derivatives and DNA mutation. But for what it’s worth, most of the review articles I came across and Ludwig’s JAMA article claimed that concerns about cancer have basically been put to rest.

Of course, there’s still the problem of how they all taste

(1) Broken record footnote: Weight is a poor indicator of health. People in the BMI categories labeled “overweight” and “obese” people are often as healthy or healthier than people in the “healthy” or “normal” BMI category. People in the “overweight” category live longer on average than people in the “normal” or “healthy” category. People who are “overweight” or “obese” who engage in regular physical activity are healthier on basically every measure than sedentary “normal” or “healthy” weight people. The people who are really (statistically) screwed are the “underweight.”

Diet Soda…Probably Not the Cause of the “Obesity Epidemic”

IN SHOCKING REVERSAL, NATION’S SCIENTISTS DECLARE THAT CORRELATION DOES, IN FACT, PROVE CAUSATION!

A couple of studies on artificial sweeteners presented at the American Diabetic Association’s Scientific Sessions in San Diego last week are being hailed as new evidence that diet soda can make you fat. For example, under the headline “2 New Studies: Diet Soda Leads to Weight Gain,” the blog Fooducate declares:

Not only will diet soda NOT help you lose weight, it may actually cause weight gain and diabetes!

image

Study #1 tracked the waist circumference and diet soda consumption of 474 people between the ages of 65 and 74 over an average of 3.5 years. In general, everyone got fatter between their baseline and follow-up appointments, but diet soft drink “users” got 70% fatter than “non-users.” Frequent users (those who consume more than 2 diet sodas per day) got significantly fatter: their waists grew, on average, 500 percent more than non-users.

It appears from this chart that only the difference between the heavy users and non-users was significant at the p<.001 level. The study hasn't been published, so I have no idea how big each of the groups is or whether the other differences are significant at the p<.05 level.

What’s that? A correlation, you say? Why, the only possible explanation is that the variable randomly assigned to the x axis must have caused the differences in the variable plotted on the y-axis! It’s SCIENCE!

CBS News:

Sorry, soda lovers – even diet drinks can make you fat. That’s the word from authors of two new studies, presented Sunday at a meeting of the American Diabetes Association in San Diego.

Business Insider:

Bad News, Your "Diet" Soda Is Making You Fat Too

Time Magazine:

More bad news, diet soda drinkers: data presented recently at the American Diabetes Association’s (ADA) Scientific Sessions suggest that diet drinks may actually contribute to weight gain and that the artificial sweeteners in them could potentially contribute Type 2 diabetes.

Because there’s no chance there’s some confounding factor, or that the causal arrow points in the other direction. After all, people who are getting fatter wouldn’t have any reason to be more likely to drink diet soda, would they?

The study’s authors are somewhat more modest about what their research shows:

“These results suggest that, amidst the national drive to reduce consumption of sugar-sweetened drinks, policies that would promote the consumption of diet soft drinks may have unintended deleterious effects.”

However, it still seems irresponsible to me that they claim their research shows that diet soft drinks have “effects,” deleterious or otherwise. Correlations are not effects. All they’ve shown is that, in general, people over 65 are more likely to consume “diet” drinks if they are also gaining more weight. Which is not especially surprising, if you think about it.

I wish people who write headlines and story leads like the ones quoted above would have “Correlation =/= causation” tattooed across their foreheads, backwards, so they’d be reminded of it every time they look in the mirror.

Study #2 and more incredulous owls below the jump:

20 MICE WHO ATE ASPARTAME SHOWED SOME POTENTIAL EARLY SIGNS OF DIABETES (MAYBE). ALSO, SIGNS OF DEATH.

Study #2 involved 40 mice, half of whom were fed chow + corn oil and half of whom were fed chow + corn oil + aspartame (6 mg/kg/day, which seems to be approximately equivalent to a 132 lb person drinking 20 oz of aspartame-sweetened soda per day). After three months on the diets, the mice on the aspartame diet had fasting glucose levels 37% higher than the mice only getting chow + oil. The fasting insulin levels in the aspartame-fed mice were also 27% lower, but that wasn’t statistically significant.

I’m not sure how biologically significant 37% higher average fasting glucose is, or what the range for each group was, or whether the aspartame-fed mice went on to develop diabetes. The latter is especially hard to answer because apparently, by 6 months after starting the diet, at 18 months of age, only 50% of the aspartame-fed mice and 65% of the control group mice were still alive—which, the researchers note, was not a statistically significant difference and is apparently about par for the course with mice, whose average lifespan seems to be between 1-2 years.

This study is intriguing, and does offer one possible mechanism by which aspartame could independently cause weight gain—if aspartame consumed in sufficient quantities has a biologically meaningful effect on blood sugar levels, then diet sodas could indeed be causing people to store more fat than they would if they consumed another calorie-free beverage. But this is far from a smoking gun. It’s not clear if all artificial sweeteners have the same effect. Or if it would also occur in mice not eating a high-fat diet. Or if the blood sugar effects only happen above a certain level of aspartame consumption. Or if it works the same in humans. Or if it does lead to diabetes or negative health outcomes or just produces some biological markers of pre-diabetes. And what this study also showed was that rats who eat aspartame are not significantly more likely to die early than rats who don’t eat aspartame.

BUT WAIT I HAVE MORE CORRELATIONS FOR YOU

Fooducate has one more piece of evidence to submit—the clincher, it seems, if you’re still not convinced by those two studies that diet sodas make you fat:

Still sipping away at your Diet Sprite?

Need more evidence that drinking diet soft drinks is bad for you?

Consider this – ever since diet soft drinks were introduced into the market, obesity and diabetes rates in this country have skyrocketed.

image

Consider this—ever since aerobics became a nationwide trend, obesity and diabetes rates in this country have skyrocketed! Consider this—ever since sushi became popular in America, obesity and diabetes rates have skyrocketed! STOP DOING AEROBICS AND EATING SUSHI. FOR THE LOVE OF GOD, THINK OF THE CHILDREN.

You’re All Good Eggs: New research shows that specialty eggs aren’t any better for the environment or more delicious

Next year, I will decorate Easter eggs and they will have faces. See 39 other pictures of egg face dioramas at The Design Inspiration by clicking on image

Two articles about eggs published last week have rocked my commitment to paying the specialty egg surcharge. I’m still tentatively on the organic, cage-free, local egg bandwagon for animal welfare and health concerns, but I have to admit that even those reasons may be a little flimsy. The four main reasons given for the superiority of specialty eggs are:

1. They’re better for the environment
2. They taste better
3. They’re produced in a more humane way
4. They’re healthier

There may also be an argument for supporting local producers who might employ less exploitative or abusive labor practices, although that’s not guaranteed. In order to help offset the increased labor requirements of non-conventional practices, small and local farms often rely on unpaid interns and family members, including children. Not that I think it’s a major ethical abuse to have your kids gather eggs, but I often feel at least a little pang of sympathy for the kids—often Amish, sometimes very young-looking—manning farmer’s market booths alone. So I’m deliberately tabling the labor issue because 1) I suspect that the issue of labor conditions at small, local farms vs. big, industrial ones is, like so many things related to the food industry, complicated and 2) it’s nowhere near the top of the list of most consumers’ concerns about eggs.

1. Green Eggs vs. Ham

On June 1, Slate’s Green Lantern reported that specialty eggs (cage-free, free range, and organic) have a greater environmental impact than conventional based on land use, greenhouse gas emissions, and feed efficiency (measured by kg eggs laid/kg feed). The article also noted that according to life-cycle analysis, a recent review article by two Dutch researchers found no consistent or conclusive difference between the environmental impact of pork, chicken, milk, and eggs. Beef requires more land, water, and feed, but pound for pound (or kilogram for kilogram—most life-cycle analyses are European), the review, “did not show consistent differences in environmental impact per kg protein in milk, pork, chicken and eggs.”

The Lantern didn’t evaluate the transportation costs “since the majority of the impacts associated with chicken-rearing comes from producing their feed.” For local eggs, the reduced transportation costs might help balance out the increased feed requirement, but that’s just speculation. For cage-free, free-range, organic, or vegetarian eggs, transportation costs probably further increase the relative impact because not only do they travel just as far or farther than conventional eggs to get to the market, there are probably costs associated with transporting the additional feed they require.

I don't remember where I first heard the story about the egg yolk-inspired label, but it's documented in multiple places, including Red, White, and Drunk All Over and the biography of The Widow Cliquot by Tilar MazzeoMy initial response was basically:

Well, that’s too bad, but efficiency be damned, if it takes more feed and produces higher ammonia emissions to treat chickens humanely and produce healthy eggs with yolks the vibrant orange-yellow of a Veuve Cliquot label, so be it. I know specialty eggs are better, I can see and taste the difference.

2. Golden Eggs

Not so much, apparently. The very next day, The Washington Post published the results of a blind taste test of “ordinary supermarket-brand eggs, organic supermarket eggs, high-end organic Country Hen brand eggs and [eggs from the author’s own backyard chickens].” Blindfolded and spoon-fed, the tasters—two food professionals and six “avocationally culinary” folks with “highly critical palates”—struggled to find differences between the eggs, which were soft cooked to ensure firm whites and runny yolks.

And apparently, this isn’t a new finding. It replicates the results of years of research by food scientists:

Had Pat Curtis, a poultry scientist at Auburn University, been at the tasting, she wouldn’t have been at all surprised. "People’s perception of egg flavor is mostly psychological," she told me in a phone interview. "If you ask them what tastes best, they’ll choose whatever they grew up with, whatever they buy at the market. When you have them actually taste, there’s not enough difference to tell."

The egg industry has been conducting blind tastings for years. The only difference is that they don’t use dish-towel blindfolds; they have special lights that mask the color of the yolks. "If people can see the difference in the eggs, they also find flavor differences," Curtis says. "But if they have no visual cues, they don’t."

Freshness can affect the moisture content, and thus the performance of eggs for some applications, especially recipes that rely heavily on beaten egg whites like meringues or angel food cake. But probably not enough for most people to notice. The author also tested a simple spice cake with super-fresh eggs from her backyard versus regular supermarket eggs. The batters looked different, but once the cakes were baked and cooled, they were indistinguishable.

3. Do They Suffer?

Given how self-evidently cruel battery cage poultry production seems, I’m not entirely sure that “free-range” is as meaningless as people like Jonathan Safran Foer have argued. Sure, “cage free” chickens might never see daylight, and the range available to “free range” chickens might be a dubious privilege at best—a crowded concrete lot exposed to some minimal sunlight would fulfill the USDA requirements. But I don’t think it’s entirely marketing gimmickry, either. For one thing, if there were really no difference, the specialty eggs wouldn’t have a larger carbon footprint.

The animal welfare argument relies on the assumption that either chickens have a right not to experience pain or discomfort or that humans have a moral obligation not to cause them pain, or at least wanton, unnecessary or excessive pain. The debate about animal rights/humans’ moral obligations to animals is too big and complicated for me to cover in any real depth here, but I tend to believe that we ought to try to minimize the pain and discomfort of anything that seems capable of suffering. I used to draw the line at the limbic system—i.e. fish and invertebrates might respond to pain but don’t process it in a way that rises to the level of suffering, whereas birds and mammals can suffer and it’s often pretty apparent when they do. However, as it turns out, the boundaries of the limbic system are “grounded more in tradition than in facts,” and there are unsettled questions in my mind about what constitutes suffering and how to evaluate it. 

Even renowned animal rights theorist Peter Singer has gone back and forth about oysters over the years. I suspect that David Foster Wallace was right when he concluded that what guides our behavior in these matters has more to do with historically and culturally-variable forms of moral intuition than any objective criterion for “suffering”:

The scientific and philosophical arguments on either side of the animal-suffering issue are involved, abstruse, technical, often informed by self-interest or ideology, and in the end so totally inconclusive that as a practical matter, in the kitchen or restaurant, it all still seems to come down to individual conscience, going with (no pun) your gut” ("Consider the Lobster” footnote 19).

I hate relying on “I know it when I see it” standards, because I suspect we’re all inclined to see what we want to, but I don’t have a better answer. My gut says that chickens can suffer and that being able to flap around a concrete lot is better than never getting to move at all. However, my gut also says that chickens are pretty stupid creatures, and it might be an entirely reasonable thing to care more about the environmental impact of egg production than the happiness and well-being of the chickens.

4. Eggs Good For You This Week

Health is the issue that matters most to most consumers (see: The Jungle), and unfortunately, the available research on conventional vs. specialty eggs is frustratingly inconclusive. The most common assertion re: the health of specialty eggs concerns omega-3 fatty acids. I’ve mentioned this in passing and will try to devote some more time to it soon, but for now, I’m tentatively convinced that omega-3s are healthful and low ratios of omega-6:omega-3 are optimal.

Some studies have suggested that chickens raised on pasture—i.e. who get at least some of their nutrients from plants, especially clover or alfalfa—produce eggs with more omega-3 fatty acids and vitamins A and E (and less cholesterol and saturated fat, not that that probably matters). However, specialty labels like “cage free,” “free range,” and “organic” don’t mean pastured and the results of the nutritional analysis of eggs bearing those labels don’t provide very clear guidelines about what to purchase.

A 2002 comparison between five different kinds of specialty eggs and conventional eggs found differences between them, but none that lead to a simple characterization of specialty eggs as healthier:

From Cherian et al in Poultry Science 81: 30-33 (2002)

The "animal fat free and high in omega-3” eggs (SP1) had the highest percentage of omega-3 fatty acids and lowest ratio of omega 6: omega 3, and the cage-free, unmedicated brown eggs were also significantly better by that measure. However, the Organic-certified free-range (SP2) and cage-free all-vegetarian-feed eggs (SP4) had similar omega-3 content to the regular eggs. While some of the differences might be due to the feed, the authors note that the age, size, and breed of the hen can also affect the composition of fats and nutrients.

The study also showed that the shells of some of the specialty eggs were weaker, which supports other research showing more breakage and leaking in specialty eggs than conventional and my anecdotal experience of typically having to set aside the first few cartons I pick up because they contain cracked eggs.

Additionally, a 2010 USDA survey of traditional, cage-free, free-range, pasteurized, nutritionally enhanced (omega-3), and fertile eggs also concluded that:

Although significant differences were found between white and brown shell eggs and production methods, average values for quality attributes varied without one egg type consistently maintaining the highest or lowest values. (Abstract here, no free full text available)

In sum, if you can get pastured eggs (either from your own backyard or a farmer whose practices you can interrogate or even observe), they might be a little better for you than conventional. But after reading all this, I still found myself thinking: But what about the color difference? Doesn’t a darker yellow yolk mean the egg itself is healthier? Apparently not:

Yolk colour varies. It is almost completely dependent upon the feed the hen eats. Birds that have access to green plants or have yellow corn or alfalfa in their feed tend to produce dark yolks, due to the higher concentration of yellow pigments (mainly carotenoids) in their diet. Since commercial laying hens are confined, lighter and more uniformly coloured yolks are being produced. Yolk colour does not affect nutritive value or cooking characteristics. Egg yolks are a rich source of vitamin A regardless of colour. (from Wageningen University)

The record on other health concerns like salmonella and dioxin and PCB content is mixed:

4A: Can you eat raw cookie dough if it’s organic?

The salmonella thing is reminiscent of the e coli in grass-fed beef thing: some people actually claim organic chickens have no risk of salmonella. One UK study allegedly found salmonella levels over five times higher in conventional caged hens than in birds raised according to Soil Association organic standards (which are comparable to USDA Organic certification). 23.4% of farms with caged hens tested positive for salmonella compared to 4.4% of farms with organic flocks and 6.5% with free-range flocks. The explanation proffered is that the spread of the disease is inversely related to flock size and density. No link or citation for the study itself.

A 2007 UK study that tested 74 flocks (59 caged and 15 free range) from 8 farms, all of which had been vaccinated against salmonella, found a smaller but still significant difference: 19.4% of cage chicken house samples and 10.2% of free-range chicken house samples taken over a 12-month period tested positive for salmonella. However, they also noted a high degree of variation between flocks, and that the longest continuously-occupied houses were typically the most heavily contaminated. It’s possible that some of the results of other studies can be attributed to the fact that free-range or organic hen operations are likely to be newer and differences between them and conventional may diminish as time goes on.

On this side of the Atlantic, the results seem to show the opposite. A 2005 USDA study that tested free-range, all-natural antibiotic-free, and organic chicken meat (and contamination in chickens themselves has been linked to salmonella in eggs) found salmonella in all three groups at higher rates than in past years’ surveys of commercial chicken meat:

A total of 135 processed free-range chickens from four different commercial free-range chicken producers were sampled in 14 different lots for the presence of Salmonella. Overall, 9 (64%) of 14 lots and 42 (31%) of 135 of the carcasses were positive for Salmonella. No Salmonella were detected in 5 of the 14 lots, and in one lot 100% of the chickens were positive for Salmonella. An additional 53 all-natural (no meat or poultry meal or antibiotics in the feed) processed chickens from eight lots were tested; 25% of the individual chickens from 37% of these lots tested positive for Salmonella. Three lots of chickens from a single organic free-range producer were tested, and all three of the lots and 60% of the individual chickens were positive for Salmonella. The U.S. Department of Agriculture Food Safety and Inspection Service reported that commercial chickens processed from 2000 to 2003 had a Salmonella prevalence rate of 9.1 to 12.8%. Consumers should not assume that free-range or organic conditions will have anything to do with the Salmonella status of the chicken.

Additionally, a 2007 analysis of fresh, whole broiler chickens by Consumer Reports found that 83% tested positive for campylobacter or salmonella, and that chickens labeled organic or raised without antibiotics were more likely to harbor salmonella than conventionally-produced broilers:

We tested 525 fresh, whole broilers bought at supermarkets, mass merchandisers, gourmet shops, and ­natural-food stores in 23 states last spring. Represented in our tests were four leading brands (Foster Farms, Perdue, Pilgrim’s Pride, and Tyson) and 10 organic and 12 nonorganic no-antibiotics brands, including three that are “air chilled” in a newer slaughterhouse process designed to re­duce contamination. Among our findings:

  • Campylobacter was present in 81 percent of the chickens, salmonella in 15 percent; both bacteria in 13 percent. Only 17 percent had neither pathogen. That’s the lowest percentage of clean birds in all four of our tests since 1998, and far less than the 51 percent of clean birds we found for our 2003 report.
  • No major brand fared better than others overall. Foster Farms, Pilgrim’s Pride, and Tyson chickens were lower in salmonella incidence than Perdue, but they were higher in campylobacter.

Ultimately, salmonella is a always a risk when dealing with chicken or eggs and it’s not clear that specialty eggs are any better than conventional. If you’re concerned about salmonella, cook your food to 165F or stick to vegan options. You know, like peanut butter.

4B: What’s in the grass?

One final concern: a 2006 Dutch study found that free-range eggs in Europe have increased levels of dioxins and PCBs (which fall under the category of dioxin-like compounds), apparently because they are present in the soil in both residential and agricultural areas. “Dioxins” refer to a wide variety of compounds and they vary in toxicity; the term is basically just shorthand for environmental pollutants. On the one hand, they’re everywhere and we probably can’t avoid them so who cares? On the other, many are fat soluble so eggs are of greater concern than, say, apples.

There’s not really enough research on this to draw any conclusions. Which just pains me to type for what feels like the umpteenth time, because, seriously, is there ever conclusive research? Can we ever really know anything about anything? I like to think we can, but I’ll be damned if I don’t feel like every time I try to find more information about any kind of nutritional claim, the answer turns out to be “well, that’s complicated” or “well, the research on that isn’t conclusive.” Sometimes I really just want to see a chart that says YES! THIS IS THE RIGHT ANSWER! IT IS RELIABLE AND ACCURATE AND CONTROLLED FOR ALL POSSIBLE VARIABLES.

So just in case you might be wondering if I’m trying to be deliberately indecisive or vague in service of whatever ideological position that would even promote: I’m not. When I find conclusive results, I will share them with you in very excited caps lock. 

So Here’s The Deal

If you care more about climate change and efficient resource allocation than chicken welfare, buy conventional eggs; if you care more about chicken welfare, buy cage-free, free-range, Organic, or perhaps ideally, local. Taste and health-wise, there’s no clear difference, although I know that won’t prevent some of you from believing there is (remember the chocolate yogurt with “good strawberry flavor”?) Perhaps the biggest lesson is that, once again, the foods some people think are objectively superior for all kinds of reasons  may not be, and attempting to eat “better” is way more complicated than simply choosing the “green” alternative.

Who Says Robots Can’t Taste?: On Cooking Robots and Electronic Noses

The color of the stuff in the bowl for some reason made me realize, for the first time, the coincidental similarity of Freud's "unheimlich" and the Heimlich maneuver. Image from: http://www.fanpop.com/spots/bender/links/2942473 

Kantos Kan led me to one of these gorgeous eating places where we were served entirely by mechanical apparatus. No hand touched the food from the time it entered the building in its raw state until it emerged hot and delicious upon the tables before the guests, in response to the touching of tiny buttons to indicate their desires.—Edgar Rice Burroughs, “A Princess of Mars” (1912)

Chef Motoman griddling up okonomiyaki from http://www.rutgersprep.org/kendall/7thgrade/cycleD_2008_09/mk/burgerflippingrobot.jpgBy now, robots who can cook are nothing new. Most of them are basically one trick ponies (at least culinarily): a Swiss robot that was taught to make omelets to demonstrate its abilities, Japanese robots that can grill okonomayaki or make octopus balls from scratch.There’s even a restaurant called Famen in Nagoya staffed by two robots who act out a comic routine and spar with knives in between preparing bowls of ramen. However, the cooking robot recently introduced by two Chinese unversities that’s making the rounds online this month comes closer to the fantasy in the Burroughs story of something that can produce a huge variety of foods on demand, almost like replicators on Star Trek. This new cooking robot can make 300 different dishes based on the offerings of four top chefs in Jiangsu Province and may soon be able to produce up to 600.

is this really nightmare-inducingly realistic? from http://www.nextnature.net/2009/06/robot-hand-meets-sushi/What strikes me about the media coverage of cooking robots is the paradox that, on the one hand, the fact that they can do something so essentially human is a substantial part of the delight they inspire. Their food-related activities are often designed to soften peoples’ resistance to robots—for example, researchers at Carnegie Mellon developed the Snackbot that they introduced to a reporter for the New York Times last month to “gather information on how robots interact with people (and how to improve homo-robo relations).” But on the other hand, the essential humanness of cooking can also make the robots especially unnerving. In fact, the more human, the more they seem to bother people. The Engadget article on the sushi-grabbing hand, “Chef Robot makes its video debut, nightmares forthcoming,” seems mostly disturbed by how “realistic” the hand looks:

In case you missed it, the robot itself is actually just a standard issue FANUC M-430iA robot arm with a way too realistic hand attached to it, which apparently not only helps it prepare sushi, but some tasty desserts as well. Head on past the break for the must-see video, you’ve nothing to lose but your ability to unsee it.

Though usually slightly less dramatic, most other articles I’ve seen about cooking robots end with some sort of joke or disclaimer, which usually reflect anxieties about the threat that cooking robots pose to the boundary between human and machine.

If this thing ever gets imported to the U.S., it would need to make fortune cookies too. But what would a robot fortune say?—CNet (on the 300-dish Chinese cook)

More than 200 diners have enjoyed the machine’s cuisine thus far, and reportedly taste testers have found the food to be on par with a traditional restaurant kitchen, flavor-wise. (No mention has been made of the robot’s plating abilities.)—CNet (on a prototype developed by a retired professor using an induction burner and robotic arm)

While it lacks the personal touch and the ability to hold some small banter with regular guests, at least you can be sure the fingers have not gone around digging noses or scratching butts.”—Ubergizmo (on the sushi hand)

“No matter how skilled Motoman is, I doubt real chefs like Anthony Bourdain or Mario Batali would be caught dead cooking next to him.” Robot Living (referring to Chef Motoman, who was designed to work alongside humans in a restaurant environment)

A seemingly irrepressible impulse to name something robots can’t infringe on, like speculating about the future or making the kind of aesthetic and creative decisions that go into plating, or find some other way to distinguish them from human chefs—the ability to banter or pick their nose or smoke and hate on vegans or compete in elaborate cooking competitions. Even the NYTimes article, which focuses mostly on how food “humanizes” robots, ends by erecting a wall based on the ability to taste:

The real obstacle to a world full of mechanized sous-chefs and simulated rage-filled robo-Gordon Ramsays may be something much harder to fake: none of these robots can taste.

Keizo Shimamoto, who writes a blog on ramen noodles and has eaten at Famen, the two-robot Japanese restaurant, said that the establishment was “kind of dead” when he ate there last year. Though the owner said that people do taste the food, according to Mr. Shimamoto, “It was a little disappointing.” It’s one thing to get people to stop by to see the robots. “But to keep the customers coming back,” he said, “you need better soup.”

And while it’s true that none of the robots mentioned in the article can taste, that doesn’t mean there aren’t other robots that can.

What If Chef Motoman Had a Nose?

Researchers have developed mass spectrometers that can determine the ripeness of tomatoes and melons and describe the nuances in different samples of espresso—which ones are more or less floral, citrusy, which have hints of buttery toffee or a woody undertone, etc. Some electronic noses, as the e-sensing systems are often called, are so sensitive they can pinpoint not only the grape varietal and region where a wine was produced, but what barrel it was fermented in. As I’ve discussed before, tastes are largely produced by how substances react with our ~40 taste receptors and ~400 olfactory receptors. Every unique flavor/odor combination is like a “fingerprint,” and e-sensing systems are far, far better at identifying and classifying those fingerprints than humans.

In Mindless Eating: Why We Eat More Than We Think, Brian Wansink mentions a 2004 study performed at the Cornell University Food and Brand Lab where 32 participants were invited to taste what they were told was strawberry yogurt in the dark. They were actually given chocolate yogurt, but nineteen of them still rated it as having “good strawberry flavor.” The yogurt-tasters weren’t food critics or trained chefs, but even “experts” are dramatically influenced by contextual cues. Frederic Brochet has run multiple experiments with wine experts in the Bordeaux region of France, where many of the world’s most expensive wines are produced. In one experiment, he had 54 experts taste white wines that had been dyed red with a flavorless additive and in another he served 57 experts the same red wine in two different bottles alternately identifying it as a high-prestige wine and lowly table wine. In the first experiment, none of the experts detected the white wine flavor, and many of them praised it for qualities typically associated with red wines like “jamminess” or “red fruit.” In the second, 40 of the experts rated the wine good when they thought it was an expensive Grand Cru and only 12 did when they thought it was a cheap blend (read more: “The Subjectivity of Wine” by Jonah Lehrer).

It’s true that robots can’t make independent subjective judgments about tastes and odors. The ramen robots might be able to customize your ramen based on variables like the proportion of noodles to broth and different kinds of toppings, but they can’t simply make a “better” soup. However, if outfitted with an electronic nose and programmed to recognize and replicate the fingerprint of a really fantastic tonkotsu broth, there’s no reason to believe they wouldn’t be able to make the best ramen you’ve ever tasted—and likely with greater consistency than a human chef (assuming they had access to the necessary ingredients). There are other factors, too, like rolling and cooking the noodles just so to give them a toothsome bite, but again, noodle recipes and some way of evaluating their texture could be programmed into a robot chef’s computer brain. In other words, the only reason robots can’t taste is because we haven’t designed them to yet.

they just need a better spectrometer!

Even though robots can’t innovate criteria, they can be trained to make subjective judgments—Science just reported yesterday that researchers in Israel have trained an electronic nose to predict whether novel smells are good or bad. They exposed it to 76 odors that were rated by human volunteers (both Israeli and Ethiopian to account for cultural differences, which unsurprisingly turn out to be pretty minor) on a scale from “the best odor you have ever smelled” to “the worst odor you have ever smelled.” Then, they exposed the nose to 22 new odors and compared them to the ratings of a new group of volunteers. The electronic nose agreed with the humans on the relative pleasantness of the odor 80% of the time. In another trial using only extreme odors—ones that had been rated most pleasant or unpleasant—it agreed with the humans 90% of the time. (Here’s the original study)

So, theoretically, it might be possible not only to program a robot to make foods that match a “fingerprint” that’s widely rated “delicious” but also to predict what kinds of foods are likely to taste good or bad. And perhaps the next step would be to ask it to innovate combinations that are likely to taste especially delicious.

However, given that the way we taste often has more to do with expectations and presentation than the chemical properties of the food, the discomfort inspired by cooking robots may be more of a barrier than the technology itself. If sushi merely been transferred from tray to plate by a robot hand is nightmare-inducing, we’re probably a long way—culturally, if not technologically—from Robot Cuisine.

HFCS Follow-up: What the Rats at Princeton Can and Can’t Tell Us

Ed called my attention to last week’s press release about the study at Princeton currently getting some mass media attention. The press release claims:

Rats with access to high-fructose corn syrup gained significantly more weight than those with access to table sugar, even when their overall caloric intake was the same. 

i know it's a squirrel, not a rat. apparently no one's gotten a rat to do this and then circulated it with the right keywords to match my google search. this image likely not original to: http://ybfat101.com/notyourfault.shtmlThat’s pretty surprising, given that other studies have suggested that there is no difference between HFCS and sucrose. The Princeton study doesn’t offer a definitive explanation for the difference they found, but they suggest that it may have something to do with the slightly greater proportion of fructose in the HFCS.

As I noted in the first post on high-fructose corn syrup, HFCS-55, which is the kind used in soft drinks and the Princeton study, has roughly the same proportions of fructose and glucose as table sugar. Table sugar, or sucrose, is composed of fructose bonded to glucose so it’s a perfect 50-50 split. HFCS-55 contains 55% fructose, 42% glucose, and 3% larger sugar molecules. There’s a lot of evidence that fructose is metabolized differently than glucose, and may promote the accumulation of fat, especially in the liver and abdomen. Indeed, that’s why I believe that agave nectar is probably nutritionally worse than table sugar. Still, I’d be pretty shocked if a 5% increase in fructose could produce a statistically significant difference in weight gain, unless the rats were eating nothing but sugar-water. And they weren’t—in both of the experiments reported in the original study, the rats had access to unlimited “standard rat chow,”

Experiment 1: Rats Who Binge?

In the first experiment, 40 male rats were divided into four groups of ten. All of them had 24-hour access to rat chow and water. Group 1 was the control, so they just had chow and water. Group 2 had 24-access to an 8% solution of HFCS (.24 kcal/mL), which the press release claims is “half as concentrated as most sodas”. Group 3 had 12-hr access to the same HFCS solution. And Group 4 had 12-hr access to a 10% solution of sugar dissolved in water (.4 kcal/mL), which the press release claims is “the same as is found in some commercial soft drinks.” The two things of note so far are that none of the rats had 24-hr access to sucrose-sweetened water, and that the concentration of the sucrose was nearly 2x that of the HFCS syrup.*

Why the 24 hr vs 12 hr groups? According to the study:

We selected these schedules to allow comparison of intermittent and continuous access, as our previous publications show limited (12 h) access to sucrose precipitates binge-eating behavior (Avena et al., 2006).

In other words, they fed the sucrose group on a schedule that they already knew would cause binging. And they didn’t include a 24-hr sucrose group to control for that.

That helps to explain the results: the rats that had 24-hr access to HFCS-water gained less weight than either the rats who had 12-hr access to sucrose-water or the rats that had 12-hr access to HFCS-water. So according to the experiment, it’s better to consume some HFCS than it is to binge on sugar (not, obviously, how they chose to frame it in either the formal write-up or the press release).

Princeton rats

The only difference between the four groups in the first experiment that was statistically significant at a p<0.05 was between the rats who got chow only and the rats who got 12-hr HFCS. There was no statistically significant difference between the rats who had 12-hr access to sucrose-water and the rats who had 12-hr access to HFCS-water. There wasn’t even a significant difference between the rats who had 24-hr access to HFCS-water and the chow-only rats. So the only basis for the claim in the press release that HFCS is worse than sucrose is the fact that the rats with 12-hr HFCS got a “significant” amount fatter while the 12-hr sucrose rats didn’t. Even though the 24-hr HFCS rats didn’t either.

I am not the only one who’s picked up on this—both Marion Nestle (a vocal critic of the food industry) and Karen Kaplan (not, as far as I can tell, a shill for the Corn Refiners Association) also dispute the claim that this research demonstrates anything conclusive about HFCS vs. sucrose. The lead researcher replied to Nestle’s post, and rather than addressing the discrepancy between the 12-hr and 24-hr HFCS groups, he merely corrects her assumption that the 24-hr rats should be fatter:

There have been several studies showing that when rats are offered a palatable food on a limited basis, they consume as much or more of it than rats offered the same diet ad libitum, and in some cases this can produce an increase in body weight. So, it is incorrect to expect that just because the rats have a food available ad libitum, they should gain more weight than rats with food available on a limited basis. –Bart Hoebel

Which just makes it all the more baffling why they didn’t include a 24-hr sucrose group. Additionally, according to their results, binging or “consuming more” doesn’t explain the results, because:

There was no overall difference in total caloric intake (sugar plus chow) among the sucrose group and two HFCS groups. Further, no difference was found in HFCS intake and total overall caloric intake in the groups given 12-h access versus 24-h access. Both groups consumed the same amount of HFCS on average (21.3±2.0 kcal HFCS in 12-h versus 20.1±1.6 kcal HFCS in 24 h), even though only the 12-h group showed a significant difference in body weight when compared with the control groups.

The only explanation they offer for these results is the slight difference in the amount of fructose the rats in the HFCS and sucrose groups consumed. But even that relies on the idea that the HFCS rats did not feel as satisfied by their sugar water and compensated by eating more:

…fructose intake might not result in the degree of satiety that would normally ensue with a meal of glucose or sucrose, and this could contribute to increased body weight.

Unless satisfaction itself makes rats thinner.

Experiment 2 (Males): Wait, Where’s the Sucrose?

In the first part of the second experiment, 24 male rats were divided into three groups of eight. Again, all three had unlimited chow and water. Group 1 had 24-hr access to the HFCS-solution, Group 2 had 12-hr access to the HFCS-solution, and Group 3 was the chow-only control. Sucrose, you’ll note, drops out entirely. According to the study:

Since we did not see effects of sucrose on body weight in Experiment 1 with males, we did not include sucrose groups in this long-term analysis in males.

But there were no effects of HFCS on body weight on the 24-hr schedule! The omission of sucrose from this experiment makes as much sense as the omission of a 24-hr sucrose group in the first one. The lead researcher’s reply to Marion Nestle’s criticisms offered no further clarification about this choice. 

We explain in the article that we purposefully did not compare HFCS to sucrose in Experiment 2 in males, because we did not see an effect of sucrose on body weight in males in Experiment 1.

This study went on for 6 months instead of 2 months and, as the table above shows, the groups with both 24-hr and 12-hr access to HFCS-water gained a significantly greater amount of weight than the chow-only rats. This time, the 24-hr HFCS rats gained more weight than the 12-hr HFCS rats.

Experiment 2 (Females): Sucrose is back (still only 12-hr)! But chow is limited.

In order to “determine if the findings applied to both sexes,” they also ran a slightly different version of the second experiment on some female rats (n unknown). The control group, as usual, got unlimited chow and food. Group 1 got 24-hr access to HFCS-water. The remaining two groups got 12-hr access to chow (“to determine if limited access to chow, in the presence of HFCS or sucrose, could affect body weight”) and either 12-hr access to HFCS-water or 12-hr access to sucrose-water. Yeesh. How about testing one thing at a time, guys?**

So this time, only the rats with 24-hr access to HFCS gained a significantly greater amount of weight than the chow-only rats, which flies in the face of the claim that rats with limited access to a palatable food eat more. And the 12-hr sucrose rats actually gained slightly more weight (though not a statistically significant amount) than the 12-hr HFCS rats.

In other words, the findings in the three studies were completely inconsistent. For male rats in the short term, 12-hr access to HFCS induces significant weight gain but 24-hr access to HFCS does not. For male rats in the long term, both 12-hr or 24-hr access to HFCS induces significant weight gain, but they didn’t test sucrose. For female rats in the long term, only 24-hr access to HFCS with unlimited chow induces significant weight gain and limited chow, HFCS, and sucrose do not. And yet, based on this, they claim:

In Experiment 2 (long-term study, 6–7 months), HFCS caused an increase in body weight greater than that of sucrose in both male and female rats. This increase in body weight was accompanied by an increase in fat accrual and circulating levels of TG, shows that this increase in body weight is reflective of obesity.

Despite the fact that Experiment 2 didn’t even test the long-term effects of sucrose consumption on male rats, and 12-hr HFCS (albeit with limited chow) didn’t cause significant weight gain in female rats.

As Usual: Needs More Research

Based on the results of all three experiments, they conclude:

Rats maintained on a diet rich in HFCS for 6 or 7 months show abnormal weight gain, increased circulating TG and augmented fat deposition. All of these factors indicate obesity. Thus, over-consumption of HFCS could very well be a major factor in the
“obesity epidemic,” which correlates with the upsurge in the use of HFCS.

Despite the fact that obesity has also increased in many countries where HFCS is virtually never used, like Australia. According to a 2008 USDA paper:

Australia and the United States have a high and rising prevalence of obesity. They have opposite sugar policies: virtually no distortions affect Australia’s use of sugar, whereas sugar policy in the United States taxes sugar use. Sugar consumption per capita in Australia has been flat from 1980 to 2001, after which it increased by 10%-15%. Sugar is the major sweetener consumed in Australia.

The fact that the experiment doesn’t seem to show that HFCS is necessarily worse than sucrose doesn’t mean the findings aren’t intriguing. I really do want to know, for example, why rats with 12-hr access to HFCS gain more weight in the short term than rats with 24-hr access to HFCS, but the 24-hr HFCS rats gain more in the long term. And if, as they claim, the rats in all the groups consumed the same number of calories—which Nestle doubts because, "measuring the caloric intake of lab rats is notoriously difficult to do (they are messy)”—why were there any differences at all at the end of the trials? If none of the rats are eating more (and indeed, it seems that in some cases the HFCS rats were eating slightly less), what is the mechanism causing them to gain more weight, at least on some feeding schedules?

Does the concentration of the sugar have anything to do with it? In his reply to Nestle, Hoebel says:

Eating sucrose does not necessarily increase body weight in rats, although it has been shown to do so in some studies, usually employing high concentrations of sucrose, such as 32%. Our previously published work, has found no effect of 10% sucrose on mean body weight. At this concentration, rats seem to compensate for the sucrose calories by eating less chow.

I want to know if that’s true for HFCS as well. And did the difference in the concentrations of the HFCS and sucrose drinks have anything to do with the difference in the rats’ weight in this study?

Or does it maybe have something to do with sucrase, the enzyme that splits the fructose and glucose in table sugar? From what I’ve read, sucrase is present in the human digestive tract in sufficient amounts that it doesn’t rate-limit the absorption of those sugars in sucrose compared to the consumption of free fructose and glucose. But is it somehow involved in metabolism or appetite-regulation?

So rather than answering any questions about HFCS vs. table sugar, this really just raises a lot of new ones.

*It’s also not clear why they gave them different concentrations of sweetener. You’d think they would make them both soda-strength, or at least calorically equivalent.

**The failure to control for multiple variables does, in fact, complicate their ability to make any conclusions about gender difference:

In the present study, male rats maintained on 12-h access to HFCS also gained significantly more weight than chow-fed controls, while female rats maintained on 12-h access did not. It is possible that this can be accounted for by the fact that these males had ad libitum chow, while the females had 12-h access to chow. It is possible that the lack of chow for 12 h daily suppressed weight gain and TG levels
that might have otherwise been elevated in the female 12-h HFCS access group. This would indicate an effect of diet rather than a gender difference.

Don’t Drink the Agave-Sweetened Kool-Aid Part II: What’s Wrong With Any High-Fructose Syrup

Who knew agaves grew in so many different flavors?

In the first post on agave nectar, I focused primarily on why it’s no more “natural” than high-fructose corn syrup, which is a delicious irony given how both sweeteners tend to be portrayed. But that isn’t necessarily a reason to avoid agave nectar. “Natural” is at best an imperfect heuristic for healthiness or environmental friendliness, and has no inherent relationship with deliciousness. But, as I also suggested in the first post, agave nectar is certainly no better health-wise than other sources of sugar, and the fact that it’s much higher in fructose than most sweeteners (70-90% vs. ~50%) gives me reason to believe it may actually be worse for your health than sucrose or HFCS-55.

So Don’t Drink the Agave-Sweetened Ketchup Either. Because That Would Be Gross.GRANOLA-WASHING

Perhaps the most baffling thing is how many people seem to think agave nectar doesn’t count as sugar. For example, the rave review of Wholemato Organic Agave Ketchup in Men’s Health, contrasts it with the “liquid candy” that is HFCS. And then implies that the even-higher-fructose agave-sweetened condiment is healthier than “fatty” butter (it’s like someone at Men’s Health was specifically trying to give me apoplectic fits): 

This ketchup forgoes the high-fructose corn syrup and uses agave nectar, preserving sweetness without clobbering your fries or hot dog with liquid candy…. Slather it on your sweet potatoes as an alternative to a fatty slab o’ butter.

Note: The review is only available on the Wholemato site because the “read more” link is broken, but I’m not inclined to think it’s a fabrication as the other links on their “buzz” page are legit and you can find nearly-identical, equally-apoplexy-inducing claims about Wholemato Ketchup at The Kitch’n, Girlawhirl, i like granola, and Well Fed Man, among others.

There are also people who claim to have given up sugar, but who still eat agave nectar. Some excerpts from the comment thread on Nicole MacDonald’s resolution to give up sugar in 2010:

Jennifer: I went sugar-free at 16 to help my psoriasis & still don’t have it, 8 years later .
I don’t miss it at all. If I want to make a cake or anything I will use agave nectar … you realise there are so many interesting & alive foods out there you can enjoy without compromising your health!! xx

Nicole: I have to admit that in the first few weeks I baked a lot using ingredients like honey, agave and brown rice syrup. Cookies are my favorite to make, and I have a long list of recipes on my blog to the right. I also drank a lot of flavored tea with honey added and that seemed to cure some of my cravings.

Beth: I stopped eating sugar last year and its worked out pretty well. As long as I can have natural sugars which are found in fruits, then I’m totally satisfied.

Not All Things That Occur Naturally In Fruit Should Be Consumed In Quantity. Like Cyanide.

Beth is certainly not alone in thinking that “sugars which are found in fruits” are healthier than other sugars. People are frequently resistant to the idea that fructose might be unhealthy because, as the name so conveniently reminds them, it’s found in fruit. Or, if they’ve been sold on the idea that HFCS is poison and fructose has something to do with that, they sometimes suggest that there must be different kinds of fructose. Take, for example, the comment by Dave on this post by ThursdaysGirl, which expressed some reservations about agave nectar:

[. . .] you say Agave is 70% fructose, ok, so that means that means a teaspoon of Agave (about 4 grams) has about 2.8 grams of fructose… Hmmm, a small tomato has about 2.6 grams of fructose in it, the same as a carrot!… so, by your ridiculous logic, you should run away from tomatoes and carrots as fast and as far as you can! OMG, never eat another tomato! And don’t even get me started on Apples!

Remember, HFCS, regardless of what the lying chemists say, is not a natural source of Fructose. It is a man made molecule. It is illegal to call High Fructose Corn Syrup “All Natural”. I wonder why… Agave can be found both All Natural and Organic!!! Small amounts of Fructose actually help metabolize Glucose better, plus its low glycemic, has natural inulin fiber which is amazingly beneficial [. . . .]

I would trust the Mayo Clinics recommendations as regards to High Fructose Corn Syrup… it is poison. But really, Apples, Carrots, Tomatoes etc all bad for you? Stop it.

It’s actually not illegal to call HFCS “natural.” The FDA has been notoriously unwilling to define “natural” aside from the essentially meaningless distinction between “artificial” and “natural” colors and flavors—which Eric Schlosser talks about extensively in Fast Food Nation (pp. 121-131). As of July 2008, HFCS is “natural” for the purposes of food labeling. You can read all about the ongoing legal debates here. However, that hasn’t stopped people from trying to differentiate “natural” fructose, like the stuff in fruit, from “chemically-produced” fructose, like the stuff in HFCS. The problem is that they can’t seem to agree which side the fructose in agave nectar is on.

As you might expect, the agavevangelists are on the side of “natural.” According to Kalyn’s Kitchen Picks :

It’s been a long time since I discovered a new product that rocked my world in the way agave nectar has done…. The sweet taste in agave nectar comes from natural fructose, the same sweetener found in fruit (not to be confused with high fructose corn syrup which has chemically produced fructose.)

On the other side, there’s Rami Nagel, whose Natural News article been widelycirculated and cited by people on both sides of the agave nectar debate. According to Nagel, agave nectar is composed of bad, man-made “fructose,” which he claims actually has a completely different chemical formula from the sugar in fruit, which he calls “levulose”:

We all know that the chemical formula for water is H2O: two hydrogens and one oxygen. The opposite would be O2H, which is nothing close to water. Likewise, man-made fructose would have to have the chemical formula changed for it to be levulose, so it is not levulose. Saying fructose is levulose is like saying that margarine is the same as butter. Refined fructose lacks amino acids, vitamins, minerals, pectin, and fiber. As a result, the body doesn’t recognize refined fructose. Levulose, on the other hand, is naturally occurring in fruits, and is not isolated but bound to other naturally occurring sugars. Unlike man-made fructose, levulose contains enzymes, vitamins, minerals, fiber, and fruit pectin. (Similar claims here and here)

However, levulose is just an alternate name for fructose. A search for “levulose” on Wikipedia automatically redirects to their fructose entry. ChemBlink claims they’re synonyms with the exact same molecular formula and structure:

levulose/fructose...not quite as catchy as tomayto/tomahto 

And even the earliest examples from the OED reveal that the terms are completely interchangeable: 

1897 Allbutt’s Syst. Med. III. 386 Cane sugar is partly left unchanged, partly converted into glucose and lævulose. 1902 Encycl. Brit. XXII. 721/1 Glucose and fructose (lævulose)the two isomeric hexases of the formula C6H12O6 which are formed on hydrolysing cane sugar.

With “levulose” eventually giving way to “fructose” by the the 1970s:

1974 Nature 10 May 194/3 Although it is true that some bacteriologists are extremely conservative in the names they use for carbohydrates, surely nobody now uses ‘levulose’…in preference to ‘fructose’ these days.

A PubMed search for “levulose” also turned up 30,398 articles about (surprise!) fructose. The twenty articles that actually had “levulose” in the title were almost all translations, mostly from German.

So no, there is no difference between “naturally occurring” and “chemically-produced” fructose (and if the fructose in HFCS is the latter, so is the fructose in agave). Nonetheless, Dave and Rami Nagel are both at least partially correct. Fructose/levulose may not contain enzymes, vitamins, minerals, fiber, and fruit pectin, but the fruits that contain levulose/fructose certainly do. And there’s no reason to believe that eating a small amount of agave nectar, say a teaspoon, with similar amounts of fiber, protein, and other nutrients as would be found in a tomato or carrot would have a different or worse effect on the body than the vegetables themselves.

Fructose and Your Liver

Just because fructose isn’t necessarily bad for you in the amounts present in most fruits and vegetables, that doesn’t mean it’s a healthier substitution for other sugars. The evidence from studies on humans is still pretty scant. However, in a 2008 study where 23 subjects got 25% of their caloric intake from either fructose-sweetened or glucose-sweetened beverages for 10 weeks, the subjects who drank the fructose-sweetened drinks showed signs of decreased insulin sensitivity (a sign of diabetes) and increased fat in their abdominal regions, especially around their heart and liver, which is associated with cardiovascular disease (here’s the study itself or a translation from WebMD).

We’ve known for over 50 years that fructose is metabolized differently than glucose. Once it enters the body, it’s taken up by the liver, so it doesn’t raise blood sugar levels as much as glucose. It bypasses the step that insulin regulates, so diabetics can digest fructose about as well as non-diabetics. Which initially sounds good, especially for diabetics; however, more recently, fructose has been shown to have the same effects as alcohol on the liver:

I won't pretend to understand everything that's going on here, except that it illustrates the various processes and feedback mechanisms that cause fatty liver disease. From “Fructose Takes a Toll” in the August 2009 Hepatology (login required)

As a recent article in Physiological Review notes:

Fructose was initially thought to be advisable for patients with diabetes due to its low glycemic index. However, chronically high consumption of fructose in rodents leads to hepatic and extrahepatic insulin resistance, obesity, type 2 diabetes mellitus, and high blood pressure. The evidence is less compelling in humans, but high fructose intake has indeed been shown to cause dyslipidemia and to impair hepatic insulin sensitivity.

So while probably harmless in small amounts, it’s certainly not a “healthy” sugar or a free pass to eat sweet things without potential/likely health consequences.

I’ll do a follow-up eventually about the claim that you can use less of it because it’s sweeter. There are lots of conflicting claims about how much sweeter it is and how much less of it you use that I’m still trying to sort out. So far, I’m not at all convinced that the small caloric benefit is a reasonable trade-off for the risks of increased fructose consumption. I’ll also address one final defense: at least in some applications and to some palates, agave nectar may taste better. I admit to being a little skeptical, but a friend has promised to arrange a blind taste-test of mint juleps made with agave nectar, simple syrup, and a 50-50 agave nectar/brown rice syrup blend. That won’t happen until the national day of mint julep drinking, which falls on May 1, 2010 this year. So, until then, I’m going to take a little break from reading and writing about agave nectar.

Don’t Drink the Agave-Sweetened Kool-Aid Part I: “Natural” my foot

UGH the subtitle. I really want Ms. Catalano to show me exactly where in "nature" she gets her agave nectar. Also, I find the use of "ultimate" to mean "exemplary" or "best" instead of "final" or "last" grating, but that's a petty battle against usage change that "Ultimate Frisbee" has clearly already won. Still, I like to think of it as "Frisbee for the End Days" Just as "wholesome" as any other hydrolyzed, refined sweetener. If you've been snarky about the Corn Refiners' Assn's recent "Sweet Surprise" marketing campaign, but have a bottle that looks like this in your cupboard, I have some delicious all-natural snake oil to sell you, good sir or madam.

This entry was nearly titled “Things That Might Not Kill You In Moderation But Certainly Won’t Make You Any Healthier Vol. I,” or “Hydrolyzed, Refined Sweeteners Masquerading as ‘Natural,’ Whole Foods,” but those seemed a little unwieldy. They do, however, capture the essence of the argument: agave is nutritionally no better than most other refined sweeteners, including high-fructose corn syrup (HFCS). If anything, it’s probably worse because it contains more fructose than table sugar or HFCS. It’s also no more or less “natural” than HFCS—it’s actually produced in a remarkably similar process that was first used on the fibrous pulp of the agave in the 1990s. While, as its proponents claim, the higher proportion of fructose has enabled people to call it a “low glycemic index sweetener,” sometimes alleged to be safer for diabetics and recommended by weight-loss programs like Weight Watchers, recent research suggests that large amounts of fructose aren’t healthy for anyone, diabetic or otherwise.

I mentioned agave nectar in passing in the HFCS post, but there’s enough conflicting information about it to merit its own post(s). A lot of the misinformation comes from agavevangelists, who can sometimes get a little sanctimonious about their avoidance of the demon HFCS and preference for “natural” sweeteners. Even this Vegfamily article that concludes “the physiological effects of all [caloric] sweeteners are similar” nonetheless claims:

Given the choice between sugar, HFCS, and agave nectar, I’ll stick with organically-grown, unbleached cane sugar (evaporated cane juice) and organic raw agave nectar that are free of pesticides, herbicides, and chemical bleaching agents; not genetically engineered; and still retains some nutrients, as well as being vegan. Since HFCS is not available in organic form and is highly processed, I would never use it.

But agave nectar is just as processed as HFCS.

HFCS and Agave Nectar: One of These Things is Not Almost Exactly Like The Other

1910 magazine advertisement from http://goldcountrygirls.blogspot.com/2009/10/then-and-now-49-karo-syrup.html Like most starches, corn starch consists of large glucose polymers—70-80% the branched, non-water soluble amylopectin and 20-30% linear, soluble amylose. Normal or non-HFCS corn syrup, like Karo, is produced by breaking those polymers down into their constituent glucose molecules using acids, enzymes, and/or heat. For the history buffs: the acid hydrolysis of starch was first discovered because of the 1806 British blockade of the French West Indies. Napoleon I offered a cash reward for anyone who could come up with a replacement for cane sugar, and a Russian chemist named Konstantin Kirchhof found he could produce a sweet syrup from potato starch by adding sulfuric acid. The same process was first applied to corn in the mid-1860s, and gained popularity in the U.S. during the sugar shortages of WWI (source: The Oxford Encyclopedia of Food and Drink in America).

HFCS is produced by converting the glucose into fructose using an enzyme technology developed in Japan in the 1960s (detailed here). The resulting syrup, which contains up to 90% fructose, is then typically mixed with corn-based glucose syrup to produce HFCS-55 (the kind used in soft drinks, which has 55% fructose/45% glucose) or HFCS-45 (the kind used in baked goods, which has 45% fructose/55% glucose). Some people, like Cynthia commenting on Daily Candor, have suggested that the fructose and glucose in HFCS are absorbed into the bloodstream faster because they’re “free" instead of bound the way they are in the disacccharide sucrose, which is broken into glucose and fructose by the enzyme sucrase. Theoretically plausible, but apparently not true:

Sucrose is hydrolysed by brush-border sucrase into glucose and fructose.
The rate of absorption is identical, regardless of whether the sugar is presented to the mucosa as the disaccharide or the component monosaccharides (Gray & Ingelfinger, I 966, cited by H. B. McMichael in “Intestinal absorption of carbohydrates in man”).

I'm going to start refering to packaging like this as granola-washingJust like HFCS, agave nectar is produced by breaking down a plant-based polymer into its constituent sugars. In the case of agave, the relevant molecule is inulin, a fiber composed mostly of fructose units with a terminal glucose. Just like with corn and potato starch, there are different methods of hydrolyzing the sugars in inulin.  Blue Agave Nectar uses a thermic process. Madhava uses an enzyme process, just like HFCS.

Agavevangelists like to claim that agave nectar is a traditional sweetener used by native peoples, which appeals to the popular notion that the foodways of the past were generally healthier (e.g. Michael Pollan’s advice not to eat anything your great-grandmother wouldn’t recognize as food). Some, like Lynn Stephens of Shake Off the Sugar, merely note that the agave plant itself “has long been cultivated in hilly, semi-arid soils of Mexico.” That’s true, although it’s about as relevant as the long history of corn cultivation. Others claim that agave nectar itself has an ancient history. Flickr user Health Guy says of agave nectar: “It is 1-1/4 times sweeter than sugar, so you need less, and it has been consumed by ancient civilizations for over 5,000 years.”

Wrong. According to the website for Madhava Honey:

Agave nectar is a newly created sweetener, having been developed during the 1990’s. Originally, the blue agave variety was used. This is the same plant used in the manufacture of tequila. During the late 90’s, a shortage of blue agave resulted in huge increases in cost and a sweetener based on this plant became uneconomical. Further research was done and a method using wild agave was developed. Overcoming the language barrier between the Indians able to supply the nectar from the wild agave on their land and the Spanish speaking local manufacturer was the key that finally unlocked a supply of raw material and has led to our bringing this wonderful new product to market.

Still doing some native-washing (wild agave harvested by Indians who don’t speak Spanish—can’t you just feel the virtue?), but here’s what happens to the agave sap after harvesting, as described in the abstract of the 1998 patent issued for the production of fructose syrup from the agave plant:

A pulp of milled agave plant heads are liquified during centrifugation and a polyfructose solution is removed and then concentrated to produce a polyfructose concentrate. Small particulates are removed by centrifugation and/or filtration and colloids are removed using termic coagulation techniques to produce a partially purified polyfructose extract substantially free of suspended solids. The polyfructose extract is treated with activated charcoal and cationic and anionic resins to produce a demineralized, partially hydrolyzed polyfructose extract. This partially hydrolyzed polyfructose extract is then hydrolyzed with inulin enzymes to produce a hydrolyzed fructose extract. Concentration of the fructose extract yields a fructose syrup. (via Patentstorm)

Probably the healthiest sweetener pictured here and the one most shoppers in the market for a "natural sweetener" would be least likely to purchaseIt’s true that the corn used in HFCS is less likely than agave to be organically-grown, but you can get organic-certified corn syrup from the same manufacturer as the blue agave nectar pictured above and nutritionally, the main difference between that, the HFCS used in most processed foods, and agave nectar is the ratio of glucose: fructose. The regular corn syrup is 100% glucose, HFCS is usually 55/45 glucose/fructose, and agave nectar 56-90% fructose, depending on the plant and the process.

I’ve already talked a little about fructose vs. glucose here and here, but more coming soon in Agave-rant Part II concerning:

1) whether the fructose in agave is somehow better than, or indeed, different in any way from the fructose in HFCS

2) whether the fact that it’s sweeter than sugar makes it a lower-calorie alternative to sugar

3) whether its “low glycemic index” rating makes less likely to produce insulin resistance than table sugar and

4) whether it’s safer for diabetics

All of which people have claimed. I won’t keep you in suspense, especially given how long it may take me to put all of that together. The short answers are:

1) not in any nutritionally meaningful way

2) perhaps very slightly, but a <10 calorie/serving difference likely doesn’t make up for the increased risk of fatty liver syndrome and insulin resistance

3) no, it’s actually more likely to produce insulin resistance and

4) in miniscule amounts, perhaps, but recent trials involving diabetics and agave nectar were halted because of severe side effects.

The Sweet Science of Artichokes

i wanted a picture of artichokes boxing, but this'll have to do. image from http://miscellainey.blogspot.com/2007_08_01_archive.html 

At least you’ll never be a vegetable—even artichokes have hearts. –Amelie

I suspect that one of the reasons artichokes show up in appetizers so often, especially in the sugar-loving U.S., is that they make everything you eat or drink for a little while afterwards, including water, taste slightly sweet. It’s not quite the simple straightforward sweetness of sucrose, which I’m not sure would be an especially desirable effect no matter how much you like sweet things. Instead, it’s more of a sweet-savory enhancement, perhaps even a little bit umami.I cropped the chart description for length, but will happily send it to anyone who's really interested

According to a 1972 article in Science, the first written account of artichokes’ capacity for taste perversion followed a dinner for biologists at the 1934 AAAS conference. The salad course consisted of globe artichokes, and someone must have taken a survey—of the the nearly 250 biologists in attendance, 60% reported that after eating the artichoke, water tasted different, a difference most of them described as “sweet” but a small number said was “bitter.”

The Science article reports on the results of an experiment that showed that artichoke extract modifies the taste of water by temporarily affecting the tongue rather than the food or drink (which makes it different than saccharine, which can make water taste sort of sweet and/or bitter as residue on the tongue is re-diluted). They also isolated two molecules found in artichokes—cholorgenic acid and cynarin, and found that both, independently, had a similar effects on the perceived sweetness of water as adding 2 tsp. sugar to 6 oz. water.

However, a less formal acknowledgment of the strange effects of the artichoke exists in the ancient folk wisdom that artichokes are “impossible” to pair with wine. An article in Wine News Magazine claims to “dispel” the “antiquated myth” of impossible pairings, but many of the suggestions purport to work by minimizing the presence or effect of the cynarin, either by boiling the artichoke in "ample water” or serving it with acids like lemon and/or mayonnaise. Leaving aside for the moment the question of whether either technique actually does anything to the cynarin and/or chlorogenic acid, I’m not sure that eliminating the chemical basis for the unique taste of the artichoke passes muster as a successful “pairing.” Essentially what they’ve done there is pair the wine with a less-artichokey version of the artichoke.

The Science article notes that the effects of cynarin and cholorogenic acid last longer than the sweet taste of sugar or saccharine, but are weaker and shorter-lived than that of miraculin, the protein in “miracle fruit.” Miraculin works by adhering to sweet-receptors on the tongue and acids in food, which makes the acids activate the sweet-receptors. I tried that with a bunch of friends shortly after The New York Times reported on it, and it really is trippy—lemons taste like candy, goat cheese tastes like cheesecake, and we all got stomachaches from eating so much acidic food in such a short period of time.

However, the protein miraculin seems to affect a much larger percentage of the population than the acids in artichoke. Just like at the AAAS dinner, a large number of the 1972 experiment’s participants didn’t experience a sweet taste after consuming artichoke extract. And again, a very small number actually said that the artichokes made water taste bitter. So it seems like cynarin/cholorgenic acid must have a different kind of mechanism, one that works for a majority of the population but exempts a substantial minority. Sadly, I can’t for the life of me figure out what it is. Does it inhibit bitter receptors? Attach temporarily to a certain kind of sweet receptor not everyone has? It seems to make white wines taste more sour, so perhaps it inhibits the tongue from registering the sugars in the wine? I don’t know, and I have searched. If you know, please share.

Anyhow, back to the question of what might alter or inhibit the cynarin and/or cholorogenic acid. In a post on "Transcription and Translation" also largely based on that 1972 Science article, biochemist Alex Palazzo claims that “pickled artichoke hearts don’t have this property.” I’m not entirely convinced, although this might be an issue of semantics. I won’t dispute that the sweetish aftertaste of canned or jarred artichokes seems muted in comparison with fresh artichokes, but I swear that even in that ubiquitous creamy, spinach-filled dip, or as a pizza topping, or in salads, or when added to paella, artichokes preserved in brine do contribute a subtly-sweet taste that affects the entire dish and any accompanying beverages. However, again based on my own subjective tastes and personal experience, marinated artichokes have little or no sweet aftertaste.

The difference seems to be that marinades, by definition, contain acid whereas brines typically do not—brines are just salty solutions. Now, pickling can imply either. Traditional pickling methods involve fermenting foods in brine, with no added acid. Their sourness is a product of the acids produced during fermentation. The more common form of pickling today begins with a solution that has added acids, usually vinegar. If Palazzo was referring only to the latter method—which would be artichokes labeled “marinated,” I agree with him. That also makes sense with the chefs’ suggestions to add acids in order to make artichokes play nice with wine; added acids must interfere with the cynarin and/or cholorogenic acid in the artichoke. But salt doesn’t seem to. Artichokes sold canned or jarred in brine (also technically “pickled”) still make food taste sweet.

Tomorrow, as this is apparently becoming artichoke week, I’ll post a super-easy recipe you can try to test the effects of artichokes in brine for yourself.

[Edit: Comments closed due to spam, but I welcome feedback. Feel free to e-mail me (see “contact” tab).