Kantos Kan led me to one of these gorgeous eating places where we were served entirely by mechanical apparatus. No hand touched the food from the time it entered the building in its raw state until it emerged hot and delicious upon the tables before the guests, in response to the touching of tiny buttons to indicate their desires.—Edgar Rice Burroughs, “A Princess of Mars” (1912)
By now, robots who can cook are nothing new. Most of them are basically one trick ponies (at least culinarily): a Swiss robot that was taught to make omelets to demonstrate its abilities, Japanese robots that can grill okonomayaki or make octopus balls from scratch.There’s even a restaurant called Famen in Nagoya staffed by two robots who act out a comic routine and spar with knives in between preparing bowls of ramen. However, the cooking robot recently introduced by two Chinese unversities that’s making the rounds online this month comes closer to the fantasy in the Burroughs story of something that can produce a huge variety of foods on demand, almost like replicators on Star Trek. This new cooking robot can make 300 different dishes based on the offerings of four top chefs in Jiangsu Province and may soon be able to produce up to 600.
What strikes me about the media coverage of cooking robots is the paradox that, on the one hand, the fact that they can do something so essentially human is a substantial part of the delight they inspire. Their food-related activities are often designed to soften peoples’ resistance to robots—for example, researchers at Carnegie Mellon developed the Snackbot that they introduced to a reporter for the New York Times last month to “gather information on how robots interact with people (and how to improve homo-robo relations).” But on the other hand, the essential humanness of cooking can also make the robots especially unnerving. In fact, the more human, the more they seem to bother people. The Engadget article on the sushi-grabbing hand, “Chef Robot makes its video debut, nightmares forthcoming,” seems mostly disturbed by how “realistic” the hand looks:
In case you missed it, the robot itself is actually just a standard issue FANUC M-430iA robot arm with a way too realistic hand attached to it, which apparently not only helps it prepare sushi, but some tasty desserts as well. Head on past the break for the must-see video, you’ve nothing to lose but your ability to unsee it.
Though usually slightly less dramatic, most other articles I’ve seen about cooking robots end with some sort of joke or disclaimer, which usually reflect anxieties about the threat that cooking robots pose to the boundary between human and machine.
If this thing ever gets imported to the U.S., it would need to make fortune cookies too. But what would a robot fortune say?—CNet (on the 300-dish Chinese cook)
More than 200 diners have enjoyed the machine’s cuisine thus far, and reportedly taste testers have found the food to be on par with a traditional restaurant kitchen, flavor-wise. (No mention has been made of the robot’s plating abilities.)—CNet (on a prototype developed by a retired professor using an induction burner and robotic arm)
While it lacks the personal touch and the ability to hold some small banter with regular guests, at least you can be sure the fingers have not gone around digging noses or scratching butts.”—Ubergizmo (on the sushi hand)
“No matter how skilled Motoman is, I doubt real chefs like Anthony Bourdain or Mario Batali would be caught dead cooking next to him.” Robot Living (referring to Chef Motoman, who was designed to work alongside humans in a restaurant environment)
A seemingly irrepressible impulse to name something robots can’t infringe on, like speculating about the future or making the kind of aesthetic and creative decisions that go into plating, or find some other way to distinguish them from human chefs—the ability to banter or pick their nose or smoke and hate on vegans or compete in elaborate cooking competitions. Even the NYTimes article, which focuses mostly on how food “humanizes” robots, ends by erecting a wall based on the ability to taste:
The real obstacle to a world full of mechanized sous-chefs and simulated rage-filled robo-Gordon Ramsays may be something much harder to fake: none of these robots can taste.
Keizo Shimamoto, who writes a blog on ramen noodles and has eaten at Famen, the two-robot Japanese restaurant, said that the establishment was “kind of dead” when he ate there last year. Though the owner said that people do taste the food, according to Mr. Shimamoto, “It was a little disappointing.” It’s one thing to get people to stop by to see the robots. “But to keep the customers coming back,” he said, “you need better soup.”
And while it’s true that none of the robots mentioned in the article can taste, that doesn’t mean there aren’t other robots that can.
What If Chef Motoman Had a Nose?
Researchers have developed mass spectrometers that can determine the ripeness of tomatoes and melons and describe the nuances in different samples of espresso—which ones are more or less floral, citrusy, which have hints of buttery toffee or a woody undertone, etc. Some electronic noses, as the e-sensing systems are often called, are so sensitive they can pinpoint not only the grape varietal and region where a wine was produced, but what barrel it was fermented in. As I’ve discussed before, tastes are largely produced by how substances react with our ~40 taste receptors and ~400 olfactory receptors. Every unique flavor/odor combination is like a “fingerprint,” and e-sensing systems are far, far better at identifying and classifying those fingerprints than humans.
In Mindless Eating: Why We Eat More Than We Think, Brian Wansink mentions a 2004 study performed at the Cornell University Food and Brand Lab where 32 participants were invited to taste what they were told was strawberry yogurt in the dark. They were actually given chocolate yogurt, but nineteen of them still rated it as having “good strawberry flavor.” The yogurt-tasters weren’t food critics or trained chefs, but even “experts” are dramatically influenced by contextual cues. Frederic Brochet has run multiple experiments with wine experts in the Bordeaux region of France, where many of the world’s most expensive wines are produced. In one experiment, he had 54 experts taste white wines that had been dyed red with a flavorless additive and in another he served 57 experts the same red wine in two different bottles alternately identifying it as a high-prestige wine and lowly table wine. In the first experiment, none of the experts detected the white wine flavor, and many of them praised it for qualities typically associated with red wines like “jamminess” or “red fruit.” In the second, 40 of the experts rated the wine good when they thought it was an expensive Grand Cru and only 12 did when they thought it was a cheap blend (read more: “The Subjectivity of Wine” by Jonah Lehrer).
It’s true that robots can’t make independent subjective judgments about tastes and odors. The ramen robots might be able to customize your ramen based on variables like the proportion of noodles to broth and different kinds of toppings, but they can’t simply make a “better” soup. However, if outfitted with an electronic nose and programmed to recognize and replicate the fingerprint of a really fantastic tonkotsu broth, there’s no reason to believe they wouldn’t be able to make the best ramen you’ve ever tasted—and likely with greater consistency than a human chef (assuming they had access to the necessary ingredients). There are other factors, too, like rolling and cooking the noodles just so to give them a toothsome bite, but again, noodle recipes and some way of evaluating their texture could be programmed into a robot chef’s computer brain. In other words, the only reason robots can’t taste is because we haven’t designed them to yet.
Even though robots can’t innovate criteria, they can be trained to make subjective judgments—Science just reported yesterday that researchers in Israel have trained an electronic nose to predict whether novel smells are good or bad. They exposed it to 76 odors that were rated by human volunteers (both Israeli and Ethiopian to account for cultural differences, which unsurprisingly turn out to be pretty minor) on a scale from “the best odor you have ever smelled” to “the worst odor you have ever smelled.” Then, they exposed the nose to 22 new odors and compared them to the ratings of a new group of volunteers. The electronic nose agreed with the humans on the relative pleasantness of the odor 80% of the time. In another trial using only extreme odors—ones that had been rated most pleasant or unpleasant—it agreed with the humans 90% of the time. (Here’s the original study)
So, theoretically, it might be possible not only to program a robot to make foods that match a “fingerprint” that’s widely rated “delicious” but also to predict what kinds of foods are likely to taste good or bad. And perhaps the next step would be to ask it to innovate combinations that are likely to taste especially delicious.
However, given that the way we taste often has more to do with expectations and presentation than the chemical properties of the food, the discomfort inspired by cooking robots may be more of a barrier than the technology itself. If sushi merely been transferred from tray to plate by a robot hand is nightmare-inducing, we’re probably a long way—culturally, if not technologically—from Robot Cuisine.