Download high-resolution image Look inside
Listen to a clip from the audiobook
audio play button
0:00
0:00

The Omnivore's Dilemma

A Natural History of Four Meals

Look inside
Listen to a clip from the audiobook
audio play button
0:00
0:00
"Outstanding . . . a wide-ranging invitation to think through the moral ramifications of our eating habits." —The New Yorker

One of the New York Times Book Review's Ten Best Books of the Year and Winner of the James Beard Award

Author of This is Your Mind on Plants, How to Change Your Mind and the #1 New York Times Bestseller In Defense of Food and Food Rules


What should we have for dinner? Ten years ago, Michael Pollan confronted us with this seemingly simple question and, with The Omnivore’s Dilemma, his brilliant and eye-opening exploration of our food choices, demonstrated that how we answer it today may determine not only our health but our survival as a species. In the years since, Pollan’s revolutionary examination has changed the way Americans think about food. Bringing wide attention to the little-known but vitally important dimensions of food and agriculture in America, Pollan launched a national conversation about what we eat and the profound consequences that even the simplest everyday food choices have on both ourselves and the natural world. Ten years later, The Omnivore’s Dilemma continues to transform the way Americans think about the politics, perils, and pleasures of eating.
The Omnivore's DilemmaIntroduction: Our National Eating Disorder

I. Industrial: Corn

One. The Plant: Corn's Conquest
Two. The Farm
Three. The Elevator
Four. The Feedlot: Making Meat
Five. The Processing Plant: Making Complex Foods
Six. The Consumer: A Republic of Fat
Seven. The Meal: Fast Food

II. Pastoral: Grass

Eight. All Flesh Is Grass
Nine. Big Organic
Ten. Grass: Thirteen Ways of Looking at a Pasture
Eleven. The Animals: Practicing Complexity
Twelve. Slaughter: In a Glass Abattoir
Thirteen. The Market: "Greetings from the Non-Barcode People"
Fourteen. The Meal: Grass Fed

III. Personal: The Forest

Fifteen. The Forager
Sixteen. The Omnivore's Dilemma
Seventeen. The Ethics of Eating Animals
Eighteen. Hunting: The Meat
Nineteen. Gathering: The Fungi
Twenty. The Perfect Meal

Acknowledgments
Sources
Index

INTRODUCTION

Our National Eating Disorder 

What should we have for dinner?

This book is a long and fairly involved answer to this seemingly simple question. Along the way, it also tries to figure out how such a simple question could ever have gotten so complicated. As a culture we seem to have arrived at a place where whatever native wisdom we may once have possessed about eating has been replaced by confusion and anxiety. Somehow this most elemental of activities—figuring out what to eat—has come to require a remarkable amount of expert help. How did we ever get to a point where we need investigative journalists to tell us where our food comes from and nutritionists to determine the dinner menu?

For me the absurdity of the situation became inescapable in the fall of 2002, when one of the most ancient and venerable staples of human life abruptly disappeared from the American dinner table. I’m talking of course about bread. Virtually overnight, Americans changed the way they eat. A collective spasm of what can only be described as carbophobia seized the country, supplanting an era of national lipophobia dating to the Carter administration. That was when, in 1977, a Senate committee had issued a set of “dietary goals” warning beef-loving Americans to lay off the red meat. And so we dutifully had done, until now.

What set off the sea change? It appears to have been a perfect media storm of diet books, scientific studies, and one timely magazine article. The new diet books, many of them inspired by the formerly discredited Dr. Robert C. Atkins, brought Americans the welcome news that they could eat more meat and lose weight just so long as they laid off the bread and pasta. These high-protein, low-carb diets found support in a handful of new epidemiological studies suggesting that the nutritional orthodoxy that had held sway in America since the 1970s might be wrong. It was not, as official opinion claimed, fat that made us fat, but the carbohydrates we’d been eating precisely in order to stay slim. So conditions were ripe for a swing of the dietary pendulum when, in the summer of 2002, the New York Times Magazine published a cover story on the new research entitled “What if Fat Doesn’t Make You Fat?” Within months, supermarket shelves were restocked and restaurant menus rewritten to reflect the new nutritional wisdom. The blamelessness of steak restored, two of the most wholesome and uncontroversial foods known to man—bread and pasta—acquired a moral stain that promptly bankrupted dozens of bakeries and noodle firms and ruined an untold number of perfectly good meals.

So violent a change in a culture’s eating habits is surely the sign of a national eating disorder. Certainly it would never have happened in a culture in possession of deeply rooted traditions surrounding food and eating. But then, such a culture would not feel the need for its most august legislative body to ever deliberate the nation’s “dietary goals”—or, for that matter, to wage political battle every few years over the precise design of an official government graphic called the “food pyramid.” A country with a stable culture of food would not shell out millions for the quackery (or common sense) of a new diet book every January. It would not be susceptible to the pendulum swings of food scares or fads, to the apotheosis every few years of one newly discovered nutrient and the demonization of another. It would not be apt to confuse protein bars and food supplements with meals or breakfast cereals with medicines. It probably would not eat a fifth of its meals in cars or feed fully a third of its children at a fast-food outlet every day. And it surely would not be nearly so fat.

Nor would such a culture be shocked to discover that there are other countries, such as Italy and France, that decide their dinner questions on the basis of such quaint and unscientific criteria as pleasure and tradition, eat all manner of “unhealthy” foods, and, lo and behold, wind up actually healthier and happier in their eating than we are. We show our surprise at this by speaking of something called the “French paradox,” for how could a people who eat such demonstrably toxic substances as foie gras and triple crème cheese actually be slimmer and healthier than we are? Yet I wonder if it doesn’t make more sense to speak in terms of an American paradox—that is, a notably unhealthy people obsessed by the idea of eating healthily.

 

TO ONE DEGREE or another, the question of what to have for dinner assails every omnivore, and always has. When you can eat just about anything nature has to offer, deciding what you should eat will inevitably stir anxiety, especially when some of the potential foods on offer are liable to sicken or kill you. This is the omnivore’s dilemma, noted long ago by writers like Rousseau and Brillat-Savarin and first given that name thirty years ago by a University of Pennsylvania research psychologist named Paul Rozin. I’ve borrowed his phrase for the title of this book because the omnivore’s dilemma turns out to be a particularly sharp tool for understanding our present predicaments surrounding food.

In a 1976 paper called “The Selection of Foods by Rats, Humans, and Other Animals” Rozin contrasted the omnivore’s existential situation with that of the specialized eater, for whom the dinner question could not be simpler. The koala doesn’t worry about what to eat: If it looks and smells and tastes like a eucalyptus leaf, it must be dinner. The koala’s culinary preferences are hardwired in its genes. But for omnivores like us (and the rat) a vast amount of brain space and time must be devoted to figuring out which of all the many potential dishes nature lays on are safe to eat. We rely on our prodigious powers of recognition and memory to guide us away from poisons (Isn’t that the mushroom that made me sick last week?) and toward nutritious plants (The red berries are the juicier, sweeter ones). Our taste buds help too, predisposing us toward sweetness, which signals carbohydrate energy in nature, and away from bitterness, which is how many of the toxic alkaloids produced by plants taste. Our inborn sense of disgust keeps us from ingesting things that might infect us, such as rotten meat. Many anthropologists believe that the reason we evolved such big and intricate brains was precisely to help us deal with the omnivore’s dilemma.

Being a generalist is of course a great boon as well as a challenge; it is what allows humans to successfully inhabit virtually every terrestrial environment on the planet. Omnivory offers the pleasures of variety, too. But the surfeit of choice brings with it a lot of stress and leads to a kind of Manichaean view of food, a division of nature into The Good Things to Eat, and The Bad.

The rat must make this all-important distinction more or less on its own, each individual figuring out for itself—and then remembering—which things will nourish and which will poison. The human omnivore has, in addition to his senses and memory, the incalculable advantage of a culture, which stores the experience and accumulated wisdom of countless human tasters before him. I don’t need to experiment with the mushroom now called, rather helpfully, the “death cap,” and it is common knowledge that that first intrepid lobster eater was on to something very good. Our culture codifies the rules of wise eating in an elaborate structure of taboos, rituals, recipes, manners, and culinary traditions that keep us from having to reenact the omnivore’s dilemma at every meal.

One way to think about America’s national eating disorder is as the return, with an almost atavistic vengeance, of the omnivore’s dilemma. The cornucopia of the American supermarket has thrown us back on a bewildering food landscape where we once again have to worry that some of those tasty-looking morsels might kill us. (Perhaps not as quickly as a poisonous mushroom, but just as surely.) Certainly the extraordinary abundance of food in America complicates the whole problem of choice. At the same time, many of the tools with which people historically managed the omnivore’s dilemma have lost their sharpness here—or simply failed. As a relatively new nation drawn from many different immigrant populations, each with its own culture of food, Americans have never had a single, strong, stable culinary tradition to guide us.

The lack of a steadying culture of food leaves us especially vulnerable to the blandishments of the food scientist and the marketer, for whom the omnivore’s dilemma is not so much a dilemma as an opportunity. It is very much in the interest of the food industry to exacerbate our anxieties about what to eat, the better to then assuage them with new products. Our bewilderment in the supermarket is no accident; the return of the omnivore’s dilemma has deep roots in the modern food industry, roots that, I found, reach all the way back to fields of corn growing in places like Iowa.

And so we find ourselves where we do, confronting in the supermarket or at the dinner table the dilemmas of omnivorousness, some of them ancient and others never before imagined. The organic apple or the conventional? And if the organic, the local one or the imported? The wild fish or the farmed? The trans fats or the butter or the “not butter”? Shall I be a carnivore or a vegetarian? And if a vegetarian, a lacto-vegetarian or a vegan? Like the hunter-gatherer picking a novel mushroom off the forest floor and consulting his sense memory to determine its edibility, we pick up the package in the supermarket and, no longer so confident of our senses, scrutinize the label, scratching our heads over the meaning of phrases like “heart healthy,” “no trans fats,” “cage-free,” or “range-fed.” What is “natural grill flavor” or TBHQ or xanthan gum? What is all this stuff, anyway, and where in the world did it come from?

 

MY WAGER in writing The Omnivore’s Dilemma was that the best way to answer the questions we face about what to eat was to go back to the very beginning, to follow the food chains that sustain us, all the way from the earth to the plate—to a small number of actual meals. I wanted to look at the getting and eating of food at its most fundamental, which is to say, as a transaction between species in nature, eaters and eaten. (“The whole of nature,” wrote the English author William Ralph Inge, “is a conjugation of the verb to eat, in the active and passive.”) What I try to do in this book is approach the dinner question as a naturalist might, using the long lenses of ecology and anthropology, as well as the shorter, more intimate lens of personal experience.

My premise is that like every other creature on earth, humans take part in a food chain, and our place in that food chain, or web, determines to a considerable extent what kind of creature we are. The fact of our omnivorousness has done much to shape our nature, both body (we possess the omnicompetent teeth and jaws of the omnivore, equally well suited to tearing meat and grinding seeds) and soul. Our prodigious powers of observation and memory, as well as our curious and experimental stance toward the natural world, owe much to the biological fact of omnivorousness. So do the various adaptations we’ve evolved to defeat the defenses of other creatures so that we might eat them, including our skills at hunting and cooking with fire. Some philosophers have argued that the very open-endedness of human appetite is responsible for both our savagery and civility, since a creature that could conceive of eating anything (including, notably, other humans) stands in particular need of ethical rules, manners, and rituals. We are not only what we eat, but how we eat, too.

Yet we are also different from most of nature’s other eaters—markedly so. For one thing, we’ve acquired the ability to substantially modify the food chains we depend on, by means of such revolutionary technologies as cooking with fire, hunting with tools, farming, and food preservation. Cooking opened up whole new vistas of edibility by rendering various plants and animals more digestible, and overcoming many of the chemical defenses other species deploy against being eaten. Agriculture allowed us to vastly multiply the populations of a few favored food species, and therefore in turn our own. And, most recently, industry has allowed us to reinvent the human food chain, from the synthetic fertility of the soil to the microwaveable can of soup designed to fit into a car’s cup holder. The implications of this last revolution, for our health and the health of the natural world, we are still struggling to grasp.

The Omnivore’s Dilemma is about the three principal food chains that sustain us today: the industrial, the organic, and the hunter-gatherer. Different as they are, all three food chains are systems for doing more or less the same thing: linking us, through what we eat, to the fertility of the earth and the energy of the sun. It might be hard to see how, but even a Twinkie does this—constitutes an engagement with the natural world. As ecology teaches, and this book tries to show, it’s all connected, even the Twinkie.

Ecology also teaches that all life on earth can be viewed as a competition among species for the solar energy captured by green plants and stored in the form of complex carbon molecules. A food chain is a system for passing those calories on to species that lack the plant’s unique ability to synthesize them from sunlight. One of the themes of this book is that the industrial revolution of the food chain, dating to the close of World War II, has actually changed the fundamental rules of this game. Industrial agriculture has supplanted a complete reliance on the sun for our calories with something new under the sun: a food chain that draws much of its energy from fossil fuels instead. (Of course, even that energy originally came from the sun, but unlike sunlight it is finite and irreplaceable.) The result of this innovation has been a vast increase in the amount of food energy available to our species; this has been a boon to humanity (allowing us to multiply our numbers), but not an unalloyed one. We’ve discovered that an abundance of food does not render the omnivore’s dilemma obsolete. To the contrary, abundance seems only to deepen it, giving us all sorts of new problems and things to worry about.

Each of this book’s three parts follows one of the principal human food chains from beginning to end: from a plant, or group of plants, photosynthesizing calories in the sun, all the way to a meal at the dinner end of that food chain. Reversing the chronological order, I start with the industrial food chain, since that is the one that today involves and concerns us the most. It is also by far the biggest and longest. Since monoculture is the hallmark of the industrial food chain, this section focuses on a single plant: Zea mays, the giant tropical grass we call corn, which has become the keystone species of the industrial food chain, and so in turn of the modern diet. This section follows a bushel of commodity corn from the field in Iowa where it grew on its long, strange journey to its ultimate destination in a fast-food meal, eaten in a moving car on a highway in Marin County, California.

The book’s second part follows what I call—to distinguish it from the industrial—the pastoral food chain. This section explores some of the alternatives to industrial food and farming that have sprung up in recent years (variously called “organic,” “local,” “biological,” and “beyond organic”), food chains that might appear to be preindustrial but in surprising ways turn out in fact to be postindustrial. I set out thinking I could follow one such food chain, from a radically innovative farm in Virginia that I worked on one recent summer to an extremely local meal prepared from animals raised on its pastures. But I promptly discovered that no single farm or meal could do justice to the complex, branching story of alternative agriculture right now, and that I needed also to reckon with the food chain I call, oxymoronically, the “industrial organic.” So the book’s pastoral section serves up the natural history of two very different “organic” meals: one whose ingredients came from my local Whole Foods supermarket (gathered there from as far away as Argentina), and the other tracing its origins to a single polyculture of grasses growing at Polyface Farm in Swoope, Virginia.

The last section, titled Personal, follows a kind of neo-Paleolithic food chain from the forests of Northern California to a meal I prepared (almost) exclusively from ingredients I hunted, gathered, and grew myself. Though we twenty-first-century eaters still eat a handful of hunted and gathered food (notably fish and wild mushrooms), my interest in this food chain was less practical than philosophical: I hoped to shed fresh light on the way we eat now by immersing myself in the way we ate then. In order to make this meal I had to learn how to do some unfamiliar things, including hunting game and foraging for wild mushrooms and urban tree fruit. In doing so I was forced to confront some of the most elemental questions—and dilemmas—faced by the human omnivore: What are the moral and psychological implications of killing, preparing, and eating a wild animal? How does one distinguish between the delicious and the deadly when foraging in the woods? How do the alchemies of the kitchen transform the raw stuffs of nature into some of the great delights of human culture?

The end result of this adventure was what I came to think of as the Perfect Meal, not because it turned out so well (though in my humble opinion it did), but because this labor- and thought-intensive dinner, enjoyed in the company of fellow foragers, gave me the opportunity, so rare in modern life, to eat in full consciousness of everything involved in feeding myself: For once, I was able to pay the full karmic price of a meal.

Yet as different as these three journeys (and four meals) turned out to be, a few themes kept cropping up. One is that there exists a fundamental tension between the logic of nature and the logic of human industry, at least as it is presently organized. Our ingenuity in feeding ourselves is prodigious, but at various points our technologies come into conflict with nature’s ways of doing things, as when we seek to maximize efficiency by planting crops or raising animals in vast monocultures. This is something nature never does, always and for good reasons practicing diversity instead. A great many of the health and environmental problems created by our food system owe to our attempts to oversimplify nature’s complexities, at both the growing and the eating ends of our food chain. At either end of any food chain you find a biological system—a patch of soil, a human body—and the health of one is connected—literally—to the health of the other. Many of the problems of health and nutrition we face today trace back to things that happen on the farm, and behind those things stand specific government policies few of us know anything about.

I don’t mean to suggest that human food chains have only recently come into conflict with the logic of biology; early agriculture and, long before that, human hunting proved enormously destructive. Indeed, we might never have needed agriculture had earlier generations of hunters not eliminated the species they depended upon. Folly in the getting of our food is nothing new. And yet the new follies we are perpetrating in our industrial food chain today are of a different order. By replacing solar energy with fossil fuel, by raising millions of food animals in close confinement, by feeding those animals foods they never evolved to eat, and by feeding ourselves foods far more novel than we even realize, we are taking risks with our health and the health of the natural world that are unprecedented.

Another theme, or premise really, is that the way we eat represents our most profound engagement with the natural world. Daily, our eating turns nature into culture, transforming the body of the world into our bodies and minds. Agriculture has done more to reshape the natural world than anything else we humans do, both its landscapes and the composition of its flora and fauna. Our eating also constitutes a relationship with dozens of other species—plants, animals, and fungi—with which we have coevolved to the point where our fates are deeply entwined. Many of these species have evolved expressly to gratify our desires, in the intricate dance of domestication that has allowed us and them to prosper together as we could never have prospered apart. But our relationships with the wild species we eat—from the mushrooms we pick in the forest to the yeasts that leaven our bread—are no less compelling, and far more mysterious. Eating puts us in touch with all that we share with the other animals, and all that sets us apart. It defines us.

What is perhaps most troubling, and sad, about industrial eating is how thoroughly it obscures all these relationships and connections. To go from the chicken (Gallus gallus) to the Chicken McNugget is to leave this world in a journey of forgetting that could hardly be more costly, not only in terms of the animal’s pain but in our pleasure, too. But forgetting, or not knowing in the first place, is what the industrial food chain is all about, the principal reason it is so opaque, for if we could see what lies on the far side of the increasingly high walls of our industrial agriculture, we would surely change the way we eat.

“Eating is an agricultural act,” as Wendell Berry famously said. It is also an ecological act, and a political act, too. Though much has been done to obscure this simple fact, how and what we eat determines to a great extent the use we make of the world—and what is to become of it. To eat with a fuller consciousness of all that is at stake might sound like a burden, but in practice few things in life can afford quite as much satisfaction. By comparison, the pleasures of eating industrially, which is to say eating in ignorance, are fleeting. Many people today seem perfectly content eating at the end of an industrial food chain, without a thought in the world; this book is probably not for them. There are things in it that will ruin their appetites. But in the end this is a book about the pleasures of eating, the kinds of pleasure that are only deepened by knowing.

 

 

 

 

I

INDUSTRIAL

CORN

 

 

 

 

ONE

THE PLANT

Corn’s Conquest

 

 

1. A NATURALIST IN THE SUPERMARKET

Air-conditioned, odorless, illuminated by buzzing fluorescent tubes, the American supermarket doesn’t present itself as having very much to do with Nature. And yet what is this place if not a landscape (man-made, it’s true) teeming with plants and animals?

I’m not just talking about the produce section or the meat counter, either—the supermarket’s flora and fauna. Ecologically speaking, these are this landscape’s most legible zones, the places where it doesn’t take a field guide to identify the resident species. Over there’s your eggplant, onion, potato, and leek; here your apple, banana, and orange. Spritzed with morning dew every few minutes, Produce is the only corner of the supermarket where we’re apt to think “Ah, yes, the bounty of Nature!” Which probably explains why such a garden of fruits and vegetables (sometimes flowers, too) is what usually greets the shopper coming through the automatic doors.

Keep rolling, back to the mirrored rear wall behind which the butchers toil, and you encounter a set of species only slightly harder to identify—there’s chicken and turkey, lamb and cow and pig. Though in Meat the creaturely character of the species on display does seem to be fading, as the cows and pigs increasingly come subdivided into boneless and bloodless geometrical cuts. In recent years some of this supermarket euphemism has seeped into Produce, where you’ll now find formerly soil-encrusted potatoes cubed pristine white, and “baby” carrots machine-lathed into neatly tapered torpedoes. But in general here in flora and fauna you don’t need to be a naturalist, much less a food scientist, to know what species you’re tossing into your cart.

Venture farther, though, and you come to regions of the supermarket where the very notion of species seems increasingly obscure: the canyons of breakfast cereals and condiments; the freezer cases stacked with “home meal replacements” and bagged platonic peas; the broad expanses of soft drinks and towering cliffs of snacks; the unclassifiable Pop-Tarts and Lunchables; the frankly synthetic coffee whiteners and the Linnaeus-defying Twinkie. Plants? Animals?! Though it might not always seem that way, even the deathless Twinkie is constructed out of…well, precisely what I don’t know offhand, but ultimately some sort of formerly living creature, i.e., a species. We haven’t yet begun to synthesize our foods from petroleum, at least not directly.

If you do manage to regard the supermarket through the eyes of a naturalist, your first impression is apt to be of its astounding biodiversity. Look how many different plants and animals (and fungi) are represented on this single acre of land! What forest or prairie could hope to match it? There must be a hundred different species in the produce section alone, a handful more in the meat counter. And this diversity appears only to be increasing: When I was a kid, you never saw radicchio in the produce section, or a half dozen different kinds of mushrooms, or kiwis and passion fruit and durians and mangoes. Indeed, in the last few years a whole catalog of exotic species from the tropics has colonized, and considerably enlivened, the produce department. Over in fauna, on a good day you’re apt to find—beyond beef—ostrich and quail and even bison, while in Fish you can catch not just salmon and shrimp but catfish and tilapia, too. Naturalists regard biodiversity as a measure of a landscape’s health, and the modern supermarket’s devotion to variety and choice would seem to reflect, perhaps even promote, precisely that sort of ecological vigor.

Except for the salt and a handful of synthetic food additives, every edible item in the supermarket is a link in a food chain that begins with a particular plant growing in a specific patch of soil (or, more seldom, stretch of sea) somewhere on earth. Sometimes, as in the produce section, that chain is fairly short and easy to follow: As the netted bag says, this potato was grown in Idaho, that onion came from a farm in Texas. Move over to Meat, though, and the chain grows longer and less comprehensible: The label doesn’t mention that that rib-eye steak came from a steer born in South Dakota and fattened in a Kansas feedlot on grain grown in Iowa. Once you get into the processed foods you have to be a fairly determined ecological detective to follow the intricate and increasingly obscure lines of connection linking the Twinkie, or the nondairy creamer, to a plant growing in the earth someplace, but it can be done.

So what exactly would an ecological detective set loose in an American supermarket discover, were he to trace the items in his shopping cart all the way back to the soil? The notion began to occupy me a few years ago, after I realized that the straightforward question “What should I eat?” could no longer be answered without first addressing two other even more straightforward questions: “What am I eating? And where in the world did it come from?” Not very long ago an eater didn’t need a journalist to answer these questions. The fact that today one so often does suggests a pretty good start on a working definition of industrial food: Any food whose provenance is so complex or obscure that it requires expert help to ascertain.

When I started trying to follow the industrial food chain—the one that now feeds most of us most of the time and typically culminates either in a supermarket or fast-food meal—I expected that my investigations would lead me to a wide variety of places. And though my journeys did take me to a great many states, and covered a great many miles, at the very end of these food chains (which is to say, at the very beginning), I invariably found myself in almost exactly the same place: a farm field in the American Corn Belt. The great edifice of variety and choice that is an American supermarket turns out to rest on a remarkably narrow biological foundation comprised of a tiny group of plants that is dominated by a single species: Zea mays, the giant tropical grass most Americans know as corn.

Corn is what feeds the steer that becomes the steak. Corn feeds the chicken and the pig, the turkey and the lamb, the catfish and the tilapia and, increasingly, even the salmon, a carnivore by nature that the fish farmers are reengineering to tolerate corn. The eggs are made of corn. The milk and cheese and yogurt, which once came from dairy cows that grazed on grass, now typically come from Holsteins that spend their working lives indoors tethered to machines, eating corn.

Head over to the processed foods and you find ever more intricate manifestations of corn. A chicken nugget, for example, piles corn upon corn: what chicken it contains consists of corn, of course, but so do most of a nugget’s other constituents, including the modified corn starch that glues the thing together, the corn flour in the batter that coats it, and the corn oil in which it gets fried. Much less obviously, the leavenings and lecithin, the mono-, di-, and triglycerides, the attractive golden coloring, and even the citric acid that keeps the nugget “fresh” can all be derived from corn.

To wash down your chicken nuggets with virtually any soft drink in the supermarket is to have some corn with your corn. Since the 1980s virtually all the sodas and most of the fruit drinks sold in the supermarket have been sweetened with high-fructose corn syrup (HFCS)—after water, corn sweetener is their principal ingredient. Grab a beer for your beverage instead and you’d still be drinking corn, in the form of alcohol fermented from glucose refined from corn. Read the ingredients on the label of any processed food and, provided you know the chemical names it travels under, corn is what you will find. For modified or unmodified starch, for glucose syrup and maltodextrin, for crystalline fructose and ascorbic acid, for lecithin and dextrose, lactic acid and lysine, for maltose and HFCS, for MSG and polyols, for the caramel color and xanthan gum, read: corn. Corn is in the coffee whitener and Cheez Whiz, the frozen yogurt and TV dinner, the canned fruit and ketchup and candies, the soups and snacks and cake mixes, the frosting and gravy and frozen waffles, the syrups and hot sauces, the mayonnaise and mustard, the hot dogs and the bologna, the margarine and shortening, the salad dressings and the relishes and even the vitamins. (Yes, it’s in the Twinkie, too.) There are some forty-five thousand items in the average American supermarket and more than a quarter of them now contain corn. This goes for the nonfood items as well—everything from the toothpaste and cosmetics to the disposable diapers, trash bags, cleansers, charcoal briquettes, matches, and batteries, right down to the shine on the cover of the magazine that catches your eye by the checkout: corn. Even in Produce on a day when there’s ostensibly no corn for sale you’ll nevertheless find plenty of corn: in the vegetable wax that gives the cucumbers their sheen, in the pesticide responsible for the produce’s perfection, even in the coating on the cardboard it was shipped in. Indeed, the supermarket itself—the wallboard and joint compound, the linoleum and fiberglass and adhesives out of which the building itself has been built—is in no small measure a manifestation of corn.

And us?

2. CORN WALKING

Descendents of the Maya living in Mexico still sometimes refer to themselves as “the corn people.” The phrase is not intended as metaphor. Rather, it’s meant to acknowledge their abiding dependence on this miraculous grass, the staple of their diet for almost nine thousand years. Forty percent of the calories a Mexican eats in a day comes directly from corn, most of it in the form of tortillas. So when a Mexican says “I am maize” or “corn walking,” it is simply a statement of fact: The very substance of the Mexican’s body is to a considerable extent a manifestation of this plant.

For an American like me, growing up linked to a very different food chain, yet one that is also rooted in a field of corn, not to think of himself as a corn person suggests either a failure of imagination or a triumph of capitalism. Or perhaps a little of both. It does take some imagination to recognize the ear of corn in the Coke bottle or the Big Mac. At the same time, the food industry has done a good job of persuading us that the forty-five thousand different items or SKUs (stock keeping units) in the supermarket—seventeen thousand new ones every year—represent genuine variety rather than so many clever rearrangements of molecules extracted from the same plant.

You are what you eat, it’s often said, and if this is true, then what we mostly are is corn—or, more precisely, processed corn. This proposition is susceptible to scientific proof: The same scientists who glean the composition of ancient diets from mummified human remains can do the same for you or me, using a snip of hair or fingernail. The science works by identifying stable isotopes of carbon in human tissue that bear the signatures, in effect, of the different types of plants that originally took them from the air and introduced them into the food chain. The intricacies of this process are worth following, since they go some distance toward explaining how corn could have conquered our diet and, in turn, more of the earth’s surface than virtually any other domesticated species, our own included.

After water, carbon is the most common element in our bodies—indeed, in all living things on earth. We earthlings are, as they say, a carbon life form. (As one scientist put it, carbon supplies life’s quantity, since it is the main structural element in living matter, while much scarcer nitrogen supplies its quality—but more on that later.) Originally, the atoms of carbon from which we’re made were floating in the air, part of a carbon dioxide molecule. The only way to recruit these carbon atoms for the molecules necessary to support life—the carbohydrates, amino acids, proteins, and lipids—is by means of photosynthesis. Using sunlight as a catalyst the green cells of plants combine carbon atoms taken from the air with water and elements drawn from the soil to form the simple organic compounds that stand at the base of every food chain. It is more than a figure of speech to say that plants create life out of thin air.

But corn goes about this procedure a little differently than most other plants, a difference that not only renders the plant more efficient than most, but happens also to preserve the identity of the carbon atoms it recruits, even after they’ve been transformed into things like Gatorade and Ring Dings and hamburgers, not to mention the human bodies nourished on those things. Where most plants during photosynthesis create compounds that have three carbon atoms, corn (along with a small handful of other species) make compounds that have four: hence “C-4,” the botanical nickname for this gifted group of plants, which wasn’t identified until the 1970s.

The C-4 trick represents an important economy for a plant, giving it an advantage, especially in areas where water is scarce and temperatures high. In order to gather carbon atoms from the air, a plant has to open its stomata, the microscopic orifices in the leaves through which plants both take in and exhaust gases. Every time a stoma opens to admit carbon dioxide precious molecules of water escape. It’s as though every time you opened your mouth to eat you lost a quantity of blood. Ideally, you would open your mouth as seldom as possible, ingesting as much food as you could with every bite. This is essentially what a C-4 plant does. By recruiting extra atoms of carbon during each instance of photosynthesis, the corn plant is able to limit its loss of water and “fix”—that is, take from the atmosphere and link in a useful molecule—significantly more carbon than other plants.

At its most basic, the story of life on earth is the competition among species to capture and store as much energy as possible—either directly from the sun, in the case of plants, or, in the case of animals, by eating plants and plant eaters. The energy is stored in the form of carbon molecules and measured in calories. The calories we eat, whether in an ear of corn or a steak, represent packets of energy once captured by a plant. The C-4 trick helps explain the corn plant’s success in this competition: Few plants can manufacture quite as much organic matter (and calories) from the same quantities of sunlight and water and basic elements as corn. (Ninety-seven percent of what a corn plant is comes from the air, three percent from the ground.)

The trick doesn’t yet, however, explain how a scientist could tell that a given carbon atom in a human bone owes its presence there to a photosynthetic event that occurred in the leaf of one kind of plant and not another—in corn, say, instead of lettuce or wheat. The scientist can do this because all carbon is not created equal. Some carbon atoms, called isotopes, have more than the usual complement of six protons and six neutrons, giving them a slightly different atomic weight. C-13, for example, has six protons and seven neutrons. (Hence “C-13.”) For whatever reason, when a C-4 plant goes scavenging for its four-packs of carbon, it takes in more carbon 13 than ordinary—C-3—plants, which exhibit a marked preference for the more common carbon 12. Greedy for carbon, C-4 plants can’t afford to discriminate among isotopes, and so end up with relatively more carbon 13. The higher the ratio of carbon 13 to carbon 12 in a person’s flesh, the more corn has been in his diet—or in the diet of the animals he or she ate. (As far as we’re concerned, it makes little difference whether we consume relatively more or less carbon 13.)

One would expect to find a comparatively high proportion of carbon 13 in the flesh of people whose staple food of choice is corn—Mexicans, most famously. Americans eat much more wheat than corn—114 pounds of wheat flour per person per year, compared to 11 pounds of corn flour. The Europeans who colonized America regarded themselves as wheat people, in contrast to the native corn people they encountered; wheat in the West has always been considered the most refined, or civilized, grain. If asked to choose, most of us would probably still consider ourselves wheat people (except perhaps the proud corn-fed Midwesterners, and they don’t know the half of it), though by now the whole idea of identifying with a plant at all strikes us as a little old-fashioned. Beef people sounds more like it, though nowadays chicken people, which sounds not nearly so good, is probably closer to the truth of the matter. But carbon 13 doesn’t lie, and researchers who have compared the isotopes in the flesh or hair of North Americans to those in the same tissues of Mexicans report that it is now we in the North who are the true people of corn. “When you look at the isotope ratios,” Todd Dawson, a Berkeley biologist who’s done this sort of research, told me, “we North Americans look like corn chips with legs.” Compared to us, Mexicans today consume a far more varied carbon diet: the animals they eat still eat grass (until recently, Mexicans regarded feeding corn to livestock as a sacrilege); much of their protein comes from legumes; and they still sweeten their beverages with cane sugar.

So that’s us: processed corn, walking.

3. THE RISE OF ZEA MAYS

How this peculiar grass, native to Central America and unknown to the Old World before 1492, came to colonize so much of our land and bodies is one of the plant world’s greatest success stories. I say the plant world’s success story because it is no longer clear that corn’s triumph is such a boon to the rest of the world, and because we should give credit where credit is due. Corn is the hero of its own story, and though we humans played a crucial supporting role in its rise to world domination, it would be wrong to suggest we have been calling the shots, or acting always in our own best interests. Indeed, there is every reason to believe that corn has succeeded in domesticating us.

To some extent this holds true for all of the plants and animals that take part in the grand coevolutionary bargain with humans we call agriculture. Though we insist on speaking of the “invention” of agriculture as if it were our idea, like double-entry bookkeeping or the light-bulb, in fact it makes just as much sense to regard agriculture as a brilliant (if unconscious) evolutionary strategy on the part of the plants and animals involved to get us to advance their interests. By evolving certain traits we happen to regard as desirable, these species got themselves noticed by the one mammal in a position not only to spread their genes around the world, but to remake vast swaths of that world in the image of the plants’ preferred habitat. No other group of species gained more from its association with humans than the edible grasses, and no grass has reaped more from agriculture than Zea mays, today the world’s most important cereal crop.

Corn’s success might seem fated in retrospect, but it was not something anyone would have predicted on that day in May 1493 when Columbus first described the botanical oddity he had encountered in the New World to Isabella’s court. He told of a towering grass with an ear as thick as a man’s arm, to which grains were “affixed by nature in a wondrous manner and in form and size like garden peas, white when young.” Wondrous, perhaps, yet this was, after all, the staple food of a people that would shortly be vanquished and all but exterminated.

By all rights, maize should have shared the fate of that other native species, the bison, which was despised and targeted for elimination precisely because it was “the Indians’ commissary,” in the words of General Philip Sheridan, commander of the armies of the West. Exterminate the species, Sheridan advised, and “[t]hen your prairies can be covered with speckled cattle and the festive cowboy.” In outline Sheridan’s plan was the plan for the whole continent: The white man brought his own “associate species” with him to the New World—cattle and apples, pigs and wheat, not to mention his accustomed weeds and microbes—and wherever possible helped them to displace the native plants and animals allied with the Indian. More even than the rifle, it was this biotic army that did the most to defeat the Indians.

But corn enjoyed certain botanical advantages that would allow it to thrive even as the Native Americans with whom it had coevolved were being eliminated. Indeed, maize, the one plant without which the American colonists probably would never have survived, let alone prospered, wound up abetting the destruction of the very people who had helped develop it. In the plant world at least, opportunism trumps gratitude. Yet in time, the plant of the vanquished would conquer even the conquerors.

Squanto taught the Pilgrims how to plant maize in the spring of 1621, and the colonists immediately recognized its value: No other plant could produce quite as much food quite as fast on a given patch of New World ground as this Indian corn. (Originally “corn” was a generic English word for any kind of grain, even a grain of salt—hence “corned beef” it didn’t take long for Zea mays to appropriate the word for itself, at least in America.) The fact that the plant was so well adapted to the climate and soils of North America gave it an edge over European grains, even if it did make a disappointingly earthbound bread. Centuries before the Pilgrims arrived the plant had already spread north from central Mexico, where it is thought to have originated, all the way to New England, where Indians were probably cultivating it by 1000. Along the way, the plant—whose prodigious genetic variability allows it to adapt rapidly to new conditions—made itself at home in virtually every microclimate in North America; hot or cold, dry or wet, sandy soil or heavy, short day or long, corn, with the help of its Native American allies, evolved whatever traits it needed to survive and flourish.

Lacking any such local experience, wheat struggled to adapt to the continent’s harsh climate, and yields were often so poor that the settlements that stood by the old world staple often perished. Planted, a single corn seed yielded more than 150 fat kernels, often as many as 300, while the return on a seed of wheat, when all went well, was something less than 50:1. (At a time when land was abundant and labor scarce, agricultural yields were calculated on a per-seed-sown basis.)

Corn won over the wheat people because of its versatility, prized especially in new settlements far from civilization. This one plant supplied settlers with a ready-to-eat vegetable and a storable grain, a source of fiber and animal feed, a heating fuel and an intoxicant. Corn could be eaten fresh off the cob (“green”) within months after planting, or dried on the stalk in fall, stored indefinitely, and ground into flour as needed. Mashed and fermented, corn could be brewed into beer or distilled into whiskey; for a time it was the only source of alcohol on the frontier. (Whiskey and pork were both regarded as “concentrated corn,” the latter a concentrate of its protein, the former of its calories; both had the virtue of reducing corn’s bulk and raising its price.) No part of the big grass went to waste: The husks could be woven into rugs and twine; the leaves and stalks made good silage for livestock; the shelled cobs were burned for heat and stacked by the privy as a rough substitute for toilet paper. (Hence the American slang term “corn hole.”)

“Corn was the means that permitted successive waves of pioneers to settle new territories,” writes Arturo Warman, a Mexican historian. “Once the settlers had fully grasped the secrets and potential of corn, they no longer needed the Native Americans.” Squanto had handed the white man precisely the tool he needed to dispossess the Indian. Without the “fruitfulness” of Indian corn, the nineteenth-century English writer William Cobbett declared, the colonists would never have been able to build “a powerful nation.” Maize, he wrote, was “the greatest blessing God ever gave to man.”

Valuable as corn is as a means of subsistence, the kernel’s qualities make it an excellent means of accumulation as well. After the crop has supplied its farmer’s needs, he can go to market with any surplus, dried corn being the perfect commodity: easy to transport and virtually indestructible. Corn’s dual identity, as food and commodity, has allowed many of the peasant communities that have embraced it to make the leap from a subsistence to a market economy. The dual identity also made corn indispensable to the slave trade: Corn was both the currency traders used to pay for slaves in Africa and the food upon which slaves subsisted during their passage to America. Corn is the protocapitalist plant.

4. MARRIED TO MAN

But while both the new and the native Americans were substantially dependent on corn, the plant’s dependence on the Americans had become total. Had maize failed to find favor among the conquerors, it would have risked extinction, because without humans to plant it every spring, corn would have disappeared from the earth in a matter of a few years. The novel cob-and-husk arrangement that makes corn such a convenient grain for us renders the plant utterly dependent for its survival on an animal in possession of the opposable thumb needed to remove the husk, separate the seeds, and plant them.

Plant a whole corncob and watch what happens: If any of the kernels manage to germinate, and then work their way free of the smothering husk, they will invariably crowd themselves to death before their second set of leaves has emerged. More than most domesticated plants (a few of whose offspring will usually find a way to grow unassisted), corn completely threw its lot in with humanity when it evolved its peculiar husked ear. Several human societies have seen fit to worship corn, but perhaps it should be the other way around: For corn, we humans are the contingent beings. So far, this reckless-seeming act of evolutionary faith in us has been richly rewarded.

It is tempting to think of maize as a human artifact, since the plant is so closely linked to us and so strikingly different from any wild species. There are in fact no wild maize plants, and teosinte, the weedy grass from which corn is believed to have descended (the word is Nahuatl for “mother of corn”), has no ear, bears its handful of tiny naked seeds on a terminal rachis like most other grasses, and generally looks nothing whatsoever like maize. The current thinking among botanists is that several thousand years ago teosinte underwent an abrupt series of mutations that turned it into corn; geneticists calculate that changes on as few as four chromosomes could account for the main traits that distinguish teosinte from maize. Taken together, these mutations amounted to (in the words of botanist Hugh Iltis) a “catastrophic sexual transmutation”: the transfer of the plant’s female organs from the top of the grass to a monstrous sheathed ear in the middle of the stalk. The male organs stayed put, remaining in the tassel.

It is, for a grass, a bizarre arrangement with crucial implications: The ear’s central location halfway down the stalk allows it to capture far more nutrients than it would up top, so suddenly producing hundreds of gigantic seeds becomes metabolically feasible. Yet because those seeds are now trapped in a tough husk, the plant has lost its ability to reproduce itself—hence the catastrophe in teosinte’s sex change. A mutation this freakish and maladaptive would have swiftly brought the plant to an evolutionary dead end had one of these freaks not happened to catch the eye of a human somewhere in Central America who, looking for something to eat, peeled open the husk to free the seeds. What would have been an unheralded botanical catastrophe in a world without humans became an incalculable evolutionary boon. If you look hard enough, you can still find teosinte growing in certain Central American highlands; you can find maize, its mutant offspring, anywhere you find people.

5. CORN SEX

Maize is self-fertilized and wind-pollinated, botanical terms that don’t begin to describe the beauty and wonder of corn sex. The tassel at the top of the plant houses the male organs, hundreds of pendant anthers that over the course of a few summer days release a superabundance of powdery yellow pollen: 14 million to 18 million grains per plant, 20,000 for every potential kernel. (“Better safe than sorry” or “more is more” being nature’s general rule for male genes.) A meter or so below await the female organs, hundreds of minuscule flowers arranged in tidy rows along a tiny, sheathed cob that juts upward from the stalk at the crotch of a leaf midway between tassel and earth. That the male anthers resemble flowers and the female cob a phallus is not the only oddity in the sex life of corn.

Each of the four hundred to eight hundred flowers on a cob has the potential to develop into a kernel—but only if a grain of pollen can find its way to its ovary, a task complicated by the distance the pollen has to travel and the intervening husk in which the cob is tightly wrapped. To surmount this last problem, each flower sends out through the tip of the husk a single, sticky strand of silk (technically its “style”) to snag its own grain of pollen. The silks emerge from the husk on the very day the tassel is set to shower its yellow dust.

What happens next is very strange. After a grain of pollen has fallen through the air and alighted on the moistened tip of silk, its nucleus divides in two, creating a pair of twins, each with the same set of genes but a completely different role to perform in the creation of the kernel. The first twin’s job is to tunnel a microscopic tube down through the center of the silk thread. That accomplished, its clone slides down through the tunnel, past the husk, and into the waiting flower, a journey of between six and eight inches that takes several hours to complete. Upon arrival in the flower the second twin fuses with the egg to form the embryo—the germ of the future kernel. Then the first twin follows, entering the now fertilized flower, where it sets about forming the endosperm—the big, starchy part of the kernel. Every kernel of corn is the product of this intricate ménage à trois; the tiny, stunted kernels you often see at the narrow end of a cob are flowers whose silk no pollen grain ever penetrated. Within a day of conception, the now superfluous silk dries up, eventually turning reddish brown; fifty or so days later, the kernels are mature.*

The mechanics of corn sex, and in particular the great distance over open space corn pollen must travel to complete its mission, go a long way toward accounting for the success of maize’s alliance with humankind. It’s a simple matter for a human to get between a corn plant’s pollen and its flower, and only a short step from there to deliberately crossing one corn plant with another with an eye to encouraging specific traits in the offspring. Long before scientists understood hybridization, Native Americans had discovered that by taking the pollen from the tassel of one corn plant and dusting it on the silks of another, they could create new plants that combined the traits of both parents. American Indians were the world’s first plant breeders, developing literally thousands of distinct cultivars for every conceivable environment and use.

Looked at another way, corn was the first plant to involve humans so intimately in its sex life. For a species whose survival depends on how well it can gratify the ever shifting desires of its only sponsor, this has proved to be an excellent evolutionary strategy. More even than other domesticated species, many of which can withstand a period of human neglect, it pays for corn to be obliging—and to be so quick about it. The usual way a domesticated species figures out what traits its human ally will reward is through the slow and wasteful process of Darwinian trial and error. Hybridization represents a far swifter and more efficient means of communication, or feedback loop, between plant and human; by allowing humans to arrange its marriages, corn can discover in a single generation precisely what qualities it needs to prosper.

It is by being so obliging that corn has won itself as much human attention and habitat as it has. The plant’s unusual sexual arrangements, so amenable to human intervention, have allowed it to adapt to the very different worlds of Native Americans (and to their very different worlds, from southern Mexico to New England), of colonists and settlers and slaves, and of all the other corn-eating societies that have come and gone since the first human chanced upon that first teosinte freak.

But of all the human environments to which corn has successfully adapted since then, the adaptation to our own—the world of industrial consumer capitalism; the world, that is, of the supermarket and fast-food franchise—surely represents the plant’s most extraordinary evolutionary achievement to date. For to prosper in the industrial food chain to the extent it has, corn had to acquire several improbable new tricks. It had to adapt itself not just to humans but to their machines, which it did by learning to grow as upright, stiff-stalked, and uniform as soldiers. It had to multiply its yield by an order of magnitude, which it did by learning to grow shoulder to shoulder with other corn plants, as many as thirty thousand to the acre. It had to develop an appetite for fossil fuel (in the form of petrochemical fertilizer) and a tolerance for various synthetic chemicals. But even before it could master these tricks and make a place for itself in the bright sunshine of capitalism, corn first had to turn itself into something never before seen in the plant world: a form of intellectual property.

The free corn sex I’ve described allowed people to do virtually anything they wanted with the genetics of corn except own them—a big problem for a would-be capitalist plant. If I crossed two corn plants to create a variety with an especially desirable trait, I could sell you my special seeds, but only once, since the corn you grew from my special seeds would produce lots more special seeds, for free and forever, putting me out business in short order. It’s difficult to control the means of production when the product you’re selling can reproduce itself endlessly. This is one of the ways in which the imperatives of biology are difficult to mesh with the imperatives of business.

Difficult, but not impossible. Early in the twentieth century American corn breeders figured out how to bring corn reproduction under firm control and to protect the seed from copiers. The breeders discovered that when they crossed two corn plants that had come from inbred lines—from ancestors that had themselves been exclusively self-pollinated for several generations—the hybrid offspring displayed some highly unusual characteristics. First, all the seeds in that first generation (F-1, in the plant breeder’s vocabulary) produced genetically identical plants—a trait that, among other things, facilitates mechanization. Second, those plants exhibited heterosis, or hybrid vigor—better yields than either of their parents. But most important of all, they found that the seeds produced by these seeds did not “come true”—the plants in the second (F-2) generation bore little resemblance to the plants in the first. Specifically, their yields plummeted by as much as a third, making their seeds virtually worthless.

Hybrid corn now offered its breeders what no other plant at that time could: the biological equivalent of a patent. Farmers now had to buy new seeds every spring; instead of depending upon their plants to reproduce themselves, they now depended on a corporation. The corporation, assured for the first time of a return on its investment in breeding, showered corn with attention—R&D, promotion, advertising—and the plant responded, multiplying its fruitfulness year after year. With the advent of the F-1 hybrid, a technology with the power to remake nature in the image of capitalism, Zea mays entered the industrial age and, in time, it brought the whole American food chain with it.

 

 

 

 

TWO

THE FARM

 

 

1. ONE FARMER, 129 EATERS

To take the wheel of a clattering 1975 International Harvester tractor, pulling a spidery eight-row planter through an Iowa cornfield during the first week of May, is like trying to steer a boat through a softly rolling sea of dark chocolate. The hard part is keeping the thing on a straight line, that and hearing the shouted instructions of the farmer sitting next to you when you both have wads of Kleenex jammed into your ears to muffle the diesel roar. Driving a boat, you try to follow the compass heading or aim for a landmark on shore; planting corn, you try to follow the groove in the soil laid down on the previous pass by a rolling disk at the end of a steel arm attached to the planter behind us. Deviate from the line and your corn rows will wobble, overlapping or drifting away from one another. Either way, it’ll earn you a measure of neighborly derision and hurt your yield. And yield, measured in bushels per acre, is the measure of all things here in corn country.

The tractor I was driving belonged to George Naylor, who bought it new back in the midseventies, when, as a twenty-seven-year-old, he returned to Greene County, Iowa, to farm his family’s 320 acres. (He subsequently bought another 150 acres.) Naylor is a big man with a moon face and a scraggly gray beard. On the phone his gravelly voice and incontrovertible pronouncements (“That is just the biggest bunch of bullshit! Only the New York Times would be dumb enough to believe the Farm Bureau still speaks for American farmers!”) led me to expect someone considerably more ornery than the shy fellow who climbed down from his tractor cab to greet me in the middle of a field in the middle of a slate-gray day threatening rain. Naylor had on the farmer’s standard-issue baseball cap, a yellow chamois shirt, and overalls—the stripy blue kind favored by railroad workers, about as unintimidating an article of clothing as has ever been donned by a man. My first impression was more shambling Gentle Ben than fiery prairie populist, but I would discover that Naylor can be either fellow, the mere mention of “Cargill” or “Earl Butz” supplying the transformational trigger.

This part of Iowa has some of the richest soil in the world, a layer of cakey alluvial loam nearly two feet thick. The initial deposit was made by the retreat of the Wisconsin glacier ten thousand years ago, and then compounded at the rate of another inch or two every decade by prairie grasses—big bluestem, foxtail, needlegrass, and switchgrass. Tall-grass prairie is what this land was until the middle of the nineteenth century, when the sod was first broken by the settler’s plow. George’s grandfather moved his family to Iowa from Derbyshire, England, in the 1880s, a coal miner hoping to improve his lot in life. The sight of such soil, pushing up and then curling back down behind the blade of his plow like a thick black wake behind a ship, must have stoked his confidence, and justifiably so: It’s gorgeous stuff, black gold as deep as you can dig, as far as you can see. What you can’t see is all the soil that’s no longer here, having been blown or washed away since the sod was broken; the two-foot crust of topsoil here probably started out closer to four.

The story of the Naylor farm since 1919, when George’s grandfather bought it, closely tracks the twentieth-century story of American agriculture, its achievements as well as its disasters. It begins with a farmer supporting a family on a dozen different species of plants and animals. There would have been a fair amount of corn then too, but also fruits and other vegetables, as well as oats, hay, and alfalfa to feed the pigs, cattle, chickens, and horses—horses being the tractors of that time. One of every four Americans lived on a farm when Naylor’s grandfather arrived here in Churdan; his land and labor supplied enough food to feed his family and twelve other Americans besides. Less than a century after, fewer than 2 million Americans still farm—and they grow enough to feed the rest of us. What that means is that Naylor’s grandson, raising nothing but corn and soybeans on a fairly typical Iowa farm, is so astoundingly productive that he is, in effect, feeding some 129 Americans. Measured in terms of output per worker, American farmers like Naylor are the most productive humans who have ever lived.

  • WINNER
    James Beard Cookbook Award - Writings on Food
  • WINNER
    National Book Critics Circle Awards
© Tabitha Soren
Michael Pollan is the author of nine books, including This Is Your Mind on Plants, How to Change Your Mind, Cooked, Food Rules, In Defense of Food, The Omnivore’s Dilemma, and The Botany of Desire, all of which were New York Times bestsellers. He is also the author of the audiobook Caffeine: How Coffee and Tea Made the Modern World. A longtime contributor to The New York Times Magazine, Pollan teaches writing at Harvard University and the University of California, Berkeley. In 2010, Time magazine named him one of the one hundred most influential people in the world. 

www.michaelpollan.com View titles by Michael Pollan
Educator Guide for The Omnivore's Dilemma

Classroom-based guides appropriate for schools and colleges provide pre-reading and classroom activities, discussion questions connected to the curriculum, further reading, and resources.

(Please note: the guide displayed here is the most recently uploaded version; while unlikely, any page citation discrepancies between the guide and book is likely due to pagination differences between a book’s different formats.)

About

"Outstanding . . . a wide-ranging invitation to think through the moral ramifications of our eating habits." —The New Yorker

One of the New York Times Book Review's Ten Best Books of the Year and Winner of the James Beard Award

Author of This is Your Mind on Plants, How to Change Your Mind and the #1 New York Times Bestseller In Defense of Food and Food Rules


What should we have for dinner? Ten years ago, Michael Pollan confronted us with this seemingly simple question and, with The Omnivore’s Dilemma, his brilliant and eye-opening exploration of our food choices, demonstrated that how we answer it today may determine not only our health but our survival as a species. In the years since, Pollan’s revolutionary examination has changed the way Americans think about food. Bringing wide attention to the little-known but vitally important dimensions of food and agriculture in America, Pollan launched a national conversation about what we eat and the profound consequences that even the simplest everyday food choices have on both ourselves and the natural world. Ten years later, The Omnivore’s Dilemma continues to transform the way Americans think about the politics, perils, and pleasures of eating.

Table of Contents

The Omnivore's DilemmaIntroduction: Our National Eating Disorder

I. Industrial: Corn

One. The Plant: Corn's Conquest
Two. The Farm
Three. The Elevator
Four. The Feedlot: Making Meat
Five. The Processing Plant: Making Complex Foods
Six. The Consumer: A Republic of Fat
Seven. The Meal: Fast Food

II. Pastoral: Grass

Eight. All Flesh Is Grass
Nine. Big Organic
Ten. Grass: Thirteen Ways of Looking at a Pasture
Eleven. The Animals: Practicing Complexity
Twelve. Slaughter: In a Glass Abattoir
Thirteen. The Market: "Greetings from the Non-Barcode People"
Fourteen. The Meal: Grass Fed

III. Personal: The Forest

Fifteen. The Forager
Sixteen. The Omnivore's Dilemma
Seventeen. The Ethics of Eating Animals
Eighteen. Hunting: The Meat
Nineteen. Gathering: The Fungi
Twenty. The Perfect Meal

Acknowledgments
Sources
Index

Excerpt

INTRODUCTION

Our National Eating Disorder 

What should we have for dinner?

This book is a long and fairly involved answer to this seemingly simple question. Along the way, it also tries to figure out how such a simple question could ever have gotten so complicated. As a culture we seem to have arrived at a place where whatever native wisdom we may once have possessed about eating has been replaced by confusion and anxiety. Somehow this most elemental of activities—figuring out what to eat—has come to require a remarkable amount of expert help. How did we ever get to a point where we need investigative journalists to tell us where our food comes from and nutritionists to determine the dinner menu?

For me the absurdity of the situation became inescapable in the fall of 2002, when one of the most ancient and venerable staples of human life abruptly disappeared from the American dinner table. I’m talking of course about bread. Virtually overnight, Americans changed the way they eat. A collective spasm of what can only be described as carbophobia seized the country, supplanting an era of national lipophobia dating to the Carter administration. That was when, in 1977, a Senate committee had issued a set of “dietary goals” warning beef-loving Americans to lay off the red meat. And so we dutifully had done, until now.

What set off the sea change? It appears to have been a perfect media storm of diet books, scientific studies, and one timely magazine article. The new diet books, many of them inspired by the formerly discredited Dr. Robert C. Atkins, brought Americans the welcome news that they could eat more meat and lose weight just so long as they laid off the bread and pasta. These high-protein, low-carb diets found support in a handful of new epidemiological studies suggesting that the nutritional orthodoxy that had held sway in America since the 1970s might be wrong. It was not, as official opinion claimed, fat that made us fat, but the carbohydrates we’d been eating precisely in order to stay slim. So conditions were ripe for a swing of the dietary pendulum when, in the summer of 2002, the New York Times Magazine published a cover story on the new research entitled “What if Fat Doesn’t Make You Fat?” Within months, supermarket shelves were restocked and restaurant menus rewritten to reflect the new nutritional wisdom. The blamelessness of steak restored, two of the most wholesome and uncontroversial foods known to man—bread and pasta—acquired a moral stain that promptly bankrupted dozens of bakeries and noodle firms and ruined an untold number of perfectly good meals.

So violent a change in a culture’s eating habits is surely the sign of a national eating disorder. Certainly it would never have happened in a culture in possession of deeply rooted traditions surrounding food and eating. But then, such a culture would not feel the need for its most august legislative body to ever deliberate the nation’s “dietary goals”—or, for that matter, to wage political battle every few years over the precise design of an official government graphic called the “food pyramid.” A country with a stable culture of food would not shell out millions for the quackery (or common sense) of a new diet book every January. It would not be susceptible to the pendulum swings of food scares or fads, to the apotheosis every few years of one newly discovered nutrient and the demonization of another. It would not be apt to confuse protein bars and food supplements with meals or breakfast cereals with medicines. It probably would not eat a fifth of its meals in cars or feed fully a third of its children at a fast-food outlet every day. And it surely would not be nearly so fat.

Nor would such a culture be shocked to discover that there are other countries, such as Italy and France, that decide their dinner questions on the basis of such quaint and unscientific criteria as pleasure and tradition, eat all manner of “unhealthy” foods, and, lo and behold, wind up actually healthier and happier in their eating than we are. We show our surprise at this by speaking of something called the “French paradox,” for how could a people who eat such demonstrably toxic substances as foie gras and triple crème cheese actually be slimmer and healthier than we are? Yet I wonder if it doesn’t make more sense to speak in terms of an American paradox—that is, a notably unhealthy people obsessed by the idea of eating healthily.

 

TO ONE DEGREE or another, the question of what to have for dinner assails every omnivore, and always has. When you can eat just about anything nature has to offer, deciding what you should eat will inevitably stir anxiety, especially when some of the potential foods on offer are liable to sicken or kill you. This is the omnivore’s dilemma, noted long ago by writers like Rousseau and Brillat-Savarin and first given that name thirty years ago by a University of Pennsylvania research psychologist named Paul Rozin. I’ve borrowed his phrase for the title of this book because the omnivore’s dilemma turns out to be a particularly sharp tool for understanding our present predicaments surrounding food.

In a 1976 paper called “The Selection of Foods by Rats, Humans, and Other Animals” Rozin contrasted the omnivore’s existential situation with that of the specialized eater, for whom the dinner question could not be simpler. The koala doesn’t worry about what to eat: If it looks and smells and tastes like a eucalyptus leaf, it must be dinner. The koala’s culinary preferences are hardwired in its genes. But for omnivores like us (and the rat) a vast amount of brain space and time must be devoted to figuring out which of all the many potential dishes nature lays on are safe to eat. We rely on our prodigious powers of recognition and memory to guide us away from poisons (Isn’t that the mushroom that made me sick last week?) and toward nutritious plants (The red berries are the juicier, sweeter ones). Our taste buds help too, predisposing us toward sweetness, which signals carbohydrate energy in nature, and away from bitterness, which is how many of the toxic alkaloids produced by plants taste. Our inborn sense of disgust keeps us from ingesting things that might infect us, such as rotten meat. Many anthropologists believe that the reason we evolved such big and intricate brains was precisely to help us deal with the omnivore’s dilemma.

Being a generalist is of course a great boon as well as a challenge; it is what allows humans to successfully inhabit virtually every terrestrial environment on the planet. Omnivory offers the pleasures of variety, too. But the surfeit of choice brings with it a lot of stress and leads to a kind of Manichaean view of food, a division of nature into The Good Things to Eat, and The Bad.

The rat must make this all-important distinction more or less on its own, each individual figuring out for itself—and then remembering—which things will nourish and which will poison. The human omnivore has, in addition to his senses and memory, the incalculable advantage of a culture, which stores the experience and accumulated wisdom of countless human tasters before him. I don’t need to experiment with the mushroom now called, rather helpfully, the “death cap,” and it is common knowledge that that first intrepid lobster eater was on to something very good. Our culture codifies the rules of wise eating in an elaborate structure of taboos, rituals, recipes, manners, and culinary traditions that keep us from having to reenact the omnivore’s dilemma at every meal.

One way to think about America’s national eating disorder is as the return, with an almost atavistic vengeance, of the omnivore’s dilemma. The cornucopia of the American supermarket has thrown us back on a bewildering food landscape where we once again have to worry that some of those tasty-looking morsels might kill us. (Perhaps not as quickly as a poisonous mushroom, but just as surely.) Certainly the extraordinary abundance of food in America complicates the whole problem of choice. At the same time, many of the tools with which people historically managed the omnivore’s dilemma have lost their sharpness here—or simply failed. As a relatively new nation drawn from many different immigrant populations, each with its own culture of food, Americans have never had a single, strong, stable culinary tradition to guide us.

The lack of a steadying culture of food leaves us especially vulnerable to the blandishments of the food scientist and the marketer, for whom the omnivore’s dilemma is not so much a dilemma as an opportunity. It is very much in the interest of the food industry to exacerbate our anxieties about what to eat, the better to then assuage them with new products. Our bewilderment in the supermarket is no accident; the return of the omnivore’s dilemma has deep roots in the modern food industry, roots that, I found, reach all the way back to fields of corn growing in places like Iowa.

And so we find ourselves where we do, confronting in the supermarket or at the dinner table the dilemmas of omnivorousness, some of them ancient and others never before imagined. The organic apple or the conventional? And if the organic, the local one or the imported? The wild fish or the farmed? The trans fats or the butter or the “not butter”? Shall I be a carnivore or a vegetarian? And if a vegetarian, a lacto-vegetarian or a vegan? Like the hunter-gatherer picking a novel mushroom off the forest floor and consulting his sense memory to determine its edibility, we pick up the package in the supermarket and, no longer so confident of our senses, scrutinize the label, scratching our heads over the meaning of phrases like “heart healthy,” “no trans fats,” “cage-free,” or “range-fed.” What is “natural grill flavor” or TBHQ or xanthan gum? What is all this stuff, anyway, and where in the world did it come from?

 

MY WAGER in writing The Omnivore’s Dilemma was that the best way to answer the questions we face about what to eat was to go back to the very beginning, to follow the food chains that sustain us, all the way from the earth to the plate—to a small number of actual meals. I wanted to look at the getting and eating of food at its most fundamental, which is to say, as a transaction between species in nature, eaters and eaten. (“The whole of nature,” wrote the English author William Ralph Inge, “is a conjugation of the verb to eat, in the active and passive.”) What I try to do in this book is approach the dinner question as a naturalist might, using the long lenses of ecology and anthropology, as well as the shorter, more intimate lens of personal experience.

My premise is that like every other creature on earth, humans take part in a food chain, and our place in that food chain, or web, determines to a considerable extent what kind of creature we are. The fact of our omnivorousness has done much to shape our nature, both body (we possess the omnicompetent teeth and jaws of the omnivore, equally well suited to tearing meat and grinding seeds) and soul. Our prodigious powers of observation and memory, as well as our curious and experimental stance toward the natural world, owe much to the biological fact of omnivorousness. So do the various adaptations we’ve evolved to defeat the defenses of other creatures so that we might eat them, including our skills at hunting and cooking with fire. Some philosophers have argued that the very open-endedness of human appetite is responsible for both our savagery and civility, since a creature that could conceive of eating anything (including, notably, other humans) stands in particular need of ethical rules, manners, and rituals. We are not only what we eat, but how we eat, too.

Yet we are also different from most of nature’s other eaters—markedly so. For one thing, we’ve acquired the ability to substantially modify the food chains we depend on, by means of such revolutionary technologies as cooking with fire, hunting with tools, farming, and food preservation. Cooking opened up whole new vistas of edibility by rendering various plants and animals more digestible, and overcoming many of the chemical defenses other species deploy against being eaten. Agriculture allowed us to vastly multiply the populations of a few favored food species, and therefore in turn our own. And, most recently, industry has allowed us to reinvent the human food chain, from the synthetic fertility of the soil to the microwaveable can of soup designed to fit into a car’s cup holder. The implications of this last revolution, for our health and the health of the natural world, we are still struggling to grasp.

The Omnivore’s Dilemma is about the three principal food chains that sustain us today: the industrial, the organic, and the hunter-gatherer. Different as they are, all three food chains are systems for doing more or less the same thing: linking us, through what we eat, to the fertility of the earth and the energy of the sun. It might be hard to see how, but even a Twinkie does this—constitutes an engagement with the natural world. As ecology teaches, and this book tries to show, it’s all connected, even the Twinkie.

Ecology also teaches that all life on earth can be viewed as a competition among species for the solar energy captured by green plants and stored in the form of complex carbon molecules. A food chain is a system for passing those calories on to species that lack the plant’s unique ability to synthesize them from sunlight. One of the themes of this book is that the industrial revolution of the food chain, dating to the close of World War II, has actually changed the fundamental rules of this game. Industrial agriculture has supplanted a complete reliance on the sun for our calories with something new under the sun: a food chain that draws much of its energy from fossil fuels instead. (Of course, even that energy originally came from the sun, but unlike sunlight it is finite and irreplaceable.) The result of this innovation has been a vast increase in the amount of food energy available to our species; this has been a boon to humanity (allowing us to multiply our numbers), but not an unalloyed one. We’ve discovered that an abundance of food does not render the omnivore’s dilemma obsolete. To the contrary, abundance seems only to deepen it, giving us all sorts of new problems and things to worry about.

Each of this book’s three parts follows one of the principal human food chains from beginning to end: from a plant, or group of plants, photosynthesizing calories in the sun, all the way to a meal at the dinner end of that food chain. Reversing the chronological order, I start with the industrial food chain, since that is the one that today involves and concerns us the most. It is also by far the biggest and longest. Since monoculture is the hallmark of the industrial food chain, this section focuses on a single plant: Zea mays, the giant tropical grass we call corn, which has become the keystone species of the industrial food chain, and so in turn of the modern diet. This section follows a bushel of commodity corn from the field in Iowa where it grew on its long, strange journey to its ultimate destination in a fast-food meal, eaten in a moving car on a highway in Marin County, California.

The book’s second part follows what I call—to distinguish it from the industrial—the pastoral food chain. This section explores some of the alternatives to industrial food and farming that have sprung up in recent years (variously called “organic,” “local,” “biological,” and “beyond organic”), food chains that might appear to be preindustrial but in surprising ways turn out in fact to be postindustrial. I set out thinking I could follow one such food chain, from a radically innovative farm in Virginia that I worked on one recent summer to an extremely local meal prepared from animals raised on its pastures. But I promptly discovered that no single farm or meal could do justice to the complex, branching story of alternative agriculture right now, and that I needed also to reckon with the food chain I call, oxymoronically, the “industrial organic.” So the book’s pastoral section serves up the natural history of two very different “organic” meals: one whose ingredients came from my local Whole Foods supermarket (gathered there from as far away as Argentina), and the other tracing its origins to a single polyculture of grasses growing at Polyface Farm in Swoope, Virginia.

The last section, titled Personal, follows a kind of neo-Paleolithic food chain from the forests of Northern California to a meal I prepared (almost) exclusively from ingredients I hunted, gathered, and grew myself. Though we twenty-first-century eaters still eat a handful of hunted and gathered food (notably fish and wild mushrooms), my interest in this food chain was less practical than philosophical: I hoped to shed fresh light on the way we eat now by immersing myself in the way we ate then. In order to make this meal I had to learn how to do some unfamiliar things, including hunting game and foraging for wild mushrooms and urban tree fruit. In doing so I was forced to confront some of the most elemental questions—and dilemmas—faced by the human omnivore: What are the moral and psychological implications of killing, preparing, and eating a wild animal? How does one distinguish between the delicious and the deadly when foraging in the woods? How do the alchemies of the kitchen transform the raw stuffs of nature into some of the great delights of human culture?

The end result of this adventure was what I came to think of as the Perfect Meal, not because it turned out so well (though in my humble opinion it did), but because this labor- and thought-intensive dinner, enjoyed in the company of fellow foragers, gave me the opportunity, so rare in modern life, to eat in full consciousness of everything involved in feeding myself: For once, I was able to pay the full karmic price of a meal.

Yet as different as these three journeys (and four meals) turned out to be, a few themes kept cropping up. One is that there exists a fundamental tension between the logic of nature and the logic of human industry, at least as it is presently organized. Our ingenuity in feeding ourselves is prodigious, but at various points our technologies come into conflict with nature’s ways of doing things, as when we seek to maximize efficiency by planting crops or raising animals in vast monocultures. This is something nature never does, always and for good reasons practicing diversity instead. A great many of the health and environmental problems created by our food system owe to our attempts to oversimplify nature’s complexities, at both the growing and the eating ends of our food chain. At either end of any food chain you find a biological system—a patch of soil, a human body—and the health of one is connected—literally—to the health of the other. Many of the problems of health and nutrition we face today trace back to things that happen on the farm, and behind those things stand specific government policies few of us know anything about.

I don’t mean to suggest that human food chains have only recently come into conflict with the logic of biology; early agriculture and, long before that, human hunting proved enormously destructive. Indeed, we might never have needed agriculture had earlier generations of hunters not eliminated the species they depended upon. Folly in the getting of our food is nothing new. And yet the new follies we are perpetrating in our industrial food chain today are of a different order. By replacing solar energy with fossil fuel, by raising millions of food animals in close confinement, by feeding those animals foods they never evolved to eat, and by feeding ourselves foods far more novel than we even realize, we are taking risks with our health and the health of the natural world that are unprecedented.

Another theme, or premise really, is that the way we eat represents our most profound engagement with the natural world. Daily, our eating turns nature into culture, transforming the body of the world into our bodies and minds. Agriculture has done more to reshape the natural world than anything else we humans do, both its landscapes and the composition of its flora and fauna. Our eating also constitutes a relationship with dozens of other species—plants, animals, and fungi—with which we have coevolved to the point where our fates are deeply entwined. Many of these species have evolved expressly to gratify our desires, in the intricate dance of domestication that has allowed us and them to prosper together as we could never have prospered apart. But our relationships with the wild species we eat—from the mushrooms we pick in the forest to the yeasts that leaven our bread—are no less compelling, and far more mysterious. Eating puts us in touch with all that we share with the other animals, and all that sets us apart. It defines us.

What is perhaps most troubling, and sad, about industrial eating is how thoroughly it obscures all these relationships and connections. To go from the chicken (Gallus gallus) to the Chicken McNugget is to leave this world in a journey of forgetting that could hardly be more costly, not only in terms of the animal’s pain but in our pleasure, too. But forgetting, or not knowing in the first place, is what the industrial food chain is all about, the principal reason it is so opaque, for if we could see what lies on the far side of the increasingly high walls of our industrial agriculture, we would surely change the way we eat.

“Eating is an agricultural act,” as Wendell Berry famously said. It is also an ecological act, and a political act, too. Though much has been done to obscure this simple fact, how and what we eat determines to a great extent the use we make of the world—and what is to become of it. To eat with a fuller consciousness of all that is at stake might sound like a burden, but in practice few things in life can afford quite as much satisfaction. By comparison, the pleasures of eating industrially, which is to say eating in ignorance, are fleeting. Many people today seem perfectly content eating at the end of an industrial food chain, without a thought in the world; this book is probably not for them. There are things in it that will ruin their appetites. But in the end this is a book about the pleasures of eating, the kinds of pleasure that are only deepened by knowing.

 

 

 

 

I

INDUSTRIAL

CORN

 

 

 

 

ONE

THE PLANT

Corn’s Conquest

 

 

1. A NATURALIST IN THE SUPERMARKET

Air-conditioned, odorless, illuminated by buzzing fluorescent tubes, the American supermarket doesn’t present itself as having very much to do with Nature. And yet what is this place if not a landscape (man-made, it’s true) teeming with plants and animals?

I’m not just talking about the produce section or the meat counter, either—the supermarket’s flora and fauna. Ecologically speaking, these are this landscape’s most legible zones, the places where it doesn’t take a field guide to identify the resident species. Over there’s your eggplant, onion, potato, and leek; here your apple, banana, and orange. Spritzed with morning dew every few minutes, Produce is the only corner of the supermarket where we’re apt to think “Ah, yes, the bounty of Nature!” Which probably explains why such a garden of fruits and vegetables (sometimes flowers, too) is what usually greets the shopper coming through the automatic doors.

Keep rolling, back to the mirrored rear wall behind which the butchers toil, and you encounter a set of species only slightly harder to identify—there’s chicken and turkey, lamb and cow and pig. Though in Meat the creaturely character of the species on display does seem to be fading, as the cows and pigs increasingly come subdivided into boneless and bloodless geometrical cuts. In recent years some of this supermarket euphemism has seeped into Produce, where you’ll now find formerly soil-encrusted potatoes cubed pristine white, and “baby” carrots machine-lathed into neatly tapered torpedoes. But in general here in flora and fauna you don’t need to be a naturalist, much less a food scientist, to know what species you’re tossing into your cart.

Venture farther, though, and you come to regions of the supermarket where the very notion of species seems increasingly obscure: the canyons of breakfast cereals and condiments; the freezer cases stacked with “home meal replacements” and bagged platonic peas; the broad expanses of soft drinks and towering cliffs of snacks; the unclassifiable Pop-Tarts and Lunchables; the frankly synthetic coffee whiteners and the Linnaeus-defying Twinkie. Plants? Animals?! Though it might not always seem that way, even the deathless Twinkie is constructed out of…well, precisely what I don’t know offhand, but ultimately some sort of formerly living creature, i.e., a species. We haven’t yet begun to synthesize our foods from petroleum, at least not directly.

If you do manage to regard the supermarket through the eyes of a naturalist, your first impression is apt to be of its astounding biodiversity. Look how many different plants and animals (and fungi) are represented on this single acre of land! What forest or prairie could hope to match it? There must be a hundred different species in the produce section alone, a handful more in the meat counter. And this diversity appears only to be increasing: When I was a kid, you never saw radicchio in the produce section, or a half dozen different kinds of mushrooms, or kiwis and passion fruit and durians and mangoes. Indeed, in the last few years a whole catalog of exotic species from the tropics has colonized, and considerably enlivened, the produce department. Over in fauna, on a good day you’re apt to find—beyond beef—ostrich and quail and even bison, while in Fish you can catch not just salmon and shrimp but catfish and tilapia, too. Naturalists regard biodiversity as a measure of a landscape’s health, and the modern supermarket’s devotion to variety and choice would seem to reflect, perhaps even promote, precisely that sort of ecological vigor.

Except for the salt and a handful of synthetic food additives, every edible item in the supermarket is a link in a food chain that begins with a particular plant growing in a specific patch of soil (or, more seldom, stretch of sea) somewhere on earth. Sometimes, as in the produce section, that chain is fairly short and easy to follow: As the netted bag says, this potato was grown in Idaho, that onion came from a farm in Texas. Move over to Meat, though, and the chain grows longer and less comprehensible: The label doesn’t mention that that rib-eye steak came from a steer born in South Dakota and fattened in a Kansas feedlot on grain grown in Iowa. Once you get into the processed foods you have to be a fairly determined ecological detective to follow the intricate and increasingly obscure lines of connection linking the Twinkie, or the nondairy creamer, to a plant growing in the earth someplace, but it can be done.

So what exactly would an ecological detective set loose in an American supermarket discover, were he to trace the items in his shopping cart all the way back to the soil? The notion began to occupy me a few years ago, after I realized that the straightforward question “What should I eat?” could no longer be answered without first addressing two other even more straightforward questions: “What am I eating? And where in the world did it come from?” Not very long ago an eater didn’t need a journalist to answer these questions. The fact that today one so often does suggests a pretty good start on a working definition of industrial food: Any food whose provenance is so complex or obscure that it requires expert help to ascertain.

When I started trying to follow the industrial food chain—the one that now feeds most of us most of the time and typically culminates either in a supermarket or fast-food meal—I expected that my investigations would lead me to a wide variety of places. And though my journeys did take me to a great many states, and covered a great many miles, at the very end of these food chains (which is to say, at the very beginning), I invariably found myself in almost exactly the same place: a farm field in the American Corn Belt. The great edifice of variety and choice that is an American supermarket turns out to rest on a remarkably narrow biological foundation comprised of a tiny group of plants that is dominated by a single species: Zea mays, the giant tropical grass most Americans know as corn.

Corn is what feeds the steer that becomes the steak. Corn feeds the chicken and the pig, the turkey and the lamb, the catfish and the tilapia and, increasingly, even the salmon, a carnivore by nature that the fish farmers are reengineering to tolerate corn. The eggs are made of corn. The milk and cheese and yogurt, which once came from dairy cows that grazed on grass, now typically come from Holsteins that spend their working lives indoors tethered to machines, eating corn.

Head over to the processed foods and you find ever more intricate manifestations of corn. A chicken nugget, for example, piles corn upon corn: what chicken it contains consists of corn, of course, but so do most of a nugget’s other constituents, including the modified corn starch that glues the thing together, the corn flour in the batter that coats it, and the corn oil in which it gets fried. Much less obviously, the leavenings and lecithin, the mono-, di-, and triglycerides, the attractive golden coloring, and even the citric acid that keeps the nugget “fresh” can all be derived from corn.

To wash down your chicken nuggets with virtually any soft drink in the supermarket is to have some corn with your corn. Since the 1980s virtually all the sodas and most of the fruit drinks sold in the supermarket have been sweetened with high-fructose corn syrup (HFCS)—after water, corn sweetener is their principal ingredient. Grab a beer for your beverage instead and you’d still be drinking corn, in the form of alcohol fermented from glucose refined from corn. Read the ingredients on the label of any processed food and, provided you know the chemical names it travels under, corn is what you will find. For modified or unmodified starch, for glucose syrup and maltodextrin, for crystalline fructose and ascorbic acid, for lecithin and dextrose, lactic acid and lysine, for maltose and HFCS, for MSG and polyols, for the caramel color and xanthan gum, read: corn. Corn is in the coffee whitener and Cheez Whiz, the frozen yogurt and TV dinner, the canned fruit and ketchup and candies, the soups and snacks and cake mixes, the frosting and gravy and frozen waffles, the syrups and hot sauces, the mayonnaise and mustard, the hot dogs and the bologna, the margarine and shortening, the salad dressings and the relishes and even the vitamins. (Yes, it’s in the Twinkie, too.) There are some forty-five thousand items in the average American supermarket and more than a quarter of them now contain corn. This goes for the nonfood items as well—everything from the toothpaste and cosmetics to the disposable diapers, trash bags, cleansers, charcoal briquettes, matches, and batteries, right down to the shine on the cover of the magazine that catches your eye by the checkout: corn. Even in Produce on a day when there’s ostensibly no corn for sale you’ll nevertheless find plenty of corn: in the vegetable wax that gives the cucumbers their sheen, in the pesticide responsible for the produce’s perfection, even in the coating on the cardboard it was shipped in. Indeed, the supermarket itself—the wallboard and joint compound, the linoleum and fiberglass and adhesives out of which the building itself has been built—is in no small measure a manifestation of corn.

And us?

2. CORN WALKING

Descendents of the Maya living in Mexico still sometimes refer to themselves as “the corn people.” The phrase is not intended as metaphor. Rather, it’s meant to acknowledge their abiding dependence on this miraculous grass, the staple of their diet for almost nine thousand years. Forty percent of the calories a Mexican eats in a day comes directly from corn, most of it in the form of tortillas. So when a Mexican says “I am maize” or “corn walking,” it is simply a statement of fact: The very substance of the Mexican’s body is to a considerable extent a manifestation of this plant.

For an American like me, growing up linked to a very different food chain, yet one that is also rooted in a field of corn, not to think of himself as a corn person suggests either a failure of imagination or a triumph of capitalism. Or perhaps a little of both. It does take some imagination to recognize the ear of corn in the Coke bottle or the Big Mac. At the same time, the food industry has done a good job of persuading us that the forty-five thousand different items or SKUs (stock keeping units) in the supermarket—seventeen thousand new ones every year—represent genuine variety rather than so many clever rearrangements of molecules extracted from the same plant.

You are what you eat, it’s often said, and if this is true, then what we mostly are is corn—or, more precisely, processed corn. This proposition is susceptible to scientific proof: The same scientists who glean the composition of ancient diets from mummified human remains can do the same for you or me, using a snip of hair or fingernail. The science works by identifying stable isotopes of carbon in human tissue that bear the signatures, in effect, of the different types of plants that originally took them from the air and introduced them into the food chain. The intricacies of this process are worth following, since they go some distance toward explaining how corn could have conquered our diet and, in turn, more of the earth’s surface than virtually any other domesticated species, our own included.

After water, carbon is the most common element in our bodies—indeed, in all living things on earth. We earthlings are, as they say, a carbon life form. (As one scientist put it, carbon supplies life’s quantity, since it is the main structural element in living matter, while much scarcer nitrogen supplies its quality—but more on that later.) Originally, the atoms of carbon from which we’re made were floating in the air, part of a carbon dioxide molecule. The only way to recruit these carbon atoms for the molecules necessary to support life—the carbohydrates, amino acids, proteins, and lipids—is by means of photosynthesis. Using sunlight as a catalyst the green cells of plants combine carbon atoms taken from the air with water and elements drawn from the soil to form the simple organic compounds that stand at the base of every food chain. It is more than a figure of speech to say that plants create life out of thin air.

But corn goes about this procedure a little differently than most other plants, a difference that not only renders the plant more efficient than most, but happens also to preserve the identity of the carbon atoms it recruits, even after they’ve been transformed into things like Gatorade and Ring Dings and hamburgers, not to mention the human bodies nourished on those things. Where most plants during photosynthesis create compounds that have three carbon atoms, corn (along with a small handful of other species) make compounds that have four: hence “C-4,” the botanical nickname for this gifted group of plants, which wasn’t identified until the 1970s.

The C-4 trick represents an important economy for a plant, giving it an advantage, especially in areas where water is scarce and temperatures high. In order to gather carbon atoms from the air, a plant has to open its stomata, the microscopic orifices in the leaves through which plants both take in and exhaust gases. Every time a stoma opens to admit carbon dioxide precious molecules of water escape. It’s as though every time you opened your mouth to eat you lost a quantity of blood. Ideally, you would open your mouth as seldom as possible, ingesting as much food as you could with every bite. This is essentially what a C-4 plant does. By recruiting extra atoms of carbon during each instance of photosynthesis, the corn plant is able to limit its loss of water and “fix”—that is, take from the atmosphere and link in a useful molecule—significantly more carbon than other plants.

At its most basic, the story of life on earth is the competition among species to capture and store as much energy as possible—either directly from the sun, in the case of plants, or, in the case of animals, by eating plants and plant eaters. The energy is stored in the form of carbon molecules and measured in calories. The calories we eat, whether in an ear of corn or a steak, represent packets of energy once captured by a plant. The C-4 trick helps explain the corn plant’s success in this competition: Few plants can manufacture quite as much organic matter (and calories) from the same quantities of sunlight and water and basic elements as corn. (Ninety-seven percent of what a corn plant is comes from the air, three percent from the ground.)

The trick doesn’t yet, however, explain how a scientist could tell that a given carbon atom in a human bone owes its presence there to a photosynthetic event that occurred in the leaf of one kind of plant and not another—in corn, say, instead of lettuce or wheat. The scientist can do this because all carbon is not created equal. Some carbon atoms, called isotopes, have more than the usual complement of six protons and six neutrons, giving them a slightly different atomic weight. C-13, for example, has six protons and seven neutrons. (Hence “C-13.”) For whatever reason, when a C-4 plant goes scavenging for its four-packs of carbon, it takes in more carbon 13 than ordinary—C-3—plants, which exhibit a marked preference for the more common carbon 12. Greedy for carbon, C-4 plants can’t afford to discriminate among isotopes, and so end up with relatively more carbon 13. The higher the ratio of carbon 13 to carbon 12 in a person’s flesh, the more corn has been in his diet—or in the diet of the animals he or she ate. (As far as we’re concerned, it makes little difference whether we consume relatively more or less carbon 13.)

One would expect to find a comparatively high proportion of carbon 13 in the flesh of people whose staple food of choice is corn—Mexicans, most famously. Americans eat much more wheat than corn—114 pounds of wheat flour per person per year, compared to 11 pounds of corn flour. The Europeans who colonized America regarded themselves as wheat people, in contrast to the native corn people they encountered; wheat in the West has always been considered the most refined, or civilized, grain. If asked to choose, most of us would probably still consider ourselves wheat people (except perhaps the proud corn-fed Midwesterners, and they don’t know the half of it), though by now the whole idea of identifying with a plant at all strikes us as a little old-fashioned. Beef people sounds more like it, though nowadays chicken people, which sounds not nearly so good, is probably closer to the truth of the matter. But carbon 13 doesn’t lie, and researchers who have compared the isotopes in the flesh or hair of North Americans to those in the same tissues of Mexicans report that it is now we in the North who are the true people of corn. “When you look at the isotope ratios,” Todd Dawson, a Berkeley biologist who’s done this sort of research, told me, “we North Americans look like corn chips with legs.” Compared to us, Mexicans today consume a far more varied carbon diet: the animals they eat still eat grass (until recently, Mexicans regarded feeding corn to livestock as a sacrilege); much of their protein comes from legumes; and they still sweeten their beverages with cane sugar.

So that’s us: processed corn, walking.

3. THE RISE OF ZEA MAYS

How this peculiar grass, native to Central America and unknown to the Old World before 1492, came to colonize so much of our land and bodies is one of the plant world’s greatest success stories. I say the plant world’s success story because it is no longer clear that corn’s triumph is such a boon to the rest of the world, and because we should give credit where credit is due. Corn is the hero of its own story, and though we humans played a crucial supporting role in its rise to world domination, it would be wrong to suggest we have been calling the shots, or acting always in our own best interests. Indeed, there is every reason to believe that corn has succeeded in domesticating us.

To some extent this holds true for all of the plants and animals that take part in the grand coevolutionary bargain with humans we call agriculture. Though we insist on speaking of the “invention” of agriculture as if it were our idea, like double-entry bookkeeping or the light-bulb, in fact it makes just as much sense to regard agriculture as a brilliant (if unconscious) evolutionary strategy on the part of the plants and animals involved to get us to advance their interests. By evolving certain traits we happen to regard as desirable, these species got themselves noticed by the one mammal in a position not only to spread their genes around the world, but to remake vast swaths of that world in the image of the plants’ preferred habitat. No other group of species gained more from its association with humans than the edible grasses, and no grass has reaped more from agriculture than Zea mays, today the world’s most important cereal crop.

Corn’s success might seem fated in retrospect, but it was not something anyone would have predicted on that day in May 1493 when Columbus first described the botanical oddity he had encountered in the New World to Isabella’s court. He told of a towering grass with an ear as thick as a man’s arm, to which grains were “affixed by nature in a wondrous manner and in form and size like garden peas, white when young.” Wondrous, perhaps, yet this was, after all, the staple food of a people that would shortly be vanquished and all but exterminated.

By all rights, maize should have shared the fate of that other native species, the bison, which was despised and targeted for elimination precisely because it was “the Indians’ commissary,” in the words of General Philip Sheridan, commander of the armies of the West. Exterminate the species, Sheridan advised, and “[t]hen your prairies can be covered with speckled cattle and the festive cowboy.” In outline Sheridan’s plan was the plan for the whole continent: The white man brought his own “associate species” with him to the New World—cattle and apples, pigs and wheat, not to mention his accustomed weeds and microbes—and wherever possible helped them to displace the native plants and animals allied with the Indian. More even than the rifle, it was this biotic army that did the most to defeat the Indians.

But corn enjoyed certain botanical advantages that would allow it to thrive even as the Native Americans with whom it had coevolved were being eliminated. Indeed, maize, the one plant without which the American colonists probably would never have survived, let alone prospered, wound up abetting the destruction of the very people who had helped develop it. In the plant world at least, opportunism trumps gratitude. Yet in time, the plant of the vanquished would conquer even the conquerors.

Squanto taught the Pilgrims how to plant maize in the spring of 1621, and the colonists immediately recognized its value: No other plant could produce quite as much food quite as fast on a given patch of New World ground as this Indian corn. (Originally “corn” was a generic English word for any kind of grain, even a grain of salt—hence “corned beef” it didn’t take long for Zea mays to appropriate the word for itself, at least in America.) The fact that the plant was so well adapted to the climate and soils of North America gave it an edge over European grains, even if it did make a disappointingly earthbound bread. Centuries before the Pilgrims arrived the plant had already spread north from central Mexico, where it is thought to have originated, all the way to New England, where Indians were probably cultivating it by 1000. Along the way, the plant—whose prodigious genetic variability allows it to adapt rapidly to new conditions—made itself at home in virtually every microclimate in North America; hot or cold, dry or wet, sandy soil or heavy, short day or long, corn, with the help of its Native American allies, evolved whatever traits it needed to survive and flourish.

Lacking any such local experience, wheat struggled to adapt to the continent’s harsh climate, and yields were often so poor that the settlements that stood by the old world staple often perished. Planted, a single corn seed yielded more than 150 fat kernels, often as many as 300, while the return on a seed of wheat, when all went well, was something less than 50:1. (At a time when land was abundant and labor scarce, agricultural yields were calculated on a per-seed-sown basis.)

Corn won over the wheat people because of its versatility, prized especially in new settlements far from civilization. This one plant supplied settlers with a ready-to-eat vegetable and a storable grain, a source of fiber and animal feed, a heating fuel and an intoxicant. Corn could be eaten fresh off the cob (“green”) within months after planting, or dried on the stalk in fall, stored indefinitely, and ground into flour as needed. Mashed and fermented, corn could be brewed into beer or distilled into whiskey; for a time it was the only source of alcohol on the frontier. (Whiskey and pork were both regarded as “concentrated corn,” the latter a concentrate of its protein, the former of its calories; both had the virtue of reducing corn’s bulk and raising its price.) No part of the big grass went to waste: The husks could be woven into rugs and twine; the leaves and stalks made good silage for livestock; the shelled cobs were burned for heat and stacked by the privy as a rough substitute for toilet paper. (Hence the American slang term “corn hole.”)

“Corn was the means that permitted successive waves of pioneers to settle new territories,” writes Arturo Warman, a Mexican historian. “Once the settlers had fully grasped the secrets and potential of corn, they no longer needed the Native Americans.” Squanto had handed the white man precisely the tool he needed to dispossess the Indian. Without the “fruitfulness” of Indian corn, the nineteenth-century English writer William Cobbett declared, the colonists would never have been able to build “a powerful nation.” Maize, he wrote, was “the greatest blessing God ever gave to man.”

Valuable as corn is as a means of subsistence, the kernel’s qualities make it an excellent means of accumulation as well. After the crop has supplied its farmer’s needs, he can go to market with any surplus, dried corn being the perfect commodity: easy to transport and virtually indestructible. Corn’s dual identity, as food and commodity, has allowed many of the peasant communities that have embraced it to make the leap from a subsistence to a market economy. The dual identity also made corn indispensable to the slave trade: Corn was both the currency traders used to pay for slaves in Africa and the food upon which slaves subsisted during their passage to America. Corn is the protocapitalist plant.

4. MARRIED TO MAN

But while both the new and the native Americans were substantially dependent on corn, the plant’s dependence on the Americans had become total. Had maize failed to find favor among the conquerors, it would have risked extinction, because without humans to plant it every spring, corn would have disappeared from the earth in a matter of a few years. The novel cob-and-husk arrangement that makes corn such a convenient grain for us renders the plant utterly dependent for its survival on an animal in possession of the opposable thumb needed to remove the husk, separate the seeds, and plant them.

Plant a whole corncob and watch what happens: If any of the kernels manage to germinate, and then work their way free of the smothering husk, they will invariably crowd themselves to death before their second set of leaves has emerged. More than most domesticated plants (a few of whose offspring will usually find a way to grow unassisted), corn completely threw its lot in with humanity when it evolved its peculiar husked ear. Several human societies have seen fit to worship corn, but perhaps it should be the other way around: For corn, we humans are the contingent beings. So far, this reckless-seeming act of evolutionary faith in us has been richly rewarded.

It is tempting to think of maize as a human artifact, since the plant is so closely linked to us and so strikingly different from any wild species. There are in fact no wild maize plants, and teosinte, the weedy grass from which corn is believed to have descended (the word is Nahuatl for “mother of corn”), has no ear, bears its handful of tiny naked seeds on a terminal rachis like most other grasses, and generally looks nothing whatsoever like maize. The current thinking among botanists is that several thousand years ago teosinte underwent an abrupt series of mutations that turned it into corn; geneticists calculate that changes on as few as four chromosomes could account for the main traits that distinguish teosinte from maize. Taken together, these mutations amounted to (in the words of botanist Hugh Iltis) a “catastrophic sexual transmutation”: the transfer of the plant’s female organs from the top of the grass to a monstrous sheathed ear in the middle of the stalk. The male organs stayed put, remaining in the tassel.

It is, for a grass, a bizarre arrangement with crucial implications: The ear’s central location halfway down the stalk allows it to capture far more nutrients than it would up top, so suddenly producing hundreds of gigantic seeds becomes metabolically feasible. Yet because those seeds are now trapped in a tough husk, the plant has lost its ability to reproduce itself—hence the catastrophe in teosinte’s sex change. A mutation this freakish and maladaptive would have swiftly brought the plant to an evolutionary dead end had one of these freaks not happened to catch the eye of a human somewhere in Central America who, looking for something to eat, peeled open the husk to free the seeds. What would have been an unheralded botanical catastrophe in a world without humans became an incalculable evolutionary boon. If you look hard enough, you can still find teosinte growing in certain Central American highlands; you can find maize, its mutant offspring, anywhere you find people.

5. CORN SEX

Maize is self-fertilized and wind-pollinated, botanical terms that don’t begin to describe the beauty and wonder of corn sex. The tassel at the top of the plant houses the male organs, hundreds of pendant anthers that over the course of a few summer days release a superabundance of powdery yellow pollen: 14 million to 18 million grains per plant, 20,000 for every potential kernel. (“Better safe than sorry” or “more is more” being nature’s general rule for male genes.) A meter or so below await the female organs, hundreds of minuscule flowers arranged in tidy rows along a tiny, sheathed cob that juts upward from the stalk at the crotch of a leaf midway between tassel and earth. That the male anthers resemble flowers and the female cob a phallus is not the only oddity in the sex life of corn.

Each of the four hundred to eight hundred flowers on a cob has the potential to develop into a kernel—but only if a grain of pollen can find its way to its ovary, a task complicated by the distance the pollen has to travel and the intervening husk in which the cob is tightly wrapped. To surmount this last problem, each flower sends out through the tip of the husk a single, sticky strand of silk (technically its “style”) to snag its own grain of pollen. The silks emerge from the husk on the very day the tassel is set to shower its yellow dust.

What happens next is very strange. After a grain of pollen has fallen through the air and alighted on the moistened tip of silk, its nucleus divides in two, creating a pair of twins, each with the same set of genes but a completely different role to perform in the creation of the kernel. The first twin’s job is to tunnel a microscopic tube down through the center of the silk thread. That accomplished, its clone slides down through the tunnel, past the husk, and into the waiting flower, a journey of between six and eight inches that takes several hours to complete. Upon arrival in the flower the second twin fuses with the egg to form the embryo—the germ of the future kernel. Then the first twin follows, entering the now fertilized flower, where it sets about forming the endosperm—the big, starchy part of the kernel. Every kernel of corn is the product of this intricate ménage à trois; the tiny, stunted kernels you often see at the narrow end of a cob are flowers whose silk no pollen grain ever penetrated. Within a day of conception, the now superfluous silk dries up, eventually turning reddish brown; fifty or so days later, the kernels are mature.*

The mechanics of corn sex, and in particular the great distance over open space corn pollen must travel to complete its mission, go a long way toward accounting for the success of maize’s alliance with humankind. It’s a simple matter for a human to get between a corn plant’s pollen and its flower, and only a short step from there to deliberately crossing one corn plant with another with an eye to encouraging specific traits in the offspring. Long before scientists understood hybridization, Native Americans had discovered that by taking the pollen from the tassel of one corn plant and dusting it on the silks of another, they could create new plants that combined the traits of both parents. American Indians were the world’s first plant breeders, developing literally thousands of distinct cultivars for every conceivable environment and use.

Looked at another way, corn was the first plant to involve humans so intimately in its sex life. For a species whose survival depends on how well it can gratify the ever shifting desires of its only sponsor, this has proved to be an excellent evolutionary strategy. More even than other domesticated species, many of which can withstand a period of human neglect, it pays for corn to be obliging—and to be so quick about it. The usual way a domesticated species figures out what traits its human ally will reward is through the slow and wasteful process of Darwinian trial and error. Hybridization represents a far swifter and more efficient means of communication, or feedback loop, between plant and human; by allowing humans to arrange its marriages, corn can discover in a single generation precisely what qualities it needs to prosper.

It is by being so obliging that corn has won itself as much human attention and habitat as it has. The plant’s unusual sexual arrangements, so amenable to human intervention, have allowed it to adapt to the very different worlds of Native Americans (and to their very different worlds, from southern Mexico to New England), of colonists and settlers and slaves, and of all the other corn-eating societies that have come and gone since the first human chanced upon that first teosinte freak.

But of all the human environments to which corn has successfully adapted since then, the adaptation to our own—the world of industrial consumer capitalism; the world, that is, of the supermarket and fast-food franchise—surely represents the plant’s most extraordinary evolutionary achievement to date. For to prosper in the industrial food chain to the extent it has, corn had to acquire several improbable new tricks. It had to adapt itself not just to humans but to their machines, which it did by learning to grow as upright, stiff-stalked, and uniform as soldiers. It had to multiply its yield by an order of magnitude, which it did by learning to grow shoulder to shoulder with other corn plants, as many as thirty thousand to the acre. It had to develop an appetite for fossil fuel (in the form of petrochemical fertilizer) and a tolerance for various synthetic chemicals. But even before it could master these tricks and make a place for itself in the bright sunshine of capitalism, corn first had to turn itself into something never before seen in the plant world: a form of intellectual property.

The free corn sex I’ve described allowed people to do virtually anything they wanted with the genetics of corn except own them—a big problem for a would-be capitalist plant. If I crossed two corn plants to create a variety with an especially desirable trait, I could sell you my special seeds, but only once, since the corn you grew from my special seeds would produce lots more special seeds, for free and forever, putting me out business in short order. It’s difficult to control the means of production when the product you’re selling can reproduce itself endlessly. This is one of the ways in which the imperatives of biology are difficult to mesh with the imperatives of business.

Difficult, but not impossible. Early in the twentieth century American corn breeders figured out how to bring corn reproduction under firm control and to protect the seed from copiers. The breeders discovered that when they crossed two corn plants that had come from inbred lines—from ancestors that had themselves been exclusively self-pollinated for several generations—the hybrid offspring displayed some highly unusual characteristics. First, all the seeds in that first generation (F-1, in the plant breeder’s vocabulary) produced genetically identical plants—a trait that, among other things, facilitates mechanization. Second, those plants exhibited heterosis, or hybrid vigor—better yields than either of their parents. But most important of all, they found that the seeds produced by these seeds did not “come true”—the plants in the second (F-2) generation bore little resemblance to the plants in the first. Specifically, their yields plummeted by as much as a third, making their seeds virtually worthless.

Hybrid corn now offered its breeders what no other plant at that time could: the biological equivalent of a patent. Farmers now had to buy new seeds every spring; instead of depending upon their plants to reproduce themselves, they now depended on a corporation. The corporation, assured for the first time of a return on its investment in breeding, showered corn with attention—R&D, promotion, advertising—and the plant responded, multiplying its fruitfulness year after year. With the advent of the F-1 hybrid, a technology with the power to remake nature in the image of capitalism, Zea mays entered the industrial age and, in time, it brought the whole American food chain with it.

 

 

 

 

TWO

THE FARM

 

 

1. ONE FARMER, 129 EATERS

To take the wheel of a clattering 1975 International Harvester tractor, pulling a spidery eight-row planter through an Iowa cornfield during the first week of May, is like trying to steer a boat through a softly rolling sea of dark chocolate. The hard part is keeping the thing on a straight line, that and hearing the shouted instructions of the farmer sitting next to you when you both have wads of Kleenex jammed into your ears to muffle the diesel roar. Driving a boat, you try to follow the compass heading or aim for a landmark on shore; planting corn, you try to follow the groove in the soil laid down on the previous pass by a rolling disk at the end of a steel arm attached to the planter behind us. Deviate from the line and your corn rows will wobble, overlapping or drifting away from one another. Either way, it’ll earn you a measure of neighborly derision and hurt your yield. And yield, measured in bushels per acre, is the measure of all things here in corn country.

The tractor I was driving belonged to George Naylor, who bought it new back in the midseventies, when, as a twenty-seven-year-old, he returned to Greene County, Iowa, to farm his family’s 320 acres. (He subsequently bought another 150 acres.) Naylor is a big man with a moon face and a scraggly gray beard. On the phone his gravelly voice and incontrovertible pronouncements (“That is just the biggest bunch of bullshit! Only the New York Times would be dumb enough to believe the Farm Bureau still speaks for American farmers!”) led me to expect someone considerably more ornery than the shy fellow who climbed down from his tractor cab to greet me in the middle of a field in the middle of a slate-gray day threatening rain. Naylor had on the farmer’s standard-issue baseball cap, a yellow chamois shirt, and overalls—the stripy blue kind favored by railroad workers, about as unintimidating an article of clothing as has ever been donned by a man. My first impression was more shambling Gentle Ben than fiery prairie populist, but I would discover that Naylor can be either fellow, the mere mention of “Cargill” or “Earl Butz” supplying the transformational trigger.

This part of Iowa has some of the richest soil in the world, a layer of cakey alluvial loam nearly two feet thick. The initial deposit was made by the retreat of the Wisconsin glacier ten thousand years ago, and then compounded at the rate of another inch or two every decade by prairie grasses—big bluestem, foxtail, needlegrass, and switchgrass. Tall-grass prairie is what this land was until the middle of the nineteenth century, when the sod was first broken by the settler’s plow. George’s grandfather moved his family to Iowa from Derbyshire, England, in the 1880s, a coal miner hoping to improve his lot in life. The sight of such soil, pushing up and then curling back down behind the blade of his plow like a thick black wake behind a ship, must have stoked his confidence, and justifiably so: It’s gorgeous stuff, black gold as deep as you can dig, as far as you can see. What you can’t see is all the soil that’s no longer here, having been blown or washed away since the sod was broken; the two-foot crust of topsoil here probably started out closer to four.

The story of the Naylor farm since 1919, when George’s grandfather bought it, closely tracks the twentieth-century story of American agriculture, its achievements as well as its disasters. It begins with a farmer supporting a family on a dozen different species of plants and animals. There would have been a fair amount of corn then too, but also fruits and other vegetables, as well as oats, hay, and alfalfa to feed the pigs, cattle, chickens, and horses—horses being the tractors of that time. One of every four Americans lived on a farm when Naylor’s grandfather arrived here in Churdan; his land and labor supplied enough food to feed his family and twelve other Americans besides. Less than a century after, fewer than 2 million Americans still farm—and they grow enough to feed the rest of us. What that means is that Naylor’s grandson, raising nothing but corn and soybeans on a fairly typical Iowa farm, is so astoundingly productive that he is, in effect, feeding some 129 Americans. Measured in terms of output per worker, American farmers like Naylor are the most productive humans who have ever lived.

Awards

  • WINNER
    James Beard Cookbook Award - Writings on Food
  • WINNER
    National Book Critics Circle Awards

Author

© Tabitha Soren
Michael Pollan is the author of nine books, including This Is Your Mind on Plants, How to Change Your Mind, Cooked, Food Rules, In Defense of Food, The Omnivore’s Dilemma, and The Botany of Desire, all of which were New York Times bestsellers. He is also the author of the audiobook Caffeine: How Coffee and Tea Made the Modern World. A longtime contributor to The New York Times Magazine, Pollan teaches writing at Harvard University and the University of California, Berkeley. In 2010, Time magazine named him one of the one hundred most influential people in the world. 

www.michaelpollan.com View titles by Michael Pollan

Guides

Educator Guide for The Omnivore's Dilemma

Classroom-based guides appropriate for schools and colleges provide pre-reading and classroom activities, discussion questions connected to the curriculum, further reading, and resources.

(Please note: the guide displayed here is the most recently uploaded version; while unlikely, any page citation discrepancies between the guide and book is likely due to pagination differences between a book’s different formats.)