Blog

What will the future of food look like?

Time.com recently asked a series of "experts" to opine about the future of food and predict how our plates will change.  The predictions are rather predictable as are the choice of experts.  

The selection of experts only included one scientists - nutritionist Marion Nestle - and her future look to me a lot like our past, as many of us: 

will enjoy home gardens and locally and sustainably produced food, at greater cost.

It is implicitly assumed that home gardens and "local" are the same as "sustainable".

Indeed, many of the answers fell prey to a kind of romantic traditionalism.

The list of experts mainly included chefs, journalists, and food activists.  Aside from Nestle, not one active food scientist was interviewed.  There was one restaurant consultant and one investor in "companies dedicated to solving food problems" interviewed, but not one person currently engaged in farming for a living, no food microbiologists, no geneticist, no agronomists, no animal scientists, no food engineers, no one working for today's largest food and agricultural companies.  In short, few of the kinds of people who are most likely to have the most substantive impact on the way we eat and farm in the future were interviewed.  

Its like our thinking about the future of food has become stuck in some sort of retrogressive mindset.  

Incentives for Safer Food

Over at the US Food Policy blog, Parke Wilde writes about the terrible track-record Foster Farms had with noncompliance leading up to it's widely publicized Salmonella outbreak.

Parke advocates for better public access to food safety information (such as, I presume, the public release of noncompliance reports written by food safety inspectors) as one approach to partially deal with food safety issues.  

He also points out the main challenge with food safety: as consumers we often cannot directly observe whether a food is contaminated before purchase.  Parke writes:

Food safety problems are fundamentally about lack of public information. If consumers had magic sunglasses that displayed the presence of Salmonella on chicken in the grocery store, there would be no need for government regulation. Immediately, faced with market consequences for distributing chicken with Salmonella, the companies would clean up their product.

Well, they may not be magic sunglasses, but it appears entrepreneurs are working on hand held sensorschopsticks, and iPhone apps that may one day let us quickly check for food contaminants.  

These innovations may, one day, prove to be a very powerful incentive for companies to provide safe food.  The nice thing - from the consumers' perspective - is that they let us take action before an illness happens.

Food Fads and Fears

I've been reading the book Fear of Food by Harvey Levenstein.  It is a fascinating read, chronicling the history of food fears and fads that hit Americans in the 19th and 20th centuries.  I have a few quibbles with some of the material in the chapter on "Bacteria and Beef", but overall, good stuff.

One passage showed how at least one version of the Paleo diet had been advanced since the early 1900s for many of the same reasons it is advocated today, almost 100 years later:

In 1920 Fleischmann’s urged eating its yeast cakes because ‘the process of manufacture or preparation’ removed from many foods the ‘life giving vitamine’ that provided the energy people needed. ‘Primitive man,’ it claimed, ‘secured an abundance of vitamines from his raw, uncooked foods and green, leafy vegetables. But the modern diet - constantly refined and modified - is too often badly deficient in vital elements.’

Levenstein also chronicles the emergence of food scientists and nutritionists who often had significant effects on dietary fads and public policies.  It is remarkable the hubris with which many of these men made dietary advice and public policy, particularly because we now know they were often quite wrong in their scientific knowledge.  Whether it was Metchnikoff and Kellogg and their views on autointoxication and the merits of yogurt, or Horace Fletcher's method of chewing to "Fletcherize" food,  or Harvey Wiley and his war on benzoate of soda, or Elmer McCollum and his promotion of acidosis, or Russell  Wilder's belief that thiamine deficiencies would cause the nation to loose their will to fight the Nazis - there seems to be a continual stream of people willing to use scant evidence to promote their favored cause to promote public health.  Not just idly promote - but with often with righteous indignation and certitude of belief.  I have no doubt many of these men passionately believed the diets they promoted but that didn't ultimately make them right.  

Levenstein writes, in the midst of concern of lack of vitamin consumption in 1941, that

The New York Times said, ‘The discovery that tables may groan with food and that we nevertheless face a kind of starvation has driven home the fact that we have applied science and technology none too wisely in the preparation of food.”

Unfortunately, something similar could be said about how applied science and technology have often been used none too wisely to promote various public policies and best selling books.   

It is true that science has progressed and we know more than we used to.  One of the things we've hopefully learned is that we often need to exercise a bit of humility.

A food producing machine

Imagine a biologist on an excursion in the Amazon looking for new plant species.  He comes across a new grass he's never seen, and brings it back home to his lab in the U.S.  He finds that the grass grows exceedingly well in greenhouses with the right fertilizer and soil, and he immediately moves to field trials.  He also notices that the grass produces a seed that durable, storable, and extraordinarily calorie dense.  The scientist immediately recognizes the potential for the newly discovered plant to solve global hunger problems and to meet the dietary demands of a growing world population.

But, there is a problem.  Lab analysis reveals that the seeds are toxic to humans.  Despite the set-back, the scientist doesn't give up.  He toils away year after year until he creates a machine that can convert the seeds into a food that is not only safe for humans to eat but that is incredibly delicious to eat.  There are a few downsides.  For every five calories that go into the machine, only one comes out.  Plus, the machine uses water, runs on electricity, burns fossil fuels, and creates CO2 emissions.  

Should the scientist be condemned for his work?  Or, hailed as an ingenious hero for finding a plant that can inexpensively produce calories, and then creating a machine that can turn those calories into something people really want to eat?  

Maybe another way to think about it is to ask whether the scientist's new product can pass the market test; can his new food - despite it's inefficiencies (which will make the price higher than it otherwise would be) - compete against other foods in the marketplace?  Recall, that the new food must be priced in a way that covers the cost of all the resources it uses - from the fertilizer to grow the new seeds to the gasoline required to run the new machine.

Now, let's call the new grass "corn" and the new machine "cow".  The analogy isn't perfect (e.g., the cow is a living-feeling being and not a lifeless machine), but the thought experiment is useful nonetheless.

It's particularly useful in thinking about the argument that corn is "wasted" in the process of feeding animals.  It is one that appears - in one form - in a recent paper in Science.  West et al. write:

Although crops used for animal feed ultimately produce human food in the form of meat and dairy products, they do so with a substantial loss of caloric efficiency. If current crop production used for animal feed and other nonfood uses (including biofuels) were targeted for direct consumption, ~70% more calories would become available, potentially providing enough calories to meet the basic needs of an additional 4 billion people (28). The human-edible crop calories that do not end up in the food system are referred to as the “diet gap.”

I'm not sure the logic of this sort of argument adds up.  

Unlike my hypothetical example, corn is not toxic to humans (although some of the grasses cows eat really are inedible to humans).  Nevertheless, few people really want to eat the calories that directly come from corn or other common animal feeds like soybeans.  

So, why do we grow so much corn and soy?  They are incredibly efficient producers of calories and protein.  Stated differently, these crops (or "grasses" if you will) allow us to produce an inexpensive, bountiful supply of calories in a form that is storeable and easily transported.  

The assumption in the quote of the Science article seems to either be that the "diet gap" will be solved by: 1) convincing people to eat the calories in corn and soy directly, or 2) that there are other tasty-edible crops that can be widely grown instead of corn and soy which can produce calories as efficiently as corn and soy.  Aside from maybe rice or wheat (which also require some processing to become edible), the second assumption is almost certainly false.  I'm also skeptical about the first assumption - that large swaths of people will voluntarily consume substantial calories directly from corn or soy.

What we typically do is take our relatively un-tasty corn and soy, and plug them into our machine (the cow or pig or chicken) to get a form of food we want to eat.  Yes, it seems inefficient on the surface of it, but the key is to realize the that the original calories from corn and soy were not in a form most humans find desirable.  As far as the human pallet is concerned, not all calories are created equal; we care a great deal about the form in which the calories are delivered to us.

The grass-machine analogy also helps make clear that it is probably a mistake to compare the calorie and CO2 footprint of the corn directly with the cow.  I suspect only a very tiny fraction of the world's caloric consumption comes from directly consuming the raw corn or soy seeds.  It takes energy to convert these seeds into an edible form – either through food processing or through animal feeding. So, what we want to compare is beef with other processed foods.  Otherwise we're comparing apples and oranges (or in this case, corn and beef).

Why are cows more productive?

I've written frequently about the incredible productivity gains witnessed in agriculture, and I often mention such statistics in talks I give.  

But, what does the average person think when they hear this sort of evidence?

I decided to find out.  In the latest edition of the Food Demand Survey (FooDS), I provided respondents with some statistics on increased productivity in dairy production based on some data compiled by Bailey Norwood (I should note that Bailey mentioned that he is revisiting these figures, and thinks current water usage is likely higher than 2 gallons) .  

Here's what was asked:

In 1945, it took about 10 gallons of water and 50 lbs of feed to produce a gallon of milk. Today, it only takes about 2 gallons of water and 10 lbs of feed to produce a gallon of milk. Each dairy cow today produces about 200% more milk compared to one in 1960. How do you think this change happened?

People were then prompted to provide an open-ended response.   They could type anything they wanted.

As you might suspect, answers were all over the board (a complete, unedited list of the more than 1,000 responses is here) from "Witchcraft" to "corruption" to "I have no idea".

A keyword search was conducted among the open-ended responses.  Some of the main keywords mentioned were: hormones (69), growth hormones (41), feed (28), technology (25), and selective breeding (20). 

Looking through the responses it seems some variation on "hormones" ("hormones" or "growth hormones" or "steroids" or "drugs") were particularly common.  This is interesting because hormones aren't much used in milk production.  Some dairy cows are given the hormone rBST to boost production, and adoption of the increased in the 1990s and peaked sometime in the early 2000s and then has fallen off since then.  According to one USDA source, only about 22% of cows in the US received rBST in 2002. This paper reports that only 9.5% of dairy producers used rBST in 2010.  Thus, there appears to be something of a disconnect between how people think productivity gains occur vs. the reality on the ground.

Individual responses were placed into seven different categories related to:

  • hormones (134 responses)
  • feed chooses (78 responses)
  • science (61 responses)
  • breeding and genetics (61 responses)
  • drugs and steroids (30 responses) 
  • farming techniques (27 responses)
  • economics (6 responses)
  • others

You can read more about the responses in latest edition of FooDS.