At one particularly strange moment in my career, I found myself picking through giant conical piles of dung produced by emus—those goofy Australian kin to the ostrich. I was trying to figure out how often seeds pass all the way through the emu digestive system intact enough to germinate. My colleagues and I planted thousands of collected seeds and waited. Eventually, little jungles grew.
Clearly, the plants that emus eat have evolved seeds that can survive digestion relatively unscathed. Whereas the birds want to get as many calories from fruits as possible—including from the seeds—the plants are invested in protecting their progeny. Although it did not occur to me at the time, I later realized that humans, too, engage in a kind of tug-of-war with the food we eat, a battle in which we are measuring the spoils—calories—all wrong.
Food is energy for the body. Digestive enzymes in the mouth, stomach and intestines break up complex food molecules into simpler structures, such as sugars and amino acids that travel through the bloodstream to all our tissues. Our cells use the energy stored in the chemical bonds of these simpler molecules to carry on business as usual. We calculate the available energy in all foods with a unit known as the food calorie, or kilocalorie—the amount of energy required to heat one kilogram of water by one degree Celsius. Fats provide approximately nine calories per gram, whereas carbohydrates and proteins deliver just four. Fiber offers a piddling two calories because enzymes in the human digestive tract have great difficulty chopping it up into smaller molecules.
Every calorie count on every food label you have ever seen is based on these estimates or on modest derivations thereof. Yet these approximations assume that the 19th-century laboratory experiments on which they are based accurately reflect how much energy different people with different bodies derive from many different kinds of food. New research has revealed that this assumption is, at best, far too simplistic. To accurately calculate the total calories that someone gets out of a given food, you would have to take into account a dizzying array of factors, including whether that food has evolved to survive digestion; how boiling, baking, microwaving or flambéing a food changes its structure and chemistry; how much energy the body expends to break down different kinds of food; and the extent to which the billions of bacteria in the gut aid human digestion and, conversely, steal some calories for themselves.
Nutrition scientists are beginning to learn enough to hypothetically improve calorie labels, but digestion turns out to be such a fantastically complex and messy affair that we will probably never derive a formula for an infallible calorie count.
A Hard Nut to Crack
The flaws in modern calorie counts originated in the 19th century, when American chemist Wilbur Olin Atwater developed a system, still used today, for calculating the average number of calories in one gram of fat, protein and carbohydrate. Atwater was doing his best, but no food is average. Every food is digested in its own way.
Consider how vegetables vary in their digestibility. We eat the stems, leaves and roots of hundreds of different plants. The walls of plant cells in the stems and leaves of some species are much tougher than those in other species. Even within a single plant, the durability of cell walls can differ. Older leaves tend to have sturdier cell walls than young ones. Generally speaking, the weaker or more degraded the cell walls in the plant material we eat, the more calories we get from it. Cooking easily ruptures cells in, say, spinach and zucchini, but cassava (Manihot esculenta) or Chinese water chestnut (Eleocharis dulcis) is much more resistant. When cell walls hold strong, foods hoard their precious calories and pass through our body intact (think corn).