ADVERTISEMENT
latest stories:

The Complex Origins of Food Safety Rules--Yes, You Are Overcooking Your Food

U.S. agencies recommend temperatures and times far beyond those supported by science
Modernist Cujisine, volume 1



The Cooking Lab

Editor's note: The following is an edited excerpt from a chapter in Modernist Cuisine: The Art and Science of Cooking (The Cooking Lab, 2011), a six-volume set consisting of 2,348 pages of text and photography.

Scientific research on foodborne pathogens provides the foundation for all food safety rules. Generally speaking, two kinds of research inform us about issues of food safety. The first is laboratory experimentation: for example, testing how much heat will kill a pathogen or render it harmless. Data from these experiments tell us the fundamental facts about pathogens of interest. The second kind of research is investigation of specific outbreaks of foodborne illness.

You might think that scientific evidence would constitute the “last word” when food safety rules are made, but in fact it’s only the beginning. Policy makers take many other factors into consideration, including tradition, cultural trends, political expediency, and pressure from industry.To some extent, it’s reasonable to apply these modifiers because public health, not scientific purity, is the ultimate goal of food safety regulations. But this approach sometimes imposes arbitrary and scientifically indefensible restrictions that limit food choices, confuse the public, and prevent cooks from preparing the highest-quality meals.

To complicate matters, some guesswork and compromise are inevitable in setting safety standards. Take, for example, the way in which health officials decide how much the pathogen count should be reduced when heating food. In the preceding chapter, we reviewed the terminology used to describe these reductions.Killing 90% of the pathogens within a specific food, for example, is called a 1D reduction (where D stands for “decimal,” or factor of 10). Killing 99 percent of the pathogens is referred to as a 2D reduction, killing 99.99 percent is termed a 4D reduction, and so forth.

Cooks achieve these reductions by maintaining food at a given temperature for a corresponding length of time. The practical impact of an elevated D level is a longer cooking time at a particular temperature. If a 1D reduction requires 18 minutes at 54.4 degrees C / 130 degrees F , then a 5D reduction would take five times as long, or 90 minutes, and a 6.5D reduction would take 6.5 times as long, or 117 minutes. Clearly, the D levels targeted for food can have a profound effect on the manner and quality of cooking.

What D level should regulators choose to ensure food safety? If the food contains no pathogens to begin with, then it’s not necessary to kill pathogens to any D level! Highly contaminated food, on the other hand, might need processing to a very high D level. Right away, you can see that decisions about pathogen-reduction levels are inherently arbitrary because they require guessing the initial level of contamination. That guess can be supported by the results of scientific studies measuring the number of foodborne pathogens present under the various conditions that cooks encounter. But it’s still a guess.

Many people don’t realize that authorities rely on guesswork to develop these standards. Chefs, cookbook authors, and public health officials often make dogmatic statements that food cooked to
a standard is “safe,” but food cooked less than the standard is “unsafe.” That can never be literally true. No matter what the standard is, if the food is highly contaminated, it might still be unsafe (especially owing to cross-contamination). And on the other hand, if the food is not contaminated, then eating it raw won’t hurt you.

All food safety standards deal in probabilities. Reaching a higher standard (i.e., cooking food longer or at a higher temperature) will make the food less likely to be unsafe, and targeting a lower standard will make it a bit more likely. But there are no guarantees and no absolutes.

To compensate for this inherent uncertainty, food safety officials often base their policies on the so-called worst-case scenario. They reason that if you assume the absolute worst contamination levels and act to address that threat, then the public will always be safe. Setting relatively high D levels to account for a worst-case scenario establishes such a formidable barrier for pathogens that even highly contaminated food will be rendered safe. High D levels also offer a measure of insurance against an imperfect thermometer, an unevenly heated oven, an inaccurate timer, or an impatient chef. If real-world conditions miss the mark, slightly lower reductions will still suffice.

Not surprisingly, some food safety experts challenge this conservative approach. The required pathogen reductions or “drops” explicitly cited in U.S. federal regulations, for example, range from
a 4D drop for some extended-shelf-life refrigerated foods, such as cooked, uncured meat and poultry products, to a 12D drop for canned food, which must last for years on the shelf. General FDA cooking recommendations for fresh food are set to reach a reduction level of 6.5D, which corresponds to killing 99.99997 percent of the pathogens present. Many nongovernmental food safety experts believe this level is too conservative and instead consider 5D to 6D pathogen reduction for fresh foods sufficient for real-world scenarios.

An expert advisory panel charged with reviewing the scientific basis of food safety regulations in the United States made just this point about standards developed by the U.S. Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS). In a 2003 report, the panel, assembled by the U.S. Institute of Medicine and National Research Council, questioned the FSIS Salmonella reduction standards for ready-to-eat poultry and beef products. In devising its standards, the FSIS had established a worst-case Salmonella population for the precooked meat of each animal species, then calculated the probability that the pathogen would survive in 100 grams / 3.5 ounces of the final ready-to-eat product.

In the case of poultry, for example, the FSIS calculated a worst-case scenario of 37,500 Salmonella bacteria per gram of raw meat. For the 143 grams / 5 ounces of starting product necessary to yield 100 grams / 3.5 ounces of the final, ready-to-eat product, that works out to nearly 5.4 million Salmonella bacteria before cooking. To protect consumers adequately, the FSIS recommended a 7D drop in bacterial levels, equivalent to a reduction from 10 million pathogens to one.

The review committee, however, found fault with several FSIS estimates that, it said, resulted in an “excessively conservative performance standard.”Even “using the highly improbable FSIS worst-case figure,” the committee concluded that the ready-to-eat regulation should instead require only a 4.5D reduction.

The irony is that, although experts debate these matters, their rigorous analyses can be undermined by confounding factors such as cross-­contamination. Imagine, for example, that a highly contaminated bunch of spinach really does require a 6.5D reduction in pathogens to be safe. Even if that spinach is properly cooked, it could have contaminated other food or utensils in the kitchen while it was still raw, rendering moot even an extreme 12D reduction during the cooking process. A chain is only as strong as the weakest link, and in food safety, cross-contamination is often the weakest link. One powerful criticism of food safety standards is that they protect against unlikely worst-case scenarios yet do not address the more likely event of cross-contamination.

Another conservative tactic used by health officials is to artificially raise the low end of
a recommended temperature range. Most food pathogens can be killed at temperatures above 50 degrees C / 120 degrees F , yet food safety rules tend to require temperatures much higher than that. Experts may worry that relying on the low end of the range may be dangerous for the same reasons that moderate D levels cannot be trusted: vacillating oven temperatures, varying chef temperaments, and so on. Still, their solution belies the facts.

For Our Own Good?

The public health goal of maintaining food safety and minimizing harm poses an interesting dilemma: when does the end justify the means? More specifically, is it justifiable to promote unscientific food safety standards in the name of public safety? Regulators seem to act as if it is.

During a recent outbreak of Escherichia coli linked to contaminated fresh spinach in the United States, public health authorities initially told consumers, retailers, and restaurants to throw out all spinach, often directly stating in public announcements that it could not be made safe by cooking it. This assertion is scientifically incorrect: E. coli is very easy to kill with heat.

Evidently the officials decided that oversimplifying the public message was better than telling the truth. They may have feared that if people cooked contaminated spinach to make it safe to eat, but either didn’t cook it sufficiently or cross-contaminated other food or kitchen surfaces in the process, more fatalities would result. The authorities must have decided that the benefits of avoiding multiple accidental deaths far outweighed the costs of simply tossing out all spinach. In this case they probably were right to make that decision. The cost of some spinach is small compared to the misery and expense of hospitalization.

Oversimplifying for the sake of public safety is a very reasonable thing to do in the midst of an outbreak or other health crisis. It may well have saved lives to lie to the public and announce things that, strictly speaking, are false (for example, that you can’t kill E. coli with heat).

However, outside of a crisis situation, there is a pervasive danger that this philosophy leads to “dumbing down,” oversimplifying, or fabricating food safety information. It is very easy for public health officials to adopt the paternalistic attitude that they can make scientifically incorrect statements with impunity, even in situations in which the balance of risks is nothing like that which occurs during a crisis. Who pushes back against nonsensical rules? The reality is that the only groups that push back are those that have political clout.

Because of this approach, culinary professionals and casual cooks alike have been grossly misled about a wide range of food safety issues and are often subjected to distorted, incomplete, or contradictory rules. When a political interest group exists, it is that group’s opinion, rather than science, that shapes the rules. But when there is no political force to push back, the rules can be overstated and excessive.

Consider the overstated risk of exposure to Trichinella , which has led to ridiculously excessive recommendations for cooking pork. This overkill is just one of many such examples. Cooking standards for chicken, fish, and eggs, as well as rules about raw milk cheeses, all provide examples of inconsistent, excessive, or illogical standards. To a public health official, mandating that pork chops or chicken breasts be dry and overcooked makes sense if it keeps even one person from getting sick. In this calculus, one less case of foodborne illness is worth millions of ruined chops or breasts.

That attitude becomes harder to defend, however, if you accept that overcooking food comes at a cost. A chef’s livelihood may depend on producing the best taste and texture for customers. Home cooks who love food want it to taste the very best that it can. To a person who cares about the quality of food—or who makes a living based on it—excessive food safety standards don’t come cheap.

A balance must be struck between the risk of foodborne illness and the desire for palatable food. In cases such as those of pork and chicken, misleading the public about a rarely occurring scen­ario (while ignoring other, larger risks) arguably offers little protection and comes at the cost of millions of unnecessarily awful meals.

Culture Clash

The excessive restrictions on cooking pork didn’t come out of nowhere. In decades past, pork was intrinsically less safe than other meats because of muscle infiltration by Trichinella and surface contamination from fecal-borne pathogens like Salmonella and Clostridium perfringens . As a result, people learned to tolerate overcooked pork, and farms raised pigs with increasing amounts of fat—far more fat than is typical in the wild ancestors of pigs such as wild boar. The extra fat helped to keep the meat moist when it was overcooked.

Since then, research has sharpened our understanding of pork-associated pathogens, and producers have vastly reduced the risk of contamination through preventive practices on the farm and in meat-processing facilities. Eventually the FDA relaxed the cooking requirements for pork; they are now no different than those for other meats. The irony is that few people noticed—­culinary professionals and cookbook authors included. Government information aimed at consumers from both the USDA and the FDA continued to promote excessive cooking standards for pork. Amazingly, even pork industry groups continued to do the same thing.

After decades of consuming overcooked pork by necessity, the American public has little
appetite for rare pork; it isn’t considered traditional. With a lack of cultural pressure or agitation for change by industry groups, the new standards are largely ignored, and many new publications leave the old cooking recommendations intact.

Clearly, cultural and political factors impinge on decisions about food safety. If you doubt that, note the contrast between the standards applied to pork and those applied to beef. Many people love rare steak or raw beef served as carpaccio or steak tartare, and in the United States alone, millions of people safely eat beef products, whether raw, rare, or well-done. Beef is part of the national culture, and any attempt to outlaw rare or raw steak in the United States would face an immense cultural and political backlash from both the consumers and the producers of beef.

Millions of servings of rare beef steak or completely raw steak tartare or carpaccio are served every day, so if that meat were inherently dangerous, we’d certainly know by now. Scientific investigation has confirmed the practice is reasonably safe—almost invariably, muscle interiors are sterile and pathogen-free. That’s true for any meat, actually, but only beef is singled out by the FDA. The cultural significance of eating raw and rare beef, as much as the science, accounts for the FDA’s leniency in allowing beef steak to be served at any internal temperature.

Cultural and political factors also explain why cheese made from raw milk is considered safe in France yet viewed with great skepticism in the United States. Traditional cheese-making techniques, used correctly and with proper quality controls, eliminate pathogens without the need for milk pasteurization. Millions of people safely consume raw milk cheese in France, and any call to ban such a fundamental part of French culture would meet with enormous resistance there.

The United States, however, lacks a broadly recognized culture of making or eating raw milk cheeses. Not coincidentally, health officials have imposed inconsistent regulations on such cheeses. Raw milk cheese aged less than 60 days cannot be imported into the United States and cannot legally cross U.S. state lines. Yet in 24 of the 50 states, it is perfectly legal to make, sell, and consume raw milk cheeses within the state. In most of Canada raw milk cheese is banned, but in the province of Quebec it is legal.

How can these discrepancies among and even within countries persist? It comes down to politics. In areas without a substantial local population demanding unpasteurized milk cheeses—
a few gourmets, foodies, and chefs don’t count for much politically—no backlash has ensued. So the seemingly conservative rule holds, banning anything that seems remotely suspicious.

Where artisanal cheese producers have more public support, the laws allow raw milk cheese. Raw milk cheese is a product of small-time artisans. As of this writing, no large, politically connected producers are making these cheeses in the U.S., so no movement has emerged to make laws on raw milk cheese more consistent and reasonable.

Bureaucracy affects food safety rules in more subtle ways as well. Changing a regulation is always harder than keeping it intact, particularly if the change means sanctioning a new and strange food or liberalizing an old standard. No one will praise public health officials and organizations for moist pork chops, but plenty will heap blame should someone fall ill after regulators relax a safety standard.

Misconceptions About Chicken

The misconceptions surrounding chicken are in some ways similar to those that plague pork but are arguably even more confusing because of conflicting standards and widespread blurring between fact and fiction. First, the facts: chickens can indeed host asymptomatic Salmonella infections, and it is not uncommon for chicken feces to contain high levels of the pathogenic bacteria. Moreover, chickens are typically sold whole, which means that they may carry remnants of any fecal contamination of the skin or interior abdominal cavity that occurred during slaughter and processing. That’s why chicken and chicken-derived products are considered such common sources of foodborne Salmonella .

As with Trichinella and pork, however, the link between contaminant and food has been exaggerated. Many people believe, for example, that chicken is the predominant source of Salmonella . That’s not necessarily the case. In a 2009 analysis by the CDC, Salmonella was instead most closely associated with fruits and nuts, due in part to an outbreak linked to peanut butter in 2006. Indeed, the tally of outbreak-linked foodborne illnesses attributable to produce was nearly double the tally of such illnesses associated with poultry, and the foodborne pathogen most commonly linked with poultry was not Salmonella but the bacterium Clostridium perfringens.

For ready-to-eat food products, including rotisserie and fast-food chicken, the FSIS calls for a 7D reduction in Salmonella levels. In 2001, the FSIS developed a corresponding set of time-and-­temperature tables for chicken and turkey products according to their fat content. The tables, based on the research of microbiologist Vijay K. Juneja, Ph.D. and colleagues at the USDA Agricultural Research Service, include fat contents as high as 12 percent and recommended temperatures as low as 58 degreesC / 136 degrees F . As we’ve previously discussed, that set of standards has been challenged as overly conservative by an advisory panel, which instead suggested a 4.5D reduction, allowing a 36 percent decrease in cooking times from the FSIS 7D standard.

In 2007 Juneja’s team published the results of a study directly examining Salmonella growth in ground chicken breast and thigh meat. The data show that cooking chicken meat at temperatures as low as 55 degrees C / 131 degrees F for much shorter times produces a 6.5D reduction. The researchers’ curve is quite similar to the FDA’s 6.5D reduction curve for whole-meat roasts, except for a sizeable divergence in time at the 60 degrees C / 140 degrees F temperature point.

So who’s right? Technically, destruction of Salmonella can take place at temperatures as low as 48  degrees C / 120 degreesF given enough time. There is no scientific reason to prefer any one point on the reduction curve, but the experts who formulated the FSIS ready-to-eat standards arbitrarily decided to go no lower than 58  degrees C / 136  degrees F . Likewise, officials preparing the FDA Food Code and other reports chose 74  degrees C / 165  degrees F as an arbitrary cut-off. The choice seems to have been based not on science but on politics, tradition, and subjective judgment.

Health officials have admitted as much. In a January 2007 report published in the Journal of Food Protection , a panel called the National Advisory Committee on Microbiological Criteria for Foods conceded that, on the basis of preconceived notions of consumer taste, the FSIS recommended higher cooking temperatures to consumers than to makers of processed chicken products: "T he temperatures recommended to consumers by the FSIS exceed those provided to food processors, because poultry pieces cooked to 160 °F are generally unpalatable to the consumer because of the pink appearance and rubbery texture."

Elsewhere in the same report, the authors suggested that a final temperature of 77 degrees C / 170 degrees F for whole-muscle breast meat and 82 degrees C / 180 degrees F for whole-muscle thigh meat “may be needed for consumer acceptability and palatability.”

These are amazing admissions! In effect, the authors are saying that FSIS consumer regulations, which are ostensibly based on safety considerations, are in reality based on bureaucrats’ be­liefs about consumer preference. That is hardly their charter! Shouldn’t chefs and consumers be the ones to decide what they would prefer to eat? Perhaps the most galling aspect of this stance is that the advisors are just wrong about the culinary facts. Chicken cooked at 58  degrees  C / 136   degrees F and held there for the recommended time is neither rubbery nor pink . In our opinion its texture and flavor are far superior to those of chicken cooked at the extremely high temperatures the experts recommend. Regulators’ misguided and patronizing attempts to cater to consumer preference have served only to perpetuate the tradition of overcooking chicken.

 
Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American Holiday Sale

Give a Gift &
Get a Gift - Free!

Give a 1 year subscription as low as $14.99

Subscribe Now! >

X

Email this Article

X