Skip to content

100% Raw food Vs. Cooked food…. The truth

June 12, 2011

SirNatural,

 

Controversy within raw-food circles. In light of the first two parts, it should be clear that some benefits (although not perfect health) can be expected from increasing the amount of raw food compared to the average Westerner’s diet. However, one of the main controversies in raw-food circles is whether eating 100% raw constitutes an improvement compared to a “predominantly raw” diet. While a general answer would be difficult to give because of the number of other parameters (what, specifically, would you cook? what would you eat raw?), we claim that in general, going from predominantly raw to 100% doesn’t constitute a clear improvement, and even that including some cooked food is often beneficial.

 

  • Why did humans begin cooking?The first section, mostly speculative, discusses the reasons why humans started cookingtheir food. 
  • What kind of cooking? Since the word “cooked” is very imprecise and can refer to steamed kale as well as potato chips or chocolate cake, we try to clarify what kind of mixed (raw/cooked) diet might be most prudent, and examine whether some forms of cooking should be avoided. 
  • How does cooking affect the overall diet?Then, we analyze in more depth how balance of nutrients is affected by cooking in regard tominerals, etc. 
  • Cooking practices among hunter-gatherers and reputedly healthy traditional peoples. Observing that there are no raw-food cultures in the world, we give an overview of the cooking practices of the peoples who reportedly enjoy far better health than in industrialized countries. 
  • Assessing what Instinctive Eating and Natural Hygiene say about cooking. Obviously, a number of different raw-food trends exist. It is said by Instinctive Nutrition that the diet requires 100% raw food for the food-selection instinct and satiation mechanism to work efficiently; and it is said by some Natural Hygienists that, since cooking is unnatural, we should eat 100% raw, according to the strictest interpretation of the principles of Natural Hygiene. We discuss successively these arguments, and we claim that they have some serious weaknesses. 
  • Weight of anecdotal evidence in the raw-food community. Finally, we review some anecdotal evidence: we suggest that many of the improvements experienced by people starting raw food can be explained by numerous reasons unrelated to whether their food is heated or not. We also examine what happens when a long-time raw-foodist comes back to a partiallycooked diet. 


Why does Homo sapiens eat cooked food?



This section is primarily speculation, but well worth the exercise. One might wonder: Why did humans start cooking their food, if cooking is so bad? When asking people in the street, answers would be “to kill germs,” “to kill parasites,” “to tenderize and improve digestibility,” “to improve taste,” or, more naively, they simply believe that cooking is part of what makes humans civilized and superior to animals.

A Natural Hygienist would object that germs are not dangerous in the context of a robust, healthy, and naturally fed body–that cooked food is “dead” and has lost most of its nutritive value, and that cooking is unnatural and unnecessary.

An Instincto will say that, once humans started cooking, say, a sweet potato, the quantity they ate was much higher than usual because the instinctive “stop” mechanism only works well with original, raw foods we’ve been exposed to during evolution, but is subverted by cooking. Thus, since their metabolisms became overloaded, raw sweet potatoes tasted less good the following morning, and the only way to get enough satisfaction with food was to continue cooking. In a sense, humans became prisoners of the vicious cycle they had built themselves.

What likely happened in reality? While more specific details about the earliest beginnings of fire use will probably always be speculative, there are still a few things that can be said. Goudsblom [1992, pp. 12-23] provides a grounded view of the plausible stages of increasing human familiarity with, and eventual control over, fire, recapitulated here in the following bullet points:

 

  • First natural wildfires. Evidence for naturally occurring forest fires goes back as far as evidence for forest vegetation itself–about 350 million years, with such fires (as with wildfires today) likely being ignited by lightning and/or volcanic eruptions. While fire in nature is often viewed by modern-day humans as mainly destructive, and this is certainly true of its initial effects, the longer-term effects of wildfires in a primitive natural setting can be positive. Some plants and trees, for example, are “pyrophytes” that depend on fire to propagate and spread successfully, and many of them provide food and/or shelter for animal species. 
  • Typical responses to wildfires by predators and other animals as guide to early primitive human relationship to fire. Based on present-day observations of how animals react and respond to wildfires, which can be used as a behavioral baseline, one can infer that early humans would have exhibited at least as sophisticated responses to it. Typically, predators move in soon after a fire to forage for food among the charred or partially burnt remains. Ruminants later visit to lick at the ashes (for salt), and in general, mammals visiting the site appear to enjoy its warmth at night. Goudsblom refers to these types of behavior as “passive” use of fire. It may also have been at this stage that humans would first have begun to appreciate not just the different and perhaps appealing taste of fired food, but more importantly its effects in preserving meat for later consumption when it would otherwise spoil if not soon eaten–a survival advantage. 
  • The transition from “passive” to “active” use of fire. However, perhaps the most interesting and, of necessity, speculative question is how the transition occurred from the initial stage of such passive, opportunistic use to more “active” control and deliberate use of fire. Here, it can be supposed that at first, as with other animals, humans were wholly dependent on intermittent, fortuitous encounters with wildfires in gaining familiarity and experience. At some point, however, poking around like other predators in the aftermath of a still-smoldering fire for food, but utilizing a stick for a tool in typical human fashion, they would have noticed that branches used in this way might reignitethe fire.From there it is not hard to imagine the dawning recognition that fire could be kept burning and transported from place to place for safekeeping, prior to the stage of being able to create fire directly. (Side note: This basic scenario obviously is what provided the ideas for the underlying plotline of the 1981 movie Quest for Fire.)

     

Where later controlled use of fire is concerned and can be documented by science, the most reliable studies seem to indicate that humans did not start using fire consistently until about 400,000-500,000 years ago (see discussion in Fire and Cooking in Human Evolution), perhaps to drive predators away and keep warm, but may not have used it yet at that time to cook food on a regular basis. It’s commonly accepted in the paleoanthropological community that where the less equable environments of the temperate zones are concerned (and particularly where the harsher climates of higher latitudes or altitudes are concerned), fire probably would have been essential for warmth and to thaw out meat that had become frozen. From there it likely would not be much of a leap to proceed to the next step of beginning tocook food.

Speculative scene around an ancient campfire. In any event, it’s not difficult to imagine that, once fire use became widespread, one evening, when a family gathered for dinner around the fire–with some individuals tossing into it various objects (like branches), or the occasional piece of passed-over food to see what might happen–one of them got the idea of approaching a piece of meat thrown into the fire earlier. Humans are very curious by nature, so it is impossible to imagine that cooking occurred completely by accident; experiments like this must have commonly been made.

Coming back to our prehistoric individual warming themselves by the fire, the Eland steak, now cooked rare, began to acquire a delightful aroma of roasted meat, much stronger than what they were used to with raw meat. Perhaps they would have been a bit fearful the food might be altered in such a way that it became dangerous, but on the other hand, their senses told them that this piece of meat was decidedly attractive. With some hesitation, they tried a little bit of it, and waited anxiously for the next morning.

The family then went to sleep, and, at dawn, apparently no ill-effect was observed. After a number of such hesitant experiments by themselves and others in the group as well, it was concluded that cooking improves taste and is not harmful, and they decided to continue from then on. Obviously, they didn’t think at all about germs, since Pasteur wasn’t born yet.

Was cooking an evolutionary “error”? This story may or may not be valid, but an important question is: was cooking a “mistake”? Aside from any biochemical questions, this is the primary question that those who align themselves with “philosophical naturalism” (the position that what is natural is inherently good and to be emulated) must answer. Children will also put their fingers in the fire, but as they burn themselves, they learn from their mistake and don’t do it again. Cooking doesn’t produce ill-effects, at least in the short-term, so humans might not have been aware of their “mistake” yet, and it is tempting to conclude that now that we have grown up as a species, we should realize it is time to stop playing with fire.

Enhanced survival is the test of adaptation. In principle this argument makes a certain amount of sense on the face of it, but overlooks the fact that, by cooking, many inedible foods become edible, and thus confer a significant adaptive advantage in an uncertain and wild environment, since the range of the diet is extended. We will see below that part of the success of the !Kung San in their marginal environment can be in part attributed to the use of fire and other types of rudimentary food processing, without which life would be much more difficult, perhaps impossible.

Idealist views of nature contradicted by examples of foods obtainable by actual hunter-gatherers. It has been claimed by numerous raw-foodists that originally humans started cooking because, as they migrated out of the tropical African climate in which the species began, fruits became unavailable in winter, and the only way to be able to eat a sufficient amount of food or meat was to cook it. Again, this argument might make some sense if one could point to at least a few tribes in tropical countries who consumed predominantly raw fruit, but this is simply not the case. In actuality, there are no tribes in the tropics who eat this way for the simple reason that fruit is not as abundant or easily available to human foragers as raw/fruitarian advocates seem to suppose. See Hawkes et al. [1982] for a good discussion of how “optimal foraging theory” applies in the case of the tropical rainforest Ache hunter-gatherers of Paraguay, how it does a good job of predicting and explaining the composition of their diet, and why fruit constitutes only a modest part of it.

Hunter-gatherers, even in tropical environments (like the Ache and some of the Aborigine tribes), eat a fair amount of meat, which is usually cooked, and also cook tubers. Only traditional Inuit/Eskimos seem to make a point of regularly including some raw animal foods in their diet (though it constitutes only a part of their diet).

Cooking in light of optimal foraging theory. The most reasonable explanation for why hunter-gatherers–the best and most primitive examples we have of anyone living as close to nature as possible with only the most rudimentary kind of “technology”–cook some of their food can be formulated in terms of “optimal foraging theory.” This branch of study (which has been developed to explain and predict the feeding behaviors of animals in general) states that organisms tend to optimize energy expended in acquiring food vs. the energy and nutrients available in the food.

In other words, if a food is easily collected and edible raw, then it will generally be eaten that way, but some foods, which are inedible in their raw state, would be able to provide a high energy or nutrient yield, in the sense that gathering, processing, cooking them would represent a significant gain of time and effort compared to eating only raw foods. This is true in particular for certain tubers or root vegetables, some of which are partially edible raw but become much more bioavailable after cooking, and are concentrated sources of energy. It’s also worth noting that smoking, drying, or cooking meats can also allow them to be preserved and more fully utilized without waste or spoilage, which is important in a wild environment where a regular supply of meat is not assured. On the other hand, foods that are difficult to collect, and which require too much time preparing and cooking would be unlikely to be eaten, period.

Natural “Garden of Eden” simply a myth. The conclusion of this is that the cooking of some foods, by saving time and effort and extending the range of the diet, would have enhanced survival in a significant way, even in more supposedly “ideal” or tropical environments, but especially in areas where edible raw foods are scarce. That lack of ease in obtaining food occurs only in temperate zones and higher latitudes is not true is shown by the examples of the Australian Aborigines [O’Dea 1992], the Bushmen [Bicchieri 1972 and below] as well as many others [Bicchieri 1972], including the tropical rainforest Ache of Paraguay [Hawkes et al. [1982]. (Note that even in the case of the Ache hunter-gatherers of Paraguay, one of the few examples of a primitive people that have been extensively studied who subsisted in a dense rainforest habitat–the type of environment considered ideal by raw-fooders–significant amounts of their food were cooked; and fruits, a typical raw-foodist staple, were not as easily obtainable compared to other foods in their diet [Clastres 1972, see esp. p. 156; also Hawkes et al. 1982, and Hill et al. 1984].)

The Garden of Eden is a myth–or, if you like, doesn’t exist on Earth. The bottom line then, is that–however cooking may have gotten started–it’s use conferred significant (evolutionary) survival advantages, or it would not have eventually become part of the regular repertoire of that most opportunistic of all animal species: human beings.

GO TO NEXT PART OF ARTICLE

(Choosing an Intelligent, Combined Raw-Food/Cooked-Food Diet)

Return to beginning of article

SEE REFERENCE LIST

 


 

Advertisements
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: