Navigation bar--use text links at bottom of page.

(Fire and Cooking in Human Evolution--continued, Part D)

Are cooking's effects black-and-white
or an evolutionary cost/benefit tradeoff?


What about the contention by raw-food advocates that cooking foods results in pyrolytic by-products that are carcinogenic or otherwise toxic to the body, and should be avoided for that reason?

It's true cooking introduces some toxic by-products, but it also neutralizes others.[132] In addition, the number of such toxins created is dwarfed by the large background level of natural toxins (thousands)[133] already present in plant foods from nature to begin with, including some that are similarly carcinogenic in high-enough doses. (Although only a few dozen have been tested so far,[134] half of the naturally occurring substances in plants known as "nature's pesticides" that have been tested have been shown to be carcinogenic in trials with rats and mice.[135]) Nature's pesticides appear to be present in all plants, and though only a few are found in any one plant, 5-10% of a plant's total dry weight is made up of them.[136]

[The reason "nature's pesticides" occur throughout the plant kingdom is because plants have had to evolve low-level defense mechanisms against animals to deter overpredation. On one level, plants and animals are in a continual evolutionary "arms race" against each other. Fruiting plants, of course, have also evolved the separate ability to exploit the fact that certain animals are attracted to the fruit by enabling its seeds to be dispersed through the animals' feces.]

We have a liver and kidneys for a reason, which is that there have always been toxins in natural foods that the body has had to deal with, and that's one reason why these organs evolved. There are also a number of other more general defenses the body has against toxins. These types of defenses make evolutionary sense given the wide range of toxic elements in foods the body has had to deal with over the eons. [Perhaps not clear enough in the original version of the interview is the point that a wide range of GENERAL defenses might therefore be reasonably expected to aid in neutralizing or ejecting toxins even of a type the body hadn't necessarily seen before, such as those that might be introduced by cooking practices.] Such mechanisms include the constant shedding of surface-layer cells of the digestive system, many defenses against oxygen free-radical damage, and DNA excision repair, among others.[137]

The belief that a natural diet is, or can be, totally toxin-free is basically an idealistic fantasy--an illusion of black-and-white thinking not supported by real-world investigations. The real question is not whether a diet is completely free of toxins, but whether we are adapted to process what substances are in our foods--in reasonable or customary amounts such as encountered during evolution--that are not usable by the body. Again, the black-and-white nature of much Hygienic thinking obscures here what are questions of degrees rather than absolutes.

Cooking may favorably impact digestibility. Also, and I know raw-foodists generally don't like to hear this, but there has long been evidence cooking in fact does make foods of certain types more digestible. For example, trypsin inhibitors (themselves a type of protease inhibitor) which are widely distributed in the plant kingdom, particularly in rich sources of protein, inhibit the ability of digestive enzymes to break down protein. (Probably the best-known plants containing trypsin inhibitors are legumes and grains.) Research has shown the effect of most such protease inhibitors on digestion to be reduced by cooking.[138] And it is this advantage in expanding the range of utilizable foods in an uncertain environment that was the evolutionary advantage that helped bring cooking about and enhanced survival.*

I want to make clear that I still believe the largest component of the diet should be raw (at least 50% if not considerably more), but there is provision in the evolutionary picture for reasonable amounts of cooked foods of certain types, such as at the very least, yams, probably some other root vegetables, the legumes, some meat, and so forth. (With meat, the likelihood is that it was eaten raw when freshly killed, but what could not be eaten would likely have been dried or cooked to preserve it for later consumption, rather than wasting it.) Whether or not some foods like these can be eaten raw if one has no choice or is determined enough to do so is not the real question. The question is what was more expedient or practical to survival and which prevailed over evolutionary time.

Cooking practices of Aborigines in light of survival needs. A brief look at the Australian Aborigines might be illustrative here.* What data is available since the Aborigines were first encountered by Europeans shows that inland Aborigines in the desert areas were subject to severe food shortages and prolonged droughts.[139] This of course made emphasizing the most efficient use of whatever foods could be foraged paramount. Estimates based on studies of Aborigines in northern Australia are that they processed roughly half of their plant foods, but that no food was processed unnecessarily, any such preparation being done only to make a food edible, more digestible, or more palatable.[140] In general food was eaten as it was collected, according to its availability during the seasons--except during times of feasts--with wastage being rare, such a pattern being characteristic of feast-and-famine habitats. Some food, however, was processed for storage and later retrieval (usually by drying), including nuts and seeds, but may also have been ground and baked into cakes instead, before burying in the ground or storing in dry caches.[141]

Fresh foods such as fruits, bulbs, nectar, gums, flowers, etc., were eaten raw when collected. Examples of foods that were prepared before consumption include the cooking of starchy tubers or seeds, grinding and roasting of seeds, and cooking of meat.[142]

That these practices were necessary to expand the food supply and not merely induced by frivolous cultural practices like raw-foodists often tend to theorize can be seen in the fact that after colonization by Europeans, Aborigines were not above coming into missions during droughts to get food.[143]


The role of individual experimentation given
evolutionary uncertainties about diet

Fly in the ointment: dietary changes since advent of agriculture. But the more interesting and more pressing question, to my mind, is not whether we are adapted to cooking of certain foods, which seems very likely,* but how much we have adapted to the dietary changes since the Neolithic agricultural transition, given the 10,000 years or less it's been underway. At present, the answer is unclear, although in general, we can probably say there just hasn't been enough time for full adaptation yet. Or if so, only for people descended from certain ancestral groups with the longest involvement with agriculture.

My guess (and it is just a guess) would be that we are still mostly adapted to a Paleolithic diet, but for any particular individual with a given ancestral background, certain Neolithic foods such as grains, perhaps even modest amounts of certain cultured milk products such as cheese or yogurt (ones more easily digested than straight milk) for even fewer people, might be not only tolerated, but helpful. Especially where people are avoiding flesh products which is our primary animal food adaptation, these animal by-products may be helpful,* which Stanley Bass's work with mice and his mentor Dr. Gian-Cursio's work with Hygienic patients seems to show, as Dr. Bass has discussed previously here in H&B (in the April and June 1994 issues).



How are we to determine an optimum diet for ourselves, then, given that some genetic changes may be more or less complete or incomplete in different population groups?

I think what all of this points to is the need to be careful in making absolute black-and-white pronouncements about invariant food rules that apply equally to all. It is not as simple as saying that if we aren't sure we are fully adapted to something to just eliminate it from the diet to be safe. Because adaptation to a food does not necessarily mean just tolerance for that food, it also means that if we are in fact adapted to it, we would be expected to thrive better with some amount of that food in our diet. Genetic adaptation cuts both ways.

This is why I believe it is important for people to experiment individually. Today, because of the Neolithic transition and the rates at which genetic changes are being discovered to take place, it is apparent humanity is a species in evolutionary transition. Due to the unequal flow and dissemination of genes through a population during times like these, it is unlikely we will find [more] uniform adaptation across the population, as we probably would have during earlier times. This means it is going to be more likely right now in this particular historical time period that individuals will be somewhat different in their responses to diet. And as we saw above (with the two genes ACE and apolipoprotein-B) these genetic differences may even confound attempts to replicate epidemiological dietary studies from one population to another unless these factors are taken into account.*

Conflicting data from various modern lines of evidence means people must experiment and decide for themselves. So while it is important to look for convergences among different lines of evidence (evolutionary studies, biochemical nutritional studies, epidemiological studies and clinical trials, comparative anatomy from primate studies, and so forth), it is well to consider how often the epidemiological studies, perhaps even some of the biochemical studies, reverse themselves or come back with conflicting data. It usually takes many years--even decades--for their import to become clear based on the lengthy scientific process of peer review and replication of experiments for confirmation or refutation.

Openness means challenging any rigid assumptions we may have through experimentation. So my advice is: don't be afraid to experiment. Unless you have specific allergies or strong food intolerances and whatnot, the body is flexible enough by evolution to handle short-term variations in diet from whatever an optimal diet might be anyway. If you start within the general parameters we've outlined here and allow yourself to experiment, you have a much better chance of finding the particular balance among these factors that will work best for you. If you already have something that works well for you, that's great. If, however, you are looking for improvements, given the uncertainties above we've talked about, it's important to look at any rigid assumptions you may have about the "ideal" diet, and be willing to challenge them through experimentation. In the long run, you only have yourself to benefit by doing so.


Conflicts between paleo/anthropological
vs. biochemical/epidemiological evidence


Despite the evolutionary picture you've presented here, there are still objections that people have about meat from a biochemical or epidemiological standpoint. What about T. Colin Campbell's China Study for example?

Good point. Campbell's famous study, to my mind, brings up one of the most unremarked-upon recent conflicts in epidemiological data that has arisen. In his lecture at the 1991 ANHS annual conference, reported on in the national ANHS publication Health Science, Campbell claimed that the China Study data pointed to not just high fat intake, but to the protein in animal food, as increasing cholesterol levels. (High cholesterol levels in the blood are now widely thought by many to be the biggest single factor responsible for increased rates of atherosclerosis--clogged blood vessels--and coronary heart disease.) According to him, the lower the level of animal protein in the diet (not just the lower the level of fat) the lower the cholesterol level in the blood. He believes that animal food is itself the biggest culprit, above and beyond just fat levels in food.[144]

Campbell's conclusions about cholesterol and animal protein are contradicted by evidence from studies of modern hunter-gatherers. Yet as rigorous as the study is proclaimed to be, I have to tell you that Campbell's claim that animal protein by itself is the biggest culprit in raising blood cholesterol is contradicted by studies of modern-day hunter-gatherers eating considerable amounts of wild game in their diet who have very low cholesterol levels comparable to those of the China study. One review of different tribes studied showed low cholesterol levels for the Hadza of 110 mg/dl (eating 20% animal food), San Bushmen 120 (20-37% animal), Aborigines 139 (10-75% animal), and Pygmies at 106, considerably lower than the now-recommended safe level of below 150.[145] Clearly there are unaccounted-for factors at work here yet to be studied sufficiently.

Large and significant differences between domesticated meat vs. wild game. One of them might be the difference in composition between the levels of fat in domesticated meat vs. wild game: on average five times as much for the former than the latter. On top of that, the proportion of saturated fat in domesticated meat compared to wild game is also five times higher.[146]

Other differences between these two meat sources are that significant amounts of EPA (an omega-3 fatty acid thought to perhaps help prevent atherosclerosis) are found in wild game (approx. 4% of total fat), while domestic beef for example contains almost none.[147] This is important because the higher levels of EPA and other omega-3 fatty acids in wild game help promote a low overall dietary ratio of omega-6 vs. omega-3 fatty acids for hunter-gatherers--ranging from 1:1 to 4:1--compared to the high 11:1 ratio observed in Western nations. Since omega-6 fatty acids may have a cancer-promoting effect, some investigators are recommending lower ratios of omega-6 to omega-3 in the diet which would, coincidentally, be much closer to the evolutionary norm.[148]

Differences like these may go some way toward explaining the similar blood cholesterol levels and low rates of disease in both the rural Chinese eating a very-low-fat, low-animal-protein diet, and in hunter-gatherers eating a low-fat, high-animal-protein diet. Rural Chinese eat a diet of only 15% fat and 10% protein, with the result that saturated fats only contribute a low 4% of total calories. On the other hand, those hunter-gatherer groups approximating the Paleolithic norm eat diets containing 20-25% fat and 30% protein, yet the contribution of saturated fat to total caloric intake is nevertheless a similarly low 6% of total calories.[149]



What about the contention that high-protein diets promote calcium loss in bone and therefore contribute to osteoporosis?

The picture here is complex and modern studies have been contradictory. In experimental settings, purified, isolated protein extracts do significantly increase calcium excretion, but the effect of increased protein in natural foods such as meat is smaller or nonexistent.[150] Studies of Eskimos have shown high rates of osteoporosis eating an almost all-meat diet[151] (less than 10% plant intake[152]) but theirs is a recent historical aberration not typical of the evolutionary Paleolithic diet thought to have averaged 65% plant foods and 35% flesh.* Analyses of numerous skeletons from our Paleolithic ancestors have shown development of high peak bone mass and low rates of bone loss in elderly specimens compared to their Neolithic agricultural successors whose rates of bone loss increased considerably even though they ate much lower-protein diets.[153] Why, nobody knows for sure, though it is thought that the levels of phosphorus in meat reduce excretion of calcium, and people in Paleolithic times also ate large amounts of fruits and vegetables[154] with an extremely high calcium intake (perhaps 1,800 mg/day compared to an average of 500-800 for Americans today[155]) and led extremely rigorous physical lives, all of which would have encouraged increased bone mass.[156]

GO TO NEXT SECTION OF PART 2

(Caveats in Using Modern "Hunter-Gatherers" as Dietary Models)

Return to beginning of interviews

SEE BIBLIOGRAPHY


SEE TABLE OF CONTENTS FOR: PART 1 PART 2 PART 3

GO TO PART 1 - Setting the Record Straight on Humanity's Prehistoric Diet and Ape Diets

GO TO PART 2 - Fire and Cooking in Human Evolution

GO TO PART 3 - The Psychology of Idealistic Diets / Successes & Failures of Vegetarian Diets

Back to Frank Talk by Long-Time Insiders

   Beyond Veg home   |   Feedback   |   Links