yet our basic underlying genetic inheritance remains basically the same as it was before, and has evolved only very slightly since then. Thus, many of the foods we now eat are discordant with our genetic inheritance. (This is not simply an idle or "just so" hypothesis. As we proceed, we will look at the considerable clinical evidence supporting this picture.) Such "evolutionary discordance" is a fundamental aspect of the evolutionary equation that governs fitness and survival (in which health plays a key role), which includes the question of the diet humans are evolved to handle best from the genetic standpoint.
To begin with, we will be examining evolutionary discordance from a general standpoint by looking at the mismatch between the characteristics of foods eaten since the "agricultural revolution" that began about 10,000 years ago compared with our genus' prior two-million-year history as hunter-gatherers. As the article progresses, however, we'll be taking a look at some of the actual genetics involved so it can be seen that "evolutionary discordance" is not merely a theoretical concept but a very real issue with relevance in how diseases can be genetically expressed in response to dietary factors.
With this key concept in mind, let's now begin with a look at the history of grains and legumes in the human diet (quite recent in evolutionary time), after which we'll move on to some of the evolutionarily discordant effects of their consumption on human beings, as seen in modern clinical and genetic studies.
Evidence for the late evolutionary role of grains in the human diet |
- Timeframe for cereal grain domestication. There are 8 major cereal grains which are consumed by modern man (wheat, rye, barley, oats, corn, rice, sorghum, and millet) [Harlan 1992]. Each of these grains were derived from wild precursors whose original ranges were quite localized [Harlan 1992]. Wheat and barley were domesticated only ~10,000 years ago in the Near East; rice was domesticated approximately 7,000 years ago in China, India, and southeast Asia; corn was domesticated 7,000 years ago in Central and South America; millets were domesticated in Africa 5,000-6,000 years ago; sorghum was domesticated in East Africa 5,000-6,000 years ago; rye was domesticated ~5,000 years ago in southwest Asia; and oats were domesticated ~3,000 years ago in Europe.
Consequently, the present-day edible grass seeds simply would have been unavailable to most of mankind until after their domestication because of their limited geographic distribution. Also, the wild version of these grains were much smaller than the domesticated versions and extremely difficult to harvest [Zohary 1969].
How recent in the human evolutionary experience is grain consumption in terms of our total dietary experience? The first member of the human genus, Homo, was Homo habilis who has now been dated to ~2.33 million years ago (MYA) [Kimbel et al. 1996]. Homo erectus, who had post-cranial (the rest of the body below the skull) body proportions similar to modern humans, appeared in Africa by about 1.7 MYA and is thought to have left Africa and migrated to Asia by 1 MYA or perhaps even earlier [Larick and Ciochon 1996]. Archaic Homo sapiens (called by some, Homo heidelbergensis) has been dated to 600,000 years ago in Africa and to about 400,000 years ago in Europe or perhaps earlier [De Castro et al. 1997].
Anatomically modern Homo sapiens appear in the fossil record in Africa and the Mideast by about 90,000-110,000 years ago and behaviorally modern H. sapiens are known in the fossil record by ~50,000 years ago in Australia and by about ~40,000 yrs ago in Europe.
The so-called "Agricultural Revolution" (primarily the domestication of animals, cereal grains, and legumes) occurred first in the Near East about 10,000 years ago and spread to northern Europe by about 5,000 years ago [Cavalli-Sforza et al. 1993]. The industrial revolution occurred roughly 200 years ago, and the technological revolution which brought us packaged, processed foods is primarily a development that has occurred in the past 100 years and has seen enormous growth in the last 50 years.
To gauge how little geologic or evolutionary time humans have been exposed to foods wrought by the agricultural revolution, let's do a little paper experiment. Take a stack of computer paper (the kind in which each page is connected to one another) and count out 212 eleven-inch (28-cm) pages. Then unravel the stack of paper and lay it out end to end--it will form a continuous 194-foot (59-meter) strip. Now, let's assume that 1 inch (2.54 cm) equals 1,000 years in our 194-foot strip of computer paper; thus, the first part of the first page represents the emergence of our genus 2.33 MYA and the last part of the last page represents the present day.
Now, take a slow walk down all 194 feet of the computer paper, and carefully look at each of the individual eleven-inch sections. When you get to the very last eleven-inch section (the 212th section), this represents approximately the beginning of agriculture in the Mideast 10,000 years ago; therefore, during the preceding 211 sheets humanity's foods were derived from wild plants and animals. This little experiment will allow you to fully grasp how recent in the human evolutionary experience are cereal grains (as well as dairy products, salt, and the fatty meats of domesticated animals).
Humans may have indeed eaten these foods for "millennia," but millennia (even 10 millennia) in the overall timeframe of human existence represents 0.4%. Because the estimated amount of genetic change (0.005%) which has occurred in the human genome over this time period is negligible, the genetic makeup of modern man has remained essentially unchanged from that of pre-agricultural man [Eaton et al. 1985]. Consequently, the human genome is most ideally adapted to those foods which were available to pre-agricultural man, namely lean muscle meats, limited fatty organ meats, and wild fruits and vegetables--but, significantly, not grains, legumes, dairy products, or the very high-fat carcasses of modern domesticated animals.
- Processing technology required. Clearly, grass seeds have a worldwide distribution and would have been found in most environments that early man would have inhabited. However because almost all of these seeds are quite small, difficult to harvest, and require substantial processing before consumption (threshing, winnowing, grinding, and cooking), it would have been virtually impossible for pre-behaviorally modern humans (circa 35,000-40,000 years ago) to exploit this food source.
To harvest and process grains on a large scale, sickles, winnowing trays (baskets), threshing sticks, grinding stones, and cooking apparatus are required. There is no reliable evidence to indicate that this combination of technology was ever utilized by hominids until the late Pleistocene. The advent of grinding stones in the Mideast approximately 15,000 years ago heralds the first large-scale evidence of regular cereal grain consumption by our species [Eaton 1992]. There is substantial evidence that certain modern-day hunter-gatherers such as the Australian Aborigine and the American Great Basin Indians utilized grass seeds [Harlan 1992]; however, these grass seeds were not utilized as a staple and represented only a small percentage of the total caloric intake and were eaten for only a few weeks out of the year. For virtually all of the rest of the studied hunter-gatherer populations, cereal grains were not consumed.
- Optimal foraging theory. In view of the substantial amount of energy required (as just outlined) to harvest, process, and eat cereal grains, optimal foraging theory suggests that they generally would not be eaten except under conditions of dietary duress [Hawkes et al. 1985]. It seems likely that during the Late Paleolithic and before, when large mammals abounded, our ancestors would almost have never consumed the seeds of grass.
- Comparison with other foraging primates. Except for some species of baboons, no primate consumes gramineae (grass) seeds as a part of their regular natural diet. Primates in general evolved in the tropical rainforest in which dicotyledons predominate--consequently monocotyledons (gramineae) would not have been available to our primate ancestors.
- Primate digestive physiology. The primate gut is not equipped with the enzyme systems required to derive energy from the specific types of fiber which predominate in gramineae. Consequently, unless cereal grains are milled to break down the cell walls and cooked to crystallize the starch granules (and hence make them more digestible), the proteins and carbohydrates are largely unavailable for absorption and assimilation. Thus, until the advent of regular fire use and control (as evidenced by hearths ~125,000 years ago), it would have been almost virtually energetically impossible for our species to consume cereal grains to supply the bulk of our daily caloric requirements.
- Repercussions of antinutrient load. As has been suggested by John Yudkin almost 30 years ago, cereal grains are a relatively recent food for hominids and our physiologies are still adjusting and adapting to their presence. Clearly, no human can live on a diet composed entirely of cereal grains (for one thing they have no vitamin C). However, that is but one consideration, since eating raw cereal grains (as well as cooked cereal grains) wreaks havoc on the primate gut because of the high antinutrient content of grains. When cereal grain calories reach 50% or more of the daily caloric intake, humans suffer severe health consequences. One has to look no further than the severe pellagra epidemics of the late 19th century in America and the beri-beri scourges of southeast Asia to confirm this.
Additionally, in not only human beings, but in virtually every animal model studied (dog, rat, guinea pig, baboon, etc.), high cereal grain consumption promotes and induces rickets and osteomalacia [Robertson 1981; Ewer 1950; Sly 1984; Ford 1972, 1977; MacAuliffe 1976; Hidiroglou 1980; Dagnelie 1990]. Recent research has also implicated zinc deficiency due to the effects of excessive cereal grain consumption in retarding skeletal growth [Reinhold 1971; Halsted 1972; Sandstrom 1987; Golub 1996], including cases of hypogonadal dwarfism seen in modern-day Iran.
The pathologies introduced by higher levels of cereal grain consumption discussed above are due primarily to the effects of phytates in grains, which bind to minerals, preventing adequate uptake. To this point, we haven't even touched upon the other antinutrients which inflict damage on a wide variety of human physiological systems. These antinutrients include protease inhibitors, alkylrescorcinols, alpha-amylase inhitors, molecular-mimicking proteins, etc. We will look further at these additional problems below. Clearly, however, cereal grains cannot contribute substantial calories to the diet of primates unless they are cooked and processed.
Digestive considerations and technology required |
Question: Granted that grains would not have made up a large portion of the diet. Nevertheless, if people could in some way have comfortably eaten some amount of wild grains without technology, then given the opportunistic nature of human beings, there's not much reason to think they wouldn't have, is there?
Commentary: People can put many plant items as well as non-edible items (stones, bones, feathers, cartilage, etc.) into their gastrointestinal tracts by way of putting them into their mouths. The key here is the ability of the GI tract to extract the nutrients (calories, protein, carbohydrate, fat, vitamins, and minerals). Bi-gastric herbivores (those having second stomachs) have evolved an efficient second gut with bacteria that can ferment the fiber found in leaves, shrubs, grasses, and forbs (broad-leaved herbs other than grass) and thereby extract nutrients in an energetically efficient manner. (That is, there is more energy in the food than in the energy required to digest it.) Humans can clearly put grasses and grass seeds into our mouths; however, we do not have a GI tract which can efficiently extract the energy and nutrients.
The starch and hence carbohydrate and protein calories in cereal grains occur inside the cell walls of the grain. Because the cell walls of cereal grains are almost completely resistant to the mechanical and chemical action of the human GI tract, cereal grains have been shown to pass through the entire GI tract and appear intact in the feces [Stephen 1994]. In order to make the nutrients in cereal grains available for digestion, the cell walls must first be broken (by milling) to liberate their contents and then the resultant flour must be cooked. Cooking causes the starch granules in the flour to swell and be disrupted by a process called gelatinization which renders the starch much more accessible to digestion by pancreatic amylase [Stephen 1994]. It has been shown that the protein digestibility of raw rice is only 25% whereas cooking increases it to 65% [Bradbury 1984].
The main cereal grains that humans now eat (wheat, rice, corn, barley, rye, oats, millet, and sorghum) are quite different from their wild, ancestral counterparts from which all were derived in the past 10,000 years. We have deliberately selected for large grains, with minimal chaff, and which are easily harvestable. The wild counterparts of these grains were smaller and difficult to harvest. Further, separation of the chaff from the grain was time-consuming and required fine baskets for the winnowing process. Once the chaff is separated from the grain, the grains have to be milled and the resultant flour cooked. This process is time-consuming and obviously could have only come about in very recent geologic times. Further, the 8 cereal grains now commonly eaten are endemic to very narrow geographic locations and consequently by their geographic isolation would have been unavailable to all but a selected few populations of hominids.
As touched upon previously, the issue of antinutrients in raw cereal grains is a very real issue. There are components in raw cereal grains which wreak absolute havoc with human health and well-being. The primary storage form of phosphorous in cereal grains is phytate, and phytates bind virtually all divalent ions, i.e., minerals for our purposes. Excessive consumption of whole-grain unleavened breads (50-60% of total calories) commonly results in rickets [Robertson 1981; Ewer 1950; Sly 1984; Ford 1972, 1977; MacAuliffe 1976; Hidiroglou 1980; Dagnelie 1990], retarded skeletal growth [Reinhold 1971; Halsted 1972; Sandstrom 1987; Golub 1996] including hypogonadal dwarfism, and iron-deficiency anemia (will provide the references upon request). The main lectin in wheat (wheat germ agglutinin) has catastrophic effects upon the gastrointestinal tract [Pusztai 1993a]. Additionally, the alkylrescorcinols of cereals influence prostanoid tone and induce a more inflammatory profile [Hengtrakul 1991], as well as depressing growth [Sedlet 1984].
Given the barriers to grain consumption that primitive hominids would have faced, who did not possess the more sophisticated technology only seen since about 15,000 years ago, optimal foraging theory, again, strongly suggests any consumption would have been at extremely minimal levels. Given also the lack of adaptation of the human gut to prevent the negative effects of their consumption which are only mitigated (and only partially) by such technology, it is extremely unlikely cereal grains were ever more than a very minute fraction of the human diet until very recent times.
Genetic changes to the human gut in evolutionary perspective |
Question: What evidence is there for the speed at which genetic changes that govern the way the gastrointestinal tract functions can occur? Isn't there evidence showing that, for example, the genes governing lactose intolerance can be quite rapid in evolutionary terms? What basis is there for believing that the human gut is really the same as that of our hominid ancestors during Paleolithic times?
Commentary: There are calculations which estimate how long it took to increase the gene for adult lactase persistence (ALP) in northern Europeans from a pre-agricultural incidence rate of 5% to its present rate of approximately 70% [Aoki 1991]. (Note: The enzyme lactase is required to digest the sugar lactose in milk, and normally is not produced in significant quantity in human beings after weaning.) In order for the gene frequency to increase from 0.05 to 0.70 within the 250 generations which have occurred since the advent of dairying, a selective advantage in excess of 5% may have been required [Aoki 1991].
Therefore, some genetic changes can occur quite rapidly, particularly in polymorphic genes (those with more than one variant of the gene already in existence) with wide variability in their phenotypic expression. ("Phenotypic expression" means the physical characteristic(s) which a gene produces.) Because humans normally maintain lactase activity in their guts until weaning (approximately 4 years of age in modern-day hunter-gatherers), the type of genetic change (neoteny) required for adult lactase maintenance can occur quite rapidly if there is sufficient selective pressure. Maintenance of childlike genetic characteristics (neoteny) is what occurred with the geologically rapid domestication of the dog during the late Pleistocene and Mesolithic [Budiansky 1992].
The complete re-arrangement of gut morphology or evolution of new enzyme systems capable of handling novel food types is quite unlikely to have occurred in humans in the short time period since the advent of agriculture. Some populations have had 500 generations to adapt to the new staple foods of agriculture (cereals, legumes, and dairy) whereas others have had only 1-3 (i.e., Inuit, Amerindians, etc). Because anatomical and physiological studies among and between various racial groups indicate few differences in the basic structure and function of the gut, it is reasonable to assume that there has been insufficient evolutionary experience (500 generations) since the advent of agriculture to create large genetic differences among human populations in their ability to digest and assimilate various foods.
Of the population differences in gastrointestinal function which have been identified, they generally are associated with an increased ability to digest disaccharides (lactose and sucrose) via varying disaccharidase activity. Although insulin metabolism is not a direct component of the gastrointestinal tract, there is substantial evidence to indicate that recently acculturated populations are more prone to hyperinsulinemia and its various clinical manifestations, including non-insulin-dependent diabetes mellitus (NIDDM), obesity, hypertension, coronary heart disease and hyperlipidemia [Brand-Miller and Colagiuri 1994].
It is thought that these abnormalities, collectively referred to as "syndrome X" [Reaven 1994], are the result of a so-called "thrifty gene" [Neel 1962] which some groups have suggested codes for glycogen synthase [Schalin-Jantti 1996]. Consequently, the ability to consume increasing levels of carbohydrate without developing symptoms of syndrome X is likely genetically based and a function of relative time exposure of populations to the higher carbohydrate contents of agriculture [Brand-Miller and Colagiuri 1994].
There are no generally recognized differences in the enzymes required to digest fats or proteins among human populations. Additionally, all human groups regardless of their genetic background have not been able to overcome the deleterious effects of phytates and other antinutrients in cereal grains and legumes. Iranian populations, Inuit populations, European populations, and Asian populations all suffer from divalent ion (calcium, iron, zinc, etc.) sequestration with excessive (>50% total calories) cereal or legume consumption. All racial groups also have not evolved gut characteristics which allow them to digest the food energy which is potentially available in the major type of fiber contained in cereal grains. Further, most of the antinutrients in cereal grains and legumes (alklyrescorcinols, amylase inhibitors, lectins, protease inhibitors, etc.) wreak their havoc upon human physiologies irrespective of differing genetic backgrounds.
Thus, most of the available evidence supports the notion that except for the evolution of certain disaccharidases and perhaps changes in some genes involving insulin sensitivity, the human gut remains relatively unchanged from paleolithic times.
Celiac disease as evidence of genetic and evolutionary discordance |
Simoons classic work on the incidence of celiac disease [Simoons 1981] shows that the distribution of the HLA B8 haplotype of the human major histocompatibility complex (MHC) nicely follows the spread of farming from the Mideast to northern Europe. Because there is strong linkage disequilibrium between HLA B8 and the HLA genotypes that are associated with celiac disease, it indicates that those populations who have had the least evolutionary exposure to cereal grains (wheat primarily) have the highest incidence of celiac disease. This genetic argument is perhaps the strongest evidence to support Yudkin's observation that humans are incompletely adapted to the consumption of cereal grains.
Thus, the genetic evidence for human disease (in this case, I have used celiac disease; however, other models of autoimmune disease could have been used) is supported by the archeological evidence which in turn supports the clinical evidence. Thus, the extrapolation of paleodiets has provided important clues to human disease--clues which may have gone unnoticed without the conglomeration of data from many diverse fields (archaeology, nutrition, immunology, genetics, anthropology, and geography).
For a celiac, a healthy diet is definitely cereal-free--why is this so? Perhaps now the evolutionary data is finally helping to solve this conundrum.
Biotin deficiency and the case of Lindow Man |
Lindow Man, whose preserved body was found in a peat bog in Cheshire, England in 1984, is one of the more extensively studied of the so-called "bog mummies" [Stead, Bourke, and Brothwell 1986]. The principal last meal of Lindow Man likely consisted of a non-leavened whole-meal bread probably made of emmer wheat, spelt wheat, and barley. Unleavened whole-grain breads such as this represented a dietary staple for most of the less-affluent classes during this time. Excessive consumption of unleavened cereal grains negatively impacts a wide variety of physiological functions which ultimately present themselves phenotypically (i.e., via changes in physical form or growth). The well-documented phytates of cereal grains sequester many divalent ions including calcium, zinc, iron, and magnesium, which can impair bone growth and metabolism. Further, there are antinutrients in cereal grains which directly impair vitamin D metabolism [Batchelor 1983; Clement 1987]; and rickets are routinely induced in animal models via consumption of high levels of cereal grains [Sly 1984].
Less well-appreciated are the ability of whole grains to impair biotin metabolism. My colleague, Bruce Watkins [Watkins 1990], as well as others [Blair 1989; Kopinksi 1989], have shown that biotin deficiencies can be induced in animal models by feeding them high levels of wheat, sorghum, and other cereal grains. Biotin-dependent carboxylases are important metabolic pathways of fatty-acid synthesis, and deficiencies severely inhibit the chain-elongation and desaturation of 18:2n6 (linoleate) to 20:4n6 (arachidonic acid). Human dietary supplementation trials with biotin have shown this vitamin to reduce fingernail brittleness and ridging that are associated with deficiencies of this vitamin [Hochman 1993].
Careful examination of the photograph of Lindow's man fingernail (still attached to a phalange of the right hand [Stead 1986, p. 66]) shows the characteristic "ridging" of biotin deficiency. It is likely that regular daily consumption of high levels (>50% daily calories) of unleavened cereal-grain breads, which Lindow man may have consumed, caused a biotin deficiency, which in turn caused nail ridging.
Antinutritional properties of legumes |
Question: So far we have been discussing grains. What about legumes? Could they have been realistically eaten as a staple by primitive groups without cooking, and if they are natural to the human evolutionary experience, why do they cause gas which indicates fermentation of indigestible products in the gut? If they are not natural for us, how do we account for the !Kung and other primitive groups who eat them?
Commentary: As with grain consumption, there are hunter-gatherers who have been documented eating legumes. However, under most cases, the legumes are cooked or the tender, early sprouts eaten raw rather than the mature pod. Some legumes in their raw state are less toxic than others. However, most legumes in their mature state are non-digestible and/or toxic to most mammals when eaten in even moderate quantities. I refer interested readers to:
- Liener IE (1994) "Implications of antinutritional components in soybean foods." Crit Rev Food Sci Nutr., vol. 34, pp. 31-67.
- Gupta YP (1987) "Antinutritional and toxic factors in food legumes: a review." Plant Foods for Human Nutrition, vol. 37, pp. 201-228.
- Noah ND et al. (1980) "Food poisoning from raw red kidney beans." Brit Med J, vol. 2, pp. 236-237. and
- Pusztai A et al. (1981) "The toxicity of Phaseolus vulgaris lectins: Nitrogen balance and immunochemical studies." J Sci Food Agric, vol. 32, pp. 1037-1046.
These references summarize the basics about legume indigestibility/toxicity; however, there are hundreds if not thousands of citations documenting the antinutritional properties of legumes. Legumes contain a wide variety of antinutrient compounds which influence multiple tissues and systems, and normal cooking procedures do not always eliminate these [Grant 1982]. There are a variety of compounds in beans which cause gas. Mainly, these are the non-digested carbohydrates raffinose, stachyose, and sometimes verbascose, which provide substrate for intestinal microflora to produce flatus [Calloway 1971].
PROCEED TO SECOND HALF OF ARTICLE
Back to Paleodiet & Paleolithic Nutrition