By Alex Enescu (M.Sc. Candidate: Psychiatry Department, Faculty of Medicine).
Dieting is ubiquitous. But this does not mean that everyone has willingly adapted, customized, or modified their way of eating. In fact, it means that different diets naturally—or through the governing forces of human tradition and habits—assert themselves within different cultural domains. For example, what is considered a “normal” diet in India, will not be considered a normal diet in neighbouring Bangladesh. Similarly, the “Chinese diet”, if there is such a thing, is remarkably different from the equally equivocal “European diet”. Taoists strongly oppose the consumption of grains, whereas Mesopotamian cultures strongly encourage it. While no single place has a singular way of eating, and eating habits tend to diffuse themselves based on loosely defined, diaphanous geo-cultural borders, there is nonetheless a particularized way of eating that can be associated —within a specific historicity— to any given group of people.
Diets are shaped by a form of “natural selection” i.e., eating habits that encourage survival get passed down; while eating habits that discourage it, don’t get passed down. These dietary trends often have particularized historic precedents. As an example of this, take the emblematic Irish nightshade crop (i.e., the potato). Potatoes are not indigenous to Europe. In fact, they are a rather recent addition to the European menu, as they were imported from South America, roughly speaking, about three hundred years ago. In spite of this, this particular nightshade crop has come to dominate European culinary traditions, in a way that few other foods have managed to assert themselves. As such, potatoes have emerged on to the culinary cultural scene by virtue of their tang, spurt, and ability to provide nourishment to many geo-cultural regions within the context of a resilient northern European climate.
The North American diet is no different. It is a mixture of European eating traditions, combined with indigenous North American crops, such as corn, and various adaptive necessities, such as learning how to grow European crops in a non-European environment. Today, the Standard American Diet (SAD) has dramatically changed since its original conception, and is now composed of roughly speaking 25% dairy products, 35% grain products (e.g., rice, wheat, rye, corn, and so on), 15% animal products (not counting dairy consumption), 15% sugars, sweeteners and other processed sweets, and less than 10% of vegetables and fruits. Abbreviated as SAD—perhaps a suitable acronym—the diet has a long and intricate history (for a brief introduction see my previous parts of this series I & II). While most people who live in North America have adopted a SAD way of eating, some have opted for a more “healthy” alternative. For example, some North Americans have turned to veganism, others to pesco-vegetarianism. Still, others have opted for an ovo-lacto-vegetarian way of eating, while many have chosen to self-fashion their eating habits even further, and have consequently produced ever more complex combinatorial dietary taxonomies i.e., ovo-lacto-pescatarian-vegetarian diet; paleo; keto; low-carb; ovo-lacto-glutenfree-vegetarian; flexitarian-semi-vegetarian-ovo-lacto-pescatarianism; macrobiotic-raw-vegetarianism; and so on and so forth.
The question, which is undoubtedly plaguing your mind, just as it has plagued mine for a protracted period of time is: why is there so much confusion about nutrition? Why are there so many different ways of eating within a particularized geo-cultural environment? The answer to these questions can be traced back to a national obsession with cardiovascular disease prevention (see previous posts, I & II, for more details).
By the end of the 1980s, the USDA (United States Department of Agriculture) had thoroughly revised the overwhelming majority of available scientific publications on nutrition, consulted with the leading dietician and nutritional scientists of the period, and counselled many branches of government, in an effort to issue the first government approved national diet plan. In particular, the USDA was concerned, as were many doctors during that period, about the rising rates of cardiovascular disease, cancer, obesity and diabetes. As such, the USDA embarked on a crusade to halt the progression of these disorders through a well structured, scientifically based, consciously orchestrated culinary — universalized — diet plan. The ensuing result (see previous blog posts for more details) was a low-fat, quasi-vegetarian, diet—a way of eating that became engrained in the American imagination in 1992 with the advent of the USDA food pyramid. So, by 1992, the government had achieved its national dietary ambitions. As a consequence of these policies, cardiovascular disease was expected to drop. However, to the USDA’s horror, the exact opposite happened.
After the introduction of the USDA food pyramid, America got fatter, sicker and suffered from more cancer, cardiovascular disease, diabetes and neurological disorders than ever before. In fact, entirely new “obesity brackets” had to be introduced every five years, just to keep up with the obesity epidemic! Unheard of categories of weight, which didn’t exist in the early 1990s, came to dominate the entire demographic spectrum of the United States within less than a decade after the introduction of the USDA food pyramid. But it wasn’t just obesity that was on the rise. Diabetes, cancer and cardiovascular diseases were also dramatically increasing. In time, it became apparent that what we are dealing with is not a plurality of disorders, but a single disorder, which is expressed through different facets — collective statistical data was quick to show a strong, albeit, poorly understood, link between obesity, cancer, diabetes and cardiovascular disease.
Some of these disorders became clumped together in the medical literature. For example, the term diabesity was coined to refer to diabetes and obesity as a single disorder. Later, type II diabetes, obesity, cardiovascular diseases, polycystic ovaries, lipid problems, hypertension, cancer, dementia and non-alcoholic fatty liver disease, came to be collectively referred to as “metabolic syndrome”. Interestingly, while superficially these disorders appear to be radically different from each other, it is now becoming increasingly accepted that they are all derived from the same fundamental metabolic disorder.
How does any of this relate back to dieting? Well, since the 1950’s, it has been observed that some culinary traditions appear to be protective, whereas others appear to increase the risk of developing metabolic disorders. This became particularly evident among the East-Asian demography of the United States. For example, ethnic Japanese people, while living in Japan, have repeatedly shown a significantly lower rate of cardiovascular disease, cancer, obesity, and diabetes compared to their North American counterparts. However, when the Japanese immigrate to North America, they gradually begin to exhibit the same rates of metabolic disorders as Americans. (It should be noted that this effect is most prominent in second, or later, generations of immigrants). Scientists tend to agree that diet plays a significant contributory role to these statistics. The question, however, has been — and still remains — why do different eating habits either promote or discourage the development of metabolic disorders?
The USDA ascertained fat to be the main dietary cause of diabesity and cardiovascular diseases. But, the low-fat diet, which was nationally adapted to various degrees, and proved to be undoubtedly successful in its overall implementation throughout North America — as is evidenced by the multiplicity of available low-fat products, lean meat consumption, and overarching quasi-vegetarian culture — did little to curve the disastrous spread of these disorders. Because of this, in recent years, two major dietary schools have surfaced to challenge the USDA food pyramid legacy.
The first group argues that the USDA did not go far enough in its approach. In their mind, fat is such a problematic nutrient that it can only be consumed in exceedingly small quantities before it becomes harmful i.e., in far less quantities than the USDA originally suggested. As such, they argue that in order to curtail the exponential growth of metabolic disorders, consumers must adopt a vegetarian, or preferably vegan (i.e., no animal products) low-fat diet. Proponents of this dietary approach include Dr. Dean Ornish, Dr. Michael Greger, Dr. John McDougall, Dr. Colin Campbell, among others. Collectively, this group of researchers advocates that in order to avoid the development of metabolic syndrome, adults should consume a high-carb, low-fat diet, which should be roughly composed of 80% carbohydrates, 10% protein and less than 10% fat.
The alternative group developed a radically different solution to the same problem. Researchers, such as Dr. Stephen Phinney, Dr. Jeff Volek, Dr. Eric Westman, Dr. Thomas Seyfried, and so on, suggest that the entire low-fat enterprise is problematic, and build on shaky science. They argue that fat is an old nutrient — widely consumed throughout human history — and that blaming it for a modern health epidemic makes little to no sense. These researchers encourage a return to a “Palaeolithic”, pre-agricultural, way of eating i.e., a diet that is predominantly composed of animal products, leafy greens, nuts and seeds, and which exempts all grains, root vegetables (e.g., potatoes), processed foods and sugars. As such, they posit that carbohydrates, as a non-essential macronutrient, should only be consumed in moderation, if at all — preferably under 50g per day. So, while proponents of a low-fat vegetarian/vegan diet advocate for the consumption of a predominantly carb-based diet, the paleo/ketogenic/low-carb folks assert that a healthy diet implies consuming 80% of calories as fat, 15% as protein and less than 5% as carbohydrates. As such, the two schools could not be at more odds with each other.
To be continued…
Banner Photo by @Comfreak// @ pixabay