In a 2013 cross-sectional study of Texas Head Start teachers,[1] researchers found that “nutrition-related knowledge, attitudes, and behaviors” were severely lacking. Despite the fact that the vast majority of the surveyed teachers believed nutrition was important—and despite the fact that health is a major priority of the national Head Start program—not a single one of the study’s participants could answer five elementary questions about nutrition (e.g., which has more calories: protein, carbohydrate, or fat?), and most of them reported that they were confused about nutrition. Moreover (and reflecting trends in the general public), most of the study’s participants were either overweight or obese.
This may only be one example, but it illustrates a much larger problem in our society: we suffer from an undeniable epidemic of nutrition confusion. In fact, this confusion has become a mammoth industry unto itself. The explosive demand for lifestyle books, blogs, magazines, podcasts, and other media indicate an extraordinary demand for guidance, and it’s virtually impossible to avoid the almost constant stream of advertising that preys on our confusion. We are not a healthy, well-informed public.
A combination of international correlation studies, migration studies, experimental laboratory animal studies, and human intervention studies have linked poor or inadequate nutrition to many of the leading causes of death: heart disease, cancer, kidney disease, diabetes, and more. On the other hand, that means there is huge room for improvement. The number of premature deaths that could be prevented by good nutrition boggle the mind. Consider the number one killer in the US—heart disease—which claims the lives of about 650,000 Americans every year.[2] Estimates backed by research suggest that up to 90% of those deaths could be prevented by the informed use of nutrition.[3] That’s a potential 585,000 lives saved from only one disease!
There is also a huge financial incentive to get our act together. Medical expenses are the number one cause of bankruptcies in this country,[4] and health care spending currently accounts for nearly 20% of the nation’s GDP—a huge increase from previous decades.[5] Doing a better job of dealing with these lifestyle-related diseases would free up a tremendous amount of resources.
More than 50 years have passed since the National Cancer Act of the US Congress became law.[6] This legislation was hailed as the first strike in the “War on Cancer,” and it marked the beginning of many efforts: new cancer research centers were established, the National Cancer Institute was overhauled and began to resemble what it looks like today, and there began an even more concerted, proactive effort to discover how cancer works and how we might treat it. At the time when this legislation was drafted, cancer was a leading cause of death, and alarmingly high incidence and mortality trends did not offer encouragement for the future.
President Richard Nixon signing the National Cancer Act of 1971.
Credit: National Cancer Institute
There has been a huge influx of research these past five decades, but the situation remains fundamentally unchanged. Though we do understand the disease better than ever before, it remains a leading cause of death both in the US and internationally. Although the treatment and management for certain types of cancer has improved, overall incidence and mortality rates—especially when you include increasing rates in developing countries—illustrate the need for a new approach. And yet, as was the case in 1971, nutrition remains consistently underutilized.
Nutrition can offer a profound benefit to people suffering from chronic diseases, and it is an empowering alternative to the current disease treatment system. That’s why it is especially important that the best nutrition research is made available to everyone, and not warped by the influence of industry. Likewise, any institution that hinders clarity or limits accessibility to this information—whether they are a nonprofit organization, a professional society, or a university—undermines its own legitimacy and deserves to be questioned. However, institutions will always have a role to play:
“certain regulatory, legal, and financial objectives can only be achieved by the collective action that institutions facilitate… the question, then, is not how to obliterate these institutions, but how to flip the script of history and radically transform our systems so that they no longer hinder growth, but rather accelerate it.”
The status quo, which tends toward nutrition ignorance, will not spontaneously reverse. We must all confront this challenge together and in our own lives. The past of nutrition is riddled with examples of poor judgment, inaction, unnecessary controversy, and corruption; if the future is going to be better, it’s going to take a concerted effort from all of us.
Although there are many general recommendations—e.g., protect academic freedom; advocate for policies that increase access to healthy, affordable food; promote local and regenerative agriculture practices—the field of nutrition specifically is going to require a huge overhaul:
In The Future of Nutrition, T. Colin Campbell cuts through the noise with an in-depth analysis of our historical relationship to the food we eat, the source of our present information overload, and what our current path means for the future—both for individual health and society as a whole. The Future of Nutrition offers a fascinating deep-dive behind the curtain of the field of nutrition—with implications both for our health and for the practice of science itself.