The Looming Brain Health Crisis
The Good News
Let’s start with some good news. We, as a human race, are living longer than ever before. Life expectancy has been steadily on the rise for over two hundred years. During the twentieth century in particular, there has been nothing less than a downright boom in human longevity. This dramatic increase in life expectancy ranks as one of society’s greatest achievements. According to the Centers for Disease Control and Prevention, while most babies born in 1900 did not live past age fifty, life expectancy now averages just under eighty years old in most industrialized countries.
It turns out that the secret behind our recently extended life span is not due to genetics or natural selection, but rather to the relentless improvements made to our overall standard of living. From a medical and public health perspective, these developments were nothing less than game changing. For example, major diseases such as smallpox, polio, and measles have been eradicated by mass vaccination. At the same time, better living standards achieved through improvements in education, housing, nutrition, and sanitation systems have substantially reduced malnutrition and infections, preventing many unnecessary deaths among children. Furthermore, technologies designed to improve health have become available to the masses, whether via refrigeration to prevent spoilage or systemized garbage collection, which in and of itself eliminated many common sources of disease. These impressive shifts have not only dramatically affected the ways in which civilizations eat, but also determined how civilizations will live and die.
In the end, we are living longer and longer lives. In most industrialized nations old age is now a reasonable expectation, so much so that scientists are adamant: an older society is here to stay. That’s good news—news that is hard won over the millennia of the history of humankind.
The Not Such Good News
Now for the flip side. As it turns out, to some degree, we might be -victims of our own success. Unfortunately, this increase in life span has not necessarily provided us with additional years of particularly high-quality health. Old age can come with wisdom, but it just as regularly arrives with some less illustrious additions. Hearing loss, bifocal glasses, slower reflexes, and common medical ailments such as arthritis, rheumatisms, and respiratory problems are examples of those side effects we’d rather do without. What is of greater concern is that deterioration of the brain sneaks up on many of us as we age, making us vulnerable to memory deficits and loss of cognitive function.
Over the years, I’ve asked countless patients, “What concerns you most about your future health?” More often than not, it wasn’t the condition of their heart or even the risk of cancer that came to mind. Today, the greatest fear for most people is that they might end their days -battling dementia.
The most common cause of dementia, and probably the most feared, is the memory-robbing Alzheimer’s disease. The idea of losing track of one’s own thoughts, or being unable to remember our loved ones, is cause for great anxiety, fear, and stress. Equally daunting is our inevitable grief at seeing a relative or close friend suffer from this devastating disease.
This concern is understandable. Of all the challenges to aging in the twenty-first century, nothing compares to the unprecedented scale of Alzheimer’s. According to recent reports from the Alzheimer’s Association, the number of people living with Alzheimer’s in the United States alone is an estimated 5.3 million. As the baby boomer generation ages, the number of patients is predicted to reach a staggering 15 million cases by 2050. This is the population of Los Angeles, New York, and Chicago all together.
A similar trend is observed planet wide. Today more than 46 million people live with dementia the world over. This number is estimated to increase to 132 million by the year 2050.
Further, while Alzheimer’s represents the most recognizable (and most common) framework for dementia, there are many ways a healthy brain can go awry: other forms of dementia, Parkinson’s disease, stroke, depression, and so forth. As more countries reap the benefits of longer lives, the burden of all these disorders is reaching an alarming proportion. If that weren’t enough, beyond specific disorders, general age--related cognitive impairment might affect three to four times as many people, with extraordinary psychological, social, and economic consequences.
As we take in the challenges of such an unprecedented brain health crisis, the year 2050 doesn’t seem so far away.
We need a cure, and we need it fast.
The Breaking News
Now for the news that provides us with hope. Recent medical breakthroughs have radically changed our understanding of aging and disease by showing that the brain changes leading to dementia unfold over -decades before anyone ever forgets a name or loses their keys. These findings have revealed a much more complex picture than previously imagined.
Two technologies in particular have deeply changed the way we understand brain aging. On the one hand, we finally have access to “cheap genomics” (affordable DNA testing), which allows us to take an important peek into our genetic predispositions. While just five years ago we would have had to spend thousands of dollars to do a proper genetic screening on patients, today anyone can obtain such precious information for just a few hundred.
In addition, we have lab tests such as brain imaging that allow us to view how the brain is functioning over time, in response to both our genetics and our lifestyle choices. Scientists now have access to sophisticated brain imaging techniques, such as magnetic resonance imaging (MRI) and positron emission tomography (PET), which allow a view of the human brain from the inside out. Brain imaging has given us a rare and opportune window from which we can catch a glimpse of the actual progression of many brain diseases years in advance of any noticeable clinical symptoms. Finally, we can track the development of diseases like Alzheimer’s as they unfold, and use that knowledge to identify people at risk many years, if not decades, before clinical symptoms emerge.
As you’ll notice, a large part of the discussion around nutrition for brain health references Alzheimer’s. This is primarily because Alzheimer’s is one of the few neurological diseases reaching epidemic proportions that scientists can agree is influenced by diet, and as such, it is the instigator for much of the funded research in the space. In order to figure out what people need to eat to improve or maintain optimal cognitive capacities, we need to compare people who age gracefully (from the brain’s perspective) to those who unfortunately do not. In this context, Alzheimer’s is in effect shorthand for the most extreme responses of the brain to the nutrients we provide. The lessons learned, and behaviors to follow, apply therefore to broader cognitive health as well as many, if not all, forms of cognitive decline associated with brain aging. In much the same way that following the guidelines to prevent heart disease is good for everyone—not just those at risk for cardiac events—the newly discovered dietary strategies to prevent Alzheimer’s are also those that optimize cognitive health overall, over the course of a lifetime and with benefits across the board. Research findings in Alzheimer’s can then be used as a framework that stands in for cognitive decline of the aging brain as as a whole.
By using brain imaging, several teams across the world have been successful in mapping the development of Alzheimer’s over time, showing how it occurs gradually in the brain and progresses over a twenty-to-forty-year period before clinical symptoms emerge. In other words, cognitive impairment is not a mere consequence of old age, but rather represents the endgame of years after years of accumulated insults to the brain. What’s even more disconcerting is that the brain changes leading to dementia can begin as early as young adulthood, and in some cases, even from birth. As it turns out, Alzheimer’s is not a disease of the old, nor does it hit without warning.
Currently, our understanding is that many genetic, lifestyle, and environmental factors can potentially damage the brain while one is still young, triggering a cascade of pathological events that ultimately lead to cognitive deterioration. Whether we’re referring to the somewhat typical forgetfulness and mild memory issues that many people experience around age sixty, or to the full-blown dementia and loss of independent function in older age, there is a long period of time during which brain changes can be under way without the disease yet causing any noticeable symptoms.
If this sounds frightening, take heart.
The key message from these studies, including my own work, is that this lengthy gap leaves a precious window of time to finally and thoroughly explore the power of prevention. There is increasing evidence that implementing the lifestyle changes -described in this book has the potential to prevent Alzheimer’s from developing and also to help slow down or even halt progression of the disease in those who are currently suffering from -dementia.
If that weren’t enough, eating for your brain isn’t just a powerful -preventative against disease—it actually helps you achieve peak performance in every part of your life. Beyond the specific fears over any particular desease and toward a more general hope for better brain health over a longer life, this is a call to action. Anyone who is old enough to consider how their brain will remain healthy into old age is old enough to start making vital changes to address that immediately.
Copyright © 2018 by Lisa Mosconi PhD. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.