Health & Healthcare – Brain Blogger Health and Science Blog Covering Brain Topics Wed, 30 May 2018 15:00:03 +0000 en-US hourly 1 Hunger Signaling: Can It Be Regulated to Treat Obesity? Mon, 23 Apr 2018 12:00:27 +0000 Recent research suggests that obesity can be controlled by regulating the satiety cascade, including influencing the nerves carrying hunger signals.

Global obesity levels have almost doubled in the last 30 years. This is a worrying fact, especially when considering that obesity represents one of the major risk factors for many chronic diseases, cardiovascular and metabolic diseases in particular. Thus, it is not surprising that conditions like insulin resistance, pre-diabetes, and diabetes are becoming increasingly common worldwide.

Although it is clear that obesity develops when caloric energy intake exceeds energy expenditure, it is not always easy to combat the excessive body weight and fat accumulation. Multiple strategies for tackling obesity and food intake have been developed. These include behavioral (including dietary) changes, interventions with multiple supplements, and pharmacological and surgical treatments.

The latest research-supported developments include controlling obesity by regulating the satiety cascade by influencing the nerves carrying hunger signals. Hunger is a neural signal that initiates eating. Hunger signals originate in the stomach. Furthermore, gut hormones transfer information from the gastrointestinal tract to the centers of appetite regulation located in the central nervous system. This communication between the gut and the brain is known as the gut-brain axis.

It is assumed that information from the gut can be transferred to the brain via both nerve signaling or blood circulation. The hypothalamus has been identified as a key part of the brain in controlling our eating behavior. It integrates peripheral signals that carry information about dietary intake as well as information about energy expenditure.

The hypothalamus receives appetite signals and reacts by modulating the release of neuropeptides in two neuronal populations. While one population of neurons co-express neuropeptides that stimulate appetite and increase hunger (and thus promote eating and weight gain), the other population acts via neuropeptides that decrease appetite (and reduce eating and promote weight loss). Consequently, the balance between these two neuronal populations is essential for the maintenance of optimal body weight.

Considering the importance of the gut-brain axis, influencing the hormones and neurons that carry hunger signals could be a good strategy for obesity control. Drugs that act via hunger-regulating hormones are often prescribed in order to control obesity and body weight, even though they do not appear to be particularly effective in the long term. Meanwhile, the development of strategies that act on neurons and neural signaling is still in the early stages.

One very recent pilot study demonstrated that the freezing of neurons that carry the hunger signals could be an efficient weight loss approach in subjects with mild to moderate obesity. Ten subjects with body mass index (BMI) ranging between 30 and 37 kg/m2 included in this investigation underwent an innovative procedure. Namely, a radiologist inserted a needle through the patients back and used argon gas to freeze the nerve (the posterior vagal trunk) that transfers hunger signals from the gut to the brain. The procedure was done using live images from the CT scan.

After the treatment, the patients were followed for three months. Interestingly, all subjects reported decreased appetite, accompanied by lower BMI and significant weight loss. Specifically, only one week after the procedure the average weight loss was 1%, while after three months it was 3.6%. In addition, at the three-month follow up a 13% decline in BMI was recorded. More importantly, there were no adverse effects or undesired complications for any of the participants involved. The aim of this pilot project was not to shut down the biological response to hunger, but rather to control the hunger signals and reduce their strength. Judging by the first results, it seems that the procedure could represent a useful strategy in reducing excess weight gain through the control of appetite and food intake.

The role of the vagus nerve in the regulation of appetite has been studied previously. Electrical stimulation of the vagus nerve has been identified as clinically relevant for mood regulation, i.e., in the treatment of patients not responding to anti-depressant drugs. On the other hand, stimulation of the vagus nerve for the purpose of obesity treatment has gained attention only recently. Studies in animal models have shown that long-term vagus nerve stimulation could prevent further weight gain by decreasing food consumption. Still, the mechanisms involved are still unclear and further research is required.

Apart from stimulating the vagus nerve, studies have also questioned the effects of vagus blockage (i.e., cutting the vagus nerve) on obesity control. The findings from these studies suggested more pronounced weight loss and prolonged satiety in comparison with the effects achieved by vagus nerve stimulation. However, clinical studies in humans are still ongoing. Hopefully, they will reveal how we could better control body weight through this nerve.

It is obvious that our central nervous system and nerve signaling play an important role in the regulation of appetite and food intake. In addition to cutting caloric intake and increasing energy expenditure, acting on hunger signaling could potentially be an efficient strategy for obesity control as well. Nonetheless, even though the data from recent findings are promising, they require further confirmation in larger clinical trials.


Amin, T., Mercer, J. G. (2016). Hunger and Satiety Mechanisms and Their Potential Exploitation in the Regulation of Food Intake. Current Obesity Reports. 5(1): 106-112. doi: 10.1007/s13679-015-0184-5

Buhmann, H., le Roux, C.W., Bueter, M. (2014). The gut-brain axis in obesity. Best Practice and Research. Clinical Gastroenterology. 28(4): 559-571. doi: 10.1016/j.bpg.2014.07.003

Prolog, J.D., Cole, S., Bergquist, S., Corn, D., Knight, J., Matta, H., Singh, A., Lin, E. (2018). Percutaneous CT Guided cryovagotomy for the management of mild-moderate obesity: a pilot trial. Emory University School of Medicine, Atlanta, GA; University at Buffalo, Buffalo, NY; Emory Healthcare, Roswell, GA; Penn State University Milton S. Hershey Medical Center, Hershey, PA. Society of Interventional Radiology’s Annual Scientific Meeting.

Browning, K.N., Verheijden, S., Boeckxstaens, G.E. (2017). The Vagus Nerve in Appetite Regulation, Mood, and Intestinal Inflammation. Gastroenterology. 152(4): 730-744. doi: 10.1053/j.gastro.2016.10.046

Image via silviarita/Pixabay.

]]> 0
Born Prematurely? Adult Mental Illness Risk Linked to Brain Injury and Dopamine Thu, 25 Jan 2018 16:30:09 +0000 Babies born prematurely who also sustain small brain injuries at the time of birth are more likely to have lower levels of dopamine as adults, according to a new study led by researchers at King’s College London.

Dopamine is a neurotransmitter associated with motivation, attention, concentration, and finding enjoyment in life. Low levels of this chemical may lead to serious mental health conditions such as depression and substance dependence.

Although one in 10 people are born prematurely, most experience no major complications around the time of birth. However, 15-20 percent of babies born before 32 weeks of pregnancy experience bleeding in the brain’s ventricles (fluid-filled spaces). If this bleeding is significant, it can cause long-term problems.

While the exact link between birth complications and greater risk of mental health issues is still unclear, one theory states that the stress of a complicated birth could lead to increased levels of dopamine, which is also increased in people with schizophrenia.

To investigate this further, researchers from King’s, Imperial College London and the Icahn School of Medicine at Mount Sinai in New York, used a combination of positron emission tomography (PET) scans and magnetic resonance imaging (MRI) scans of the brain along with a range of psychological tests. They wanted to determine the precise changes to both chemistry and brain structure following early brain damage.

They compared three groups of people: adults who were born very preterm who sustained early brain damage, adults who were born very preterm who did not sustain brain damage and controls born at term.

Dr. Sean Froudist-Walsh, the study’s first author commented:

People have hypothesized for over 100 years that certain mental illnesses could be related to problems in early brain development.Studies using animal models have shown us how early brain damage and mental illness could be linked, but these theories had not been tested in experiments with humans.

We found that dopamine, a chemical that’s important for learning and enjoyment, is affected in people who had early brain injury, but not in the way a lot of people would have thought — dopamine levels were actually lower in these individuals.

This could be important to how we think about treating people who suffered early brain damage and develop mental illness. I hope this will motivate scientists, doctors and policymakers to pay more attention to problems around birth, and how they can affect the brain in the long term.

Research has shown that mental health problems often arise from a complex mix of genetic vulnerability factors combined with negative or traumatic life experiences. Difficulties at birth may count as one of the most stressful life experiences.

Dr. Chiara Nosarti, the study’s joint senior author from King’s College London, notes on the overarching impact of the research:

The discovery of a potential mechanism linking early life risk factors to adult mental illness could one day lead to more targeted and effective treatments of psychiatric problems in people who experienced complications at birth.

This guest article originally appeared on Dopamine Seen As Link to Mental Illness for Brain-Injured Preemies by Traci Pedersen.


Froudist-Walsh, S., Bloomfield, M., Veronese, M., Kroll, J., Karolis, V., & Jauhar, S. et al. (2017). The effect of perinatal brain injury on dopaminergic function and hippocampal volume in adult life. Elife, 6. doi:10.7554/elife.29088

Image via esudroff/Pixabay.

]]> 0
Lamarckian Evolution is Making a Comeback Mon, 08 Jan 2018 16:30:41 +0000 When scientists see the term “Lamarckian evolution”, the usual reaction is that it references a long-debunked theory. But that might be changing.

Lamarck was an accomplished biologist living in the late 18th and early 19th centuries. He was an expert on the taxonomy of invertebrates and was widely regarded as a botanist. He also wrote about physics, chemistry, and meteorology.

He is best remembered for his publication of Philosophie Zoologique in 1809 in which he lays out his theory of evolution. He describes two laws of nature. The first is that animals develop or lose physical traits depending on usage of those traits. For example, giraffes got their long necks because they constantly stretched to reach high leaves in trees during their lifetime. The second law states that these acquired changes during a lifetime are passed on to offspring, i.e., inherited. These two laws explain how species evolve by continual adaptation to their environment and eventually branch off into new species once the changes become large enough—so-called Lamarckian evolution.

There were other interesting aspects of his theories. He believed that there was some natural force that drove organisms toward increased complexity that was set apart from the usage law. The wide variety of organisms found in nature was because different life forms appeared spontaneously at different times.  Thus they do not all evolve from a common ancestor.  When gaps seemed to appear in the fossil record in certain lineages, he attributed that to a failure in finding all the relevant fossils. His theory clearly assumed gradual and continual evolution, but that evolution was always driven toward greater complexity.

Lamarckian evolution was largely debunked when the works of Gregor Mendel and others later demonstrated that inheritance occurred according to discreet rules of dominant and recessive inheritance rather than through acquired characteristics. Further discoveries in genetics during the 20th century further put the notion of inheritance through acquired characteristics to rest.

BUT, Lamarck has gotten a bit of a reprieve in the 21st century. By 2003, we had completed the Human Genome Project, which told us a lot about our genome and genes, but little about the epigenome. Since then, we’ve learned a lot. The epigenome refers to the 98% of our genome that does not code for proteins (what we traditionally call genes.) Instead, much of that huge portion of our genome has to do the regulation of genes, largely through the coding of various types of RNA. We have between 20,000 and 25,000 protein-coding genes.  That’s about the same number as a mouse or even a worm. And many if not most of these genes do pretty much the same thing across a wide spectrum of animals. What makes us different from a mouse or a worm is largely controlled by the epigenome.

It turns out that the epigenome responds to various factors in our environment like diet and toxins. These factors do cause changes in the epigenome during one’s lifetime, which, in turn, cause changes in the expression of various genes. The epigenome does not ever change the DNA sequence of a gene.  The remarkable fact is that some of the epigenomic changes acquired during a lifetime are passed on to progeny through the sperm and egg! Although it is not through the usage of parts of the body as Lamarck proposed, there is evidence of inheritance of traits acquired during a lifetime. One could call that Lamarckian.

Another way that acquired traits could be passed on to progeny in the future will be through germline genetic engineering when and if that becomes acceptable. So perhaps Lamarck was more prescient than we give him credit for.

Lamarck was extremely accomplished and well ahead of his time. He lived long before we understood genetics and his evolutionary theories preceded those of Darwin. To some extent, he has been given a bit of a bum rap. He got some things right and some things wrong. You can say that about a lot of our great scientists. He did recognize that something changes in an individual through generations and those changes interacted with the environment. Darwin also theorized that individuals change from generation to generation. Neither understood that these changes first require random genetic changes. Both knew that the environment played a large role in evolution, although Darwin’s natural selection is what is generally accepted today as the driving environmental force rather than usage of body components. He was wrong about the multiple spontaneous emergences of different life forms at different times, but he was correct about any apparent gaps in evolutionary lines reflecting an incomplete fossil record.

Let’s give Jean-Baptiste Lamarck his due.


Carey, N. (2012). The Epigenetics Revolution: How Modern Biology Is Rewriting Our Understanding of Genetics, Disease, and Inheritance (1st ed.). Columbia University Press.

Image via Sponchia/Pixabay.

]]> 0
The Dangers of American Sexual Prudishness Mon, 11 Dec 2017 16:20:24 +0000 Is the US the world’s most uptight nation regarding sex? Maybe not the most, but certainly among them. For example, the US has more laws regulating sexual behavior than all European countries combined. US prudishness is so severe as to be deadly. To end sexual violence and harassment against women, something has to change.

Is America the World’s Most Uptight Nation When It Comes to Sex?

Less than half of girls and boys in the US have received the HPV vaccinations that can protect them from deadly cancers. Why? Because HPV is a sexually transmitted infection (STI), and discussing teen sexual activity is taboo. Many doctors refuse to recommend the vaccine because they are uncomfortable discussing STIs.

Related to this prudishness is the view that women’s bodies are purely sexual and therefore all female nudity is provocative and shameful. Even public breastfeeding makes most Americans uncomfortable because a woman’s breast is exposed.

This prudishness about women’s bodies claims to be “protecting” women. At its heart, however, it is about power rather than sex. The “protection” it provides is both seductive and insidious. Seductive, because many women find it comforting to imagine that men are protecting them from danger, even strangers such as legislators—insidious in its implications.

Whom do we protect? Children and adults who are too young, inexperienced, weak, or incompetent to protect themselves. Putting a normal adult woman into this category disempowers her, ensuring that someone else can dictate the most intimate conditions of her life: how she dresses, where she can go alone, whether she has final authority over her own body.

Prudishness also justifies a perceived division between “good” and “bad” women. The former are  modest, compliant and “covered up.” The latter, bold, proud, and independent. That separation buttresses men’s sense that they can treat “bad” women badly. Because the women are “out there,” they can be objectified, attacked, harassed, groped. The result is evident, as the tidal wave of sexual violence and harassment reports continues to grow.

Despite broad recognition of this public health epidemic and dedicated efforts to end sexual violence and harassment, few programs have been successful. The problem is that they are fighting an uphill battle against the prevailing social mores described above. If men are inherently more powerful than women and can define “good” and “bad” women, the only way to end sexual assault and harassment is to convince men they should not assault women. Otherwise, the only option is to mitigate the impact by convincing bystanders to intervene, or training women to defend themselves.

We need a completely new approach. Let’s consider societies with two striking cultural differences from the US. These cultures hold that women are equal to men and that women, from teenhood, should have complete control over their own bodies.

Consider the Kreung society of the lovely Ratanakiri (“Mountain of Jewels”) Province in Cambodia. The Kreung believe that healthy, loving marriages require women who are strong, self-assured, and have self-confidence about their sexuality. Parents help each teen daughter achieve this state by giving her a room of her own. She can invite a boy she likes to spend the night in her room. There, she makes all the rules and reigns supreme. Will they talk the night away? Sleep? Cuddle? Have sex? She alone decides. In this completely secure space, she is free to explore her own sexuality, to discover what pleases her. When she says, “No,” he obeys instantly, without argument or bad feelings. A boy who flouts this rule faces severe penalties from the entire community, as do his parents.

Take another interesting group, the Vanatinai, a small island society off New Guinea. There, women and men are equal in all major aspects of life: decision-making, ritual practices, spiritual power, property holdings, and sexual activity. By working hard to gain goods and giving them away through ritual generosity, anyone of any sex can become one of the authoritative and influential leaders known as “gia”. Everyone is free to engage in sex before marriage, to end a marriage, and to marry as often as, and with whomever, he or she wishes.

The result? Divorce is rare in these societies; sexual violence virtually unknown.

The Takeaway

Sexual violence and harassment are rooted in the very foundations of culture. It is not enough to tell men they should not indulge, or bystanders that they should intervene, or women that they should protect themselves. Ending sexual violence and harassment requires a fundamental shift in cultural attitudes and values, beginning with equality between women and men, and women’s complete control over their own bodies. This change includes ending the putative “protection” of women—including laws to restrict abortion, to regulate women’s attire in ways that are different from those for men, or other social and legal constraints that claim to “protect” but actually disempower and diminish women. Only such basic cultural and legal changes will make it possible to end sexual violence and harassment against women.

References (2017). Sexual Violence: Prevention Strategies. [online] Available here.

Lepowsky, M. (1993). Fruit of the motherland. New York: Columbia University Press.

Mullin, E. The Cancer Vaccine That Too Many People Ignore. (2017). MIT Technology Review, 120 (6), pp.16-17.

Muong, V. (2014). ‘Love huts’ of Ratanakiri minorities: Is a tradition quietly slipping away?. The Phnom Penh Post.

Procida, R. and Simon, R. (2007). Global perspectives on social issues. Lanham, Md.: Lexington Books.

Image via frolicsomepl/Pixabay.

]]> 0
Re-Learning the Joy of Living with Journaling and Meditation Mon, 20 Nov 2017 16:30:03 +0000 Moving along the treadmill of life, many of us succumb to the ever-present pressures to be faster, stronger, more efficient, and smarter. Perspective on what is happening in our lives is lost. We focus on failure and lacking within ourselves, rather than the abundance and opportunities for growth that surround us.

We stop taking the time to appreciate the simple pleasures of our lives as we spiral our way into a depleted existence—physically, emotionally, and mentally. Now more than ever, there is a global need to circumvent this pattern of being. We need to learn to unconditionally love and appreciate ourselves just as we are, how we are.

Growing Epidemic

Traditionally, it was believed the cause of stress, anxiety, and depression was attributable to genetic disposition, personality traits, the existence of stressful events, physical health problems, and substance abuse, as well as serotonin, dopamine, and epinephrine imbalances within the brain. Whilst this is largely still the case, this perception has altered over the last decade or so.

In our fast-paced world, we judge ourselves as harshly as we judge others. We are encouraged and manipulated to compete with others with whom we continually compare ourselves. We try so hard to emulate or exceed expectations placed upon us that we forget our personal needs in the process (no time for that!). Stress and anxiety often manifest as a result as we try and prove our worth to the world, and depression looms when we judge ourselves as falling short of the benchmark that is set for us to achieve.

Prolonged periods of stress wreak havoc on the human mind and body. Chronic muscle tension leads to tension headaches and migraines. The cardiovascular, respiratory, and endocrine systems become over-taxed and the risk increases for the development of diseases like asthma, type 2 diabetes, and heart disease (just to name a few).

In recent times the media has reported stress, anxiety, and depression as reaching epidemic proportions, reportedly attributable to numerous causes including an increase in hours in front of computer screens, national and cultural competitiveness, the exposure to a broadening range of choices due to advances in technology, and the belief that worthiness is related to monetary success.  Additionally, there is a sense of “collective stress” in regards to issues such as climate change and terrorism.

Mindfulness & Self-Reflection

Whilst living a faster pace, society has forgotten the art of living in the present moment, and yet there is much evidence to support that engaging in mindfulness and self-reflection enables sufferers to break the cycle of anxiety, stress, and depression as it promotes a greater sense of well-being and perspective.

Those who engage in the art of mindfulness and self-reflection can improve their ability relate to the world around them in a more compassionate and empathetic manner.  A feeling of gratitude, joy, and abundance is also often a pleasant side effect.

The Value of Meditation & Journaling

Studies have shown that journaling can positively impact a person’s mental health as it allows one to “capture” a thought for long enough to acquire a 360-degree perspective on what that thought is about, where it came from, and how acting on that thought might impact those around us.

Many forms of meditation, like mindfulness meditation, work particularly well with journaling as it takes the mind out of a conscious, judgmental state and into a reflective, sub-conscious state. Such meditative practices smooth the path for writing down thoughts and feelings by prompting less judgment of the thoughts being written down.

Meditative practice can be merely taking five minutes to go for a walk in the park or to focus on breathing patterns—anything that promotes being in the present moment. Likewise, there are many effective journaling techniques that may help people with self-reflection and mindfulness that work well with meditation, and they are not restricted just to writing.

For many, mindfulness can be achieved through writing, art, photography – any means that allows a person to step into a reflective zone. It is for an individual to explore what takes them to that special place where they can set down their emotional and mental baggage to touch base with their soul, and nurturing a sense of gratitude and appreciation for all the simple attainable pleasures within day to day life.


University of Michigan Depression Center, Depression Journaling

Tams, L, Journalling To Reduce Stress (1 May, 2013) Michigan State University Extensiona, < href="">

Hidaka BH, Depression as a disease of modernity: explanations for increasing prevalence, 2013,

McCormack, A, “Lovitude: Trying To Calm The Monkey Mind”, 2016, Peacock Dreaming Publications, Nelson (NZ)

Image via GDJ/Pixabay.

]]> 0
MCI and Anosognosia: She Didn’t Know She Couldn’t Walk Mon, 30 Oct 2017 15:00:52 +0000 My mother’s first fall, at age ninety-two, seemed like an accident. Her rubber-soled sandals caught on the carpet in the activities room of her senior apartment complex and she plunged forward catching her arm on the piano bench. After minor surgery to repair the wound, and a short hospital stay, she began a course of physician therapy; but the falls continued. Injuries were rare and minor. Still the increasing frequency was distressing.

When I suggested a walker, Mom told me, “I don’t need one. I can walk perfectly well. I just lose my balance sometimes.” I puzzled over that response for days. Was it possible she didn’t remember all the falls? Did she really think she could walk safely without assistance?

It was not until after she died that I came upon the concept of anosognosia, from the Greek a-without, nosos-disease, gnosis-knowledge: without (lack of) knowledge of disease. People with anosognosia are unaware of their illness or deficits. It is most often associated with mental illness and is the primary reason for medication non-compliance in psychiatric patients.

It also affects seniors with age-related brain changes, including mild cognitive impairment. Unaware of her weakness and impaired balance, my mother had not understood that she couldn’t walk without falling. And because she did not believe she was impaired, she could not imagine why she should use a walker.

Anosognosia is often assessed with the Clinical Insight Rating (CIR) scale that identifies lack of knowledge/insight in four domains: understanding the reason for the visit (to the physician); awareness of cognitive deficit; awareness of functional deficit; and/or perception of disease progression. Mom’s lack of insight pertained to her functional deficit. Although her anosognosia resulted from an aging brain and associated (and increasing) cognitive impairment, she was sadly aware of her memory problems and disordered thoughts. People with more severe forms of dementia (including Alzheimer’s disease) are more likely to exhibit anosognosia of cognitive deficit, while those with mental illness, brain tumors, or stroke, often exhibit lack of awareness in multiple domains.

At the time, I thought my mother’s refusal to accept help stemmed from her fierce independence. I knew she had trouble thinking clearly, but I still tried to reason with her, hoping that in time she would acknowledge her functional decline and her need for assistance. But neuropsychological testing shows that patients with anosognosia score lower on composite indices of both memory and executive function. Facts and evidence are not persuasive to anyone with a pathological lack of insight.

Looking back, I now understand that Mom’s falls probably signaled her progression to “mild cognitive impairment” or MCI. The Mayo Clinic defines MCI as an intermediate stage between the expected cognitive decline of normal aging and the more-serious decline of dementia; it is often assessed using the Mini Mental Status Exam (MMSE). Studies have shown that the rate of falls increases with each unit decrease in the MMSE, and that insight can be equally impaired in persons with MCI or more advanced dementia.

I wish my mother had received a diagnosis of MCI, but it is not surprising that she didn’t. Most people with MCI can and do live independently, as my mother did, and are able to fulfill basic social roles. They are typically capable of most activities of daily living, although on close observation they take more time to complete them than unimpaired individuals. My mother’s mental decline was almost imperceptible at first, and as it progressed, I did not know enough to ask whether her symptoms were evidence of a specific diagnosis.

Knowing she had MCI would not have helped my mother, but it would have helped me. I would have done my research and uncovered her anosognosia. I would have realized that her reluctance to use a walker was grounded in her understanding of her abilities, wrong as that understanding was. I would have felt more comfortable taking over some of her decision-making.

Eventually, a physical therapist convinced my mother to use the walker for one week, promising that she could stop if she didn’t like it. Within two days, she began referring to it as her Cadillac. After that, her rate of falling decreased substantially.


De Carolis A, Corigliano V, Comparelli A, Sepe-Monti M, Cipollini V, Orzi F, Ferracuti S, Giubilei F: Neuropsychological patterns underlying anosognosia in people with cognitive impairment. Dement Geriatr Cogn Disord. 2012;34:216–223. doi:10.1159/000343488

De Carolis A, Cipollini V, Corigliano V, Comparelli A, Sepe-Monti M, Orzi F, Ferracuti S, Giubilei F: Anosognosia in people with cognitive impairment: Association with cognitive deficits and behavioral disturbances. Dement Geriatr Cogn Disord; 2015;5:42-50. doi:10.1159/000367987

Vogel A, Hasselbalch SG, Gade A, Ziebell M, Waldemar G: Cognitive and functional neuroimaging correlate for anosognosia in mild cognitive impairment and Alzheimer’s disease. Int J Geriatr Psychiatry 2005;20:238–246. doi:10.1002/gps.1272

Lin F, Vance DE, Gleason CE, Heidrich, SM: Taking care of older adults with mild cognitive impairment: An update for nurses. J Gerontol Nurs. 2012 December; 38(12): 22–37. doi:10.3928/00989134-20121106-02

Gleason CE, Gangnon RE, Fischer BL, Mahoney JE: Increased risk for falling associated with subtle cognitive impairment: Secondary analysis of a randomized clinical trial. Dement Geriatr Cogn Disord 2009;27:557–563. doi:10.1159/000228257

Image via cocoparisienne/Pixabay.

]]> 0
The Most Important Thing We Can Do for Our Brain? Exercise! Tue, 24 Oct 2017 15:00:43 +0000 If I would have guessed ten years ago what the best way to train the brain would be, I would probably have thought about crossword puzzles, sudoku, or cognitive apps. But then I would be wrong. The best way is physical exercise. During the last decade, neuroscience has shown that physical exercise has extraordinary effects on our brain.

Most people know by now that exercise will improve their mood—but few know that it will boost all of their cognitive abilities—memory, attention, creativity, and how we cope with stress. It all gets better in a way unparalleled by any drug, food-supplement, or cognitive training method.

So what happens in our brain when we move? First. the brain gets more blood. Bloodflow is increased by 20% while walking fast compared to sitting. More blood means more oxygen and nutrients. But increased bloodflow is only the beginning. The rate of neurogenesis—the formation of new brain cells—is increased by exercise. The newly born brain cells are formed in the dentate gyrus, a part of the hippocampus known as the “memory center”, and the effect is substantial. The hippocampus actually grew by 2% when a group of sedentary individuals walked regularly for a year. Typically, the hippocampus shrinks by up to 1% per year from our late twenties onwards, contributing to gradual memory loss as we get older. The exercise-based boost of hippocampal growth not only increases memory but improves mood. Exercise has been shown to be as efficient as antidepressants for mild and moderate depression, useful information in an age where more than one in ten adults are prescribed antidepressants in the US.

How about kids? Exercise does wonders to children’s cognitive abilities and their ability to learn. Just 20 minutes of playing increases math and reading test scores. And this isn’t exclusive to tests in the lab, several studies have shown that kids in good shape actually perform better in school. Physical activity even seems to affect IQ! When data from the Swedish military service was analyzed from 1.2 million 18-year old male Swedes a clear pattern emerged—boys in good cardiovascular fitness had higher IQs, a result that was also apparent for identical twins. In a number of identical twins, one brother was in good shape while the other brother was not. The brother in good shape had a higher IQ than his identical twin—even though they, more or less, have identical genes (there can be small differences in identical twins) and have grown up together!

The list goes even further. Exercise can make us more creative. A recent study showed that creativity test results for divergent thinking (“brainstorming”) increased by more than 50% if participants had walked for 45 minutes before the test. The creativity-boost is temporary, we get more creative during 1-2 hours after exercise—probably due to increased blood flow, than we are back to our normal creativity-level. The takeaway message is: if you are stuck with a problem, then go for a walk or jog, and rethink of the problem an hour afterward and increase your chances of coming up with a solution.

But why is exercise so important for the brain? It is not at all obvious from our modern perspective but makes more sense if we look at our history. Our brains are basically the same today as they were 10 000 years ago. It was when our ancestors moved: during hunting, running from predators, and discovering new lands, that they really needed their cognitive abilities. That was when they needed to be attentive and have a memory to remember new experiences. That is why evolution has slowly tailored the brain in such a way that it benefits from exercise and that is why we still benefit from it today as our brains have not grossly changed since our ancestors days on the savanna.

While the human brain is fundamentally unchanged in the past 10 000 or even 20 000 years or so, our lifestyle has changed enormously. Modern sedentary lifestyle deprives many of us from getting enough physical activity, leading to vast consequences not only in terms of obesity and type-2 diabetes but also when it comes to wellbeing and how we function mentally. Exercise is not about sports. It is not about participating in a lifestyle. It is something we need to do for our brain and cognitive abilities since we have evolved for it. Now neuroscience is helping us to rediscover the brain-medicine that we forgot.


[1] Eriksson P et al (1998) Neurogenetis in the adult human hippocampous. Nature medicine. 4;1313–1317. doi:10.1038/3305

[2] Alvarexz- Bueno C (2017) The Effect of Physical Activity Interventions on Children’s Cognition and Metacognition: A Systematic Review and Meta-Analysis. J Am Acad Child Adolesc Psychiatry. 56(9):729–738. doi:10.1016/j.jaac.2017.06.012

[3] Åberg, M et al (2009) Cardiovascular fitness is associated with cognition in young adulthood. Proc Natl Acad Sci USA. 106(49):20906–11. doi:10.1073/pnas.0905307106

[4] Oppezzo et al (2014) Give Your Ideas Some Legs: The Positive Effect of Walking on Creative Thinking Journal of Experimental Psychology: Learning, Memory, and Cognition. 40(4):1142–1152. doi:10.1037/a0036577

Image via Tumisu/Pixabay.

]]> 0
Could Deadly Zika Virus Cure Brain Cancer? Mon, 09 Oct 2017 11:16:44 +0000 Not long ago, Zika virus was dominating headlines. A new infection was hardly ever heard about before then, yet is now affecting hundreds of thousands of people in Latin America, causing disfiguration and microcephalia in new-born babies. Microcephalia is caused by severe delayed and abnormal development of the brain, resulting in the range of intellectual disability, dwarfism, poor motor functions and speech. With no cure or even preventive vaccination available, many women in the most affected regions were reportedly considering postponing any planned pregnancies.

The virus was actually discovered back in 1947 in Zika forest in Uganda (and this is where its name comes from). The pathogen is related to better known viruses causing dengue and yellow fever. The disease is spread predominantly by one type of mosquito and was a rare occurrence until the epidemics of 2015–2016, when in Brazil alone well over 100,000 cases were reported. The disease caused particular concern as it coincided with the run-up to the 2016 Olympic games in Rio de Janeiro.

Apart from mosquitoes, the virus can be spread through sexual contact and from mother to child during pregnancy or at delivery. The latter way of transmission is a particular concern: while adults suffer only very mild symptoms (fever and rush), children infected during pregnancy suffer major brain damage. The reason for this is that viral infection delays brain development.

Further research identified a more specific reason: Zika virus specifically targets neural progenitor cells, the cells responsible for production of other neurons. This is what makes the virus very dangerous for the developing fetus. Neuron progenitor cells are abundant in the developing fetal brain. However, only a few are left in the brain of adults. In adults with completely formed brain, Zika virus infection causes only mild symptoms, if any (Zika fever). But the specificity with which the virus targets neural progenitor cells gave researchers an idea that might revolutionize the treatment of one of the deadliest types of brain cancer—glioblastoma.

Glioblastoma is one of the most difficult types of cancer to treat, with patients rarely surviving even one year after diagnosis. Unfortunately, this is also one of the most common types of brain cancer. Approximately 12,000 people are diagnosed with glioblastoma in the US alone. The quick return of the disease even after aggressive surgery is caused by the survival of a few glioblastoma stem cells. Many  types of cancer like glioblastoma grow due to the existence of cancer stem cells that give rise to other tumor cells. The glioblastoma stem cells remain almost unaffected by all radio- and chemotherapy regiments currently used to treat this malignancy, even though these therapeutic approaches do kill other cells in the tumor. They also successfully avoid detection and elimination by the immune system, allowing the regrowth of cancer in a short period of time after surgery, replenishing the cancer cells eliminated by therapy.

Researchers noted that glioblastoma stem cells are, in many ways, very similar to normal neural progenitor cells. Therefore, infecting a person with glioblastoma with Zika virus might help in treating the disease by eliminating the stem cells. This was a core idea that researchers initially tested on cancer cells from tumors obtained from surgeries. It turned out that the virus does indeed kill cancer stem cells, leaving other cancer cells almost unaffected.

To further make sure that the virus doesn’t affect the normal cells of the brain, scientists have performed experiments on brain tissues from patients with epilepsy. The tests did not detect any damage to these cells due to viral infection.

The findings suggest that combining traditional chemotherapy with treatment with Zika virus may help to eliminate stem and non-stem cancer cells. Such an outcome will most certainly be beneficial for the patients.

To test the idea further, scientists injected Zika virus directly into the brain of mice with brain tumors. In all animals infected with the disease, tumor growth slowed down significantly and the animals survived longer.

The researchers suggest that Zika virus can be injected into the brain of a glioblastoma patient at the time of surgery. The subsequent chemotherapy will remove any remaining cancer cells that survived surgery, and Zika virus will kill the residual glioblastoma stem cells. The published findings also suggest that the virus can be further engineered to be more easily eliminated from normal healthy brain cells using the patient’s immune system. Less harmful strains of the virus have already been developed to this end and have demonstrated some success in animal experiments.

It remains to be seen if a successful therapeutic approach to treat deadly glioblastoma can be developed using Zika virus. The path to future use of Zika-based treatment in hospitals will likely be long. The original results, however, are very encouraging. This new approach is another fascinating example of a growing number of new innovative tools that are currently being developed to treat a variety of cancers.


Bleeker, Fonnet E, Molenaar, Remco J; Leenstra, Sieger (2012) Recent advances in the molecular understanding of glioblastoma. Journal of Neuro-Oncology. 108 (1): 11–27. PMC3337398.

Lathia J, Mack SC, Mulkearns-Hubert EE, Valentim CLL, Rich JN (2015). Cancer stem cells in glioblastoma. Genes & Development, 29(12), 1203–1217. doi:10.1101/gad.261982.115.

Rasmussen, Sonja A, Jamieson, Denise J, Honein, Margaret A, Petersen, Lyle R (2016) Zika Virus and Birth Defects — Reviewing the Evidence for Causality. New England Journal of Medicine. 374 (20): 1981–7. doi:10.1056/NEJMsr1604338.

Zhu Z, Gorman MJ, McKenzie LD, Chai JN, Hubert CG, Prager BC, Fernandez E, Richner JM, Zhang R, Shan C, Wang X, Shi P, Diamond MS, Rich JN, Chheda MG. Zika virus has oncolytic activity against glioblastoma stem cells. The Journal of Experimental Medicine, Sept. 5, 2017 doi:10.1084/jem.20171093.

Image via NeuPaddy/Pixabay.

]]> 0
Excessive Porn Consumption Can Cause Erectile Dysfunction – Myth or Truth? Fri, 06 Oct 2017 15:25:49 +0000 There’s a growing trend of healthy young men using medications like Viagra and Cialis, drugs intended for older men and those with health-related erectile dysfunction (ED).

Many of these young men (unknowingly?) use these drugs to treat a condition that is psychological rather than physiological: porn-induced erectile dysfunction (PEID).

Online social groups and websites such as Your Brain on Porn and Reddit’s “no fap” group ( were founded to help men with PIED.

At the same time, studies that checked for a connection between watching porn and erectile dysfunction found no evidence associating the two. If that’s the case, what explains the sharp rise in ED cases in young men in recent years?

In 2012, Swiss researchers used the International Index of Erectile Function (IIEF-5), finding an ED rate of 30% in a cross-section of Swiss men aged 18 to 24. A 2013 Italian study reported that one in four patients who sought help for new-onset ED were younger than 40, with rates of severe ED nearly 10% higher than in men over 40.

We asked Takeesha Roland-Jenkins (MS in psychology and MS in neurology) a professional consultant for the Between Us Clinic, to weigh in. Takeesha is an expert in both psychology and neurology, and she has a unique insight into both the psyche and the brain.

In your opinion, can excessive porn consumption really cause a man to experience erectile dysfunction?

Yes, watching hardcore porn excessively, especially pornography with deviant and violent behavior can cause mental changes that may result in erectile dysfunction.

What happens in a man’s brain when he is exposed to extreme sexual stimuli (such as hardcore pornography) and how does this relate to ED?

Hardcore pornography is often graphic and generally displays deviant, violent, and abnormally kinky behavior. This is not typical for the average sexual encounter and it can create unrealistic mental perceptions of how a man should engage in sexual activity. Furthermore, a man may initially get a thrill from watching what he believes is an exotic encounter, but over time excessive porn watching desensitizes men to intense sexual stimuli and even sexual violence which sometimes occurs in the porn that is being viewed, thereby lowering the ability to engage in true intimacy.

Pornography, in general, causes intense mental stimulation that changes the way the brain views sexual activity and sexual violence in pornography exaggerates the alterations in the brain.

This phenomenon is similar to becoming more tolerant to a certain drug after prolonged use; meaning you eventually need higher and higher doses to experience the same feelings of euphoria. Repeatedly watching hard-core porn can have a similar effect on sexual performance. In other words, watching excessive porn changes the way the brain processes sexual arousal and activity, often leading to desensitization that lowers libido and causes psychological erectile dysfunction.

Some say that men who watch too much porn can develop performance anxiety. Why does anxiety affect the ability to get and maintain an erection?

In addition, because a man’s brain has now become accustomed to getting stimulated at the sight of intense pornographic images, an ordinary encounter will cause the man to wonder if he will be able to perform at a similar level (e.g., for extended periods) as what has been observed in a pornographic video. Therefore, the performance anxiety is still related to the changes that occur in the brain and wondering if he can satisfy his partner in the manner that the brain has become used to. In other words, the anxiety is a direct result of worrying about being able to re-enact the sexual scenes in porn; this unrealistic goal can lead to performance anxiety. Subsequently, a man may get an erection, but after beginning to worry about whether he can perform like the actors in porn, the erection may soften or stop altogether.

So, other than performance anxiety, is there another reason why porn could cause men to experience ED?

The changes that occur in the brain’s ability to lead to an erection contributes more to PIED than performance anxiety. Over time the brain needs increasing levels of stimulation from the pornography in order to initiate an erection. Performance anxiety can unfortunately worsen erectile dysfunction.

Does porn-induced erectile dysfunction cure itself if the man stops watching porn?

Discontinuing pornographic viewing does not automatically cure PIED. Furthermore, drugs such as Viagra or Cialis target the physical aspect of erectile dysfunction, not the psychological aspect. This means that a man will become completely dependent upon such drugs until the brain restores its ability to initiate an erection under ordinary sexual circumstances. A healthy relationship (e.g., marriage) with a patient partner can help a man overcome PIED over time.

What form of treatment would you recommend a man who suffers from PIED?

Beneficial treatment would be in the form of individualized therapy, which may vary in time (e.g., weeks, months) depending on the individual and the degree of PIED. As PIED is often the result of an addiction to pornography, this form of treatment should be viewed as the first step to addiction recovery.

The purpose of therapy is to begin to desensitize the brain to the pornographic images and to address some of the reasons that the addiction to porn more than likely started. Men are also encouraged to reconnect intimately with their partners in order to help the brain restore its ability to initiate sexual arousal during ordinary sexual encounters. Overall, a man must be willing to give himself time to gradually overcome PIED.

*Originally published on


Prause N and Pfaus J. (2015), Viewing Sexual Stimuli Associated with Greater Sexual Responsiveness, Not Erectile Dysfunction. Sexual Medicine, 3: 90–98. doi:10.1002/sm2.58.

Landripet I and Štulhofer A. (2015), Is Pornography Use Associated with Sexual Difficulties and Dysfunctions among Younger Heterosexual Men?. The Journal of Sexual Medicine, 12: 1136–1139. doi:10.1111/jsm.12853.

Park BY, Wilson G, Berger J, Berger J, Christman M, Reina B, Bishop F, Klam WP, Doan AP. Is Internet Pornography Causing Sexual Dysfunctions? A Review with Clinical Reports. Lane SD, ed. Behavioral Sciences. 2016;6(3):17. doi:10.3390/bs6030017.

Image via geralt/Pixabay.

]]> 0
New Fibromyalgia Treatment: Emotional Awareness and Expression Therapy Mon, 02 Oct 2017 15:25:59 +0000 Psychotherapy that encourages addressing emotional experiences related to trauma, conflict, and relationship problems has been found helpful for people with the chronic pain condition fibromyalgia.In the randomized clinical trial at Wayne State University in Detroit, 230 adults with fibromyalgia received one of three treatments. Each was presented for eight weekly sessions to small groups of patients.

The new therapy, called Emotional Awareness and Expression Therapy (EAET), helps patients view their pain and other symptoms as stemming from changeable neural pathways in the brain that are strongly influenced by emotions, the researchers explain.

EAET helps patients process emotional experiences, such as disclosing important struggles, learning how to adaptively express important feelings—especially anger and sadness, but also gratitude, compassion, and forgiveness—and empowering people to be more honest and direct in relationships that have been conflicted or problematic, according to the researchers.

The EAET intervention was compared to both an educational intervention, as well as the gold standard psychological approach in the field, cognitive behavioral therapy. Six months after treatments ended, patients were evaluated for the severity and extent of their pain and other problems that people with fibromyalgia often experience.

Patients who received EAET had better outcomes—reduced widespread pain, physical impairment, attention and concentration problems, anxiety, and depression, and more positive emotions and life satisfaction—than patients who received the education intervention, the researchers report.

More than twice as many people in EAET (34.8 percent) reported that they were “much better” or “very much better” than before treatment, compared to 15.4 percent of education patients.

An important additional finding was that the new emotion therapy also had greater benefits than cognitive behavior therapy in reducing widespread pain and in the number of patients who achieved at least 50 percent pain reduction, the researchers point out.

Mark A. Lumley, Ph.D., a professor of psychology said:

Many people with fibromyalgia have experienced adversity in their lives, including victimization, family problems, and internal conflicts, all of which create important emotions that are often suppressed or avoided. Emerging neuroscience research suggests that this can contribute strongly to pain and other physical symptoms.

We developed and tested an approach that tries to help people overcome these emotional and relationship problems and reduce their symptoms, rather than just help people manage or accept their fibromyalgia. Although this treatment does not help all people with fibromyalgia, many patients found it to be very helpful, and some had dramatic improvements in their lives and their health.

The Wayne State researchers collaborated with a team of researchers from the University of Michigan Medical Center led by David A. Williams, Ph.D., a professor of anesthesiology.

The study was published in the journal PAIN.

This guest article originally appeared on New Therapy Technique Offers Hope to Those with Fibromyalgia by Janice Wood.


Lumley, M., Schubiner, H., Lockhart, N., Kidwell, K., Harte, S., Clauw, D., & Williams, D. (2017). Emotional awareness and expression therapy, cognitive behavioral therapy, and education for fibromyalgia. PAIN, 1. DOI: 10.1097/j.pain.0000000000001036.

Image via stevepb/Pixabay.

]]> 0
Do We Sense Each Other’s Sickness? Fri, 28 Jul 2017 15:00:08 +0000 Social behavior is important for our survival as a species. But social interaction also gives pathogens a chance to spread, and it thereby increases our exposure to infection. Our immune system is a complex defense system that has evolved to protect us from infection. Therefore, it makes sense to assume that our immune system must have developed ingenious strategies to protect us from new pathogens that social interaction has exposed us to.

Evidence of a link between the immune system and our social behavior has been accumulating in the recent years. A direct connection between the brain and the immune system, through lymphatic vessels in the meninges, was recently revealed. It was shown that the immune system can directly affect and even control social behavior and the desire for social interaction because an impaired immunity induced deficits in social behavior. This sounds like a clever preventive self-defense mechanism designed to avoid contagion—in times of poor immunity, our brain gets the message to reduce social interaction and, consequently, exposure to pathogens.

This is a self-defense mechanism that is activated when our body signals a poor immunological status; it’s an internal chemical communication system. But is there an external threat signaling system? The ability to detect and avoid infected individuals would clearly be a great evolutionary asset in strengthening our protection mechanisms. Many animals can detect sickness via odors, leading to a restraint in social interaction, most likely intended to reduce exposure to disease. Do humans have a similar sensory sickness detection system, something that allows us to detect infectious threats in others?

To answer this question, a new study aimed at determining whether humans can detect sickness in others from visual and olfactory cues. Sickness was experimentally induced by the injection of lipopolysaccharide (LPS), a molecule found in the membrane of Gram-negative bacteria that provokes robust immune responses. The activation of immune responses leads to an increase in the production of pro-inflammatory molecules that activate sickness responses and behaviors. It is known that visual cues of sickness, such as redness of the skin, allow us to infer the health of others. But although LPS induces a strong sickness response, its observable effects are subtle, and odor cues are difficult to perceive.

Photos of the face and samples of body odors of both sick and healthy individuals were presented to a group of naive participants while their cerebral responses were recorded using fMRI. These participants were not aware that they would be seeing and smelling sick and healthy people. They were asked to focus on the faces while the odors were also presented and rate how much they liked the person. Faces were also rated on attractiveness, health, and desired social interaction, while odors were rated on intensity, pleasantness, and health. This allowed the assessment of the “liking behavior” towards the faces, an indication of the will to approach and interact with others.

The rating of sick and healthy faces showed that photos obtained during acute sickness were generally considered less attractive, less healthy, and less socially desirable than the faces of participants receiving the placebo treatment. When faces were presented concomitantly with an odor, there was a lower liking of sick than of healthy faces, regardless of the odor presented with the face. Participants were not able to perceive sickness in the odors, nor did they rate sick odors as more unpleasant or more intense than healthy odors. Remarkably, however, faces, were less liked when paired with sick body odor regardless of being sick or healthy.

These results show that we can detect early and subtle signs of sickness in others from both facial and olfactory cues, even just a couple of hours after activation of their immune system. Moreover, fMRI data revealed that visual and olfactory sickness cues activated their respective visual face processing and olfactory sensory cortices, as well as multisensory convergence zones. And even though odors were often too weak to be consciously detected, these olfactory sickness cues still led to activation of the olfactory cortex.

The study also revealed that this perception of subtle cues of sickness leads to reduced liking and decreased will for social interaction. This response may represent a human behavioral defense system against disease. The integration of olfactory and visual sickness cues in the brain may be part of a mechanism designed to detect sickness, resulting in behavioral avoidance of sick individuals, and in avoidance of impending threats of infection.


Filiano AJ, et al (2016). Unexpected role of interferon-? in regulating neuronal connectivity and social behavior. Nature, 535(7612):425-9. doi: 10.1038/nature18626

Kipnis J (2016). Multifaceted interactions between adaptive immunity and the central nervous system. Science, 353(6301):766-71. doi: 10.1126/science.aag2638

Louveau A, et al (2015). Structural and functional features of central nervous system lymphatic vessels. Nature, 523(7560):337-41. doi: 10.1038/nature14432

Regenbogen C, et al (2017). Behavioral and neural correlates to multisensory detection of sick humans. Proc Natl Acad Sci U S A, pii: 201617357. doi: 10.1073/pnas.1617357114. [Epub ahead of print]

Shattuck EC, Muehlenbein MP (2015). Human sickness behavior: Ultimate and proximate explanations.Am J Phys Anthropol, 157(1):1-18. doi: 10.1002/ajpa.22698.

Image via junko/Pixabay.

]]> 0
Nurturing the Brain – Part 11, Magnesium Wed, 26 Jul 2017 15:00:50 +0000 Magnesium is everywhere—it does not occur free in nature, but in combination with other elements it is the eighth-most abundant chemical element in the Earth’s crust and the third-most abundant element in seawater; it is even the ninth-most abundant in the Milky Way! In the human body, magnesium is the fourth most abundant ion and the eleventh most abundant element by mass, being stored in bones, muscles, and soft tissues.

Magnesium is fundamental for health: it is essential to all cells and to the function of hundreds of enzymes, including enzymes that synthesize DNA and RNA, and enzymes involved in cellular energy metabolism, many of which are vital. Magnesium is involved in virtually every major metabolic and biochemical process in our cells and it plays a critical role in the physiology of basically every single organ.

Low plasma levels of magnesium are common and are mostly due to poor dietary intake, which has lowered significantly in the last decades. Magnesium can be found in high quantities in foods containing dietary fiber, including green leafy vegetables, legumes, nuts, seeds, and whole grains. But although magnesium is widely distributed in vegetable and animal foods, some types of food processing can lower magnesium content up to 90%. Also, the soil used for conventional agriculture is becoming increasingly deprived of essential minerals. In the last 60 years, the magnesium content in fruit and vegetables has decreased by around 20–30%.

Symptomatic magnesium deficiency due to low dietary intake in healthy people is not very frequent, but a consistently poor dietary supply of magnesium has insidious effects. Magnesium deficiency alters biochemical pathways and increases the risk of a wide range of diseases over time, including hypertension and cardiovascular diseases, metabolic diseases, osteoporosis, and migraine headaches.

In the brain, magnesium is an important regulator of neurotransmitter signaling—particularly the main neurotransmitters glutamate and GABA—by modulating the activation of NMDA glutamate and GABAA receptors. It also contributes to the maintenance of adequate calcium levels in the cell by regulating the activity of calcium channels.

These physiological roles make magnesium an essential element in important neuronal processes. Magnesium participates in mechanisms of synaptic transmission, neuronal plasticity, and consequently, learning and memory. Accordingly, increased levels of magnesium in the brain have been shown to promote multiple mechanisms of synaptic plasticity that enhance different forms of learning and memory, as well as delay age-related cognitive decline. Increased levels of magnesium in the brain have also been linked to an increased proliferation of neural stem cells, indicating that it may promote the generation of new neurons (neurogenesis) in adulthood. This is an important feature because neurogenesis is a key mechanism in the brain’s structural and functional adaptability, in cognitive flexibility, and in mood regulation.

Magnesium supplementation has also been shown to modulate the neuroendocrine system and to improve sleep quality by promoting slow wave (deep) sleep, which, among many other functions, is also important for cognition and memory consolidation.

Furthermore, magnesium may enhance the beneficial effects of exercise in the brain, since it has been shown to increase the availability of glucose in the blood, muscle, and brain, and diminish the accumulation of lactate in the blood and muscles during exercise.

But just as increasing magnesium levels can be beneficial, magnesium deficiency can have serious harmful effects.

Magnesium has important roles in the regulation of oxidative stress, inflammatory processes and modulation of brain blood flow. In circumstances of magnesium deficiency, all of these functions can potentially be dysregulated, laying the ground for neurological disorders. Also, in a context of low magnesium availability in the brain, NMDA glutamate receptors, which are excitatory, may become excessively activated, and GABAA receptors, which are inhibitory, may become insufficiently activated; this can lead to neuronal hyperactivity and to a condition known as glutamate excitotoxicity. This causes an excessive accumulation of calcium in neurons, which in turn leads to the production of toxic reactive oxygen species and, ultimately, to neuronal cell death.

Magnesium deficiency has been associated with several neurological and psychiatric diseases, including migraines, epilepsy, depression, schizophrenia, bipolar disorder, stress, and neurodegenerative diseases. Magnesium supplementation has shown beneficial effects on many of these conditions, as well as in post-stroke, post-traumatic brain injury, and post-spinal cord injury therapies. This therapeutic action is likely due to the blocking of NMDA glutamate receptors and decreasing excitotoxicity, reducing oxidative stress and inflammation, and increasing blood flow to the brain, all of which are determinants of the outcome of these conditions.

There are multiple benefits to be obtained from magnesium, both from a health promotion and disease prevention/management perspective. The recommended daily intake of magnesium is 320 mg for females and 420 mg for males. Too much magnesium from food sources has no associated health risks in healthy individuals because the kidneys readily eliminate the excess. However, there is a recommended upper intake level for supplemental magnesium, since it can cause gastrointestinal side effects. So, keep it below 350 mg/day.


Chen HY, et al (2014). Magnesium enhances exercise performance via increasing glucose availability in the blood, muscle, and brain during exercise. PLoS One, 9(1):e85486. doi: 10.1371/journal.pone.0085486

de Baaij JH, et al (2015). Magnesium in man: implications for health and disease. Physiol Rev, 95(1):1-46. doi: 10.1152/physrev.00012.2014

Held K, et al (2002). Oral Mg(2+) supplementation reverses age-related neuroendocrine and sleep EEG changes in humans. Pharmacopsychiatry, 35(4):135-43. doi: 10.1016/j.pbb.2004.01.006

Jia S, et al (2016). Elevation of Brain Magnesium Potentiates Neural Stem Cell Proliferation in the Hippocampus of Young and Aged Mice. J Cell Physiol, 231(9):1903-12. doi: 10.1002/jcp.25306

National Institutes of Health, Office of Dietary Supplements. Magnesium Fact Sheet for Health Professionals

Slutsky I, et al (2010). Enhancement of learning and memory by elevating brain magnesium. Neuron. 2010 Jan 28;65(2):165-77. doi: 10.1016/j.neuron.2009.12.026

Image via Brett_Hondow/Pixabay.

]]> 0
Prevention Is the Best Medicine for Dementia Fri, 14 Jul 2017 15:00:29 +0000 Population aging is bringing about a substantial increase in the prevalence of neurocognitive disorders. Current projections estimate that, by 2050, more that 130 million people will be affected by dementia worldwide. As experts assemble to devise strategies to face this incoming challenge, one conclusion stands out: prevention is crucial.

The goal of prevention is obvious: to promote good health and take action before disease onset, thereby reducing the incidence of disease. And this is obviously better than having to manage a disease and its complications, and losing quality of life – “prevention is better than cure.”

But in order for preventive behaviors to be acquired, knowledge is essential – knowledge of modifiable risk factors and preemptive actions that can be adopted, and knowledge of how effective they really are. Studies addressing the benefit of lifestyle interventions for the prevention of dementia have identified numerous modifiable risk and protective factors and have shown that change can indeed be beneficial.

Modifiable risk factors for cognitive impairment include lifestyle factors such as smoking, high alcohol intake, diet (saturated fats, sugar, processed foods), and poor physical activity; these then manifest in other risk factors that are already a consequence of inadequate lifestyle options, namely vascular and metabolic diseases (cerebrovascular and cardiovascular diseases, diabetes, hypertension, overweight and obesity, high cholesterol). In the end, it all builds up to speed up cognitive decline.

Protective factors include opposite lifestyle choices, such as quitting smoking, moderate alcohol intake, healthier diet (Mediterranean diet, polyunsaturated fatty acids and fish-related fats, vitamins B6 and B12, folate, antioxidant vitamins (A, C and E), vitamin D, physical activity, and mentally stimulating activity.

Still, cognitive disorders are complex, multifactorial conditions – even if you lead the healthiest life, dementia may still strike you. But this should not be an excuse to let go because research shows is that preventive behaviors shift the odds in your favor.

An important aspect of behavioral change is that it should be integrative. Single-domain interventions provide some benefit: physical activity and cognitive training have been positively associated with cognitive performance in multiple studies; also, a recent meta-analysis showed that an increased consumption of fruit and vegetables reduces the risk of cognitive impairment and dementia. But multi-domain interventions, in which multiple risk factors are targeted simultaneously are more likely to deliver better results.

For example, a 2015 Finnish study assessed the effect of a 2-year multimodal intervention in adults aged 60-77 years at risk of cognitive decline but without pronounced cognitive impairment. Four intervention targets were included: diet, exercise, cognitive training, and vascular risk. The program’s design included a diet with high consumption of fruit and vegetables, consumption of wholegrain cereal products and low-fat milk and meat products, low sucrose intake, use of vegetable margarine and rapeseed oil instead of butter, and fish consumption at least two portions per week.

The physical exercise training program included progressive muscle strength training, aerobic exercise, and exercises to improve postural balance. Cognitive training consisted of computer-based training targeting executive processes, working memory, episodic memory, and mental speed. Metabolic and vascular risk factors were monitored throughout the study. Social activities were also stimulated through the numerous group meetings of all intervention components.

The study showed that simultaneous changes in multiple risk factors, even of small magnitude, had beneficial effects on the risk of cognitive decline, on overall cognition, complex memory tasks, executive functioning and processing speed, and on also BMI, dietary habits, and physical activity.

But a timely prevention seems fundamental – these lifestyle interventions may not be as effective once cognitive impairment is manifest. A recent study evaluated the impact of a 3-year omega-3 fatty acid supplementation with or without multi-domain lifestyle interventions on cognitive function in adults aged 70 years or older. These adults already had symptoms of cognitive impairment: either memory complaints, limitations in one instrumental daily living activity, or slow gait speed. The multi-domain intervention included cognitive training, physical activity, nutrition, and management of cardiovascular risk factors.

In this case, neither the omega-3 supplementation alone, nor the combination with the lifestyle interventions were able to reduce cognitive decline. Also, the adherence to lifestyle interventions over time was lower in this study when compared to other studies with younger seniors, with no clinical manifestations of dementia onset. But still, those with increased risk of dementia were the ones who benefited the most.

Early prevention is probably the best strategy. Instead of trying to prevent dementia later in life, focusing on preventing earlier, milder, and more common forms of cognitive impairment may be a better strategy that may end up also preventing cardiovascular and metabolic diseases and, ultimately, dementia. Because they’re all fruits from the same tree.


Andrieu S, et al (2017). Effect of long-term omega 3 polyunsaturated fatty acid supplementation with or without multidomain intervention on cognitive function in elderly adults with memory complaints (MAPT): a randomised, placebo-controlled trial. Lancet Neurol, 16(5):377-389. doi: 10.1016/S1474-4422(17)30040-6

Jiang X, et al (2017). Increased Consumption of Fruit and Vegetables Is Related to a Reduced Risk of Cognitive Impairment and Dementia: Meta-Analysis. Front Aging Neurosci, 9:18. doi: 10.3389/fnagi.2017.00018

Kivipelto M, et al (2017). Can lifestyle changes prevent cognitive impairment? Lancet Neurol, 16(5):338-339. doi: 10.1016/S1474-4422(17)30080-7

Ngandu T, et al (2015). A 2 year multidomain intervention of diet, exercise, cognitive training, and vascular risk monitoring versus control to prevent cognitive decline in at-risk elderly people (FINGER): a randomised controlled trial. Lancet, 385(9984):2255-63. doi: 10.1016/S0140-6736(15)60461-5

Shah H, et al (2016). Research priorities to reduce the global burden of dementia by 2025. Lancet Neurol, 15(12):1285-1294. doi: 10.1016/S1474-4422(16)30235-6

Solomon A, et al (2016). Advances in the prevention of Alzheimer’s disease and dementia. J Intern Med, 275(3):229-50. doi: 10.1111/joim.12178

Image via Couleur/Pixabay.

]]> 0
Are Sleep Apps Effective Tools For Behavioral Change? Wed, 12 Jul 2017 15:00:04 +0000 Smartphones are technological Swiss Army knives – easy to carry and, thanks to apps, able to do almost anything. All you need is a smartphone and an internet connection to unfold a thousand tools.

Apps make communication, traveling, working, and entertainment easier. And they can also allow us to monitor and manage our health, fitness and lifestyle, or even improve them. There are thousands of health and lifestyle apps – for exercise, for nutrition, for weight loss, for meditation, for overall health, for sleep… Their use is on the rise and they have shown great potential in effectively promoting self-improvement.

Technology may actually revolutionize how we take care of ourselves. It can successfully influence behavior and this is empowering in the sense that it offers an opportunity to self-manage our health routines. Apps can be a great aid for lifestyle interventions – they allow us to monitor our behavior and our progress, they can motivate us, give positive reinforcement, and set goals for continued enhancement. A well-designed app, built on scientific background may offer valuable help to behavioral health and lifestyle interventions.

But among the sea of apps, one wonders how many are really effective and how many present evidence-based content and behavioral theory-based interventions. Although it is not clear whether evidence- and theory-based interventions are indispensable for the efficacy of an app, they are known to be effective in changing behavior, being, most likely, a predictor of efficacy.

Traditional theory-based behavioral modification strategies state that behavioral change can be most successfully achieved when multiple strategic approaches and behavioral constructs are combined; these include informational strategies (creating knowledge); cognitive strategies, such as perceived benefits, barriers and risks; behavioral strategies, such as self-monitoring, realistic goal-setting, self-reward, relapse prevention; emotion-focused strategies, such as stress and negative affect management; and therapeutic interventions such as skill-building, for example. Apps that include such features have been proven more effective.

However, app developers are naturally focused on keeping users engaged. Therefore, many app features may tend to favor usability. Also, it is likely that app development may be preferentially aligned with more contemporary behavioral models. These postulate that technology can be designed to change user attitudes and behaviors through persuasion and social influence (Persuasive Technology Theory), and that when technology increases motivation and capacity to change, triggers to change behavior are more likely to work (Fogg Behavioral Model).

But a reliance on either traditional or contemporary behavioral models does not seem to be the case for most health, fitness and lifestyle apps: a 2011 review revealed that most had insufficient evidence-based content; a 2012 analysis showed a general lack of theory-based strategies; a 2013 study of exercise apps found that, overall, the apps contained few features based on behavioral change theory; another 2013 study reached the same conclusion for weight management apps; a 2015 study reported similar findings for alcohol reduction apps.

And what about sleep apps? They are one of the most popular type of lifestyle and health apps, which comes as no surprise – sleep disorders affect millions of people and this creates a huge demand for interventional strategies. But sleep apps are particular in the sense that interventional constructs need to go way beyond motivation.

Therefore, a new study aimed at determining whether sleep apps follow evidenced-based guidelines or are grounded in behavioral change or persuasive technology theories.

The study included the most downloaded and reviewed sleep apps for both iOS and Android. From the 369 apps found using the term “sleep” (in September 2015), 35 apps met the authors’ inclusion criteria. They scored them based on the presence of behavioral and persuasive technology constructs and correlated these scores with the average user rating for each app.

The average behavioral construct score was 34%, whereas the average persuasive technology score was 42%, which is not impressive. Realistic goal setting (86%), time management (77%), and self-monitoring (66%) were the behavioral constructs most commonly included in sleep apps; factors that contributed most to the apps’ persuasiveness were the user interface (94%), provision of positive feedback (54%), and social praise (40%).

Interestingly, the authors found a positive association between the presence of behavioral constructs and the apps’ popularity and ratings, showing that a good scientific design is an indication of probable success.

This also indicates that, since there is still a relatively poor inclusion of theory-based constructs, there is room to grow. Building strong evidence-based apps is likely to result in a real opportunity for effective behavioral intervention and be beneficial to the management of sleep disorders.

(An important side note: one obvious limitation of using smartphone apps for sleep management is the well-known negative impact of LED devices on the circadian rhythm and, consequently, on sleep. Something that should be kept in mind while designing a sleep app is the possibility on minimizing one’s interaction with our phone before bedtime.)


Azar KM, et al (2013). Mobile applications for weight management: theory-based content analysis. Am J Prev Med, 45(5):583-9. doi: 10.1016/j.amepre.2013.07.005

Breton E, et al (2011). Weight loss—there is an app for that! But does it adhere to evidence-informed practices? Transl Behav Med, 1(4):523–9. doi: 10.1007/s13142-011-0076-5

Cowan LT, et al (2013). Apps of steel: are exercise apps providing consumers with realistic expectations?: a content analysis of exercise apps for presence of behavior change theory. Health Educ Behav, 40(2):133–9. doi: 10.1177/1090198112452126

Crane D, et al (2015). Behavior change techniques in popular alcohol reduction apps: content analysis. J Med Internet Res, 17(5):e118. doi: 10.2196/jmir.4060

Fogg BJ (2003). Persuasive technology: using computers to change what we think and do (interactive technologies). Morgan Kaufmann, San Francisco. ISBN: 978-1-55860-643-2

Glanz K, et al (2008). Theory, research, and practice in health behavior and health education. In Glanz K, Rimer B & Viswanath K, Health behavior and health education: Theory, research, and practice (4th ed., pp. 23-40). San Francisco, CA: Jossey-Bass. ISBN: 978-0-470-39629-2

Grigsby-Toussainta DS, et al (2017). Sleep apps and behavioral constructs: A content analysis. Prev Med Rep, 6: 126–129. doi: 10.1016/j.pmedr.2017.02.018

Higgins JP (2016). Smartphone Applications for Patients’ Health and Fitness. Am J Med, 129(1):11-9. doi: 10.1016/j.amjmed.2015.05.038

Image via 1haboeri/Pixabay.

]]> 0
Mental Health, Drug Approval, and Biomedical Research in the 21st Century Wed, 05 Jul 2017 15:00:58 +0000 A few weeks before the end of 2016, which was also a few weeks before the end of a congressional session and the end of President Barack Obama’s time in office, the 21st Century Cures Act became law. It had been passed with overwhelming support in the House of Representatives and the Senate and it was signed in to law by President Obama in December. The law makes sweeping, if not controversial, changes to many of the most important issues facing health care today, but we must wait to see how – and if – the law will be funded under a new President and future Congresses.

Originally introduced in 2015, the Act was proposed with the goal of promoting development and speeding approval of new drugs and medical devices, and, in its final form, it offers broad incentives and funding opportunities for many areas of health care research, development, and support.

The law directs increasing the budget for the National Institutes of Health (NIH) and creating an NIH fund to promote innovation in research and support opportunities for young researchers. NIH-funded research plays a crucial role in supporting the approval of many new drugs, so this is a welcome addition for many drug developers and researchers. The bill also commits billions of dollars to research precision medicine, map the human brain, and cure cancer.

A large portion of the Act is devoted to accelerating the drug approval process. The U.S. Food and Drug Administration—the gatekeeper of all drugs and devices in the United States—is now instructed by the Act to consider nontraditional study designs and methods when evaluating approvals of new drugs and indications. While most people, especially those in the health care industry, view the approval of new safe and effective drugs as a laudable goal, the willingness to rely on shorter or smaller clinical trials or observational studies or registries to evaluate safety and effectiveness may prove to be problematic. Such approaches are not as rigorous as randomized, controlled trials, that have been, until now, the gold standard of drug approval data.

The Act also encourages the FDA to rely on biomarkers instead of clinical outcomes to assess effectiveness. However, biomarkers are not always accurate representations or predictors of disease risks and endpoints. The Act does not change FDA approval standards, but it allows the FDA more discretion and leniency in how it approves drugs. Critics argue that the less-than-one-year average approval time does not really need accelerating and the strict research requirements do not need adjusting.

As part of the Act, the FDA can also incentivize hospitals for administering new antimicrobial drugs that have not received confirmatory approval. The Act also incentivizes drugmakers by removing regulatory hurdles that lengthen their approval processes and make them more expensive.

In addition to drug approvals, the Act addresses mental health care in this country: it established provisions for fighting the opioid epidemic, strengthened laws guaranteeing access to mental health care, and provided grants to increase the number of psychologists and psychiatrists. The Act pushes society ahead in our goal of preventing devastating consequences of mental health such as homelessness, incarceration, and suicide.

The Act has, for the most part, good motives, and it includes a little bit for a lot of people. However, it aims to fix problems that may or may not really exist and offers incentives to groups that may or may not benefit from incentives.

Plus, the future of health care in this country is uncertain. The new administration could unwind many of the changes established by the Act and, while the Act provides a framework for authorizing these programs, future Congresses would actually need to vote on and approve the budgets that pay for the provisions in the Act. The benefits and drawbacks of the Act in the real world remain to be seen. Our lawmakers are probably looking in the right direction and trying hard to choose the right path, but just how many obstacles and speed traps we hit along the way will influence just how fast this Act can get us to safe, effective, and accessible health care.


21st Century Cures Act. U.S. House of Representatives Committee on Energy and Commerce. Accessed January 23, 2017.

Avorn J, Kesselheim AS. The 21st Century Cures Act—Will it take us back in time? N Engl J Med. 2015;372:2473-2475. PMID: 26039522

Kesselheim AS, Avorn J. The 21st Century Cures Act. N Engl J Med. 2015;373(17):1679-80. PMID: 26488710

Image via tpsdave/Pixabay.

]]> 0