Neuroscience & Neurology – Brain Blogger Health and Science Blog Covering Brain Topics Wed, 30 May 2018 15:00:03 +0000 en-US hourly 1 Why We Don’t Remember Early Childhood? Wed, 23 May 2018 14:00:26 +0000 Although early experiences are important for personal development and future life, as adults we recall nothing or very little of those early formative events, such as making first steps or learning first words. In fact, when adults are asked about their first memories they usually don’t recall events before the age of 2-3, with only fragmented recollection of events that happened between the age of 3 and 7. This phenomenon is often called childhood or infantile amnesia. It represents an inability of both children and adults to recall episodic memories (i.e., memories for particular events or stimuli that occur in a particular context) from infancy and early childhood, before the age 2-4.

Sigmund Freud was the first researcher to develop the theory of infantile amnesia, as he had observed that his patients rarely had been able to recall memories of events that took place during the first years of life. He believed that childhood memories are being repressed and thus forgotten. Still, modern theories focus on cognitive and social development as an important predictor of childhood amnesia. One possible explanation of childhood amnesia is the lack of neurological development, i.e., the development of brain parts that are in charge of storage and retrieval of episodic memories. For instance, some researchers believe that the development and functioning of the prefrontal cortex (cortex area at the front of the brain) is crucial for the creation of contextualized memories. Moreover, the prefrontal cortex and hippocampus are assumed to be crucial for the development of autobiographical memories. Importantly, these two brain structures develop around the age of 3 or 4.

The lack of neurological maturation, i.e., maturation of brain structures required for creation, storage, and recall of memories during infancy and early childhood might explain the phenomenon of childhood amnesia. According to this explanation, childhood amnesia occurs not due to the loss of memories over time (the forgetting explanation), as Freud had suggested, but rather due to the lack of storing of these memories in the first place. The lack of stored memories, according to this theory, is due to brain immaturity.

Some evidence has suggested that amnesia for events taking place in early childhood (before the age of 2) could be at least partly explained by difficulties with verbally recalling memories that were encoded before language acquisition. In line with this is the fact that the majority of words (the vocabulary) are acquired between the age of 2 years and 6 months and 4 years and 6 months. This is the time period that the earliest memories can be recalled.

Childhood amnesia seems not to be an exclusively human phenomenon. Indeed, some researchers have observed something like infantile amnesia in animals (for instance, rodents). The discovery of amnesia in animals has pointed to the possibility of investigating the underlying mechanisms of childhood amnesia, such as neurological events, by using animal models. The animal studies have addressed the importance of some parts of brain and their development in relation to the childhood amnesia. For instance, they have indicated that high rate of neurogenesis in hippocampus as observed in infancy might explain the accelerated forgetting of contextual fear memories. It seems that integrating of new neurons into the existing circuit might destabilize and weaken the existing memories.

Some researchers believe that it is unclear whether childhood amnesia occurs due to the failure of memory retrieval or failure of their storage. Forgetting might be described as a linear function of the time passing since the event. Since there is a long time span between the early events and recall in adulthood, it might be assumed that early events are simply forgotten. Still, some researchers disagree. This is because they have found that subjects recall far less memories for events occurring between the age of 6 and 7 as would be expected by simply extrapolating the forgetting curve. Thus, forgetting could not completely explain the phenomenon of childhood amnesia. This is why a neurogenic hypothesis of childhood amnesia has been developed.

According to its inventors, a neurogenic hypothesis explains childhood amnesia through the continuous adding of new neurons (neurogenesis) in the hippocampus, as already mentioned above. According to this hypothesis, high levels of postnatal neurogenesis (which occurs in both humans and some animals) in the hippocampus prevents the creation of long-lasting memories. This hypothesis has been experimentally tested in animal models (mouse and rat). The findings emerging from these models have proposed that high levels of neurogenesis jeopardize the formation of long-term memories, possibly by replacement of synapses in pre-existing memory circuits. In addition, the same findings indicate that the decline in hippocampal neurogenesis corresponds with the emerging ability to form stabile memories.

Thus, according to these animal studies, the theory of neurogenesis appears to be a logical explanation for childhood amnesia.

Although the early theory regarding the forgetting or repressing of memories might look like a good explanation of childhood amnesia, more recent findings demonstrate that something else is happening in our brain that contributes to this phenomenon. Whether this is the lack of development in some brain parts, or the continuous synthesis of new neurons, or both, remains to be further investigated. Childhood amnesia cannot be explained by simple forgetting.


Newcombe, N., Drummey, A., Fox, N., Lai, E., Ottinger-Alberts, W. (2000) Remembering Early Childhood: How Much, How, and Why (or Why Not). Current Directions in Psychological Science. 9 (2): 55–58.

Hayne, H., Jack, F. (2011) Childhood amnesia. Wiley Interdisciplinary Reviews. Cognitive Science. 2(2): 136-145. doi: 10.1002/wcs.107

Simcock, G., Hayne, H. (2003) Age-related changes in verbal and non-verbal memory during early childhood. Developmental Psychology. 39: 805–814. PMID: 12952395

Madsen, H.B., Kim, J.H. (2016) Ontogeny of memory: An update on 40 years of work on infantile amnesia. Behavioural Brain Research. 298(Pt A):  4-14. 10.1016/j.bbr.2015.07.030

Wetzler, S.E., Sweeney, J.A. (1986) Childhood amnesia: An empirical demonstration. In Autobiographical memory (ed. DC Rubin), pp. 191–201. Cambridge University Press, New York, NY.

Josselyn, S.A., Frankland, P.W. (2012) Infantile amnesia: a neurogenic hypothesis. Learning and Memory. 19(9): 423-433. doi: 10.1101/lm.021311.110

Image via VABo2040/Pixabay.

]]> 0
Nerve Agents: What Are They and How They Can Hurt Us? Fri, 04 May 2018 13:00:58 +0000 Chemical weapons keep making headlines these days, be it the use of sarin in Syria or Novichok in the UK. An interesting fact hardly ever covered by the media is that the chemical structure of these compounds is relatively simple. An average, modern pharmaceutical drug tends to be much more complex and difficult to make. This is not particularly surprising, as most research into these agents was done 50 or more years ago, when the art of organic synthesis was not as advance as it is now. Nonetheless, these compounds (and nerve agents in particular) are extremely efficient. It is quite interesting to analyze, from a neuroscience perspective, what exactly these compounds are doing to our body to cause such a devastating effect.

Nerve agents are most commonly deployed in chemical warfare. However, they are more common than most of us understand. Compounds with similar structures are sometimes found in insecticides used in agriculture.

How nerve agents work?

All nerve agents work in a similar fashion, though they vary significantly in chemical structure. Due to this variance in chemical composition, they differ in toxicity and other properties. Nerve agents primarily belong to a group called the anticholinesterase inhibitors, and they act by causing paralysis or dysfunctioning of the nervous system.

To better understand the functioning of these agents, let’s look at the underlying physiology of the nervous system. All functions of our body are controlled by nerve cells. This includes the movement of muscles and the work of internal organs and cardio-respiratory apparatus. Our brain sends regular electrical signals to vital organs to keep them regulated. Once these signals reach the target organ, they release chemicals called neurotransmitters. When the brain sends messages to regulate the heart and respiration, nerve endings release a neurotransmitter called acetylcholine that mediates the communication between the neural system and organs and muscles.

Although there are many neurotransmitters helping to regulate the functioning of smooth and skeletal muscles and internal organs, acetylcholine is the most important of them all. Once released by neurons, acetylcholine forces the muscles to contract. Acetylcholine molecules get destroyed in milliseconds by a specific enzyme called acetylcholinesterase to ensure that muscles can relax back again. This contraction and relaxation of muscles ensures the smooth functioning of skeletal muscles, body movements, respiration, heartbeat, and much more.

Thus, for proper functioning and contraction of any muscles in the body, the firing of acetylcholine (in synaptic space) and its quick destruction by acetylcholinesterase is essential.

Now imagine a situation when acetylcholine fired from nerve endings is not destroyed due to a lack of acetylcholinesterase. Now the muscles contract but they cannot relax back. This would result in paralysis. Muscles cannot remain contracted forever and they would be damaged. It is how all nerve agents work: they inhibit acetylcholinesterase, making it non-functional and thus causing muscular paralysis.

Types of nerve agents

Nerve agents were primarily developed either for military purposes or to be used as insecticides. For these very different needs, nerve agents should be either volatile and non-persistent, or non-volatile and highly persistent.

G-series nerve agents were developed by Germans before the second World War. Sarin, tabun, soman, and cyclosarin are some of the representatives of this class of agent. These compounds are non-persistent, which means that they are less stable, cannot remain in the environment for long, and have a shorter washout period from the human body.

The V-series of nerve agents is another significant class of these compounds. They are highly persistent and have an oil like consistency. It means that they can remain stable in the environment for long time and have exceptionally long washout periods. VE, VG, VX, VR, and VM are some of the representatives of this class.

The Novichok series of agents were created by the Soviet Union between the 1960s and 1990s. They were designed to ensure that they remain undetectable by adversaries.

Although these three classes are well known, it is evident that there are many more classes perhaps unknown to the general public due to the secrecy surrounding this technology. Importantly, all of these compounds have a similar mode of action, even though they differ in physical properties and toxicity.

Exposure consequences and antidotes

Once a person is exposed to a nerve agent, the chemical effectively paralyzes various muscles of his/her body, including respiratory and cardiac muscles. Thus, in the end, the person dies due to respiratory and cardiac failure. After initial exposure, many of the agents cause immediate irritation of mucous membranes resulting in a runny nose and burning sensation, followed by the blurring of vision, tightness in chest, urination, defecation, stomach aches, vomiting, epileptic seizures, and finally death due to cardio-respiratory failure.

Acetylcholine is also a vital neurotransmitter for communication between brain cells. This means that nerve agents cause neural damage that is, in many cases, irreversible. Even if a person is revived after exposure, neural damage or psychiatric changes may continue to persist for years. Surviving victims of nerve agent poisoning continue to suffer from fatigue, cognitive deficits, and many other neural symptoms.

The anticholinergic drug atropine remains the first line of help in most cases. This is a widely available drug and may help to counteract many nerve agent effects, especially those related to respiration, heart, and skeletal muscles. Atropine is a part of many emergency kits.

Biperiden is another drug used to treat nerve agent exposure. It is slow to act, but it can cross the blood-brain barrier, and thus helps to counter the central nervous toxicity of these agents.

Pralidoxime chloride is an activator of anticholinesterase and may also help to counter the toxic effect of nerve agents.

Apart from the above-mentioned three drugs, there is an array of supportive drugs and treatments that may help to counteract the effects of nerve agents, help to maintain the functioning of critical organs, and also play a role in faster washout of some of these agents from the body.


Newmark, J. (2009). CHAPTER 56 – Nerve Agents. In M. R. Dobbs (Ed.), Clinical Neurotoxicology (pp. 646–659). Philadelphia: W.B. Saunders. doi:10.1016/B978-032305260-3.50062-9

Organisation for the Prohibition of Chemical Weapons. (n.d.). Nerve Agents. Retrieved March 21, 2018, from here

Image via jessebridgewater/Pixabay.

]]> 0
“Non-gene” Mutations and Neurodevelopmental Disorders Wed, 02 May 2018 12:30:01 +0000 Every year, thousands of children are born with neurodevelopmental issues. This is not just about lagging intellectual growth or autism: in fact, many of the psychiatric illnesses in later life have been blamed on neurodevelopmental problems. These conditions are more common than most people imagine. One estimate suggests that as many as 15% of people suffer from certain neurological and psychiatric issues that have to do with genetics and neurodevelopmental disorders.

However, at present researchers are unable to explain the cause of many neurodevelopmental disorders based on genetic mutations only. That led them to think that they must be missing something. Advances in genetic sequencing technology have revolutionized our understanding of inherited disorders. This change has been further supported by improvements in computing and big data analysis. Nowadays, researchers have better tools and plentiful data to analyze and uncover the genetic landscape of various neurodevelopmental disorders.

A new understanding of neurodevelopmental disorders

For many years, researchers have primarily focused their attention on ~23,000 so-called protein-coding genes, as most neurodevelopmental disorders are directly or indirectly related to proteinopathies (i.e., defects in proteins encoded by the genes). These genes make up just 2% of our genome. To date, researchers have mostly been calling the remaining 98% of our genome as “junk” (i.e., non-coding), and its role is poorly understood.

Now scientists are uncovering that what they have been dismissing as junk has a significant influence on the way the 2% containing protein-coding genes works. This 98% of the genome participates in activation, deactivation, or changes the expression level of the 2%. Researchers have discovered many mutations in these neglected parts of the genome that influence the other more active part of the genome. Thus, it seems that junk DNA has a vital role to play in multiple genetic disorders, including neurodevelopmental diseases.

However, identifying the link between these non-coding genome regions and the coding genes is not a straightforward task. It requires the use of an enormous amount of data and computing power. Detailed investigations of non-coding regions have become possible due to novel genome projects like the 1000 Genome Project, as these projects provide complete genome information, including information about non-coding regions.

A new study confirms the link between non-coding genes and neurodevelopmental disorders

In one of the most extensive studies of its kind that has been published this year in the journal Nature, genetic data from 8000 families were analyzed. Researchers were able to demonstrate the link between neurodevelopmental disorders and mutations outside of protein-coding genes. This is the first study that was able to provide information regarding neurodevelopmental disease risk in undiagnosed children.

Although thousands of children around the world develop neurodevelopmental illnesses linked to slow intellectual growth, epilepsy, and even heart defects, a large number of them remain undiagnosed until clinical symptoms develop. This delay or missed diagnosis means that valuable time to take prophylactic measures is lost.

The new findings are the result of the Deciphering Developmental Disorders (DDD) study initiated in 2010, aimed at early diagnosis of rare developmental disorders. So far, even in the developed European nations and the US, only a very small percentage of children with developmental disorders are diagnosed in a timely manner. It is expected that these new findings will revolutionize the way these conditions are diagnosed and possibly help in the early identification of the disorders and associated risk factors.

In about one-third of 13,000 children diagnosed with neurodevelopmental disorders, researchers were able to identify the genetic mutations that had been previously found. However, they had no clue as to what mutations or genetic issues were responsible for developmental disorders in the remaining 8000 patients.

Therefore, researchers turned their attention towards the non-coding or junk parts of the genome, which control the inhibition or activation of genes and serve as gene regulators. To their surprise, they discovered that specific mutations in these non-coding areas were strongly related to specific neurodevelopmental disorders. There was a good reason to pay attention to the particular parts of these non-coding genes since they have been highly conserved in various species during evolutionary history. Usually, conserved genetic material plays a critical role in the survival and fitness of living species.

What are the future implications of the findings?

At present, researchers agree that they have limited knowledge about the link between mutations in non-coding genome regions and neurodevelopmental disorders. Nonetheless, they think that things will change for the better, as they are now on the right track. As more data becomes available, we will be able to predict and diagnose rare neurodevelopmental disorders more precisely. Data from new projects like NHS’s 100,000 Genome Project may provide truly rich data for a better understanding of the subject. As more plentiful genome and health-related data become available, the diagnosis of neurodevelopmental disorders will be done with higher precision.

Novel studies have finally opened the doors to new unexplored territory, and eventually, we will be able to get the answers to the questions that were too difficult to deal with for many years. Once we are able to diagnose done in a timely fashion, we will be better equipped to make the best decisions regarding the treatment options.


Hu, W. F., Chahrour, M. H., & Walsh, C. A. (2014). The diverse genetic landscape of neurodevelopmental disorders. Annual Review of Genomics and Human Genetics, 15, 195–213. doi:10.1146/annurev-genom-090413-025600

Khurana, E., Fu, Y., Colonna, V., Mu, X. J., Kang, H. M., Lappalainen, T., … Gerstein, M. (2013). Integrative Annotation of Variants from 1092 Humans: Application to Cancer Genomics. Science, 342(6154), 1235587. doi:10.1126/science.1235587

Plummer, J. T., Gordon, A. J., & Levitt, P. (2016). The Genetic Intersection of Neurodevelopmental Disorders and Shared Medical Comorbidities – Relations that Translate from Bench to Bedside. Frontiers in Psychiatry, 7. doi:10.3389/fpsyt.2016.00142

Short, P. J., McRae, J. F., Gallone, G., Sifrim, A., Won, H., Geschwind, D. H., … Hurles, M. E. (2018). De novo mutations in regulatory elements in neurodevelopmental disorders. Nature, 555(7698), 611–616. doi:10.1038/nature25983

Image via kalhh/Pixabay.

]]> 0
Is Childhood Obesity Linked to Lower IQ? Thu, 26 Apr 2018 17:00:14 +0000 Obesity is a global health burden, a serious risk factor for development of metabolic disorders, cardiovascular diseases and many other conditions. But some researchers believe that in addition to affecting physical health, obesity can damage the brain and compromise intelligence.

Brain imaging studies have documented multiple structural and functional abnormalities in the brains of obese individuals, which are already evident in adolescence.  Moreover, research findings indicate that even obesity in childhood is associated with lower intelligence scores. But this is not all. According to some investigations, there is causality in the opposite direction, meaning that lower IQ at childhood results in increased prevalence of obesity in adulthood.

Scientific studies have investigated the association of IQ and obesity in large cohorts. For instance, a group of researchers analyzed data in a prospective, longitudinal study and investigated whether becoming obese is associated with a decline in intelligence from childhood to later life. More than one thousand children were included and tracked until their fourth decade of life. Anthropometric measurements (i.e., body weight and height) were carried out at birth and at 12 occasions later in life, at the ages of 3, 5, 7, 9, 11, 13, 15, 18, 21, 26, 32, and 38. The intelligence quotient (IQ) scores were assessed at the ages of 7, 9, 11, and 38. As the results demonstrated, the participants who became obese had lower IQ scores at adulthood in comparison with the participants whose body mass index (BMI) remained within the normal range. However, the obese participants did not experience a severe decline in their IQ over lifetime, meaning that they had lower IQ scores even in childhood, in comparison with normal weight controls.

Another population-based study followed babies born in the same week of 1950 in the United Kingdom for more than half a century. More than 17 thousand babies were included and their intelligence was assessed at the ages of 7, 11 and 16, while the obesity level and BMI were evaluated at 51. The results indicated negative effects of childhood intelligence on adult BMI and obesity level. In addition, it turned out that more intelligent children had healthier dietary habits and were exercising more frequently as adults.

Considering the negative association between childhood obesity and intelligence, one review study questioned the direction of this causality. After careful examination of longitudinal population based studies, this review study suggested that the direction of causality goes from having low intelligence that results in weight gain and obesity. It also claimed that excess weight gain did not cause a decline in IQ. The study found no strong evidence that obesity impairs cognitive functions or leads to cognitive decline, while it established proof that poor intelligence in childhood leads to weight gain in adulthood.

Still, not all scientists agree with these conclusions. For instance, a group of researchers investigated the impact of obesity on cognitive functions in children with sleep-disordered breathing. They included three groups of children in the study: children with obstructive sleep apnea, children with obstructive sleep apnea and obesity, and children without any of these conditions (normal control). The aim was to assess the total, verbal, and performance IQ scores in these children. The total and performance IQ scores turned out to be significantly lower in the children with obstructive sleep apnea and obesity, in comparison with the other two groups. In addition, BMI negatively influenced the total IQ score in obese children (with obstructive sleep apnea). This study clearly demonstrated that obesity can lead to higher cognitive impairments.

Since childhood IQ and obesity are linked, others investigated whether maternal pre-pregnancy obesity can impact the child’s neurological development. More than 30 thousand women were included; their pre-pregnancy BMI was calculated and the children’s IQ scores were assessed at 7 years of age. The results indicated that women with a BMI of around 20 kg/m2 had children with the highest IQ scores. In contrast, maternal obesity (BMI 30 kg/m2) was associated with lower total and verbal IQ scores. More importantly, excessive weight gain during pregnancy accelerated this association.

All of these findings confirm that there is a link between childhood intelligence and body weight later in life. But what is the mechanism underling this phenomenon?

According to some studies, higher intelligence (IQ) in childhood predicts a better socio-economic status later in life (a higher educational level with a better income). In addition, higher educational attainment seems to reduce the risk of obesity, probably based on better dietary habits (more healthy food choices). This might partly explain how a lower IQ in childhood can lead to weight gain and obesity later in life. When it comes to the impact that excess weight gain has on intelligence, it seems that more research is needed to confirm this association and elucidate the underlying mechanisms. One of the possible explanations for this association is that hormones produced by fat cells may damage brain cells. Another possibility is that excess body weight may jeopardize cerebral blood vessels and, thus, impair brain functions.

Although the cause of obesity-lowered intelligence scores is not entirely clear, it is evident that the link exists. Since obesity is a rising global health concern, its negative effects should also be investigated in terms of its impact on cognitive functions and intelligence. This is especially important when we consider that even pre-pregnancy obesity leads to lower IQ in children.


Belsky, D.W., Caspi, A., Goldman-Mellor, S., Meier, M.H., Ramrakha, S., Poulton, R., Moffitt, T.E. (2013). Is obesity associated with a decline in intelligence quotient during the first half of the life course? American Journal of Epidemiology. 178(9): 1461-1468. doi: 10.1093/aje/kwt135

Kanazawa, S. (2013). Childhood intelligence and adult obesity. Obesity (Silver Spring). 21(3): 434-440. doi: 10.1002/oby.20018

Kanazawa, S. (2014) Intelligence and obesity: which way does the causal direction go? Current Opinion in Endocrinology, Diabetes and  Obesity. 21(5): 339-344. doi: 10.1097/MED.0000000000000091

Vitelli, O., Tabarrini, A., Miano, S., Rabasco, J., Pietropaoli , N., Forlani, M., Parisi, P., Villa, M.P. (2015). Impact of obesity on cognitive outcome in children with sleep-disordered breathing. Sleep Med. 2015;16(5): 625-630. doi: 10.1016/j.sleep.2014.12.015

Huang, L., Yu, X., Keim, S., Li, L., Zhang, L., Zhang, J. (2014). Maternal prepregnancy obesity and child neurodevelopment in the Collaborative Perinatal Project. International Journal of Epidemiology. 2014;43(3): 7837-92. doi: 10.1093/ije/dyu030

Chandola, T., Deary, I.J., Blane, D., Batty, G.D. (2006). Childhood IQ in relation to obesity and weight gain in adult life: the National Child Development (1958) Study. International  Journal of Obesity. 30(9): 1422-1432. DOI: 10.1038/sj.ijo.0803279

Image via mojzagrebinfo/Pixabay.

]]> 0
Creatine and the Brain Wed, 14 Mar 2018 12:00:36 +0000 The human brain depends on a constant energy supply, which is needed for proper functioning. Energy supply impairments can jeopardize brain function and even lead to the pathogenesis or progression of neurodegenerative diseases. Chronic disruption of energy causes degradation of cellular structures and creates conditions that favor the development of Parkinson’s, Alzheimer’s, or Huntington’s disease. In addition, impaired brain energy metabolism is one of the important contributors to the pathogenesis of psychiatric disorders. Thus, interventions that can increase or regulate local energy stores in the brain might be neuroprotective and represent a good therapeutic tool for managing various neurological and neurodegenerative conditions.

One of the potential therapeutic agents for restoring brain energy is creatine. Creatine is particularly important since it replenishes ATP (a cellular unit of energy) without relying on oxygen.

Creatine is better known as one of the most popular supplements for bodybuilding. Being a completely natural compound, it has no negative effects and is commonly used by gym goers. Creatine is mostly stored in muscles where it serves as an easily available source of energy. But according to scientific findings, creatine also concentrates in the brain. It is an important component of the creatine kinase/phosphocreatine system that plays an important role in the metabolic networks of the brain and central nervous system and is involved in many of the brain’s functions. Experimental studies have indicated that creatine can protect from ischemic cell damage (which is caused by a lack of oxygen) by preventing ATP (energy) depletion and reducing structural damage to the affected brain cells.

In spite of promising laboratory findings, investigation of creatine’s effects in the human brain has produced controversial results. So far, the studies on oral supplementation with creatine have demonstrated some benefits. For instance, one study in healthy young volunteers has shown that oral supplementation with creatine monohydrate for 4 weeks leads to a significant increase in the total creatine concentration in the participants’ brain, with the most pronounced rise seen in the thalamus. The fact that creatine concentrates in the brain after consumption clearly indicates that creatine can pass the blood-brain barrier, where the benefits of creatine supplementation for the brain can be expected.

Another study has investigated the impact of creatine consumption on brain chemistry, including the brain’s high energy phosphate metabolism. After two weeks of creatine supplementation, the brain’s creatine level significantly increased, as well as the concentrations of phosphocreatine and inorganic phosphate.  This study clearly demonstrates the possibility of using creatine supplementation to modify high-energy phosphate metabolism in the brain. This is especially important for people with certain brain disorders as alterations in brain phosphate metabolism have been reported in depression, schizophrenia, and in cases of cocaine and opiate abuse.

The effects of creatine supplementation in another human study demonstrated that creatine can improve cognitive performance during oxygen deprivation. The participants in this study received creatine or placebo for seven days and were then exposed to a hypoxic gas mixture. In comparison to the placebo group, supplementation with creatine helped to restore cognitive performance, especially attention capacity that was affected by hypoxia. Also, creatine helped to maintain an appropriate neuronal membrane potential in brain cells. This research has demonstrated that creatine can be a valuable supplement when energy provision by cells is jeopardized. In addition, it supports the idea that creatine is beneficial not only for recovering muscle strength but for restoring the brain function too.

Approximately half of the daily requirement (around 3–4 grams) for creatine comes from alimentary sources, while the other half is endogenously produced in the body. Creatine is a carninutrient, meaning that it is available only from animal foods (mostly meat). Since creatine is not present in plant-based foods, plasma and muscle levels of creatine are commonly lower in vegetarians and vegans compared to omnivores. Thus, individuals whose diet is based on plant foods may benefit from creatine supplementation in terms of improvements in brain function. One study in young adult females investigated the impact of creatine supplementation on cognitive functions in both vegetarians and omnivores. Compared to the placebo group, 5 days of supplementation with creatine led to significant improvements in memory. This improvement in brain function was more pronounced in vegetarians. Another study investigated the effects of 6-week-long creatine supplementation in young vegetarians. In comparison with placebo, creatine-induced significant improvements in intelligence and working memory, with both functions depend on the speed of information processing. This study showed that brain performance is dependent on the level of energy available in the brain, which can be beneficially influenced by creatine supplementation.

Creatine supplementation seems to be beneficial not only for healthy people but also for individuals with psychiatric disorders. For instance, decreased creatine levels have been reported in the brains of patients with anxiety disorders. Post-traumatic stress disorder (PTSD) is a type of anxiety condition that develops in subjects that have experienced traumatic situations. Creatine supplementation was shown to be beneficial in treatment-resistant PTSD patients in relief from symptoms and improved sleep quality.

Furthermore, studies of creatine functions in the central nervous system underline creatine’s therapeutic potential in neurodegenerative diseases, since creatine supplementation can reduce the loss of neuronal cells. Also, animal model studies have demonstrated that the size of creatine stores in the brain play an important role in Alzheimer’s disease, and creatine supplementation was found to be beneficial in animal models of Parkinson’s disease as well, a rationale for using creatine in these conditions.

To sum up, it seems that creatine can be used as a supplement for replenishing the brain’s energy stores. This can further improve cognitive functions and brain performance, with the effects more pronounced in vegans and vegetarians. In addition, creatine has therapeutic potential in psychiatric disorders and neurodegenerative conditions.


Turner, C.E., Byblow, W.D., Gant, N. (2015) Creatine supplementation enhances corticomotor excitability and cognitive performance during oxygen deprivation. Journal of Neuroscience. 35(4): 1773-1780. doi:10.1523/JNEUROSCI.3113-14.2015

Dechent, P., Pouwels, P.J., Wilken, B., Hanefeld, F., Frahm, J. (1999) Increase of total creatine in human brain after oral supplementation of creatine-monohydrate. American Journal of Physiology. 277(3 Pt 2): R698-R704. PMID:10484486

Lyoo, I.K., Kong, S.W., Sung, S.M., Hirashima, F., Parow, A., Hennen, J., Cohen, B.M., Renshaw, P.F. (2003) Multinuclear magnetic resonance spectroscopy of high-energy phosphate metabolites in human brain following oral supplementation of creatine-monohydrate. Psychiatry Res. 123(2): 87-100. PMID:12850248

Brosnan, M.E., Brosnan, J.T. (2016) The role of dietary creatine. Amino Acids. 48(8): 1785-1791. doi:10.1007/s00726-016-2188-1

Benton, D., Donohoe, R. (2011) The influence of creatine supplementation on the cognitive functioning of vegetarians and omnivores. British Journal of Nutrition. 105(7):1100-1105. doi:10.1017/S0007114510004733

Rae, C., Digney, A.L., McEwan, S.R., Bates, T.C. (2003) Oral creatine monohydrate supplementation improves brain performance: a double-blind, placebo-controlled, cross-over trial. Proceedings. Biological Sciences.  270(1529): 2147-2150. doi:10.1098/rspb.2003.2492

Andres, R.H., Ducray, A.D., Schlattner, U., Wallimann, T., Widmer, H.R. (2008) Functions and effects of creatine in the central nervous system. Brain Research Bulletin. 76(4): 329-343. doi:10.1016/j.brainresbull.2008.02.035

Image via TheDigitalArtist/Pixabay.

]]> 0
Invertebrates: A Vastly Different Brain Structure Can Be Remarkably Efficient Mon, 12 Mar 2018 12:00:20 +0000 Have you ever wondered about how an alien’s brain works? What kind of information-processing system they may have and how it may differ from ours? There is no need to look any further. The answers to these questions can be found much closer to home, in insects and other invertebrates.

Insects and humans have evolved quite differently, and thus they have a very different kind of neural system. Insects, jellyfishes, octopuses, and many other invertebrates have a very sophisticated nervous system, they display remarkably complex behavior, learning abilities, and level of intelligence. Yet, many of them lack the kind of brains we have. That is, they lack a centralized decision-making system.

If we chart the evolution of the neural system, we can see that the earlier neural networks were more diffuse. The collections of neurons called ganglions make most of decisions locally (non-vertebrates have several ganglions), while some central decisions like the direction of movement of the whole body are made more democratically. In fact, it is still not fully known when the centralization of the nervous system started to take place.

It is well accepted that Cnidarians have one of the most primitive neurological systems. This is from where other living beings evolved in terms of neural capabilities. Organisms of this evolutionary branch, like jellyfish, are still common today. Now it is known that their neural system is quite complex and has more intellectual capabilities than we had thought.

A decentralized or diffuse nervous system has some fantastic capabilities. Some insects like Drosophila flies can stay alive for many days after being decapitated. They do not only survive without the head, they can fly, walk, and even copulate. Cockroaches would be able to remember things even if their brain was removed.

Although diffuse nervous systems are more primitive, it does not imply the lack of intelligence. An example of a decentralized, large, and at the same time incredibly complex nervous system is in the octopus. Octopuses have the majority of neurons or collections of neurons (ganglions) located in their arms (tentacles). The arms of an octopus can do lots of things independently from each other; they can perform basic motions and they can touch or taste without any interference from the brain. Although octopuses have nothing common with vertebrates when it comes to neuroanatomy, yet they can learn things, recognize subjects, and perform complex tasks.

We have erroneously come to see brain size as a measure of intelligence in species since we know that most representatives of the animal kingdom, particularly non-mammals, have far smaller brains than ours. But the correlation between brain size and intelligence is not linear. Just think of whales that have a brain weighing 9 kg and containing 200 billion neurons. A typical human brain, for comparison, is around 1.5 kg in weight and has 80 billion neurons. This is proof that not all kind of intellectual activities depend on brain size or the number of neurons.

This lack of a direct correlation explains why some insects are more innovative than us or any other higher animals when it comes to socializing, forming colonies, or even learning from each other. Colonies of bees and ants have very complex social structures, where various members have clearly divided tasks. They have a complicated system of communicating with each other. They even have clear labor division, which includes the casts of slaves, farmers, and warriors. What is amazing is that such complex activities are achieved with the help of just a few million neurons.

Now we understand that the nervous system doesn’t have to be centralized to function efficiently, and different forms of nervous systems have their pros and cons. Further, we know that neither the brain volume nor the number of neurons is an indicator of intelligence. However, we have beein maintaining the thought that, at least for handling, larger brain volumes of information need a larger number of neurons. But even this view is under revision now.

Many complex behavioral reactions and responses can rely on just a few neurons and be independent of the brain. Think of reflexes such as the pain response that is a function of ganglions and not the brain. We are learning to appreciate the value of our gut feeling, as the gut has an enormous amount of neurons playing a much broader role in health.

Understanding the existence of entirely different kinds of cognitive systems, neural systems, or information processing systems has many implications for human health. It forces us to look at our bodies from a different angle. It is entirely possible that a certain degree of non-centralized intelligence, and maybe even non-neural information processing, exists in our body.

An excellent example of diffuseness of body systems is our endocrine system. The concept that endocrine functions are limited to some specific organs is becoming obsolete. We are now talking about diffuse endocrinology, as every organ and tissue secretes some endocrine hormones with various effects, be it the gut or fat. Similarly, many researchers are now talking about diffuse neuroendocrinology, as the two systems are very well connected.


Ameri, P., & Ferone, D. (2012). Diffuse endocrine system, neuroendocrine tumors and immunity: what’s new? Neuroendocrinology, 95(4), 267–276. doi:10.1159/000334612

Arendt, D., Denes, A. S., Jékely, G., & Tessmar-Raible, K. (2008). The evolution of nervous system centralization. Philosophical Transactions of the Royal Society B: Biological Sciences, 363(1496), 1523–1528. doi:10.1098/rstb.2007.2242

Gagliano, M., Vyazovskiy, V. V., Borbély, A. A., Grimonprez, M., & Depczynski, M. (2016). Learning by Association in Plants. Scientific Reports, 6, 38427. doi:10.1038/srep38427

Koizumi, O. (2016). Origin and Evolution of the Nervous System Considered from the Diffuse Nervous System of Cnidarians. In The Cnidaria, Past, Present and Future (pp. 73–91). Springer, Cham. doi:10.1007/978-3-319-31305-4_6
Kuehnle, I., & Goodell, M. A. (2002). The therapeutic potential of stem cells from adults. BMJ?: British Medical Journal, 325(7360), 372–376.

Image via mineev_88/Pixabay.

]]> 0
Neurobotany: A Drastically Different Approach To Information Processing And Communication In Plants? Fri, 09 Mar 2018 13:00:43 +0000 Can plants think? We would have confidently said “No!” just a few decades back. But now we know that even plants have specific information-processing abilities, and this comes in the absence of any neurons at all. In fact, some scientific studies have even demonstrated that plants are able to learn and memorize. Now researchers have started to use terms like learning, intelligence, and behavior when referring to plants.

Although plants do not have brains, they seem to have other structures and mechanisms through which they communicate with each other and receive signals from the outside world. Some scientists believe that plant cells share some common features with neural cells, including the production of substances called neurotransmitters that the brain utilizes for sending the signals. Glutamate is an example of a very well studied excitatory neurotransmitter found in our brain. Interestingly, glutamate cell receptors have been identified in plants, suggesting that glutamate participates in cell-to-cell communication not only in animals and humans but in plants as well. In addition, glutamate was identified as an exogenous signal for regulation of root growth and its morphogenesis (development and structure), suggesting that the neuronal-like activities of glutamate are most evident in root apices. Accordingly, root apices, i.e., root hairs, represent plant parts that are the most similar in appearance to neurons. In other words, root apices (peaks) create branched networks that resemble neuronal networks with extended axons (nerve fibers).

Apart from reacting to neurotransmitters, plants can also generate and transmit signals. As some findings indicate, plants can produce electrical signals (impulses) that control important physiological processes, including photosynthesis, respiration, and movement of lateral organs.

Also, it has been proposed that plants assemble structures similar to neuronal synapses. Synapses are structures that enable transmission of signals from one neuron to another in the brains of animals and humans. As findings further indicate, the activity of so-called plant synapses is regulated by light and gravity. The presence of these synapses could explain how the transportation of nutrients in the plant body goes along the root-shoot axis directed by gravity or light. Also, these synapses might be important determinants in defining plant integrity, i.e., the ability of plant cells to recognize “non-self” and detect “self”.  

Scientists believe that in addition to sensing light, gravity and moisture, plants (more precisely their roots) can also sense toxins and other chemical signals from their neighbors and other plants. Based on the collected information, roots somehow “decide” whether to change their course in order to escape dangerous substances and pathogens.

Plant communication can also be mediated by a complex network that plants form with underground fungi. Fungi that live in the soil form symbiosis with plants: they make physical connections with roots that result in the formation of a huge underground network. This network is not dissimilar to connections formed between brain cells, leading to speculations that it may function as a kind of “underground brain”.  The fungi enhance the plants’ uptake of nutrients (minerals) from the soil, as well as the plant’s tolerance to pathogens. On the other hand, plants supply fungi with the nutrients (such as carbohydrates) that are needed for the formation of so-called mycelial networks (i.e., the networks of fungi filaments). Often, these networks are shared between the roots of different species or separate individuals of same species, forming common mycelial networks.

Emerging evidence suggests that fungi-root networks mediate transmission of signals, allowing chemical signaling between plants. Thus, not only in appearance but also in function these structures resemble the brain. A group of researchers tested the hypothesis that underground fungi networks mediate allelopathy, a phenomenon in which plants limit the growth of neighboring plants by producing certain compounds called allelochemicals. Allelopathy is based on the idea along with water and nutrients that are transported through the common fungal networks, signals (chemicals) that induce plant defense are also transported.

Researchers have demonstrated that common mycelial networks facilitate allelopathic interactions by expanding the zone in which allelochemicals act and by increasing their amounts, i.e., the fungi contribute to their accumulation. Another study has shown that underground fungi networks facilitate inter-plant communication, through which plants defend themselves from pathogens. More precisely, as this study indicated, plants infected with pathogens can induce defense system in neighboring plants and thus increase their resistance to a disease. Through the fungal networks that connect the roots of neighboring plants, plants infected with pathogens can send chemical signals to non-infected plants to induce their defense mechanisms and thereby prevent the spread of disease. In other words, plants can sense sickness.

Other findings in support of the idea that plants have a degree of intelligence are based on claims that plants can sense music and react differently to different types of music (preferring classical music to rock and roll). Although this and similar ideas remain rather speculative, research on this topic progressed and new findings emerged. In certain circumstances, plants do express sophisticated behavior. They respond to competitors and pathogens and have evolved to adapt to different soils and conditions for growth. Thus, it could be assumed that plants can process information, memorize it, learn, and make decisions accordingly. Indeed, some scientists believe that plants have the ability to store biological information and recall it in order to express an adequate response. A group of researchers indicated that some plants can learn from experience. For instance, the mimosa plant can learn which stimuli to respond to in order to defend itself and folds its leaves as a sign of defense.

As these research findings indicate, the presence of a brain and neurons is not a mandatory requirement for the learning process. Plants seem to express behavioral traits common to organisms of higher organization, although through less conventional pathways that need to be further studied.


Davenport, R. (2002). Glutamate receptors in plants. Annals of Botany. 90(5): 549-557. DOI: 10.1093/aob/mcf228

Baluska F. (2010). Recent surprising similarities between plant cells and neurons. Plant Signaling and Behavior. 5(2): 87-89. PMCID: PMC2884105

Baluska, F., Volkmann, D., Menzel, D. (2005). Plant synapses: actin-based domains for cell-to-cell communication. Trends in Plant Science. 10(3): 106-111. DOI: 10.1016/j.tplants.2005.01.002

Babikova, Z., Gilbert, L., Bruce, T.J., Birkett, M., Caulfield, J.C., Woodcock, C., Pickett, J.A., Johnson, D. (2013). Underground signals carried through common mycelial networks warn neighbouring plants of aphid attack. Ecology Letters. 16(7): 835-843. DOI: 10.1111/ele.12115

Barto, E.K., Hilker, M., Müller, F., Mohney, B.K., Weidenhamer, J.D., Rillig, M.C. (2011). The fungal fast lane: common mycorrhizal networks extend bioactive zones of allelochemicals in soils. PLoS One. 6(11): e27195. DOI: 10.1371/journal.pone.0027195

Song, Y.Y., Zeng, R.S., Xu, J.F., Li, J., Shen, X., Yihdego, W.G. (2010). Interplant communication of tomato plants through underground common mycorrhizal networks. PLoS One. 5(10): e13324. 10.1371/journal.pone.0013324

Gagliano, M., Renton, M., Depczynski, M., Mancuso, S. (2014). Experience teaches plants to learn faster and forget slower in environments where it matters. Oecologia. 175(1): 63-72. DOI: 10.1007/s00442-013-2873-7

Garzón, F. C. (2007). The Quest for Cognition in Plant Neurobiology. Plant Signaling & Behavior, 2(4), 208–211. PMCID: PMC2634130

Image via 27707/Pixabay.

]]> 0
Dietary Intake of Omega Fatty Acids and Brain Health Mon, 05 Mar 2018 13:00:45 +0000 Omega fatty acids are well known to be important for the normal functioning of our body. These fatty acids are essential for the formation of the cell membrane. They play a critical role in brain health. In addition, they are crucial for fertility, visual acuity, and optimal cardiovascular health. Omega-3 fatty acids also have an anti-inflammatory effect.

DHA is an omega-3 fatty acid. Omega fatty acids belong to a group called polyunsaturated fatty acids (PUFA). There are several kinds of omega-3 fatty acids, but for humans, three of them are considered to be essential. These are alpha-linolenic acid (ALA), eicosapentaenoic acid (EPA), and docosahexaenoic acid (DHA).

ALA is a short-chain fatty acid that cannot be made by our body and thus regarded as essential. EPA and DHA are classified as long-chain fatty acids. Our body can produce them in small quantities and a very inefficient manner. This is why one’s diet should provide enough of the essential and semi-essential omega-3 fatty acids. ALA is the only fatty acid which is present in plant-based food products.

Although EPA and DHA, both of which are particularly essential for brain health, can be produced in the body from ALA, the conversion process is not very efficient. It is believed that only ~15% of dietary ALA can be converted to EPA or DHA. ALA is present in canola oil, flaxseed oil, and in some other plant oils in sufficient quantities, whereas DHA and ALA are mainly present in seafood (various varieties of fish) and found in small amounts in other animal or poultry products.

Research has shown that although fish are rich in omega fatty acids, they are not able to produce DHA or EPA. In fact, they get omega-3 fatty acids by ingesting phytoplankton. Phytoplankton, in turn, get DHA and EPA by eating microalgae that are able to produce these fatty acids. These findings draw attention to two important facts: the importance of fish and animal products in our diet for optimal cognitive health and the importance of preserving the fragile environmental balance where everything is interconnected.

Omega-3 fatty acids are essential for the formation of phospholipids that are in turn required for cellular membranes. DHA is particularly critical for brain health, as demonstrated by its high content in the brain. Furthermore, since DHA helps to reduce the inflammatory responses, it may have a neuroprotective action.

There are lots of studies regarding the optimal intake of omega-3 and omega-6 fatty acids and their optimal ratio in one’s diet. However, the recommendations in this regard remain inconclusive. Rather than focusing on the total amount of omega fatty acids or the ratio of various fatty acids, one should focus on the overall dietary intake of EPA and DHA.

Although EPA and DHA have been accepted to be essential for wellbeing, healthy aging, and slowing down or preventing neurodegeneration, their levels are rarely assessed in clinical conditions. There is no standardized, universally accepted range, but the most widely accepted normal level is ~3–4% of all plasma phospholipids (for EPA and DHA combined).

Plasma or serum fatty acid values are not very accurate, as they may change according to the content of recent meals. For this reason, many researchers recommend checking the content of EPA and DHA inthe red blood cell membranes, as this can give the approximate average of the last 120 days. At present, for European and American populations, 3–5% of EPA and DHA in erythrocyte membranes is considered in the normal range. In the Japanese population for example, where consumption of seafood is much higher, these numbers may be much higher too.

So what are the dietary recommendations for Omega-3s?

At present, for a healthy brain, it is recommended to consume 1.6 g of omega-3s daily for males, and 1.1 g daily for females. As mentioned earlier, specific attention should be paid to the content of DHA and EPA in food items. Flaxseed oil, chia seeds, and walnuts are all rich in ALA. However, DHA and EPA are mainly present in fish, seafood, and poultry. Considering that only a small amount of ALA can be converted to DHA or EPA, it puts vegans at risk of developing a deficiency in these omega acids and thus raises the risk of neurodegenerative disorders.

To counter the deficit of DHA in plant-based products, many manufacturers have started to fortify soy beverages, juices, and milk products with DHA. Dietary supplements are another way of obtaining sufficient amounts of DHA. However, as most dietary supplements are based on fish oil or krill oil, if a person is strictly vegetarian, they can take algal oil-based supplements.

The National Health and Nutrition Examination Survey (NHANES) 2011-12 indicates that most of the population in the US are obtaining enough dietary omega-3 fatty acids. However, the majority of omega fatty acids are coming from a plant-based diet, meaning that American food is rich in ALA but deficient in EPA and DHA.

Diagnosing the omega-3 deficiency is not an easy task, as there is no lower cut-off value. Researchers don’t currently know at which level an omega fatty acid insufficiency may start causing problems. Things get complicated further by individual differences. Present scientific data are insufficient to know at what level there is a risk of neural deficits, visual impairment, or alterations in immune responses. However, some people may develop specific dermatological signs in the omega-3 deficiency, like scaling of the skin or unexplained dermatitis.

Classical omega-3 deficiency is rare in the US, but considering the importance of DHA and EPA for healthy aging and cognitive capacity, one may suppose that strict vegans, vegetarians, or those who don’t eat fish may be at higher risk when compared to those that do eat fish without supplementation.


Dyall, S. C. (2015). Long-chain omega-3 fatty acids and the brain: a review of the independent and shared effects of EPA, DPA and DHA. Frontiers in Aging Neuroscience, 7. doi:10.3389/fnagi.2015.00052

McNamara, R. K. (2010). DHA Deficiency and Prefrontal Cortex Neuropathology in Recurrent Affective Disorders. The Journal of Nutrition, 140(4), 864–868. doi:10.3945/jn.109.113233

Office of Dietary Supplements – Omega-3 Fatty Acids.

Weiser, M. J., Butt, C. M., & Mohajeri, M. H. (2016). Docosahexaenoic Acid and Cognition throughout the Lifespan. Nutrients, 8(2). doi:10.3390/nu8020099

Image via pixel2013/Pixabay.

]]> 0
Handedness: What Does It Say About Your Brain Structure? Mon, 19 Feb 2018 13:30:47 +0000 Left-handedness, as a relatively uncommon phenomenon, never fails to fascinate people. There is a common perception that left-handed people are more talented and artistic. To what extent these assumptions are correct, and what your preferred use of right or left hand can tell you about your brain structure?

Handedness represents the better performance or preference of using one hand, i.e., the dominant hand. Right-handedness is the most common type observed in 70–95% of the world population, followed by left-handedness, and then a very rare type of mixed handedness and ambidexterity. Although this is an important physiological feature in humans, it seems that the origins of handedness are not well understood.

While many scientists assume that genetics is the main determinant of handedness, others disagree and believe that other factors also play an important role. They believe that variations in handedness are related to some behavioral and anatomical measures. For instance, although just 10% of humans are left-handed, these individuals tend to be over-represented in artistic professions, have better mathematical abilities, and have a lower predisposition to diseases such as arthritis and ulcers. On the other hand, there is an increased prevalence of some health issues, including cardiovascular disease, dyslexia, asthma, multiple sclerosis, and others.

In addition to strict (constant) handedness, there is something called mixed, i.e., inconsistent handedness. Some scientists believe that mixed-handed individuals are of poorer mental and physical health, with lower cognitive parameters and higher rates of dyslexia and attention deficit hyperactivity disorder (ADHD). Mixed-handedness (a change in hand preference depending on the task) has been associated with greater atrophy of the hippocampus and amygdala, brain structures that are strongly associated with dementia and cognitive aging. Also, non-right handers (mixed or left-handed) are at higher risk of neurodevelopmental disorders, including autism, epilepsy, and schizophrenia.

Handedness reflects the structure of our brain, more specifically its asymmetry. The functional differences in the right and left brain hemispheres are believed to underline the phenomenon of hand dominance. Handedness is probably the most obvious manifestation of the fact that our brain functions in an asymmetric manner. While the left hemisphere controls right-handedness, i.e., the dominant right hand, the right hemisphere controls dominant left-handedness. The left hemisphere is also specialized for language and logic in most people, while right hemisphere specialization is related to intuition and creativity. Asymmetry of the brain and handedness become detectable very early, even during fetal development. Ultrasound examinations have revealed that even at the 10th week of gestation, most fetuses move their right arm more often than the left one, while from the 15th week the majority of fetuses suck the right thumb. This is believed to be predictive of future handedness. In line with this is the leftward enlargement of brain structure (seen in the first trimester of pregnancy), which plays an important role in neurological development.

Studies have linked handedness with differences observed in language lateralization. More precisely, right-handed individuals are characterized by left hemisphere control of language, while left-handers have shown right hemisphere dominance in language or bilateral speech representation. One interesting study questioned whether early childhood handedness can influence language development. The authors assessed the handedness of infants aged 6 to 18 months at each month and then again when they were toddlers (from 18 to 24 months). They found that constant use of the right hand during infancy was associated with having superior (advanced) language skills at 24 months old. On the other side, children who were not lateralized in infancy and became right or left handed as toddlers had the average expected language scores for their age.

There are also differences in the lateralization of visual areas of the brain between right and left-handed individuals. In right-handers, there is much higher activation of the right fusiform face area (the area responsible for face visualization) and the extrastriate body area (responsible for body visualization). Meanwhile, in left-handers, these areas are equally activated across both brain hemispheres.

Some researchers believe that brain volume may correlate with handedness, although the data on this subject remains controversial. One group of researchers reported that left-handed individuals had a larger brain, while another study found no difference in brain size between the right- and left-handers. As some findings indicate, left-handers are more prone to nighttime awakenings due to sleep disorders caused by periodic limb movements. It seems that left-handed individuals are more likely to experience limb movements while asleep compared to right-handers.

Since handedness has been associated with prenatal hormonal exposure, it could influence the risk of carcinogenesis later in life. Scientists investigated the impact of handedness on brain tumors, both malignant and benign ones. One study examined the associations between glioma, meningioma, and acoustic neuroma with self-reported handedness. Left-handers or ambidextrous (with equal use of both hands) individuals were at reduced risk of glioma (the most common malignant brain tumor) when compared with the right-handers. This relationship was similar for both genders. However, another very recent study found no such association. This large case-control study (which included more than 1000 glioma cases and healthy controls) reported no association between handedness and glioma risk after adjustment for age, gender, and race.

Although the brains of left-handers and right-handers differ in their structures, the available literature shows no noteworthy differences in intelligence as measured by IQ score. Nevertheless, these brain structure differences seem to reflect the more diverse and creative processing of language and emotions by left-handers than by right-handed individuals. This may explain why a greater proportion of left-handers are professional musicians, even in those cases when the musical instruments are designed for right-handers (for example, violins). Similarly, the gift for mathematics seems to be more common in the left-handed populace.

It is obvious that right and left-handers differ not only in hand preference but also in brain structure. This further reflects the ability to perform different tasks and achieve success in different professions. Although there is a clear link between non-right handedness and developmental disorders, there is no association between brain carcinogenesis and the dominance of one hand. It seems that handedness can be predicted in early childhood, even during fetal development, but further investigations are needed to elucidate the origins of our preference to use one hand or the other.


Cherbuin, N., Sachdev, P.S., Anstey, K.J. (2011). Mixed-handedness is associated with greater age-related decline in volumes of the hippocampus and amygdala: the PATH through life study. Brain and Behavior. 1(2): 125-134. doi:10.1002/brb3.24

Corballis, M.C. (2014). Left brain, right brain: facts and fantasies. PLoS Biology. 12(1):e1001767. doi:10.1371/journal.pbio.1001767

Nelson, E.L., Campbell, J.M., Michel, G.F. (2014). Early handedness in infancy predicts language ability in toddlers. Developmental Psychology. 50(3): 809-814. doi:10.1037/a0033803

Li, M., Wang, J., Liu, F., Chen, H., Lu, F., Wu, G., Yu, C., Chen, H. (2015). Handedness- and brain size-related efficiency differences in small-world brain networks: a resting-state functional magnetic resonance imaging study. Brain Connectivity. 5(4): 259-265. doi:10.1089/brain.2014.0291

Josse, G., Hervé, P.Y., Crivello, F., Mazoyer, B., Tzourio-Mazoyer, N. (2006). Hemispheric specialization for language: Brain volume matters. Brain Research. 1068(1): 184-193. doi:10.1016/j.brainres.2005.11.037

Inskip, P.D., Tarone, R.E., Brenner, A.V., Fine, H.A., Black, P.M., Shapiro, W.R., et al. (2003). Handedness and risk of brain tumors in adults. Cancer Epidemiology, Biomarkers and Prevention. 12(3): 223-225. PMID:12646512

Miller, B., Peeri, N.C., Nabors, L.B., Creed, J.H., Thompson, Z.J., Rozmeski, C.M., et al. (2018). Handedness and the risk of glioma. Journal of Neuro-oncology. E-published: doi:10.1007/s11060-018-2759-y

Image via sasint/Pixabay.

]]> 0
Cerebral Congestion: Why Giving Your Brain a Break Improves Productivity Fri, 16 Feb 2018 13:30:00 +0000 Eureka moment often strikes when a person is taking a break, relaxing, lying down, or walking. Nevertheless, the importance of recess or downtime remains unappreciated. But does being busy truly mean being productive? What about psychological congestion? Most people, earlier or later in life, have felt the burnout effect on their work, especially when their job demands high mental involvement.

Modern research seems to support the idea that our brain needs to take frequent breaks, similar to our muscles requirements during intensive physical activity. Even short periods of downtime may improve our capabilities to comprehend, think, imagine, be creative, and come up with new ideas. It seems that modern science has started to understand why Archimedes had his Eureka moment while taking a bath.

For centuries, the idea that doing nothing has something to do with creativity or better cognition was discarded as absurd. Research in the area was boosted with the invention of electroencephalography (EEG) in the early 20th century. EEG studies revealed high activity in the brain even at rest. However, these signals were rejected, considered at best to be those being created to support basic functionalities of life like breathing, and at worst, merely random noise.

The introduction of functional magnetic resonance imaging (fMRI) in the mid-1990s was a revolutionary moment in neurophysiology. Now investigators could study the effect of brain activity on the profusion of blood to specific brain areas. Researchers found that the brain is continually consuming almost 20% of the body’s energy, and this demand rises only slightly when a person is focusing on a specific task. Performing some intellectual tasks were found to be associated with higher perfusion of blood into specific brain areas, and during downtime, some other parts of the brain start to get activated. Various brain centers that became active during daydreaming were named the default mode network (DMN). Researchers have further concluded that the DMN is critical for the resting state of our brain.

Now, science has realized that being at rest does not necessarily mean being idle. When a person is resting, the brain wanders and engages in default mode (DM), something that is suppressed when working or focusing on something. DM has an essential role in mental development: switching to DM helps a person to recall memories, strengthen experiences, ponder the future, contemplate social behavior, and much more. Being too busy with various tasks or being continually distracted may deprive us of constructive internal reflection. During rest periods, the brain switches to DM and thus improves its creative skills and other abilities. These moments of downtime are the periods when we form a better understanding of self, engaging in self-dialogue. These idle moments give us an opportunity to dive deep into our experiences, make conclusions, and plan for the future.

It seems that idle time is, in actual fact, a phase of memory consolidation, something that has been proven to be important in various experiments in both humans and animal models. For example, in one trial, food items were put in a maze and specific electric patterns were recorded from the mouses’ brains while the animals were looking for these food items. Similar signals were also recorded while the mice were having a rest, thus indicating that they were just consolidating the memory. These electric patterns during the rest periods are known as sharp wave ripples. Furthermore, the researchers found that if these sharp wave ripples were distorted by another electrical signal, the mice had a problem remembering the items or recalling information.

A similar memory consolidation phenomenon that occurs during downtime has been recorded in humans. In one experiment on human subjects with electrodes implanted for controlling epilepsy, researchers noticed the sharp wave ripples during downtime when the participants were shown various images. Researchers found that the ability to recall information was directly proportional to the strength of the sharp wave ripples created in the brain while having a rest, thus confirming a role in thought or memory consolidation role for mental downtime.

This phenomenon is very familiar to many people who are trying to learn new skills. Most people note sudden improvements in learning after a period of good rest. Thus, if a person is learning to play piano, he or she may notice swift progress after a rest period. In fact, it seems that the brain takes advantage of every small moment it gets to take a break by switching to DM.

These findings have several practical implications. They indicate the need to take regular breaks at work to maintain the optimal level of functionality. They tell us that working nine to five may not be the most productive way of achieving our professional goals. People would be more creative, productive, and energetic by taking more frequent breaks and vacations.

The human brain is continually depleted of its resources if it does not get enough rest. There are indications that even short naps during the day may have an excellent replenishing effect on our mental resources. Multiple studies have established the positive impact of daytime naps or “power naps”. However, one must also understand the importance of sleep inertia, meaning that longer naps would require more extended periods of recovery to reach optimal performance.

In a way, these studies further confirm the importance of practicing mindfulness. They explain how meditation can improve various mental processes. Multiple studies have proven the value of meditation for psychological revival. Studies have indicated that regular meditation may even increase the volume of different regions of the brain that are important for optimal mental abilities. Some experiments have shown that practicing mindfulness even for a week may enhance memory and concentration. One could say that meditation is a low-tech gym for the brain.

Working hard and being productive are not the same thing. It is essential that we understand the importance of downtime. These breaks when we do nothing are critical for our psychological revival and achieving those elusive Eureka moments. Take a break!


Axmacher, N., Elger, C. E., & Fell, J. (2008). Ripples in the medial temporal lobe are relevant for human memory consolidation. Brain, 131(7), 1806–1817. doi:10.1093/brain/awn103

Damoiseaux, J. S., Rombouts, S. A. R. B., Barkhof, F., Scheltens, P., Stam, C. J., Smith, S. M., & Beckmann, C. F. (2006). Consistent resting-state networks across healthy subjects. Proceedings of the National Academy of Sciences of the United States of America, 103(37), 13848–13853. doi:10.1073/pnas.0601417103

Girardeau, G., Benchenane, K., Wiener, S. I., Buzsáki, G., & Zugaro, M. B. (2009). Selective suppression of hippocampal ripples impairs spatial memory. Nature Neuroscience, 12(10), 1222. doi:10.1038/nn.2384

Immordino-Yang, M. H., Christodoulou, J. A., & Singh, V. (2012). Rest Is Not Idleness: Implications of the Brain’s Default Mode for Human Development and Education. Perspectives on Psychological Science, 7(4), 352–364. doi:10.1177/1745691612447308

Zeidan, F., Johnson, S. K., Diamond, B. J., David, Z., & Goolkasian, P. (2010). Mindfulness meditation improves cognition: Evidence of brief mental training. Consciousness and Cognition, 19(2), 597–605. doi:10.1016/j.concog.2010.03.014

Image via Pexels/Pixabay.

]]> 0
Why Do Children Learn Foreign Languages So Easily? Wed, 14 Feb 2018 13:30:34 +0000 Many researchers believe that learning foreign language before the puberty and even better earlier allows children to speak more fluently, almost like native speakers.  In addition, learning more than one language at early age improves lifelong ability to communicate with others and contributes to cognitive development and cultural awareness.

Many studies suggest that the best time to introduce a foreign language is before the age of ten. At this early stage of life language is learned and acquired faster, retained better, and spoken with exceptional pronunciation. It is widely accepted that the younger the learners, the more successful they are at imitating new sounds. This is because our brain is more open to new sounds (words) before adolescence. On the other hand, it is extremely difficult for older learners to speak a new language without having a “foreign” accent.

Although some findings have indicated that young child up to the age of 5 can process up to five languages, experts mostly agree that a bilingual approach is best for young children.  Nowadays, many children grow up in bilingual families and environments and thus acquire two languages as their first ones. All around the world, children successfully learn two languages at the same time, starting from birth. Studies have demonstrated that bilingual infants can discriminate and separate their two languages even before they speak their first word. In addition, they are starting to build sound representations for both languages during the first year of life. Some experts believe that infants are born with the capacity to distinguish speech-sound contrasts from all of the worlds’ languages, while the experience of listening to one language (in contrast to another) helps to maintain distinctions between them.

Both scientific and popular literature often discuss the effects of speaking multiple languages over one language on brain functioning and cognition. As available data imply, bilingual people have enhanced cognitive processing in comparison with monolinguals due to the constant switching from one language to another. This switch is assumed to be sustained by functional and anatomical changes in the brain, suggesting that there are structural and functional neuronal differences between these individuals. More precisely, multiple language speakers seem to undergo plastic changes in certain brain networks enabling them to handle control of multiple languages.

Some scientists believe that children learn language differently but not necessary easier than adults. As they point out, children acquire a language by using the same parts of the brain as the parts that control unconscious actions. This is why it often seems that children pick up words and phrases without much effort. On the other hand, adults are more capable for complex and intellectual learning.  Other researchers consider that our brain is set up to acquire language naturally during childhood and early adolescence. Apart from possible predispositions in the brain, children seem to be more motivated to learn languages compared to adults: they devote much more time to learning new words and phrases.

Language consists of four dimensions: the sound system (i.e., the phonology), the meanings system (the semantics), the world formation rules (the morphology), and the sentence formation rules (the syntax). The different subsystems involved in the acquisition of a language are assumed to have differences in developmental progression and the optimal period for acquiring them.

For instance, babies begin life with a tendency for phonology acquisition. It is commonly considered that the phonetic segment, as the smallest segment in a language, varies between children and adults, which might explain why children learn language differently and most probably easier. Neurological studies have indicated differences between auditory processing in children and adults due to the differences in cortical sequences of brain hemispheres. One recent study examined the ability of Dutch-speaking adults and 9-year old children to rapidly recite novel word sequences, which were in accordance with the phonotactics of Dutch language. Phonotatics refer to the phonological rules for the sequences that can occur in a language. This study has shown that children started learning new phonotactics effectively on the first day of the experiment, while it took adults 2 days to get the same results.

Even early research suggested that language is learned differently before and after the onset of puberty. Namely, in the late 1960s, one scientist proposed that language can only be acquired during the critical period, defined as the period between birth and puberty. At this life stage, maturational and experiential forces direct the left brain hemisphere toward gradual specialization for language. This process is assumed to be finalized before puberty, regardless of how complete the language acquisition is. This means that after puberty the language is not learned through the neural systems specialized for language learning, but through the mechanisms intended for general learning.

Based on these findings, scientists raised the question, “Does this critical period for language learning extend to acquisition of a second language as well”? A group of researchers tested the English proficiency acquired by native Chinese and Korean speakers who were 3 to 39 years of age at the time of arrival to the United States and who had lived in the USA between 3 and 26 years prior to testing. The test was based on investigating the efficiency of using various English grammar structures. The results revealed a clear advantage for earlier arrivals. Namely, up until puberty, performance in the test linearly correlated with the age of arrival, while after puberty the performance was unrelated to the age at arrival and, more importantly, it was quite low. Therefore, this study demonstrated that children are better learners of a second language as well as their native language, reaching higher levels of proficiency.

To sum up, children may not be better at learning languages in terms of inputting effort and dedicating time towards this aim, but they are certainly better than adults at acquiring the correct grammatical and phonetic structure of a foreign language. Age-associated changes in brain structure and plasticity make the task of learning a foreign language more difficult for older people as their brains process information differently.


Ghasemi, B., Hashemi, M. (2011). Foreign Language Learning During Childhood. Procedia – Social and Behavioral Sciences. 28: 872-876. doi: 10.1016/j.sbspro.2011.11.160

Werker, J.F., Byers-Heinlein, K. (2008). Bilingualism in infancy: first steps in perception and comprehension. Trends in Cognitive Sciences. 12(4): 144-151. doi: 10.1016/j.tics.2008.01.008.

Van de Putte, E., De Baene, W., García-Pentón, L., Woumans, E., Dijkgraaf, A., Duyck, W. (2017). Anatomical and functional changes in the brain after simultaneous interpreting training: A longitudinal study. Cortex. 99: 243-257. doi: 10.1016/j.cortex.2017.11.024

Nora, A., Karvonen, L., Renvall, H., Parviainen, T., Kim, J.Y., Service, E, Salmelin, R. (2017). Children show right-lateralized effects of spoken word-form learning. PLoS One. 12(2): e0171034. doi: 10.1371/journal.pone.0171034

Smalle, E.H.M., Muylle, M., Szmalec, A., Duyck, W. (2017). The different time course of phonotactic constraint learning in children and adults: Evidence from speech errors. Journal of Experimental Psychology. Learning, Memory and Cognition. 43(11): 1821-1827. doi: 10.1037/xlm0000405

Johnson, J.S., Newport, E.L. (1989). Critical period effects in second language learning: the influence of maturational state on the acquisition of English as a second language. Cognitive Psychology. 21(1): 60-99. PMID: 2920538

Image via sasint/Pixabay.

]]> 0
Nicotine: A Powerful Nootropic? Mon, 12 Feb 2018 13:30:02 +0000 Smoking kills. It increases the risk of lung cancer, it leads to peripheral vascular diseases and peripheral neuropathy, it causes cancer of the mouth, pharynx, larynx, and so on.

But can it make you smarter???

It seems yes, although not the smoking itself but the main neurostimulant in tobacco called nicotine. This addictive alkaloid compound has various effects on our brain, and these effects are the reason many people love to smoke. There is no doubt that tobacco smoking is harmful, but on the other hand, it is no secret that even the best of minds like Churchill and Einstein were heavy smokers. Most people know at least one person who claims that smoking helps him or her to overcome stress, think better, or be creative. Modern research seems to support the beneficial effects of nicotine on the brain.

Nicotine: the natural nootropic

Despite all the negative sentiments surrounding the word nicotine, more and more researchers are calling nicotine a powerful nootropic. Nicotine is abundant in the leaves of the tobacco plant. It has been found to have beneficial effects on memory, cognition, creativity, motivation, and executive functions in healthy subjects. In short, it is a substance that may help healthy individuals to become smarter, and boost their brain power and mental capabilities. Furthermore, clinical trials are indicating that it may have a role to play in the treatment of various neurological conditions. Nicotine seems to work by modulating the release of neurotransmitters through its effect on presynaptic nicotine acetylcholine receptors.

Nicotinic receptors are present in the brain and other organs like adrenal glands, acting as an acetylcholine agonist by stimulating these receptors. Nicotine also affects the reward pathway through dopamine release, which explains the benefits of nicotine in conditions like Parkinson’s disease. Nicotine acts on the brain in a rather sophisticated way. It acts more like a moderator, thus stimulating the brain when a person is feeling tired, while having a calming effect during periods of anxiety. Meanwhile, the effects of nicotine on peripheral receptors such as those in adrenal glands explain its action on the blood vessels and heart.

Thus, nicotine can be called a cognition enhancer that:

  • Improves attention in healthy individuals, helping them to perform a variety of tasks in a better way;
  • Improves short and long-term memory in healthy individuals;
  • Improves attention and performance in attention deficit hyperactivity disorder (ADHD), schizophrenia, Alzheimer’s disease, and other neurodegenerative conditions.

With research indicating that nicotine itself is not very addictive, as is evident from the failure of nicotine patches in helping to quit smoking without the use of cognitive behavioral therapy, it is evident that addiction to tobacco smoking cannot be explained by nicotine alone. New research indicates that addiction to smoking may involve other chemical components of tobacco that are involved with the inhibition of monoamine oxidase in the brain.

In fact, it seems that nicotine is quite safe for healthy people when given in the right dose and through the right route. Nicotine patches have been demonstrated to be a safe option.

Research from the National Institute on Drug Abuse (NIDA) using functional magnetic resonance imaging (fMRI) has provided some of the most robust evidence that nicotine improves attention. Scientists allotted a task that demanded high attention to two groups, smokers and non-smokers. The fMRI scans demonstrated significantly higher activity in the brain areas related to the task in those who smoked.

Many studies have shown the role of the hippocampus in memory. Stimulation of nicotinic receptors has a positive effect on memory, while blockade of these receptors results in poorer memory. Various studies support the idea that a low dose nicotine can improve long and short-term memory through its effect on nicotine acetylcholine receptors.

Since nicotine improves memory and cognition, it has been extensively studied in diseases characterized by neurodegeneration and the worsening of mental functions. One such serious ailment is Alzheimer’s disease. Accumulation of amyloid plaques and resulting death of neurons is the prime reason for the progressive loss of memory and cognition in Alzheimer’s. Numerous studies have demonstrated that even a low dose of nicotine can have a beneficial effect on memory, cognition, and attention in those living with Alzheimer’s disease. Furthermore, some early studies have demonstrated that nicotine may also directly inhibit the formation of amyloid plaques in Alzheimer’s.

Apart from being beneficial to cognition, nicotine has also been found useful for Parkinson’s disease, a condition involving motor dysfunction of central origin. The primary cause of Parkinson’s is deficits in the release of dopamine in some regions of the brain. Nicotine seems to have a neuroprotective action as it decreases the symptoms of rigidity and tremor associated with the disease. Thus, nicotine may be used both in the symptomatic and pathological treatment of Parkinson’s disease.

These benefits show that there is a need to understand nicotine better, or even develop safer novel nicotine analogs that have all the benefits of nicotine but don’t cause as much harm. There is also a need to study the use of nicotine in various other health conditions considering that nicotine taken as a medication (in controlled and measured doses) may not be that harmful as smoking tobacco. Thus, it seems that there is some truth to the statement of smokers that smoking helps them to think better.


Berlin, I., & M. Anthenelli, R. (2001). Monoamine oxidases and tobacco smoking. International Journal of Neuropsychopharmacology, 4(1), 33–42. doi:10.1017/S1461145701002188

Gotti, C., & Clementi, F. (2004). Neuronal nicotinic receptors: from structure to pathology. Progress in Neurobiology, 74(6), 363–396. doi:10.1016/j.pneurobio.2004.09.006

Gray, R., Rajan, A. S., Radcliffe, K. A., Yakehiro, M., & Dani, J. A. (1996). Hippocampal synaptic transmission enhanced by low concentrations of nicotine. Nature, 383(6602), 713. doi:10.1038/383713a0

Jones, G. M. M., Sahakian, B. J., Levy, R., Warburton, D. M., & Gray, J. A. (1992). Effects of acute subcutaneous nicotine on attention, information processing and short-term memory in Alzheimer’s disease. Psychopharmacology, 108(4), 485–494. doi:10.1007/BF02247426

Levin, E. D. (2002). Nicotinic receptor subtypes and cognitive function. Journal of Neurobiology, 53(4), 633–640. doi:10.1002/neu.10151

Levin, E. D., Bradley, A., Addy, N., & Sigurani, N. (2002). Hippocampal ?7 and ?4?2 nicotinic receptors and working memory. Neuroscience, 109(4), 757–765. doi:10.1016/S0306-4522(01)00538-3

Powledge, T. M. (2004). Nicotine as Therapy. PLOS Biology, 2(11), e404. doi:10.1371/journal.pbio.0020404

Quik, M., Huang, L. Z., Parameswaran, N., Bordia, T., Campos, C., & Perez, X. A. (2009). Multiple roles for nicotine in Parkinson’s disease. Biochemical Pharmacology, 78(7), 677–685. doi:10.1016/j.bcp.2009.05.003

Sahakian, B., Jones, G., Levy, R., Gray, J., & Warburton, D. (1989). The effects of nicotine on attention, information processing, and short-term memory in patients with dementia of the Alzheimer type. The British Journal of Psychiatry, 154(6), 797–800. doi:10.1192/bjp.154.6.797

Salomon, A. R., Marcinowski, K. J., Friedland, R. P., & Zagorski, M. G. (1996). Nicotine Inhibits Amyloid Formation by the ?-Peptide. Biochemistry, 35(42), 13568–13578. doi:10.1021/bi9617264

Suliman, N. A., Taib, M., Norma, C., Moklas, M., Aris, M., Adenan, M. I., … Basir, R. (2016). Establishing Natural Nootropics: Recent Molecular Enhancement Influenced by Natural Nootropic. Evidence-Based Complementary and Alternative Medicine, 2017, 12. doi:10.1155/2016/4391375

Villafane, G., Cesaro, P., Rialland, A., Baloul, S., Azimi, S., Bourdet, C., … Maison, P. (2007). Chronic high dose transdermal nicotine in Parkinson’s disease: an open trial. European Journal of Neurology, 14(12), 1313–1316. doi:10.1111/j.1468-1331.2007.01949.x

Warburton, D. M. (1992). Nicotine as a cognitive enhancer. Progress in Neuro-Psychopharmacology & Biological Psychiatry, 16(2), 181–191.

Image via RyanMcGuire/Pixabay.

]]> 0
Vitamin D and Brain Health Wed, 07 Feb 2018 13:00:02 +0000 Although the main functions of vitamin D include maintaining phosphorus and calcium homeostasis (and hence providing a positive effect on bone health), vitamin D plays other important roles in the body. Consequently, vitamin D deficiencies can be felt in the whole body, including the brain.

Vitamin D is an important member of fat-soluble vitamins that can be endogenously synthesized or obtained from foods that contain vitamin D or foods that are supplemented (fortified) with vitamin D.  Endogenous synthesis of vitamin D starts in human skin under the influence of ultraviolet B (UVB) radiation, which is why this vitamin is often called the sunshine vitamin. This is how vitamin D3 is formed. This vitamin undergoes further metabolism in the body, leading to the creation of its active form. Vitamin D3 can also be obtained from foods of animal origin, with fatty fish and egg yolk as major sources. In addition, it can be found in fortified foods, such as milk or cereals. On the other hand, vitamin D2 can be synthesized in mushrooms under the influence of sunlight, and it can be found in some plants. Generally, vitamin D3 is more potent than vitamin D2, meaning that it raises the levels of active forms of vitamin D in the body more efficiently.

According to scientific literature, low levels of vitamin D are implicated in conditions such as neurodegenerative diseases (Parkinson’s disease for instance), ischemic stroke, multiple sclerosis, depression, autism and cognitive decline. Thus, vitamin D appears to play an important role in the regulation of optimal brain functioning.

Vitamin D and depression

Depression is a common mood disorder, affecting more than 120 million people worldwide. As some researchers have identified, vitamin D status (more precisely its deficiency) seems to be a potent biomarker of depression. Another group of researchers investigated the association between vitamin D levels and the scores on tests used to evaluate depression symptoms. They compared these scores in more than 200 participants with low levels of vitamin D with around 100 participants who had high levels of vitamin D. It was found that participants from the first group were more prone to depression.

In addition, one interesting study identified an inverse association between prenatal levels of vitamin D and postpartum depressive symptoms. Postpartum depression represents a serious mental health issue that occurs after childbirth and is characterized by behavioral changes and disturbances in emotions. Furthermore, one study has identified lower levels of vitamin D in postmenopausal women with major depressive disorder than postmenopausal women with no record of depression.

Vitamin D and multiple sclerosis

Multiple sclerosis (MS) is a progressive neuroinflammatory condition affecting more than 2 million people worldwide, mostly young adults. It represents a complex disease of the central nervous system, and its etiology is not completely understood. Still, immune system and inflammation seem to be important determinants of this condition. It is believed that vitamin D, due to its immune-modulatory and anti-inflammatory properties, can attenuate multiple sclerosis progression. In addition, low levels of vitamin D have been reported in individuals with MS. An interesting feature of MS is that its prevalence significantly increases at geographic locations closer to the poles.

Also, there is an inverse association between sun exposure and prevalence of MS. Since vitamin D is produced under sun exposure, it might have a sustainable effect on the incidence of multiple sclerosis.  Some findings indicate that vitamin D levels at gestational stage could negatively influence the development of MS later in life. Still, other scientists believe that vitamin D intake throughout the whole lifespan is more efficient in reducing the risk of MS. According to some estimation in Caucasians, it seems that every 50 nmol/L increase in the blood levels of vitamin D leads to more than 40% decrease in the risk of MS.

One group of researchers conducted an interesting trial and investigated whether the levels of vitamin D could predict disease progression in subjects having the first event suggestive of MS. As these researchers reported, higher baseline vitamin D predicted reduced disease activity and its slower progression, as evaluated by the follow-up study lasting for several months (6, 12, and 24 months). This indicates that low levels of vitamin D in the early diagnosis of MS might represent a strong risk factor and a predictor of further disease progression and activity.

Depression is a common co-morbidity of multiple sclerosis. Thus, one study investigated the association between vitamin D levels and the depressive symptoms in this condition. Two hundred subjects with MS were included, and depressive symptoms (assessed by the relevant scale), as well as levels of vitamin D in blood, were measured. Interestingly, the prevalence of low vitamin D levels was >45%. Moreover, lower levels of vitamin D were inversely associated with depressive symptoms, suggesting the potential role of vitamin D supplementation in the amelioration of depression in MS.

Other conditions

Apart from depression and MS, vitamin D is assumed to play an important role in other conditions related to brain health. For instance, cognitive decline has been associated with lower levels of vitamin D. Also, vitamin D is now considered as an important protective factor in the development of autism and attention hyperactive disorder (ADHD). Importantly, a mother’s intake of vitamin D during pregnancy, as well as supplementation of a child especially during the first months of life, seem to be crucial for proper brain development.

Evidently, the role of vitamin D in the human body is far more complex than its impact on bone health alone. Perhaps the most important aspects of vitamin D health effects include its impact on proper brain development and functioning. An adequate intake of vitamin D is important not only in the early stages of life but throughout the whole lifespan in order to prevent conditions such as depression and multiple sclerosis. Vitamin D deficiency being a global health issue, supplementation (or a larger intake of fortified foods) is especially important in the winter months when we are less exposed to the sun.


Abboud, M., Rybchyn, M.S., Rizk, R., Fraser, D.R., Mason, R.S. (2017). Sunlight exposure is just one of the factors which influence vitamin D status. Photochemical and Photobiological Sciences. 16(3): 302-313. doi:10.1039/c6pp00329j

Adibfar, A., Saleem, M., Lanctot, K.L., Herrmann, N. (2016). Potential Biomarkers for Depression Associated with Coronary Artery Disease: A Critical Review. Current Molecular Medicine. 16(2): 137-164. PMID:26812919

Kjærgaard, M., Waterloo, K., Wang, C.E., Almås, B., Figenschau, Y., et al. (2012). Effect of vitamin D supplement on depression scores in people with low levels of serum 25-hydroxyvitamin D: nested case-control study and randomised clinical trial. British Journal of Psychiatry. 201(5): 360-368. doi:10.1192/bjp.bp.111.104349

Accortt, E.E., Schetter, C.D., Peters, R.M., Cassidy-Bushrow, A.E. (2016). Lower prenatal vitamin D status and postpartum depressive symptomatology in African American women: Preliminary evidence for moderation by inflammatory cytokines. Archives of Women’s Mental Health. 19(2): 373-383. doi:10.1007/s00737-015-0585-1

Atteritano, M., Lasco, A., Mazzaferro, S., Macrì, I., Catalano, A. et al. (2013) Bone mineral density, quantitative ultrasound parameters and bone metabolism in postmenopausal women with depression. Internal and Emergency Medicine. 8(6):485-91. doi:10.1007/s11739-011-0628-1

Faridar, A., Eskandari, G., Sahraian, M.A., Minagar, A., Azimi, A. (2012). Vitamin D and multiple sclerosis: a critical review and recommendations on treatment. Acta Neurologica Belgica. 112(4): 327-333. doi:10.1007/s13760-012-0108-z

Ascherio, A., Munger, K.L., White, R., Köchert, K., Simon KC, et al. (2014). Vitamin D as an early predictor of multiple sclerosis activity and progression. JAMA Neurology. 71(3): 306-314. doi:10.1001/jamaneurol.2013.5993

Ashtari, F., Ajalli, M., Shaygannejad, V., Akbari, M., Hovsepian, S. (2013). The relation between Vitamin D status with fatigue and depressive symptoms of multiple sclerosis. Journal of Research in Medical Sciences. 2013 Mar;18(3):193-7. PMID:23930114

Image via silviarita/Pixabay.

]]> 0
Daytime Naps, Productivity, and Brain Functions Wed, 31 Jan 2018 16:30:17 +0000 It is not a secret that sleep is important for overall health, especially for the proper brain functioning and successful performance of daily tasks. Lack of sleep during the night forces many of us to take a daytime nap. Is this a good practice? Scientific research shows that a short daytime nap can be a good idea even if you don’t suffer from sleep deprivation at night.

The importance of sleep is evident starting from the early childhood, as the baby’s sleep pattern influences his/her cognitive development. More importantly, how we sleep as babies has an impact on our productivity later in life. Short night sleep duration in toddlers has been linked to their poor cognitive performance as they enter the school. One study demonstrated the relationship between vocabulary (i.e., language development) and sleep pattern in infants and toddlers, with particular importance of daytime naps. It turns out that daytime nap has the same importance as night sleep, if not even bigger.

Apart from reducing sleepiness, mid-day naps offer benefits such as memory consolidation, better learning, better tasks performance, as well as improved emotional processing. Unfortunately, the habit of staying awake for most of the day has become a widespread phenomenon, especially in the developed, industrialized countries where the constant drive for all day work and profits is overwhelming. However, researchers are convinced that we are physiologically prone to snooze during the so-called nap zone between 2 and 4 pm since our brain seems to prefer to toggle between wake and sleep more than once a day.  So, during one of those days filled with multiple work tasks, we probably should take a break and rethink how siesta can benefit us.

Studies have indicated that naps can sharpen our brain, i.e., improve concentration in both fully rested and sleep-deprived individuals. This can lead to significantly better performance of various tasks, from driving to all kind of work assignments. One interesting study retrospectively analyzed the incidence of highway car accidents occurring to shift-working police drivers. What this study confirmed is that napping before night shift effectively improved performance and reduced number of car accidents by almost 50%.

Another trial investigated the effects of 40 minutes nap at 3 am on cognitive and psychomotor performances in nurses and physicians working 12-hour night shifts at an emergency department. The researchers tested attention and memory scores of these workers at three points in time: before, during, and after the night shift. The workers were split into two groups: one group had a nap, and another group worked continuously without napping. The result was quite revealing: workers that had a nap scored better at attention tests after the night shift and performed work assignments (such as inserting a catheter) more successfully than the non-napping group.

Some researchers suggest that even brief naps can improve cognitive performances. One trial investigated benefits of the naps of different duration compared to no naps at 3 pm after nocturnal sleep restricted to 5 hours only. While five minutes naps provided no benefits, when compared with no nap, 10, 20 and 30-minutes naps resulted in improvements of measured outcomes, including cognitive performance. Overall, 10 minutes naps were identified as the most efficient, as they provided almost immediate improvements lasting for more than 2 hours.

So, what happens in the brain during those evidently beneficial naps? Regardless of the time, we spend sleeping during the night, sleepiness increases during the day and leads to the decline in cognitive abilities, including working memory. Mid-day naps (siesta) can reduce this decline by minimizing the homeostatic sleep pressure. Homeostatic sleep pressure results from the accumulation of adenosine, a by-product of cellular metabolism. Adenosine is recognized as an important sleep factor, and its accumulation is amplified with more time spent awake. Naps improve executive functions by reducing sleepiness (i.e., improving alertness) via this mechanism involving adenosine. Besides, there is a possible alternative mechanism that involves the interaction of adenosine and dopamine regulating pathways in the brain parts crucial for executive functioning.

Naps enhance memory integration, i.e., solidification of previously learned information and improvement of subsequent learning. This is assumed to be mediated by activation of brain cells in the hippocampus, an important brain part associated mainly with memory and learning. Activation of the hippocampus during the learning in the sleep-deprived brain is altered. It has been proposed that this alteration mediates the association between sleep loss and learning deficits.

Research findings also suggest that there is a bidirectional link between sleep and emotions, and poor sleep is often associated with poor mental health. Thus, by improving sleep patterns, naps could be an efficient tool for emotional regulation. Research on children has shown that napping promotes more appropriate and mature responses to differing stimuli. Children taking naps respond more positively to positive stimuli, and less negative to negative/neutral stimuli compared with children deprived of napping.

To sum it up, there is a growing body of evidence that daytime nap is not just a decadent luxury. It helps to improve brain performance in people of all ages and regardless of how well they slept at night. It improves memory, enhances alertness, improves performance and learning. These facts confirm the wisdom of having a siesta, as the benefits of such practice compensate for the time lost during the busy working part of the day.


Horváth, K., Plunkett, K. (2016). Frequent daytime naps predict vocabulary growth in early childhood. Journal of child psychology and psychiatry, and allied disciplines. 57(9): 1008-1017. DOI:  10.1111/jcpp.12583

Garbarino, S., Mascialino, B., Penco, M.A., Squarcia, S., De Carli, F., Nobili, L., et al. (2004). Professional shift-work drivers who adopt prophylactic naps can reduce the risk of car accidents during night work. Sleep. 27(7): 1295-1302. PMID: 15586782

Smith-Coggins, R., Howard, S.K., Mac, D.T., Wang, C., Kwan, S., Rosekind, M.R., et al. (2006). Improving alertness and performance in emergency department physicians and nurses: the use of planned naps. Annals of Emergency Medicine. 48(5): 596-604. DOI: 10.1016/j.annemergmed.2006.02.005

Brooks, A., Lack, L. (2006). A brief afternoon nap following nocturnal sleep restriction: which nap duration is most recuperative? Sleep. 29(6): 831-840. PMID: 16796222

Mantua, J., Spencer, R.M.C. (2017). Exploring the nap paradox: are mid-day sleep bouts a friend or foe? Sleep Medicine. 37: 88-97. DOI: 10.1016/j.sleep.2017.01.019

Berger, R.H., Miller, A.L., Seifer, R., Cares, S.R., LeBourgeois, M.K. (2012). Acute sleep restriction effects on emotion responses in 30- to 36-month-old children. Journal of Sleep Research. 21(3): 235-46. DOI: 10.1111/j.1365-2869.2011.00962.x

Image via Pexels/Pixabay.

]]> 0
Obesity and the Blood-Brain Barrier: What is the Connection? Fri, 19 Jan 2018 16:30:37 +0000 The blood-brain barrier (BBB) is an important physiological formation tasked with protecting the brain from multiple chemicals that might circulate in our bloodstream. The BBB obstructs the exchange and movement of most molecules, cells, and proteins in and out of the central nervous system (CNS). This helps to keep the brain “cool” and unaffected by whatever we eat and the kind of infections we encounter. The BBB is formed by the blood vessels in the CNS that are lined by endothelial cells. It is a complex structure that ensures the maintenance of the metabolic and immunoregulatory homeostasis in the CNS. In a healthy brain, the barrier prevents most of the cerebrospinal fluid (CSF) molecules from circulating in the periphery and most of the peripheral molecules from diffusing into the CSF.

However, even in healthy brains, the BBB is not completely impermeable. The arcuate nucleus of the hypothalamus has an incomplete BBB, allowing circulating hormones to act on the regulatory systems of this part of the brain. Several neurological disorders are characterized by a compromised BBB, including stroke, CNS infections, and neurodegenerative diseases.

Obesity and BBB

It has been shown that overconsumption of foods high in saturated fats and simple sugars degrades the integrity of the BBB and could lead to the serious damage of vulnerable brain regions such as the hippocampus. Moreover, obesity causes pathological changes to the BBB that can worsen one’s general health and lead to additional pathological changes in the CNS such as neuroinflammation and cognitive decline. Obesity also causes changes to the number of cell types in a neurovascular unit that alters the BBB integrity.

However, in most overweight people, the excess body weight does not stimulate serious pathological processes. Obesity is not exactly a disorder that damages the brain, and the integrity of the BBB is not compromised by excess body weight. However, there is a remarkable link between obesity and the BBB, and this link exists because of how the brain functions normally.

Brain systems regulating feeding behavior

The systems involved in the development of obesity are complex and still not fully understood. However, it is believed that a disrupted energy homeostasis could be the root of the problem. Feeding behavior is regulated by metabolic, autonomic, endocrine, and environmental factors. Although it is influenced by numerous elements in each individual, “energy homeostasis” plays an important role by creating a balance between energy intake and expenditure.

The hypothalamus is one of the brain areas that has been recognized as a regulator of food intake, body weight, energy, and glucose homeostasis. It receives and processes the metabolic signal from the periphery, and reward and sensory inputs from the cortex. In turn, it sends output signals to the parts of central nervous system regulating feeding behavior and body weight.

Feeding hormones and the role of leptin

The hypothalamus is involved in the regulation of appetite through one appetite-stimulating signaling pathway and one appetite-suppressing pathway. These pathways involve specific hormones and neuropeptides, that can be found either in the blood or the CSF. In order for these hormones to function properly, they have to cross the vascular BBB via specialized transport systems. Many of the hormones regulating feeding behavior are disrupted in obesity, including insulin, leptin, adiponectin, and ghrelin.

One of the key hormones in the development of obesity is leptin. Leptin participates in energy homeostasis and regulation of metabolism. It responds to the satiety signals produced during food consumption. Leptin is secreted by the adipose tissue, and its secretion is positively correlated with the amount of body fat. Its secretion sends signals to the brain to suppress appetite and increase thermogenesis, in an attempt to decrease adiposity. As the body fat rises, the blood leptin levels increase. During periods of fasting when body weight drops, the secretion of leptin is decreased. In general, it has been shown that blood leptin concentration is significantly higher in obese people compared to leaner individuals, and this elevated level is decreased when obese people lose weight. It would be logical to assume that with a high level of leptin in the bloodstream, obese individuals should feel satiated. However, this is not the case. The levels of leptin in the blood are not reflecting the levels of leptin in the CSF. It has been shown that the ratio of CSF-to-serum leptin levels was four times lower in obese individuals.

Leptin resistance

This nonlinear correlation between the concentration of leptin in the blood and the CSF could be due to the so-called central resistance to leptin action. The theory of leptin’s central resistance suggests that obesity could be caused by the restricted access of leptin to the brain. This contrasts with older suggestions that obesity could be caused by an inadequate production of leptin.

An important factor to consider is that the transportation of leptin from blood circulation to the brain through the BBB is achieved via specialized transporter proteins that are prone to saturation. This means that the level of transporters is not high enough to transport enough leptin molecules into the brain. This is a classical “traffic jam” situation: only a certain amount of cars can come through the road bottleneck, regardless of how many cars are in the queue. As a result, the brain doesn’t “feel” the real concentration of leptin in the blood. Due to saturation of this transport through the BBB, the circulating levels of leptin do not always correspond to the CSF concentration of leptin. It has been suggested that this transport system functions similarly in lean individuals with normal leptin concentrations and that the higher leptin levels have no biological effects once the system is already saturated. The brains of obese hyperleptinemic subjects are not even exposed to the elevated levels of leptin.

Can the BBB cause obesity?

The fact that the transport system is also saturated in lean individuals shows that the BBB’s leptin transport system has evolved to properly function only at lower adiposity and body weight. Low levels of serum leptin inform the brain that adipose reserves are adequate to expend calories on functions other than feeding, such as reproduction and strengthening of the immune system. But when the level of fat in the body exceeds a certain level, leptin signaling simply doesn’t work adequately, thus contributing to further weight gain.


Ballabh P, Braun A, Nedergaard M. (2004) The blood-brain barrier: an overview: structure, regulation, and clinical implications. Neurobiol Dis 16(1):1-13. doi: 10.1016/j.nbd.2003.12.016

Rhea EM et al. (2017) Blood-Brain Barriers in Obesity. AAPS J. 19, 921-930. doi:10.1208/s12248-017-0079-3.

Obermeier B, Daneman R, Ransohoff RM (2013) Development, maintenance and disruption of the blood-brain barrier. Nature Medicine 19(12): 1584-1596. doi: 10.1038/nm.3407

Burguera B et al. (2000) Obesity is associated with a decreased leptin transport across the Blood-brain barrier in rats. Diabetes 49: 1219-1223. PMID: 10909981

Hsu TM, Kanoski SE (2014) Blood-brain barrier disruption: mechanistic links between Western diet consumption and dementia. Frontiers in Aging Neuroscience 6:88. doi: 10.3389/fnagi.2014.00088

Banks W (2008) The Blood-Brain Barrier as a Cause of Obesity. Current pharmaceutical design 14. 1606-14. PMID: 18673202

Image via strecosa/Pixabay.

]]> 0