Neuroscience & Neurology – Brain Blogger Health and Science Blog Covering Brain Topics Fri, 01 Feb 2019 16:17:23 +0000 en-US hourly 1 https://wordpress.org/?v=5.0.3 The Brain of an Introvert /2018/08/24/the-brain-of-an-introvert/ /2018/08/24/the-brain-of-an-introvert/#respond Fri, 24 Aug 2018 11:55:01 +0000 /?p=23844 In the age of social media, networking and global never-ending communication, introverts are often viewed as rather inefficient. They are considered as people who would not happily express their opinion during the staff meetings or actively participate in brainstorming sessions. They are often considered to not be good at multitasking or be particularly charismatic. They are rarely at the center of attention at a party, and they often ignore their smartphones for hours in a row. These days, when we believe that big tasks require the active participation of large groups of people working together, being an introvert could come as a disadvantage.

But don’t discard introverts altogether: some of the most successful people in the world are introverts. Albert Einstein, Bill Gates, and even social media inventor Mark Zuckerberg are all self-confessed introverts. So how do these people who apparently lack some of the basic skills needed for a successful career manage to achieve so much? What makes the brain of an introvert so different and so special?

Being a loner has its upsides and downsides when it comes to health and success

It is well known that personality traits are not just the result of social conditioning – they have more to do with the genetics and brain structure. People are born or inherit specific personality traits quite like they inherit any particular physical parameters and characteristics. These personality traits bring their advantages and disadvantages. Studies have also demonstrated the anatomical difference between the introvert and extrovert brain. Imaging studies have shown the differences in both the grey matter and the white matter volumes in the various parts of the brain, thus confirming that personality traits are hard-wired into the brain.

Introverts do not like prolonged social interactions, and they feel uncomfortable in large social gatherings. Introverts don’t mind remaining isolated for extended periods. They love to think and dream. However, this self-imposed social isolation comes at a price. Lower social interaction increases the risk of certain disorders; it may negatively affect cognitive functioning, increase the risk of metabolic disorders, and negatively impact the immune system.

Extreme social isolation and its negative consequences are well studied. People living in orphanages or imprisoned for prolonged periods may go through periods of mental instability, and some may even experience hallucinations. However, being introverts is different, and self-imposed social isolation may not necessarily indicate idle brain or lack of resilience towards these health issues. The latest research studies show that these periods of being alone may have positive impacts on emotional and work life.

Focus on creativity

One of the benefits of being more focused on own thoughts is the improved creativity. Introverts are more open to different ideas; they may have a higher level of confidence and independence. Introverts are less concerned about what others may think. Studies have shown that prominent feature of both scientists and artists is the dislike for too much of social interaction: avoiding it leaves them with more time to focus on their ideas.

Introverts have more time to perfect their crafts than those who spend most of the time socializing. They have time to make sense of their thoughts and experiences. All this means that they have higher chances of achieving a eureka moment.

However, it should be understood that not every kind of social withdrawal is the same. Some type of non-socializing is an indicator of psychological and physical health issues. Social withdrawal may be due to shyness and anxiety, or it may be due to a dislike of socializing. Both may have a negative impact on health and would not necessarily better the creativity. On the other hand, those who socialize less just by choice (rather than due to anxiety or dislike) are more probable to be healthy and creative.

These findings are important, as earlier it was believed that unsociability can be harmful. Now the researchers demonstrated that unsociability might be even beneficial. Healthy introverts prefer to spend more time alone, but this does not mean a complete social withdrawal. They would usually have just enough of social interaction. Creative people prefer being alone, and at the same time, they spend enough time in the company of others.

The researchers also noticed that cultural differences may play an important role too. For instance, unsociable children in China had more academic problems in comparison to their Western counterparts. However, this difference is becoming less visible due to globalization.

There is a general belief that specific profession demand more sociable personality and extroverts are better in the leadership roles. However, this is not always accurate, and research shows that a lot depends on the collective nature of the employees. Introvert bosses are more successful if the employees are more sociable. On the other hand, extrovert bosses are better in a leadership role if the employees are less proactive.

Meditation, hermits, and health

If we look back into the human history, we understand that self-imposed isolation was commonly practiced by individual members of society. Hermits would practice solitude to achieve nirvana. Daydreaming in the absence of social interactions activates the so-called default mode of the brain. Thus, isolation helps in consolidating the memories and emotions, at least to a certain extent. Isolation helps a person in re-organizing thoughts. Interestingly, when people come out of self-imposed isolation, they are likely to socialize better and more effectively.

Researchers also warn that the boundary between the dangerous isolation and useful solitude is quite blurry. Extreme loneliness could be somewhat harmful or indicative of poor health. Practicing solitude to stay productive and creative does not mean being completely unsocial. On the other hand, there is a real danger for the physical and mental health of those who are never alone. Furthermore, research indicates that introverts have fewer but stronger bonds with others that lead to better satisfaction in life and greater happiness.

If a person does not like to socialize too much, there is nothing wrong with him or her. It is important that the solitude is a person’s choice and not forced upon him/her: even the classical introverts need few good friends.

References

Bowker, J. C., Stotsky, M. T., & Etkin, R. G. (2017). How BIS/BAS and psycho-behavioral variables distinguish between social withdrawal subtypes during emerging adulthood. Personality and Individual Differences, 119, 283–288. doi: 10.1016/j.paid.2017.07.043

Forsman, L. J., de Manzano, Ö., Karabanov, A., Madison, G., & Ullén, F. (2012). Differences in regional brain volume related to the extraversion–introversion dimension—A voxel based morphometry study. Neuroscience Research, 72(1), 59–67. doi: 10.1016/j.neures.2011.10.001

Grant, A., Gino, F., & Hofmann, D. A. (2010). The Hidden Advantages of Quiet Bosses. Harvard Business Review, (DECEMBER 2010 ISSUE). Retrieved from https://hbr.org/2010/12/the-hidden-advantages-of-quiet-bosses

Hatemi, P. K., & McDermott, R. (2012). The genetics of politics: discovery, challenges, and progress. Trends in Genetics, 28(10), 525–533. doi: 10.1016/j.tig.2012.07.004

Image via Ataner007/Pixabay.

]]>
/2018/08/24/the-brain-of-an-introvert/feed/ 0
Human and Bat Echolocation in the Brain Vs Vision /2018/08/23/echolocation-in-humans-and-other-animals-is-it-as-good-as-vision/ /2018/08/23/echolocation-in-humans-and-other-animals-is-it-as-good-as-vision/#respond Thu, 23 Aug 2018 11:55:50 +0000 /?p=23836 In humans, vision is the major channel for receiving information about the world. The same applies to the majority of animals, even those that are nocturnal and have to rely more heavily on the inputs from other senses. There are, however, exceptions: some mammals use echolocation to construct a picture of their surroundings, effectively using it instead of vision. How effective is this method of gathering information, and can it substitute vision successfully?

Echolocation

Echolocation is used by several kinds of animals for navigation in various environments. Whales, dolphins, and bats emit calls (high-frequency sounds) and then listen to the echoes returned from different objects surrounding them. The distance to objects can be estimated based on the time delay between the production of the call (click) and detection of the echo. Since the sound travels in the air with the speed of 340 m/s, a delay of 2 milliseconds, for instance, would mean that the object/target is about 34 cm away.  In addition, sounds travel faster in water than in the air, making the clicking signals produced by whales of shorter duration than the signals produced by bats. Some echolocation signals produced by dolphins and sperm whales are even audible to humans.

Echolocation and its importance in the animal kingdom have been widely studied. Nature provides remarkable examples of how efficient echolocation can be. Bats easily detect tiny insects several meters away in complete darkness. Some species of bats in China and South America can fish in the darkness using echolocation, detecting ripples on the water surface that are indicative of the presence of fish under the surface. Sperm whales use echolocation to find and catch prey, mostly giant squids, deep in the ocean. These whales can dive to the depth of well over 2 kilometers and navigate through underwater canyons where their prey lives.

Although echolocation can be very efficient, it is not particularly common in the animal kingdom and was, in fact, developed independently by several groups of evolutionarily unrelated species. This is markedly different from vision, which is present in the majority of animals, with mechanisms perfected over millions of years of uninterrupted evolution.

Echolocation and Vision in Bats

If echolocation delivers essentially the same information as vision, can echolocation rely on the brain processes related to the processing of visual information?

To answer this question, scientists have investigated brain mechanisms underlying the processing of echo signals that allow animals to map objects in term of distance and direction.

Recently, an interesting study was conducted in bats with the aim of revealing what is happening in their brain while they fly through a room filled with obstacles (i.e., acoustically reflective plastic cylinders hanging from the ceiling). In order to determine the mechanisms underlying the bats’ navigation around these obstacles, researchers performed simultaneous chronological neural recordings.

The results show that the bats adjusted their flight and sonar behavior in order to respond to the echoes coming from the objects in the room. The objects’ positions were changing across the recording sessions, and bats started their flight from different start points to make sure that they didn’t rely on spatial memory from previous sessions and used only echo feedback to navigate.

The most important finding was the identification of the brain region that helped the animals to locate the objects in their environment. Echolocation signals were processed in the superior colliculus, a structure located in the midbrain. The superior colliculus consists of several layers that respond to different kinds of stimuli. Deeper layers of the superior colliculus are known to be involved in the processing of visual information. Thus, it seems that echolocation may indeed help animals to obtain a picture of their environment that is as authentic as the picture received through visual channels.

Echolocation and Vision in Humans

According to scientists, echolocation is not a phenomenon completely alien to humans. It seems that some blind individuals can be trained in echolocation.

Using this technique, they can locate objects by generating mouth clicks and listening to their echoes. The returned echoes can provide them with important information such as position, distance to and even the size or shape of the objects.

Several studies were conducted in order to determine the underlying neural mechanisms involved in human echolocation. One study investigated two individuals skilled in echolocation, one early and one late blind. The authors of the study measured brain activity in both participants while they were listening to their own echo sounds. They compared brain activity with clicks that produced echoes with the brain activity of control sounds that did not result in echoes.

It turned out that the processing of echo sounds activates brain regions that are typically associated with vision rather than hearing.  More specifically, echo signals were processed in the visual cortex (rather than in superior colliculus like in bats and other echolocating animals). However, the processing of visual information in humans is centered around the visual cortex rather than the superior colliculus, as the human visual cortex has significantly expanded compared to most animals.

Thus, in both animals and humans, the information received through echolocation is processed in those regions that are also predominantly responsible for the processing of visual information. The curious examples of human echolocation are a perfect illustration of the plasticity of our brain and its ability to adapt to changing circumstances (blindness in this case).

A recent publication on echolocation in humans reviewed the applications of this phenomenon as well as the processes occurring in the brain of echolocation experts (i.e., individuals that are skilled in echolocation). They have reported that echolocation may enable blind people to sense small variations in the location, size, and shape of objects, or even to distinguish different materials the objects are made of, simply by listening to the echoes of their own mouth clicks.

It seems that echolocation may be perfected by blind individuals to facilitate the handling of daily tasks and achieving a higher degree of independence. Based on neuroimaging studies, the review confirmed that the processing of input signals from echoes activates the visual cortex, a brain part that would normally support vision in the sighted brain.

References:

Watwood, S.L., Miller, P.J., Johnson, M., Madsen, P.T., Tyack, P.L. (2006). Deep-diving foraging behaviour of sperm whales (Physeter macrocephalus). Journal of Animal Ecology. 75(3): 814-825. DOI: 10.1111/j.1365-2656.2006.01101.x

Kothari, N.B., Wohlgemuth, M.J., Moss, C.F. (2018). Dynamic representation of 3D auditory space in the midbrain of the free-flying echolocating bat. Elife. pii: e29053 doi: 10.7554/eLife.29053.

Thaler, L., Arnott, S.R., Goodale, M.A. (2011). Neural correlates of natural human echolocation in early and late blind echolocation experts. PLoS One.  6(5): e20162. doi: 10.1371/journal.pone.0020162.

Thaler, L., Goodale, M.A. (2016). Echolocation in humans: an overview. Wiley interdisciplinary reviews. Cognitive science. 7(6): 382-393. doi: 10.1002/wcs.1408.

Image via Sweetaholic/Pixabay.

]]>
/2018/08/23/echolocation-in-humans-and-other-animals-is-it-as-good-as-vision/feed/ 0
The Brain and Genetics in Addictive Behavior /2018/08/22/the-brain-and-genes-in-addictive-behavior/ /2018/08/22/the-brain-and-genes-in-addictive-behavior/#respond Wed, 22 Aug 2018 12:00:48 +0000 /?p=23827 Addictive behavior is a major global health concern. Addiction is commonly defined as the repetitive use of substances or a repetitive pattern of behaviors that are harmful. It is believed that addiction is a brain disorder, meaning that it is caused by the impact of drugs or other addictive substances/influences on the brain and it can be modified by different environmental factors.

The presence of specific variants of some genes may promote or decrease the chances of developing an addiction. According to scientists, genetic factors may play an important role in determining both the vulnerability to addiction and the response to treatments aimed to cure addiction.

The Brain and Genetics in Addictive Behavior

One group of researchers demonstrated that polymorphisms in the genes encoding for opioid receptors and opioid ligands, and more specifically the MOPR gene (OPRM1), is associated with drug addiction. It has been found that one variant of this gene contributes to alcoholism and heroin addiction.

Another study has found that carriers of the same gene variant experience a more pronounced sensitivity to pain and decreased analgesic response to opioids, which means that they require higher doses of morphine in the management of pain, such as pain associated with cancer.

Heroin and Opioids

Some authors underlined that addiction to MOPR agonists, including heroin, has become epidemic in the 21st century. According to the same authors, the endogenous opioid system interacts with other neurotransmitter systems in the brain. More precisely, opioid receptors regulate the release of dopamine and serotonin, neurotransmitters with important roles in mood.

Additionally, the opioid system seems to interact with noradrenergic, GABAergic, and glutamatergic pathways, as well as with neural growth factors. Furthermore, these studies indicate that chronic exposure to opiates alters gene expression in the brain, and this causes long-term changes in neuronal networks.

Activation of opioid receptors leads to changes in the expression of genes in the above-mentioned neurotransmitter pathways. Variations in the genes of those pathways may determine whether someone is more prone to the development of opiate addiction.

Cocaine

One very recent study published this year revealed that the use of cocaine and cocaine addiction are associated with certain variants of the glucocorticoid receptor gene, along with lower expression of this gene.

More specifically, the authors of the study compared the expression level of this gene in chronic cocaine users and healthy controls. Apart from significantly lower gene expression in the cocaine users, the carriers of some gene variants were at increased risk of cocaine addiction. In addition, they scored higher on depression scales.

Dopamine

It seems that the major challenge in understanding and treating addictive disorders is understanding why some individuals develop addiction while others do not. According to research findings, whether or not a person will become addicted depends largely on genetic factors, which contribute ~50% of the risk for addiction.

The processes in our brain are also important. It is well established that the rewarding effects of drugs and addictive substances, in general, are based on their ability to increase the level of dopamine in the brain. Accordingly, imaging studies have shown that individual variations in brain circuits modulated by dopamine, including circuits involved in the mechanisms of reward, contribute to the inter-individual variability in the vulnerability to addiction. Thus, it seems that the roles of genetic factors and brain pathways in the addiction may be intertwined via dopamine.

Alcohol and tobacco

Apart from drug addiction, addiction to alcohol seems to depend on genetics and is influenced by heritage. A recent study has investigated the effects of parental drinking on the use of alcohol in young adults. More than 3500 adolescents and their parents have been included in this prospective research. As results have indicated, young adults whose parents are moderate or high alcohol consumers are more prone to alcohol consumption than those young adults whose parents don’t consume alcohol or consume it in low quantities.

Alcohol and tobacco use represent leading global health risks, which are responsible for 3.3 and 6 million premature deaths per year respectively, according to the World Health Organization. Genetic factors were estimated to contribute 40-60% and 40-85% to the development of alcohol and tobacco addictions respectively.

Finding out which genetic variants are associated with alcohol and tobacco addictions would be an important step in understanding their underlying mechanisms and developing the effective therapies. Over the last years, Genome-Wide Association Studies (GWAS) have been performed in order to elucidate the role of certain genes and their variants in alcohol and tobacco use. These studies have recognized that single nucleotide polymorphisms (SNP, i.e., gene variants with a difference in a single nucleotide) are important for developing these addictions.

For alcohol addiction, SNPs include polymorphisms in the KLB gene, as well as in the alcohol dehydrogenase gene cluster. In the latter, different variations of genes differently influence the metabolism of alcohol. In the case of tobacco use, the most evident variations have been detected in the nicotinic acetylcholine receptor subunit genes cluster.

However, only a small number of common genetic variants have been studied, and they account for a modest proportion of alcohol and nicotine addiction heritability. Thus, further investigation into the role of low frequency and rare genetic variants is required in order to fully understand the heritability of both alcohol and tobacco use.

In Sum

Considering all of these facts together, it seems that genetics does indeed play an important role in the development of addiction to substances such as opioids, alcohol, and tobacco. Some of these factors include differences in gene expression and gene variants in brain neurotransmitter pathways. Further research is needed in order to develop working strategies for treating heritable addictions.

References

Kreek, M.J., Levran, O., Reed, B., Schlussman, S.D., Zhou, Y., Butelman, E.R. (2012). Opiate addiction and cocaine addiction: underlying molecular neurobiology and genetics. Journal of Clinical Investigation. 122(10): 3387-3393. DOI: 10.1172/JCI60390

Shi, Q., Cleeland, C.S., Klepstad, P., Miaskowski, C., Pedersen, N.L. (2010). Biological pathways and genetic variables involved in pain. Quality of Life Research. 19(10): 1407–1417. doi: 10.1007/s11136-010-9738-x

Reed, B., Butelman, E.R., Yuferov, V., Randesi, M., Kreek, M.J. (2014). Genetics of opiate addiction. Current Psychiatry Reports. 16(11): 504. doi: 10.1007/s11920-014-0504-6

Schote, A.B., Jäger, K., Kroll, S.L., et al. (2018). Glucocorticoid receptor gene variants and lower expression of NR3C1 are associated with cocaine use. Addiction Biology. doi: 10.1111/adb.12632

Volkow, N.D., Wang, G.J., Fowler, J.S., Tomasi. D. (2012). Addiction circuitry in the human brain. Annual Review of Pharmacology and Toxicology. 52: 321-336. DOI: 10.1146/annurev-pharmtox-010611-134625

Mahedy, L., MacArthur, G.J., Hammerton, G., et al. (2018). The effect of parental drinking on alcohol use in young adults: the mediating role of parental monitoring and peer deviance. Addiction. doi: 10.1111/add.14280

Marees, A.T., Hammerschlag, A.R., Bastarache, L., et al. (2018). Exploring the role of low-frequency and rare exonic variants in alcohol and tobacco use. Drug and Alcohol Dependence. 188: 94-101. doi: 10.1016/j.drugalcdep.2018.03.026

]]> /2018/08/22/the-brain-and-genes-in-addictive-behavior/feed/ 0 Glioblastoma Treatment: New Promising Developments /2018/08/15/glioblastoma-treatment-new-promising-developments/ /2018/08/15/glioblastoma-treatment-new-promising-developments/#comments Wed, 15 Aug 2018 15:30:18 +0000 /?p=23825 Cancer is one of the top leading causes of mortality globally. The growing prevalence of the disease has to do with both our changing environment and longer lifespan. It is said that if all humans lived for 200 years, most people would develop cancer at some time in life. Thus, finding a cure for cancer is vital for longevity

Cancer is the unregulated growth of one or another type of cells. In the case of glioblastoma, it is the unchecked growth of glial cells in the brain. Fortunately, brain cancers are less common in comparison than those found in other parts of the body. However, when they occur, they are difficult to treat.

Glial cells are a group of cells that have varied functions in the brain. In the past, they were thought to merely be glue cells that help to keep neurons at the place. But now it is understood that glial cells have many other functions, and help to maintain brain health. They do the cleaning, repair, and maintenance, and also play a role in the local immunity.

A glioblastoma is a rare form of cancer—its occurrence is about 10 in 100,000. Among the brain cancers, however, glioblastomas account for 15% of all cases. It is one of the most aggressive types of brain cancers. Thus, even with the best of treatment, most patients will barely survive one and a half years. Very small numbers of patients can expect to live longer than three years, and less than 5% live up to 5 years.

Despite the progress in the treatment and management of other forms of cancer, only a few positive glioblastoma treatment developments were reported in the last decades. For more than half a century, treatment of brain cancers remains based on surgery and chemotherapy. Usually, the surgeon would remove the affected part of the brain, and then they would try to suppress the remaining cancer cell with highly toxic chemotherapeutic agents. This approach has not been working well for glioblastoma.

These days, thanks to advancements in personalized medicine and a better understanding of cellular physiology and genetics, newer cancer treatments are being developed. Instead of mere surgical removal or the use of toxic drugs, researchers are learning ways to control various cellular mechanisms. Using these novel methods, it is possible now to force the brain tumor into remission or train the brain’s immune system to work against cancer.

Exploiting what forces cancer into remission

Genes and various factors control every part of cellular life. Thus, cells grow in a particular fashion, behave in a specific way and, when required, go through programmed cell death. In the case of glial cells, this programmed cell death is called anoikis. Further, to keep the brain functioning, these supportive cells also engage in autophagy. Autophagy is a process that helps to keep the brain clear of debris and unnecessary components.

Autophagy can be both protective and dangerous to brain cells. Autophagy helps cancerous glioma stem cells resist anoikis. Autophagy is now known to be regulated by the MDA-9/Syntenin gene. Researchers have found that when the MDA-9/Syntenin gene is blocked, the glioma stem cells lose their protective ability and succumb to programmed death (i.e., anoikis).

Further, researchers found that MDA-9/Syntenin gene suppresses epidermal growth factor receptor (EGFR) signaling. Excessive EGFR signaling has been found to be associated with brain cancer/glioblastoma. EGFR is protective against excessive autophagy or programmed cell death.

Scientists realized that when the MDA-9/Syntenin gene is blocked, EGFR cannot regulate protective autophagy, and more widespread death of cancer cells ensues. Now, researchers want to use this feature in a controlled manner to kill the cancer cells.

At present, this approach is being tested in laboratory models of tumor cells and in mouse models. Initial results seem to be encouraging. Suppression of the MDA-9/Syntenin gene in mouse models led to higher survival rates.

Researchers are confident that in the near future they will be able to find safer ways to suppress the MDA-9/Syntenin gene in humans and then start trials on people diagnosed with glioblastoma.

Training immune system to destroy cancer cells

This is another approach towards the treatment of various types of cancers. Clinical research has already shown the efficacy of this approach in glioblastoma and many other types of cancers. This approach involves creating an individualized or personal vaccine for each patient with glioblastoma. The initial human trials have already shown a better survival rate.

When it comes to creating a vaccine against cancer, one size does not fit all. The reason is simple; each cancer patient differs genetically. This difference in each cancer patient is linked to the fact that cancers develop due to different mutations across individuals. It means that all patients with glioblastoma have slightly different types of tumor. Therefore, one single vaccine would not work.

The answer to these patient-specific cancerous mutations is a personalized vaccine. This vaccine is called DCVax-L. The vaccine is created by extracting cancer cells from individuals and then training the dendritic cells of the immune system to fight against those specific mutations or cancer cells. Each patient who gets this vaccine would first go through the traditional surgical and chemotherapeutic treatments.  Upon completion of these treatments, the patients would receive the vaccines created specifically for them, to prolong life.

Good news is that this method is already in the last stages of development, and it has shown excellent results in a randomized, blind clinical trial. In these trials, the DCVax-L vaccine was given to 232 patients at various sites. This trial has not yet ended, but the initial results clearly show the higher survival rate. Around 30% of those who got this vaccine survived more than 30 months, and one-fourth survived more than 36 months. At present, 32.6% of those enrolled in the trial are still alive, and they are expected to live between 46.5 to 88.2 months. At present, the vaccine is not a curative treatment, but it provides serious benefits in comparison to traditional methods.

It has taken more than two decades to perfect the vaccine, and researchers believe that things will only get better in the future. In the very near future, we can expect to see much higher 5-year survival rates for glioblastoma patients, and that would already mark a big success.

References

D’Agostino, P. M., Gottfried-Blackmore, A., Anandasabapathy, N., & Bulloch, K. (2012). Brain dendritic cells: biology and pathology. Acta Neuropathologica, 124(5), 599–614. https://doi.org/10.1007/s00401-012-1018-0

Hanif, F., Muzaffar, K., Perveen, K., Malhi, S. M., & Simjee, S. U. (2017). Glioblastoma Multiforme: A Review of its Epidemiology and Pathogenesis through Clinical Presentation and Treatment. Asian Pacific Journal of Cancer Prevention?: APJCP, 18(1), 3–9. https://doi.org/10.22034/APJCP.2017.18.1.3

Jäkel, S., & Dimou, L. (2017). Glial Cells and Their Function in the Adult Brain: A Journey through the History of Their Ablation. Frontiers in Cellular Neuroscience, 11. https://doi.org/10.3389/fncel.2017.00024

Liau, L. M., Ashkan, K., Tran, D. D., Campian, J. L., Trusheim, J. E., Cobbs, C. S., … Bosch, M. L. (2018). First results on survival from a large Phase 3 clinical trial of an autologous dendritic cell vaccine in newly diagnosed glioblastoma. Journal of Translational Medicine, 16, 142. https://doi.org/10.1186/s12967-018-1507-6

Talukdar, S., Pradhan, A. K., Bhoopathi, P., Shen, X.-N., August, L. A., Windle, J. J., … Fisher, P. B. (2018). MDA-9/Syntenin regulates protective autophagy in anoikis-resistant glioma stem cells. Proceedings of the National Academy of Sciences, 115(22), 5768–5773. https://doi.org/10.1073/pnas.1721650115

Walid, M. S. (2008). Prognostic Factors for Long-Term Survival after Glioblastoma. The Permanente Journal, 12(4), 45–48. PMID: 21339920

Image via VSRao/Pixabay.

]]>
/2018/08/15/glioblastoma-treatment-new-promising-developments/feed/ 1
Virtual Reality for Reducing Pain /2018/08/09/virtual-reality-for-reducing-pain/ /2018/08/09/virtual-reality-for-reducing-pain/#respond Thu, 09 Aug 2018 15:30:41 +0000 /?p=23763 Falling Through the Cracks in Pain Management

Pain and Opioids

Chronic pain is debilitating, and it can cause patients to “fall through the cracks”. Health care institutions struggle to find ways to create “nets” and catch these patients. Pain medications include opioids which are used to treat chronic pain. Opioids often fail to treat the patient’s primary medical condition. As time goes by, patients tend to be unsatisfied with the results.

Also, there is a good chance that some of these types of pain medications will be abused. In fact, pain medicines such as opioids are part of the United States opioid crisis. According to the Centers for Disease Control (CDC), every day, more than 115 people in the United States die after overdosing on opioids. (CDC/NCHS, 2017) The opioids include prescription drugs, including fentanyl, and synthetic street drugs such as heroin.

It is estimated that the total economic burden of prescription opioid misuse in the United States is $78.5 billion a year which includes the costs in health care, lost productivity, addiction therapy, and criminal justice involvement. (Florence, Zhou, Lou, & Xu, 2013)

Pain and Mindfulness Meditation

To overcome pain, another task which demands higher controlled attention must be pitted against it.

Other forms of therapies were introduced to manage chronic pain.

One such therapy is known as mindfulness meditation. Mindfulness meditation is: “the intentional self-regulation of attention from moment to moment”. (Goleman & Schwartz, 1876) This method has been used for quite some time. A study done in mindfulness meditation by Dr. Kabat-Zinn has reported that 65% of patients have exhibited a reduction of pain by more than 33% and about 50% of patients have reported a reduction of pain by 50% over a 10-week period of therapy. (Kabat-Zinn, 1982)

Some studies performed in mindfulness meditation have reported patients with strong feelings of anger towards their pain condition while others report some anxiety while undergoing mindfulness therapy. la Cour and Petersen point out that meditation therapy requires a learning curve for the patient to access the more important personal “inner space”. (la Cour & Petersen, 2015) This can be an exciting learning experience of discovery for some patients while other patients may see this as a constant battle that in itself can be a painful experience.

The next question to consider therefore is to find out which patients benefits the most from mindfulness meditation and which are not, and then find out what other therapies we can use in these patients.

Enter Virtual Reality: Pain and Attention

Recall a recent injury. Ever wonder why after a trauma or injury has occurred, there seems to be a delay in which actual pain is produced? Pain has to first gain access to consciousness and demands central attentional resources by interrupting all other current brain processes such as worry, fear, or desire. It does so easily because of its noxious nature. (Eccleston, 1995) Pain, therefore, can be considered as a controlled task. To overcome pain, another task which demands higher controlled attention must be pitted against it.

The characteristics of pain such as intensity, quality, and/or pattern affect the probability of capturing attention. In chronic pain, for example, the characteristics of the pain and its intensity are important for pain processing. This may explain the reason why there are “good” days and “bad” days for patients with sciatica, multiple sclerosis, and other causes of chronic pain. Persistent pains with unpredictable sensory qualities that fluctuate in intensities are more likely to be processed. (Eccleston, 1995)

Finding the perfect distractor with the ability to interrupt persistent pain stimulus processing is key to coping. Virtual reality (VR) systems offer computer-generated sensory inputs that involve sight, sound, and touch. These inputs make it essentially difficult for the brain to ignore especially if the VR program is immersive. Immersive VR is an experience that gives a perfect illusion to the patient that is in the virtual world. The strength of the illusion of the presence of the virtual world reflects the amount of attention drawn into the virtual environment. (Hoffman, Doctor, Patterson, Carrougher, & Furness III, 2000)

Virtual reality may not replace the conventional pain management anytime soon. Once the patient comes out of VR, they will soon feel pain once more. Pharmacologic therapy remains the mainstay of pain management. But the problem of using pharmacologic treatment for pain remains a challenge. Undermedication is a problem of pain management failure. But higher doses of opioids poses a serious risk such as respiratory failure and encephalopathy. Therefore, the application of pain relief using VR may be for the use of procedural pain management such as minor surgical procedures, wound cleaning and debridement, and escharotomy in burn victims.

References

CDC/NCHS. (2017). National Vital Statistics System, Mortality. (US Department of Health and Human Services, CDC) Retrieved May 21, 2018, from CDC Wonder, Atlanta GA: https://wonder.cdc.gov/

Eccleston, C. (1995). Chronic pain and distraction: an experimental investigation into the role of sustained and shifting attention in the processing of chronic persistent pain. Behav Res Ther, 33(4), 391-405. doi:10.1016/0005-7967(94)00057-Q

Florence, C., Zhou, C., Lou, F., & Xu, L. (2013). The economic burden of prescription opioid overdose, abuse, and dependence in the United States. Med Care, 54(10), 901-906. doi:10.1097/MLR.0000000000000625

Goleman, D., & Schwartz, G. (1876). Meditation as an intervention in stress activity. J Consult Clin Psychol, 44, 456-466. doi:10.1037/0022-006X.44.3.456

Hoffman, H. G., Doctor, J. N., Patterson, D. R., Carrougher, G. J., & Furness III, T. A. (2000, March 1). Virtual reality as an adjunctive pain control during burn wound care in adolescent patients. Pain, 85(1-2), 305-309. doi:10.1016/S0304-3959(99)00275-4

Kabat-Zinn, J. (1982). An outpatient program in behavioral medicine for chronic pain patients based on the practice of mindfulness meditation: theoretical considerations and preliminary results. Gen Hosp Psych, 4, 33-47. doi:10.1016/0163-8343(82)90026-3

la Cour, P., & Petersen, M. (2015, April 2). Effects of mindfulness meditation on chronic pain: a randomized control trial. Pain Medicine, 16(4), 641-652. doi:10.1111/pme.12605

Image via Pexels/Pixabay.

]]> /2018/08/09/virtual-reality-for-reducing-pain/feed/ 0 How the Brain Perceives Colors? /2018/07/23/how-the-brain-perceives-colors/ /2018/07/23/how-the-brain-perceives-colors/#comments Mon, 23 Jul 2018 15:30:02 +0000 /?p=23848 Color vision is the ability to distinguish different wavelengths of electromagnetic radiation. Color vision relies on a brain perception mechanism that treats light with different wavelengths as different visual stimuli (e.g., colors). Usual color insensitive photoreceptors (the rods in human eyes) only react to the presence or absence of light and do not distinguish between specific wavelengths.

We can argue that colors are not real—they are “synthesized” by our brain to distinguish light with different wavelengths. While rods give us the ability to detect the presence and intensity of light (and thus allow our brain to construct the picture of the world around us), specific detection of different wavelengths through independent channels gives our view of the world additional high resolution. For instance, red and green colors look like near identical shades of grey in black and white photos.

An animal with black and white vision alone won’t be able to make a distinction between, let’s say, a green and red apple, and won’t know which one tastes better before trying them both based on color. Evolutionary biologists believe that human ancestors developed color vision to facilitate the identification of ripe fruits, which would obviously provide an advantage in the competitive natural world.

Why certain wavelengths are paired with certain colors remains a mystery. Technically, color is an illusion created by our brain. Therefore, it is not clear if other animals see colors the same way we see them. It is likely that, due to shared evolutionary history, other vertebrates see the world colored similarly to how we see it. But color vision is quite common across the vast animal kingdom: insects, arachnids, and cephalopods are able to distinguish colors.

What kind of colors do these animals see?

Human color vision relies on three photoreceptors that detect primary colors—red, green, and blue. However, some people lack red photoreceptors (they are “bichromates”) or have an additional photoreceptor that detects somewhere between red and green colors (“tetrachromates”). Obviously, having only 3 photoreceptors doesn’t limit our ability to distinguish other colors.

Each photoreceptor can absorb a rather broad range of wavelengths of light. To distinguish a specific color, the brain compares and quantitatively analyses the data from all three photoreceptors. And our brain does this remarkably successfully—some research indicates that we can distinguish colors that correspond to wavelength differences of just 1 nanometer.

This scheme works in largely the same way in most higher vertebrate animals that have color vision. Although the ability to distinguish between specific shades varies significantly between the species, with humans having one of the best color distinguishing abilities.

However, invertebrates that have developed color vision (and vision in general) completely independently from us demonstrate remarkably different approaches to color detection and processing. These animals can have a exceptionally large number of color receptors. The mantis shrimp, for instance, has 12 different types of photoreceptors. The common bluebottle butterfly has even more—15 receptors.

Does it mean that these animals can see additional colors unimaginable to us? Perhaps yes. Some of their photoreceptors operate in a rather narrow region of light spectrum. For instance, they can have 4-5 photoreceptors sensitive in the green region of the visual spectrum. This means that for these animals the different shades of green may appear as different as blue and red colors appear to our eyes! Again, the evolutionary advantages of such adaptations are obvious for an animal living among the trees and grasses where most objects, as we see them, are colored in various shades of green.

Researchers tried to test if a more complicated set of visual receptors provide any advantages for animals when it comes to the distinguishing between main colors. The findings show that this is not necessarily the case, at least not for the mantis shrimp. Despite the impressive array of receptors detecting light in a much broader part of the electromagnetic spectrum compared to humans, the shrimp’s ability to distinguish between colors that great in comparison to us. However, they determine the colors fast. This is probably more important for practical purposes, as mantis shrimps are predators. A large number of photoreceptors allows for their quick activation at specific wavelengths of light and thus communicate directly to the brain what specific wavelength was detected. In comparison, humans have to assess and quantify the signals from all three photoreceptors to decide on a specific color. This requires more time and energy.

Apart from employing a different number of photoreceptors to sense light of specific wavelengths, some animals can detect light that we humans are completely unable to see. For example, many birds and insects can see in the UV part of the spectrum. Bumblebees, for instance, have three photoreceptors absorbing in the UV, blue, and green regions of the spectrum. This makes them trichromates, like humans, but with the spectral sensitivity shifted to the blue end of the spectrum. The ability to detect UV light explains why some flowers have patterns visible only in this part of the spectrum. These patterns attract pollinating insects, which have an ability to see in this spectral region.

A number of animals can detect infrared light (the long wavelength radiation) emitted by heated objects and bodies. This ability significantly facilitates hunting for snakes that are usually looking for small warm-blooded prey. Seeing them through IR detecting receptors is, thus, a great tool for slow-moving reptiles. The photoreceptors sensitive to IR radiation in snakes are located not in their eye but in “pit organs” located between the eyes and nostrils. The result is still the same: snakes can color objects according to their surface temperature.

As this brief article shows, we humans can see and analyze only a small portion of the visual information available to other creatures. Next time you see a humble fly, think about how different it perceives the same things you are both looking at!

References:

Skorupski P, Chittka L (2010) Photoreceptor Spectral Sensitivity in the Bumblebee, Bombus impatiens (Hymenoptera: Apidae). PLoS ONE 5(8): e12049. doi: 10.1371/journal.pone.0012049

Thoen HH, How MJ, Chiou TH, Marshall J. (2014) A different form of color vision in mantis shrimp. Science 343(6169):411-3. doi: 10.1126/science.1245824

Chen P-J, Awata H, Matsushita A, Yang E-C and Arikawa K (2016) Extreme Spectral Richness in the Eye of the Common Bluebottle Butterfly, Graphium sarpedon. Front. Ecol. Evol. 4:18. doi: 10.3389/fevo.2016.00018

Arikawa, K., Iwanaga, T., Wakakuwa, M., & Kinoshita, M. (2017) Unique Temporal Expression of Triplicated Long-Wavelength Opsins in Developing Butterfly Eyes. Frontiers in Neural Circuits, 11, 96. doi: 10.3389/fncir.2017.00096

Image: https://pixabay.com/en/butterfly-3d-blue-mushroom-forest-2049567/

]]>
/2018/07/23/how-the-brain-perceives-colors/feed/ 1
Asperger’s Syndrome: Hallmark of Genius, or Just Another Form of Autism? /2018/06/28/aspergers-syndrome-hallmark-of-genius-or-just-another-form-of-autism/ /2018/06/28/aspergers-syndrome-hallmark-of-genius-or-just-another-form-of-autism/#respond Thu, 28 Jun 2018 16:20:41 +0000 /?p=23664 Isaac Newton, Albert Einstein, Charles Darwin—what unites these three exceptional individuals? It is widely accepted that all three were geniuses, but there is something else. These days, neuroscientists believe that all three suffered from a specific neurological disorder called Asperger’s syndrome.

The whole definition of the term “neurological disorder” implies that something is going wrong in the brain. However, there is a growing recognition of the fact that when it comes to the processes in our brain, “going wrong” does not necessarily mean “going bad”. Our brain is too complicated a mechanism to be interpreted in simplistic terms. Some neurological disorders produce a peculiar state of mind often associated with high artistic and scientific achievements.

Asperger’s Syndrome (AS) is a developmental and neurological disorder that is often associated with symptoms of social withdrawal, motor clumsiness, and impaired communication skills. The Diagnostic and Statistical Manual of Mental disorders (DSM-5) classified AS in the same category as Autism Spectrum Disorder (ASD). It is often referred to as ‘High-functioning’ Autism (HFA), as individuals with AS are more intellectually capable and show less severe abnormalities compared to ASD subjects.

The story of AS and autism started in the 1940s, when two Viennese Scientists, Leo Kanner and Hans Asperger, described a syndrome observed in some children, with the unique characteristics of social isolation, impaired communication skills, and restrictive and obsessive interests. Both scientists used the term ‘autistic’ in their reports. While Kanner’s syndrome was published right away in 1943, Asperger’s report was written in German and remained undiscovered until 1991 when it reappeared in Uta Frith’s textbook Autism and AS.

Research and publications on Asperger’s syndrome reached its peak during 2000-2012. Different research groups proposed a set of criteria for AS diagnosis. While several of these criteria were overlapping, WHO’s International Classification of Diseases and Disorders set the following key characteristics that can be diagnostic for Asperger’s Syndrome:

  • Qualitative social impairment involving dysfunctional social adaptivity, impaired non-verbal communication for interaction and lack of social reciprocity.
  • Restrictive pattern of interest, motor clumsiness, repetitive behavior, and extreme obsessiveness to specific rituals.
  • AS patients must show age-specific, normal cognitive and linguistic development.

Interestingly, the father of Asperger Syndrome, Hans Asperger, described AS patients to be characteristically distinct from ASD subjects. He characterized them as intellectually-able, abstract-loving, and even overachieving in some specific cognitive domains.

Decades after Asperger’s observation, recent studies have also found that AS patients often demonstrate high verbal IQ, strong grammatical skills and they often outperform others in fluid reasoning although they are reported to show a delayed reaction time with poor performance IQ, specifically in symbol coding and processing speed.

Surprisingly, AS is more common than classical autism. Epidemiological surveys report that about 4 out of every 10,000 children are autistic whereas about 25 out of 10,000 children are diagnosed with AS. AS is more common in boys than girls. No scientific explanation behind this observation exists at present.

Like with many other syndromes, no single specific cause is responsible for AS. Rather, a milieu of factors is associated with its development.

Children diagnosed with AS show a genetic pattern, like in autism, where at least one of the parents (most often the father) is diagnosed with AS or at least have some hallmark AS characteristics. The relatives of AS children are known to have anxiety or depression related disorders.

An important causal factor behind the development of AS might be the altered level of neurotransmitters. In AS patients, higher levels of N-Acetyl Aspartate/Choline (precursor of acetylcholine) intake and increased dopamine levels were reported, suggesting an overall altered dopaminergic neurotransmitter composition in major areas of the brain. Intranasal injection of oxytocin, a neuropeptide, was shown to improve facial emotion recognition abilities in AS patients.

Apart from alterations in neurotransmitter levels, neuroimaging studies show that there are structural changes in major areas of the brain that could be associated with the development of Asperger’s Syndrome. Altered grey and white matter volumes were observed in major brain regions, and an abnormal thickness of the hippocampus, amygdala, and anterior cingulate cortex was reported to be the major contributing factor for dysregulated cognitive functions in AS.

Some researchers also proposed that environmental factors can contribute indirectly to the development of AS. Viral or bacterial infection and smoking during pregnancy particularly increases the risk factor, although no concrete evidence supporting these views was found.

Due to the substantial number of overlapping similarities between Asperger’s Syndrome and Autism, it is very easy to confuse one with the other.

Studies in recent decades showed differences between AS and ASD on both quantitative and qualitative levels. The AS subjects displayed age-specific or earlier verbal development, meticulous speech ability, higher desire for social reciprocation, and supreme imagination compared to ASD patients.

On a cognitive level, AS subjects are more perceptive, they possess superior verbal performance and visual-spatial ability compared to ASD patients.

But the major limitations of these studies is the huge variability within the study groups and contradictions in the data patterns, as with age, the distinction between AS and ASD significantly reduces. It is particularly difficult to categorize AS from other disorders as there are no known biomarkers specific to AS only.

As there is not sufficient evidence of distinguishable characteristics for AS that can class the syndrome as ‘one biologically and clinically diagnosed entity’, the DSM-V in 2013 revised and categorized Asperger’s Syndrome as another variant of ASD. Although this decision was criticized by the scientific community, most of the researchers agreed that there is a need to conduct more studies that could help in distinguishing AS from other ASDs.

The most general misconception about Asperger’s syndrome or, as a matter of fact, about autism spectrum disorders in general, is that they develop because of poor parenting and a lack of bonding between parents and their child. This concept was even termed as ‘Refrigerator Mother’ to describe cold and distant parenting. But this notion was challenged from the 1960’s when research on these neurological disorders started to grow and scientists found that it is not parenting, but rather the genetic and neurological makeup of the child that is responsible for these syndromes. Even today, the belief that neurodevelopmental disorders are caused by a traumatic childhood are common. The reality, however, is more complicated than our guesses based on limited information.

References

Barahona-Correa, J. B. and C. N. Filipe (2015) A Concise History of Asperger Syndrome: The Short Reign of a Troublesome Diagnosis. Front Psychol 6: 2024. doi: 10.3389/fpsyg.2015.02024

Faridi, F. and R. Khosrowabadi (2017) Behavioral, Cognitive and Neural Markers of Asperger Syndrome. Basic Clin Neurosci 8(5): 349-359. doi: 10.18869/nirp.bcn.8.5.349

Weiss, E. M., B. Gschaidbauer, et al. (2017) Age-related differences in inhibitory control and memory updating in boys with Asperger syndrome. Eur Arch Psychiatry Clin Neurosci 267(7): 651-659. doi: 10.1007/s00406-016-0756-8

Image via BarbaraALane/Pixabay.

]]>
/2018/06/28/aspergers-syndrome-hallmark-of-genius-or-just-another-form-of-autism/feed/ 0
Are Dyslexics More Entrepreneurial? /2018/06/20/are-dyslexics-more-entrepreneurial/ /2018/06/20/are-dyslexics-more-entrepreneurial/#respond Wed, 20 Jun 2018 14:30:32 +0000 /?p=23742 Dyslexia is rather common: it is estimated that around 5-10% of individuals are dyslexic. Despite an apparent disability, some are famous, like Tom Cruise or Richard Branson. Obviously, they do not suffer from a lack of intelligence and are, in fact, quite successful in the business world. So what is going on in their brains? Are they developing some compensatory mechanisms that help them to do things better?

Epidemiological research studies indicate that dyslexics develop coping strategies to compensate for their weaknesses, which helps them in later life. The resilience that they acquire while in school often helps them to be more successful in developing a business, in being an entrepreneur.

Statistics show that there are twice more dyslexics among entrepreneurs when compared to the general population. However, dyslexics are uncommon in higher management. They also tend to have a different business management style. Thus, they do better in startups and are better at handling particular types of businesses.

Dyslexia is usually first identified when a child goes to school and struggles with scrambled text. Dyslexic children have difficulty in reading texts, interpreting them, and explaining the meaning of the text to others, even though they can be very intelligent otherwise. Dyslexia often results in poor academic performance, undue pressure, and psychological trauma. Each dyslexic child needs to learn to cope with these challenges.

Although dyslexic children are as intelligent as their peers at school, they are often labeled as less capable. Children with dyslexia are often targets of bullying in school. Poor self-image at school often leads to worsening of self-esteem in many of these kids. As helping dyslexic children is not easy, they are often left to themselves.

What’s going on in the dyslexic brain? Neurological basis of dyslexia

As a common disorder, dyslexia is the subject of multiple studies. Researchers agree that those living with dyslexia may have differences in the brain relative to non-dyslexic children, and these differences are the subject of intense clinical research. The recent explosion in brain imaging technology is helping us gain a deeper understanding of the matter.

The neurological theory of dyslexia is one of the earliest. The theory was proposed about a century ago when British physicians Morgan and Hinshelwood described dyslexia as a “visual word blindness.”

The study of adults living with brain trauma in the left parietal region demonstrated that many of these people develop reading difficulties. They find it challenging to process the optical image of letters. Thus, the early theory was that those with dyslexia have developmental defects in the parietal region of the brain.

Left parietal involvement was also somewhat confirmed during pathological examination of the brains of those who died at an earlier age and were known to be dyslexic.

Another important theory focuses on delayed brain lateralization in dyslexia. It is thought that some people have weak or insufficient brain lateralization that hinders the understanding of languages. This theory was the subject of multiple studies in the second half of last century.

The latest research into the neurophysiology of those living with dyslexia seems to indicate that dyslexia is phonological in nature: dyslexics have difficulty in manipulating the phoneme parts of speech. It is possible that there are developmental issues in the visual tract or other visual mechanisms in the brain may be contributing to the difficulty.

Apart from defects in a specific subsystem of the vision pathway, researchers think that there are other brain developmental issues involved as well. It is entirely possible that people with dyslexia have temporal processing impairment, and therefore they are not able to process information fast enough. Thus, dyslexia is considered the result of multi-system deficits

In conclusion

Dyslexia is probably the result of deficits in the brain at multiple levels. There is an impaired phoneme discrimination resulting in difficulty in understanding spelling. Visual perceptual impairment leads to further worsening of word recognition, and phonological awareness impairment causes speech disturbances. In the center of all this is delayed temporal processing. The end result is delayed speech development, difficulties in reading and comprehending texts, and poor academic performance.

What makes a dyslexic a successful person?

From Leonardo da Vinci to Einstein, children with learning disabilities prove that there is a limited link between disability and intelligence. Children with dyslexia are at least equally intelligent to non-dyslexic children.

The higher success of individuals with dyslexia in certain professions is probably the result of resilience or compensatory mechanisms that they cultivate during the school days to overcome their difficulties.

Some of these kids may develop better skills for interacting with others. They may focus more on specific arts or sciences. Many of them may not concentrate on studies and instead start doing business at an early age. This means that they can be found in any profession, and in the long run they are equally successful.

The compensatory mechanisms developed at a young age may provide an edge over others in specific areas when the children grow up. Even though dyslexics may score poorly in school, they may outscore other children in practical life since they spend more time perfecting their verbal skills.

As an entrepreneur, dyslexics are known to be good at delegating tasks, they are excellent mentors, and they are often creative. All of these qualities usually make them more successful entrepreneurs, though they may not be that good in roles where there is less space for creativity.

Achieving success with dyslexia is perhaps about learning different skills, mastering different approaches to solving the tasks, and developing strategies to compensate for certain limitations.

References

Habib, M. (2000) The neurological basis of developmental dyslexia: An overview and working hypothesis. Brain, 123(12), 2373–2399. 10.1093/brain/123.12.2373

Locke, R., Scallan, S., Mann, R., & Alexander, G. (2015) Clinicians with dyslexia: a systematic review of effects and strategies. The Clinical Teacher, 12(6), 394–398. 10.1111/tct.12331

Logan, J. (2009) Dyslexic entrepreneurs: the incidence; their coping strategies and their business skills. Dyslexia, 15(4), 328–346. 10.1002/dys.388

Logan, J. (2018) Analysis of the incidence of dyslexia in entrepreneurs and its implications.

Toffalini, E., Pezzuti, L., & Cornoldi, C. (2017) Einstein and dyslexia: Is giftedness more frequent in children with a specific learning disorder than in typically developing children? Intelligence, 62, 175–179. 10.1016/j.intell.2017.04.006

Yu, X., Zuk, J., & Gaab, N. What Factors Facilitate Resilience in Developmental Dyslexia? Examining Protective and Compensatory Mechanisms Across the Neurodevelopmental Trajectory. Child Development Perspectives, 0(0). 10.1111/cdep.12293

Image via geralt/Pixabay.

]]>
/2018/06/20/are-dyslexics-more-entrepreneurial/feed/ 0
How Weather Influences the Brain? /2018/06/18/how-weather-influences-the-brain/ /2018/06/18/how-weather-influences-the-brain/#respond Mon, 18 Jun 2018 14:00:26 +0000 /?p=23740 We all know that the weather can strongly influence our mood and productivity. Many people feel better when the weather is nice and sunny. It is thus not surprising that people more often feel unhappy and depressed in winter. There is even a medical condition known as winter depression. Still, some researchers believe that our brain functions better during the cold days. In this article, I’ll briefly analyze what happens in our brain in relation to weather-related mood and mind changes.

Scientific studies indicate that weather conditions such as high temperature and humidity can impair mental performance by affecting brain neurochemistry. For instance, it is believed that thermal stress can cause cognitive impairment.

One recent study has investigated the impact of thermal stress on cognitive functions in soldiers spending at least one year in desert conditions. The evaluation of memory and cognitive functions indicated there is a decline in cognitive performance in hot climates when compared to normal weather. The decline was most pronounced for attention, concentration, verbal memory, and psychomotor performance.

Another recent study has investigated the impact of sand and dust storms on children’s cognitive function. Using mathematical analysis and word-recognition test scores, how prenatal exposure to sand and dust storms affects the cognitive performance of children was evaluated. The authors found a decline in both test scores, as well as a later beginning of counting and speaking in whole sentences in children prenatally exposed to storms. The findings imply that this kind of weather jeopardizes the cognitive functions of the next generation.

However, results from scientific research on the effects of temperature on cognitive functions are quite mixed and contradictory.

One study investigated how temperature affects the cognitive performance of subjects with multiple sclerosis. Healthy subjects were included as controls. The researchers correlated cognitive status with temperature in both study groups. In patients with multiple sclerosis, unlike in healthy subjects, the higher temperatures were associated with worsening cognitive status. These findings confirmed that warmer outdoor temperatures lead to a higher incidence of clinical exacerbation and T2 lesion activity in subjects with this condition (T2 lesions represent the white spots observed by MRI that are used to diagnose and track the progress of multiple sclerosis).

With regard to cognitive functions in cold weather, studies have shown both impairments and improvements.

For instance, one study investigated the impact of exposure to the cold and the following rewarming on working memory and executive functions in 10 young males. The results demonstrated a decline in the test results when the subjects were exposed to 10°C, and these impairments persisted for one hour during the rewarming period. Although the underlying mechanisms were not tested, the authors suggested that acute vascular changes in the brain could explain the observed changes. According to the authors, another explanation could be a deregulation of catecholamine levels, particularly important for complex attentional functions.

Other findings suggest that winter helps to wake up our mind and makes us think more clearly. It is well known that the brain utilizes glucose as its main energy source. Thus, when glucose is depleted, brain functioning is jeopardized. Energy, i.e., glucose, is also used for the regulation of body temperature, especially in extremely hot or cold conditions. It seems that more energy (glucose) is needed to cool down than to warm up the body. Thus, warm temperatures are more likely to deplete glucose levels and thus impair brain function and clarity of thinking.

It has been suggested that high temperatures increase the risk of mental disorders, especially in the elderly.

One recent study has analyzed data on emergency admissions linked to mental diseases and daily temperatures over a period of more than 10 years in 6 different cities. The results indicated that high temperatures might jeopardize mental health and be responsible for the exacerbation of symptoms of mental diseases. For instance, according to the results, more than 30% of admissions for anxiety were attributed to hot temperatures. Exposure to hot temperatures leads to reactions in the body that may cause an increase in stress hormone levels and brain temperature. Additionally, extremely hot weather may deregulate the dopamine and serotonin levels (these neuromediators are important for the feeling of happiness).

According to widespread belief, weather can affect our mood. Although a lack of sunshine is commonly linked to seasonal depression, some researchers believe that not all individuals respond similarly to weather changes.

Research has linked an individual’s self-reported daily mood with the objective weather over a 30 day period. Large individual differences have been found in how people react to the weather. Accordingly, four distinct types of weather responders have been identified: summer lovers (i.e., a better mood with warmer weather and more sun), summer haters (i.e., a worse mood with warmer weather and more sun), rain haters (i.e., a bad mood on rainy days), and unaffected (i.e., no particular association between weather and mood). Interestingly, adolescents and their mothers are often the same type, suggestive of familial weather reactivity.

The analysis of both scientific and popular literature permits the conclusion that extreme weather conditions can affect our cognitive function and mood. Most likely, this is caused by a decline in the brain’s energy source (glucose), which needs to be used for thermoregulation. Also, it is evident that extreme temperatures affect the level of catecholamines in the brain (such as dopamine and serotonin). Still, it seems that there is some individual variability in the brain’s response to weather, and it may run in the family.

References

Saini, R., Srivastava, K., Agrawal, S., Das, R. C. (2017) Cognitive deficits due to thermal stress: An exploratory study on soldiers in deserts. Med Journal Armed Forces India. 73(4):370-374. doi: 10.1016/j.mjafi.2017.07.011.

Li, Z., Chen, L., Li, M., Cohen, J. (2018) Prenatal exposure to sand and dust storms and children’s cognitive function in China: a quasi-experimental study. The Lancet. Planetary Health. 2(5): e214-e222. doi: 10.1016/S2542-5196(18)30068-8.

Leavitt, V.M., Sumowski, J.F., Chiaravalloti, N., Deluca, J. (2012) Warmer outdoor temperature is associated with worse cognitive status in multiple sclerosis. Neurology. 78(13): 964-968. doi: 10.1212/WNL.0b013e31824d5834.

Muller, M.D., Gunstad, J., Alosco, M.L., Miller, L.A., Updegraff, J., Spitznagel, M.B., Glickman, E,L. (2012) Acute cold exposure and cognitive function: evidence for sustained impairment. Ergonomics. 55(7): 792-798. doi: 10.1080/00140139.2012.665497.

Lee, S., Lee, H., Myung, W., Kim, E.J., Kim, H. (2018) Mental disease-related emergency admissions attributable to hot temperatures. The Science of Total Environment.616-617: 688-694. doi: 10.1016/j.scitotenv.2017.10.260.

Klimstra, T.A., Frijns, T., Keijsers, L., et al. (2011) Come rain or come shine: individual differences in how weather affects mood. Emotion. 11(6): 1495-1499. doi: 10.1037/a0024649.

Image via geralt/Pixabay.

]]>
/2018/06/18/how-weather-influences-the-brain/feed/ 0
Does Depression Accelerate Aging? /2018/06/11/does-depression-accelerate-aging/ /2018/06/11/does-depression-accelerate-aging/#respond Mon, 11 Jun 2018 13:00:25 +0000 /?p=23735 A clear association between depression, especially the major depressive disorder, oxidative stress, and accelerated aging is supported by research.

Depression, major depressive disorder more specifically, is one of the most striking problems of modern society. Millions of people worldwide suffer from depression, with many patients not experiencing relief from symptoms. Depression is associated with increased mortality from age-related conditions, such as cardiovascular disease and cancer. Researchers have suggested that depression is associated with increased oxidative stress and a disturbed immune response, which may accelerate aging and increase susceptibility to age-related disorders.

One of the proven indicators of cellular aging is the length of telomeres. Telomeres are nucleoprotein complexes that cap the end of chromosomal DNA and serve to protect chromosomal integrity. They become shorter with each round of replication and cell division, meaning that normally they become shorter with age. When telomeres reach a critically short length, the cells undergo apoptosis, i.e., programmed death. Leukocyte telomere length has been typically used in clinical studies as a marker of cellular aging. They shortening accelerates in the cells subjected to oxidative stress.

Multiple studies, including some meta-analysis, have questioned the association between leukocyte telomere length and major depressive disorder. For instance, one meta-analysis compared the length of telomeres between depressed and healthy individuals and found significantly shorter telomeres in groups with depression. A very recent prospective study including over 100 participants aged from 18 to 70 with or without major depressive disorder assessed telomere length at baseline and at two years follow-up. The authors concluded that individuals with major depressive disorder at baseline had significantly larger shortening of telomeres over the period of 2 years, supporting the association between depression and accelerated aging.

Major depressive disorder is typically classified as a mental illness, but its pathology is evident in cells throughout the body. According to some researchers, several biological mediators are deregulated in this disorder that contribute to accelerated aging. These changes affect levels of genetic and epigenetic mediators (i.e., the variants of genes), and biochemical mediators such as glucocorticoids and neurosteroids. This can alter immune functions, oxidative processes, and levels of factors regulating the metabolism of glucose and production of insulin.

It is evident that deregulation of some of these biological mediators leads to oxidative stress, which seems to be highly correlated with the aging process. Oxidative damage occurs when the body can’t cope with psychological and physical stressors. In other words, oxidative stress refers to the excessive production of free radicals that cannot be completely neutralized by the body’s antioxidative mechanisms. Elevated markers of oxidative stress, along with decreased antioxidant capacity, have been reported in subjects with depression.

Oxidative damage is associated with the aging process, while markers of oxidative stress correlate with the decreased activity of an enzyme called telomerase. This enzyme is responsible for extending the length of telomeres. When telomerase is absent, the telomeres shorten faster. Thus, the link between the depression and accelerated aging can partly be explained by an increased cellular oxidative stress.

Animal studies have also been conducted in order to elucidate the mechanisms underlying major depressive disorder-mediated accelerated aging. For instance, in one study, the researchers exposed rats to mild chronic stress in order to induce the symptoms of major depressive disorder. The animals that developed these symptoms were found to have shorter telomeres and decreased telomerase activity, along with an increase in oxidative damage and decreased antioxidant enzyme activity. In addition, damaged mitochondria and reduced mitochondrial DNA content were also reported in rats with depressive symptoms. This research provided clear cellular evidence of accelerated aging associated with major depressive disorder.

A group of researchers proposed that early treatment (i.e., in the first half of life) of psychiatric disorders, including depression, could extend life expectancy and significantly reduce the burden of age-related disorders (such as cardiovascular disease, cerebrovascular disease, and cancer). They demonstrated that the persistence of some psychiatric disorder from the ages of 11 to 38 years led to the dose-dependent shortening of telomere length by the age of 38. Analyses of blood samples collected at the age of 26 and 38 revealed an accelerated erosion of telomeric ends in males diagnosed with the psychiatric disorder such as depression. Interestingly, there was no such association in females with a psychiatric disorder in the interim assessment at the age of 26. This research points to the link between psychiatric disorders and accelerated biological aging, which may be particularly emphasized in men.

Recently, one study investigated the association between major depressive disorder and age-related changes of the basal ganglia. The basal ganglia are a set of subcortical structures involved in reward processing, which is often dysfunctional in subjects with major depressive disorder. Based on images from the brains of patients with depression and healthy controls, the authors assessed the grey matter volume of basal ganglia in their different parts. They found a negative correlation between the size of the putamen (a region of the basal ganglia located in the base of the forebrain) and age. Importantly, this association was twice as big in patients with major depressive disorder in comparison with healthy subjects. The finding of a greater age-related volume decrease in the depressed subjects, suggests that major depressive syndrome is clearly associated with accelerated aging.

It seems that although various biochemical mediators are responsible for the clear association between depression and accelerated aging, oxidative stress is the largest contributor to this phenomenon. Thus, it is most likely that cellular oxidative damage caused by different psychological and physical stressors represents the underlying mechanism of depression-related accelerated aging.

References

Lin, P.Y., Huang, Y.C., Hung, C.F. (2016) Shortened telomere length in patients with depression: A meta-analytic study. Journal of Psychiatric Research. 76: 84-93. doi: 10.1016/j.jpsychires.2016.01.015

Vance, M.C., Bui, E., Hoeppner, S.S., et al. (2018) Prospective association between major depressive disorder and leukocyte telomere length over two years. Psychoneuroendocrinology. 90: 157-164. doi: 10.1016/j.psyneuen.2018.02.015

Wolkowitz, O.M., Reus, V.I., Mellon, S.H. (2011) Of sound mind and body: depression, disease, and accelerated aging. Dialogues in Clinical Neuroscience. 13(1): 25-39. PMID: 21485744

Xie, X., Chen, Y., Ma, L., Shen, Q., Huang, L., Zhao, B., Wu, T., Fu, Z. (2017) Major depressive disorder mediates accelerated aging in rats subjected to chronic mild stress. Behavioural Brain Research. 329: 96-103. doi: 10.1016/j.bbr.2017.04.022

Shalev, I., Moffitt, T.E., Braithwaite, A.W., et al. (2014) Internalizing disorders and leukocyte telomere erosion: a prospective study of depression, generalized anxiety disorder, and post-traumatic stress disorder. Molecular Psychiatry. 19(11): 1163-1170. doi: 10.1038/mp.2013.183

Sacchet, M.D., Camacho, M.C., Livermore, E.E., Thomas, E.A.C, Gotlib, I.H. (2017) Accelerated aging of the putamen in patients with major depressive disorder. Journal of Psychiatry and Neuroscience. 42(3): 164-171. PMID: 27749245

Image via Snap_it/Pixabay.

]]>
/2018/06/11/does-depression-accelerate-aging/feed/ 0
Acupuncture and Pain /2018/06/05/acupuncture-and-pain/ /2018/06/05/acupuncture-and-pain/#respond Tue, 05 Jun 2018 14:30:00 +0000 /?p=23733 Acupuncture is one of the oldest systems of traditional medicine. With roots in China, this medical system is more than 2000 years old. It differs from various other medical systems as it does not involve taking any herbs or substances. Instead, it involves inserting small needles at specific predefined points. From the 20th century onwards, it has continued to gain prominence in the Western world, although many remain skeptical about efficacy.

These days, researchers want to know the exact mechanism of action of any medical system. They only trust the therapy if it has been proven as effective in randomized placebo-controlled clinical trials. Acupuncture has been the subject of many such trials, and its usefulness has been established in certain pain-associated conditions, mood disorders, and even in some diseases of internal organs. This led to the recognition of this medical system by many healthcare providers and medical organizations. In the US, acupuncture is recognized as complementary medicine. The WHO also recommends it for a number of specific medical conditions.

Clinical evidence of efficiency

Hundreds of clinical trials provide evidence of the effectiveness of acupuncture in various medical conditions. Among the most relevant trials are the so-called German mega-trials named ARC, ART, COMP, and GERAC. The primary focus of these trials has been pain relief in various musculoskeletal conditions. GERAC and ART were high quality random clinical trials investigating low back pain. These trials compared real acupuncture with sham acupuncture and standard care. Sham acupuncture involved needling without the use of conventional acupuncture points. Whereas standard care included the use of physiotherapy, exercise therapy, and NSAIDs (non-steroidal anti-inflammatory drugs). Results were measured after three months, and the response was defined as adequate if there was an improvement of more than 33% on the pain scale (the Von Korff Chronic Pain Grade Scale). After 6-months, the response rate with real acupuncture was 47.6%, while it was 44.2% with sham acupuncture, and 27.4% with standard care. Though these trials clearly demonstrated the efficacy of acupuncture, they also showed the effectiveness of sham acupuncture. Researchers think that this phenomenon is linked to psychogenic factors involved with back pain.

Interestingly, similar results were also demonstrated in various trials in the US. Though these trials show the usefulness of acupuncture in painful conditions, they also illustrate the value of psychogenic factors. Therefore, it is not surprising that acupuncture had also proven its utility in treating mood disorders and sleep disorders.

In Chinese medicine, the use of acupuncture is not limited to pain-related conditions. Thus, there is need for more extensive trials across a diverse range of diseases. The existing data do indicate that acupuncture may be an alternative approach for many difficult to treat chronic ailments.

Possible mechanisms

One of the areas of debate regarding acupuncture has been the difficulty in explaining its mechanism of action. As per traditional explanation, acupuncture is used to correct the flow of energy (Qi) in the body. This life energy flows through the fixed paths or highways in our body called meridians. Practitioners of acupuncture believe that in disease conditions there is a blockade of energy highways or meridians. Acupuncture is about creating the balance between the internal forces called Yin and Yang. However, the trouble is that modern science is not able to demonstrate or even understand the concepts of Qi, Yin, Yang, and Meridians.

In the last few decades, practitioners of modern science have made various attempts to explain the mechanism of action of acupuncture, and they came up with multiple theories.

  • Measurable effects of acupuncture: One of the explanations is that needling does cause local changes and alterations in reflexes. It also has systemic effects, as it changes the working of the autonomous nervous system. Thus, acupuncture affects blood pressure, heart rate, levels of various hormones, and changes the levels of neurotransmitters. All of this results in pain relief and other beneficial effects.
  • Local mechanotransduction: This is a theory promoted by a French physician in 1961, who wrote that acupuncture points have lower local resistance when compared to surrounding skin. He wrote that while normal dry skin has a resistance of 200,000 to 2 million Ohms, acupuncture points have a resistance of just 50,000 Ohms. The believers in this theory say that acupuncture causes minute traumas at the points of needle insertion and thus stimulates survival mechanisms of the body. Needling stimulates homeostasis, anti-inflammatory responses, tissue regeneration, and much more.
  • Neurohumoral theory: it has been demonstrated that acupuncture is useful in pain relief, and naloxone blocks its analgesic effect. Thus, researchers propose that needling at specific points leads to the release of endogenous painkilling opiate-like substances such as enkephalins, endorphins, dynorphins and some other neurotransmitters like serotonin and noradrenaline.
  • Gate-control theory: This theory states that needling stimulates large myelinated nerve fibers. These fibers carry tingling sensations and feelings of warmth, and can inhibit the painful sensation that is transmitted to the brain by much smaller C-fibers via spinal tracts.
  • Postsynaptic inhibition: This is another explanation, which states that acupuncture results in the inhibition of pain through a central mechanism via disinhibition of RAF. This phenomenon typically works in cases of extreme trauma like loss of limb. This central mechanism has a role in protecting the body from extreme stress and pain.
  • Autonomous nervous system: Proponents of this theory suggest that acupuncture works by changing the balance of sympathetic and parasympathetic nervous system.
  • Morphogenetic singularity theory: This is one of the more complicated hypotheses that tries to explain the existence of meridians. Advocates of this theory explain the presence of non-neural communication pathways in the body. During embryonic development, when neurons are not yet formed, cells must communicate among themselves to direct development. Meridians are just the remains of these early communication paths, which are still present in the adult body.
  • Visualization: In recent years, fMRI methods have advanced a lot, and in many clinical studies it has been demonstrated that stimulation of various acupuncture points by needling results in activation of different brain centers, thus explaining the mechanism of action of acupuncture. It is believed that both the elements of morphogenetic singularity theory (meridians) and components of the neurohumoral response are involved in the stimulation of the brain.

It is obvious that various theories have been used to describe the mechanism of action of acupuncture, and most probably there is more than one mechanism is involved. Clinical trials do seem to indicate that acupuncture is beneficial in specific health conditions. But the way it works still remains a mystery.

References

Dong, B., Chen, Z., Yin, X., Li, D., Ma, J., Yin, P., … Xu, S. (2017. The Efficacy of Acupuncture for Treating Depression-Related Insomnia Compared with a Control Group: A Systematic Review and Meta-Analysis. BioMed Research International, 2017, 9614810. doi: 10.1155/2017/9614810

Kawakita, K., & Okada, K. (2014) Acupuncture therapy: mechanism of action, efficacy, and safety: a potential intervention for psychogenic disorders? Biopsychosocial Medicine, 8, 4. doi: 10.1186/1751-0759-8-4

White, A., & Ernst, E. (2004) A brief history of acupuncture. Rheumatology, 43(5), 662–663. doi: 10.1093/rheumatology/keg005

Wong MC, Shen HJ (2010) Science-based Mechanisms to Explain the Action of Acupuncture. Journal of the Association of Traditional Chinese Medicine (UK), 17(2), 5-10.

Image via AusAcu/Pixabay.

]]>
/2018/06/05/acupuncture-and-pain/feed/ 0
Why We Don’t Remember Early Childhood? /2018/05/23/why-we-dont-remember-early-childhood/ /2018/05/23/why-we-dont-remember-early-childhood/#respond Wed, 23 May 2018 14:00:26 +0000 /?p=23744 Although early experiences are important for personal development and future life, as adults we recall nothing or very little of those early formative events, such as making first steps or learning first words. In fact, when adults are asked about their first memories they usually don’t recall events before the age of 2-3, with only fragmented recollection of events that happened between the age of 3 and 7. This phenomenon is often called childhood or infantile amnesia. It represents an inability of both children and adults to recall episodic memories (i.e., memories for particular events or stimuli that occur in a particular context) from infancy and early childhood, before the age 2-4.

Sigmund Freud was the first researcher to develop the theory of infantile amnesia, as he had observed that his patients rarely had been able to recall memories of events that took place during the first years of life. He believed that childhood memories are being repressed and thus forgotten. Still, modern theories focus on cognitive and social development as an important predictor of childhood amnesia. One possible explanation of childhood amnesia is the lack of neurological development, i.e., the development of brain parts that are in charge of storage and retrieval of episodic memories. For instance, some researchers believe that the development and functioning of the prefrontal cortex (cortex area at the front of the brain) is crucial for the creation of contextualized memories. Moreover, the prefrontal cortex and hippocampus are assumed to be crucial for the development of autobiographical memories. Importantly, these two brain structures develop around the age of 3 or 4.

The lack of neurological maturation, i.e., maturation of brain structures required for creation, storage, and recall of memories during infancy and early childhood might explain the phenomenon of childhood amnesia. According to this explanation, childhood amnesia occurs not due to the loss of memories over time (the forgetting explanation), as Freud had suggested, but rather due to the lack of storing of these memories in the first place. The lack of stored memories, according to this theory, is due to brain immaturity.

Some evidence has suggested that amnesia for events taking place in early childhood (before the age of 2) could be at least partly explained by difficulties with verbally recalling memories that were encoded before language acquisition. In line with this is the fact that the majority of words (the vocabulary) are acquired between the age of 2 years and 6 months and 4 years and 6 months. This is the time period that the earliest memories can be recalled.

Childhood amnesia seems not to be an exclusively human phenomenon. Indeed, some researchers have observed something like infantile amnesia in animals (for instance, rodents). The discovery of amnesia in animals has pointed to the possibility of investigating the underlying mechanisms of childhood amnesia, such as neurological events, by using animal models. The animal studies have addressed the importance of some parts of brain and their development in relation to the childhood amnesia. For instance, they have indicated that high rate of neurogenesis in hippocampus as observed in infancy might explain the accelerated forgetting of contextual fear memories. It seems that integrating of new neurons into the existing circuit might destabilize and weaken the existing memories.

Some researchers believe that it is unclear whether childhood amnesia occurs due to the failure of memory retrieval or failure of their storage. Forgetting might be described as a linear function of the time passing since the event. Since there is a long time span between the early events and recall in adulthood, it might be assumed that early events are simply forgotten. Still, some researchers disagree. This is because they have found that subjects recall far less memories for events occurring between the age of 6 and 7 as would be expected by simply extrapolating the forgetting curve. Thus, forgetting could not completely explain the phenomenon of childhood amnesia. This is why a neurogenic hypothesis of childhood amnesia has been developed.

According to its inventors, a neurogenic hypothesis explains childhood amnesia through the continuous adding of new neurons (neurogenesis) in the hippocampus, as already mentioned above. According to this hypothesis, high levels of postnatal neurogenesis (which occurs in both humans and some animals) in the hippocampus prevents the creation of long-lasting memories. This hypothesis has been experimentally tested in animal models (mouse and rat). The findings emerging from these models have proposed that high levels of neurogenesis jeopardize the formation of long-term memories, possibly by replacement of synapses in pre-existing memory circuits. In addition, the same findings indicate that the decline in hippocampal neurogenesis corresponds with the emerging ability to form stabile memories.

Thus, according to these animal studies, the theory of neurogenesis appears to be a logical explanation for childhood amnesia.

Although the early theory regarding the forgetting or repressing of memories might look like a good explanation of childhood amnesia, more recent findings demonstrate that something else is happening in our brain that contributes to this phenomenon. Whether this is the lack of development in some brain parts, or the continuous synthesis of new neurons, or both, remains to be further investigated. Childhood amnesia cannot be explained by simple forgetting.

References

Newcombe, N., Drummey, A., Fox, N., Lai, E., Ottinger-Alberts, W. (2000) Remembering Early Childhood: How Much, How, and Why (or Why Not). Current Directions in Psychological Science. 9 (2): 55–58.

Hayne, H., Jack, F. (2011) Childhood amnesia. Wiley Interdisciplinary Reviews. Cognitive Science. 2(2): 136-145. doi: 10.1002/wcs.107

Simcock, G., Hayne, H. (2003) Age-related changes in verbal and non-verbal memory during early childhood. Developmental Psychology. 39: 805–814. PMID: 12952395

Madsen, H.B., Kim, J.H. (2016) Ontogeny of memory: An update on 40 years of work on infantile amnesia. Behavioural Brain Research. 298(Pt A):  4-14. 10.1016/j.bbr.2015.07.030

Wetzler, S.E., Sweeney, J.A. (1986) Childhood amnesia: An empirical demonstration. In Autobiographical memory (ed. DC Rubin), pp. 191–201. Cambridge University Press, New York, NY.

Josselyn, S.A., Frankland, P.W. (2012) Infantile amnesia: a neurogenic hypothesis. Learning and Memory. 19(9): 423-433. doi: 10.1101/lm.021311.110

Image via VABo2040/Pixabay.

]]>
/2018/05/23/why-we-dont-remember-early-childhood/feed/ 0
Nerve Agents: What Are They and How They Can Hurt Us? /2018/05/04/nerve-agents-what-are-they-and-how-they-can-hurt-us/ /2018/05/04/nerve-agents-what-are-they-and-how-they-can-hurt-us/#respond Fri, 04 May 2018 13:00:58 +0000 /?p=23666 Chemical weapons keep making headlines these days, be it the use of sarin in Syria or Novichok in the UK. An interesting fact hardly ever covered by the media is that the chemical structure of these compounds is relatively simple. An average, modern pharmaceutical drug tends to be much more complex and difficult to make. This is not particularly surprising, as most research into these agents was done 50 or more years ago, when the art of organic synthesis was not as advance as it is now. Nonetheless, these compounds (and nerve agents in particular) are extremely efficient. It is quite interesting to analyze, from a neuroscience perspective, what exactly these compounds are doing to our body to cause such a devastating effect.

Nerve agents are most commonly deployed in chemical warfare. However, they are more common than most of us understand. Compounds with similar structures are sometimes found in insecticides used in agriculture.

How nerve agents work?

All nerve agents work in a similar fashion, though they vary significantly in chemical structure. Due to this variance in chemical composition, they differ in toxicity and other properties. Nerve agents primarily belong to a group called the anticholinesterase inhibitors, and they act by causing paralysis or dysfunctioning of the nervous system.

To better understand the functioning of these agents, let’s look at the underlying physiology of the nervous system. All functions of our body are controlled by nerve cells. This includes the movement of muscles and the work of internal organs and cardio-respiratory apparatus. Our brain sends regular electrical signals to vital organs to keep them regulated. Once these signals reach the target organ, they release chemicals called neurotransmitters. When the brain sends messages to regulate the heart and respiration, nerve endings release a neurotransmitter called acetylcholine that mediates the communication between the neural system and organs and muscles.

Although there are many neurotransmitters helping to regulate the functioning of smooth and skeletal muscles and internal organs, acetylcholine is the most important of them all. Once released by neurons, acetylcholine forces the muscles to contract. Acetylcholine molecules get destroyed in milliseconds by a specific enzyme called acetylcholinesterase to ensure that muscles can relax back again. This contraction and relaxation of muscles ensures the smooth functioning of skeletal muscles, body movements, respiration, heartbeat, and much more.

Thus, for proper functioning and contraction of any muscles in the body, the firing of acetylcholine (in synaptic space) and its quick destruction by acetylcholinesterase is essential.

Now imagine a situation when acetylcholine fired from nerve endings is not destroyed due to a lack of acetylcholinesterase. Now the muscles contract but they cannot relax back. This would result in paralysis. Muscles cannot remain contracted forever and they would be damaged. It is how all nerve agents work: they inhibit acetylcholinesterase, making it non-functional and thus causing muscular paralysis.

Types of nerve agents

Nerve agents were primarily developed either for military purposes or to be used as insecticides. For these very different needs, nerve agents should be either volatile and non-persistent, or non-volatile and highly persistent.

G-series nerve agents were developed by Germans before the second World War. Sarin, tabun, soman, and cyclosarin are some of the representatives of this class of agent. These compounds are non-persistent, which means that they are less stable, cannot remain in the environment for long, and have a shorter washout period from the human body.

The V-series of nerve agents is another significant class of these compounds. They are highly persistent and have an oil like consistency. It means that they can remain stable in the environment for long time and have exceptionally long washout periods. VE, VG, VX, VR, and VM are some of the representatives of this class.

The Novichok series of agents were created by the Soviet Union between the 1960s and 1990s. They were designed to ensure that they remain undetectable by adversaries.

Although these three classes are well known, it is evident that there are many more classes perhaps unknown to the general public due to the secrecy surrounding this technology. Importantly, all of these compounds have a similar mode of action, even though they differ in physical properties and toxicity.

Exposure consequences and antidotes

Once a person is exposed to a nerve agent, the chemical effectively paralyzes various muscles of his/her body, including respiratory and cardiac muscles. Thus, in the end, the person dies due to respiratory and cardiac failure. After initial exposure, many of the agents cause immediate irritation of mucous membranes resulting in a runny nose and burning sensation, followed by the blurring of vision, tightness in chest, urination, defecation, stomach aches, vomiting, epileptic seizures, and finally death due to cardio-respiratory failure.

Acetylcholine is also a vital neurotransmitter for communication between brain cells. This means that nerve agents cause neural damage that is, in many cases, irreversible. Even if a person is revived after exposure, neural damage or psychiatric changes may continue to persist for years. Surviving victims of nerve agent poisoning continue to suffer from fatigue, cognitive deficits, and many other neural symptoms.

The anticholinergic drug atropine remains the first line of help in most cases. This is a widely available drug and may help to counteract many nerve agent effects, especially those related to respiration, heart, and skeletal muscles. Atropine is a part of many emergency kits.

Biperiden is another drug used to treat nerve agent exposure. It is slow to act, but it can cross the blood-brain barrier, and thus helps to counter the central nervous toxicity of these agents.

Pralidoxime chloride is an activator of anticholinesterase and may also help to counter the toxic effect of nerve agents.

Apart from the above-mentioned three drugs, there is an array of supportive drugs and treatments that may help to counteract the effects of nerve agents, help to maintain the functioning of critical organs, and also play a role in faster washout of some of these agents from the body.

References

Newmark, J. (2009). CHAPTER 56 – Nerve Agents. In M. R. Dobbs (Ed.), Clinical Neurotoxicology (pp. 646–659). Philadelphia: W.B. Saunders. doi:10.1016/B978-032305260-3.50062-9

Organisation for the Prohibition of Chemical Weapons. (n.d.). Nerve Agents. Retrieved March 21, 2018, from here

Image via jessebridgewater/Pixabay.

]]>
/2018/05/04/nerve-agents-what-are-they-and-how-they-can-hurt-us/feed/ 0
“Non-gene” Mutations and Neurodevelopmental Disorders /2018/05/02/non-gene-mutations-and-neurodevelopmental-disorders/ /2018/05/02/non-gene-mutations-and-neurodevelopmental-disorders/#respond Wed, 02 May 2018 12:30:01 +0000 /?p=23673 Every year, thousands of children are born with neurodevelopmental issues. This is not just about lagging intellectual growth or autism: in fact, many of the psychiatric illnesses in later life have been blamed on neurodevelopmental problems. These conditions are more common than most people imagine. One estimate suggests that as many as 15% of people suffer from certain neurological and psychiatric issues that have to do with genetics and neurodevelopmental disorders.

However, at present researchers are unable to explain the cause of many neurodevelopmental disorders based on genetic mutations only. That led them to think that they must be missing something. Advances in genetic sequencing technology have revolutionized our understanding of inherited disorders. This change has been further supported by improvements in computing and big data analysis. Nowadays, researchers have better tools and plentiful data to analyze and uncover the genetic landscape of various neurodevelopmental disorders.

A new understanding of neurodevelopmental disorders

For many years, researchers have primarily focused their attention on ~23,000 so-called protein-coding genes, as most neurodevelopmental disorders are directly or indirectly related to proteinopathies (i.e., defects in proteins encoded by the genes). These genes make up just 2% of our genome. To date, researchers have mostly been calling the remaining 98% of our genome as “junk” (i.e., non-coding), and its role is poorly understood.

Now scientists are uncovering that what they have been dismissing as junk has a significant influence on the way the 2% containing protein-coding genes works. This 98% of the genome participates in activation, deactivation, or changes the expression level of the 2%. Researchers have discovered many mutations in these neglected parts of the genome that influence the other more active part of the genome. Thus, it seems that junk DNA has a vital role to play in multiple genetic disorders, including neurodevelopmental diseases.

However, identifying the link between these non-coding genome regions and the coding genes is not a straightforward task. It requires the use of an enormous amount of data and computing power. Detailed investigations of non-coding regions have become possible due to novel genome projects like the 1000 Genome Project, as these projects provide complete genome information, including information about non-coding regions.

A new study confirms the link between non-coding genes and neurodevelopmental disorders

In one of the most extensive studies of its kind that has been published this year in the journal Nature, genetic data from 8000 families were analyzed. Researchers were able to demonstrate the link between neurodevelopmental disorders and mutations outside of protein-coding genes. This is the first study that was able to provide information regarding neurodevelopmental disease risk in undiagnosed children.

Although thousands of children around the world develop neurodevelopmental illnesses linked to slow intellectual growth, epilepsy, and even heart defects, a large number of them remain undiagnosed until clinical symptoms develop. This delay or missed diagnosis means that valuable time to take prophylactic measures is lost.

The new findings are the result of the Deciphering Developmental Disorders (DDD) study initiated in 2010, aimed at early diagnosis of rare developmental disorders. So far, even in the developed European nations and the US, only a very small percentage of children with developmental disorders are diagnosed in a timely manner. It is expected that these new findings will revolutionize the way these conditions are diagnosed and possibly help in the early identification of the disorders and associated risk factors.

In about one-third of 13,000 children diagnosed with neurodevelopmental disorders, researchers were able to identify the genetic mutations that had been previously found. However, they had no clue as to what mutations or genetic issues were responsible for developmental disorders in the remaining 8000 patients.

Therefore, researchers turned their attention towards the non-coding or junk parts of the genome, which control the inhibition or activation of genes and serve as gene regulators. To their surprise, they discovered that specific mutations in these non-coding areas were strongly related to specific neurodevelopmental disorders. There was a good reason to pay attention to the particular parts of these non-coding genes since they have been highly conserved in various species during evolutionary history. Usually, conserved genetic material plays a critical role in the survival and fitness of living species.

What are the future implications of the findings?

At present, researchers agree that they have limited knowledge about the link between mutations in non-coding genome regions and neurodevelopmental disorders. Nonetheless, they think that things will change for the better, as they are now on the right track. As more data becomes available, we will be able to predict and diagnose rare neurodevelopmental disorders more precisely. Data from new projects like NHS’s 100,000 Genome Project may provide truly rich data for a better understanding of the subject. As more plentiful genome and health-related data become available, the diagnosis of neurodevelopmental disorders will be done with higher precision.

Novel studies have finally opened the doors to new unexplored territory, and eventually, we will be able to get the answers to the questions that were too difficult to deal with for many years. Once we are able to diagnose done in a timely fashion, we will be better equipped to make the best decisions regarding the treatment options.

References

Hu, W. F., Chahrour, M. H., & Walsh, C. A. (2014). The diverse genetic landscape of neurodevelopmental disorders. Annual Review of Genomics and Human Genetics, 15, 195–213. doi:10.1146/annurev-genom-090413-025600

Khurana, E., Fu, Y., Colonna, V., Mu, X. J., Kang, H. M., Lappalainen, T., … Gerstein, M. (2013). Integrative Annotation of Variants from 1092 Humans: Application to Cancer Genomics. Science, 342(6154), 1235587. doi:10.1126/science.1235587

Plummer, J. T., Gordon, A. J., & Levitt, P. (2016). The Genetic Intersection of Neurodevelopmental Disorders and Shared Medical Comorbidities – Relations that Translate from Bench to Bedside. Frontiers in Psychiatry, 7. doi:10.3389/fpsyt.2016.00142

Short, P. J., McRae, J. F., Gallone, G., Sifrim, A., Won, H., Geschwind, D. H., … Hurles, M. E. (2018). De novo mutations in regulatory elements in neurodevelopmental disorders. Nature, 555(7698), 611–616. doi:10.1038/nature25983

Image via kalhh/Pixabay.

]]>
/2018/05/02/non-gene-mutations-and-neurodevelopmental-disorders/feed/ 0
Is Childhood Obesity Linked to Lower IQ? /2018/04/26/is-childhood-obesity-linked-to-lower-iq/ /2018/04/26/is-childhood-obesity-linked-to-lower-iq/#respond Thu, 26 Apr 2018 17:00:14 +0000 /?p=23668 Obesity is a global health burden, a serious risk factor for development of metabolic disorders, cardiovascular diseases and many other conditions. But some researchers believe that in addition to affecting physical health, obesity can damage the brain and compromise intelligence.

Brain imaging studies have documented multiple structural and functional abnormalities in the brains of obese individuals, which are already evident in adolescence.  Moreover, research findings indicate that even obesity in childhood is associated with lower intelligence scores. But this is not all. According to some investigations, there is causality in the opposite direction, meaning that lower IQ at childhood results in increased prevalence of obesity in adulthood.

Scientific studies have investigated the association of IQ and obesity in large cohorts. For instance, a group of researchers analyzed data in a prospective, longitudinal study and investigated whether becoming obese is associated with a decline in intelligence from childhood to later life. More than one thousand children were included and tracked until their fourth decade of life. Anthropometric measurements (i.e., body weight and height) were carried out at birth and at 12 occasions later in life, at the ages of 3, 5, 7, 9, 11, 13, 15, 18, 21, 26, 32, and 38. The intelligence quotient (IQ) scores were assessed at the ages of 7, 9, 11, and 38. As the results demonstrated, the participants who became obese had lower IQ scores at adulthood in comparison with the participants whose body mass index (BMI) remained within the normal range. However, the obese participants did not experience a severe decline in their IQ over lifetime, meaning that they had lower IQ scores even in childhood, in comparison with normal weight controls.

Another population-based study followed babies born in the same week of 1950 in the United Kingdom for more than half a century. More than 17 thousand babies were included and their intelligence was assessed at the ages of 7, 11 and 16, while the obesity level and BMI were evaluated at 51. The results indicated negative effects of childhood intelligence on adult BMI and obesity level. In addition, it turned out that more intelligent children had healthier dietary habits and were exercising more frequently as adults.

Considering the negative association between childhood obesity and intelligence, one review study questioned the direction of this causality. After careful examination of longitudinal population based studies, this review study suggested that the direction of causality goes from having low intelligence that results in weight gain and obesity. It also claimed that excess weight gain did not cause a decline in IQ. The study found no strong evidence that obesity impairs cognitive functions or leads to cognitive decline, while it established proof that poor intelligence in childhood leads to weight gain in adulthood.

Still, not all scientists agree with these conclusions. For instance, a group of researchers investigated the impact of obesity on cognitive functions in children with sleep-disordered breathing. They included three groups of children in the study: children with obstructive sleep apnea, children with obstructive sleep apnea and obesity, and children without any of these conditions (normal control). The aim was to assess the total, verbal, and performance IQ scores in these children. The total and performance IQ scores turned out to be significantly lower in the children with obstructive sleep apnea and obesity, in comparison with the other two groups. In addition, BMI negatively influenced the total IQ score in obese children (with obstructive sleep apnea). This study clearly demonstrated that obesity can lead to higher cognitive impairments.

Since childhood IQ and obesity are linked, others investigated whether maternal pre-pregnancy obesity can impact the child’s neurological development. More than 30 thousand women were included; their pre-pregnancy BMI was calculated and the children’s IQ scores were assessed at 7 years of age. The results indicated that women with a BMI of around 20 kg/m2 had children with the highest IQ scores. In contrast, maternal obesity (BMI 30 kg/m2) was associated with lower total and verbal IQ scores. More importantly, excessive weight gain during pregnancy accelerated this association.

All of these findings confirm that there is a link between childhood intelligence and body weight later in life. But what is the mechanism underling this phenomenon?

According to some studies, higher intelligence (IQ) in childhood predicts a better socio-economic status later in life (a higher educational level with a better income). In addition, higher educational attainment seems to reduce the risk of obesity, probably based on better dietary habits (more healthy food choices). This might partly explain how a lower IQ in childhood can lead to weight gain and obesity later in life. When it comes to the impact that excess weight gain has on intelligence, it seems that more research is needed to confirm this association and elucidate the underlying mechanisms. One of the possible explanations for this association is that hormones produced by fat cells may damage brain cells. Another possibility is that excess body weight may jeopardize cerebral blood vessels and, thus, impair brain functions.

Although the cause of obesity-lowered intelligence scores is not entirely clear, it is evident that the link exists. Since obesity is a rising global health concern, its negative effects should also be investigated in terms of its impact on cognitive functions and intelligence. This is especially important when we consider that even pre-pregnancy obesity leads to lower IQ in children.

References

Belsky, D.W., Caspi, A., Goldman-Mellor, S., Meier, M.H., Ramrakha, S., Poulton, R., Moffitt, T.E. (2013). Is obesity associated with a decline in intelligence quotient during the first half of the life course? American Journal of Epidemiology. 178(9): 1461-1468. doi: 10.1093/aje/kwt135

Kanazawa, S. (2013). Childhood intelligence and adult obesity. Obesity (Silver Spring). 21(3): 434-440. doi: 10.1002/oby.20018

Kanazawa, S. (2014) Intelligence and obesity: which way does the causal direction go? Current Opinion in Endocrinology, Diabetes and  Obesity. 21(5): 339-344. doi: 10.1097/MED.0000000000000091

Vitelli, O., Tabarrini, A., Miano, S., Rabasco, J., Pietropaoli , N., Forlani, M., Parisi, P., Villa, M.P. (2015). Impact of obesity on cognitive outcome in children with sleep-disordered breathing. Sleep Med. 2015;16(5): 625-630. doi: 10.1016/j.sleep.2014.12.015

Huang, L., Yu, X., Keim, S., Li, L., Zhang, L., Zhang, J. (2014). Maternal prepregnancy obesity and child neurodevelopment in the Collaborative Perinatal Project. International Journal of Epidemiology. 2014;43(3): 7837-92. doi: 10.1093/ije/dyu030

Chandola, T., Deary, I.J., Blane, D., Batty, G.D. (2006). Childhood IQ in relation to obesity and weight gain in adult life: the National Child Development (1958) Study. International  Journal of Obesity. 30(9): 1422-1432. DOI: 10.1038/sj.ijo.0803279

Image via mojzagrebinfo/Pixabay.

]]>
/2018/04/26/is-childhood-obesity-linked-to-lower-iq/feed/ 0