History of Medicine – Brain Blogger http://brainblogger.com Health and Science Blog Covering Brain Topics Wed, 30 May 2018 15:00:03 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.6 Is Anorexia a Modern, Culture-Bound Disorder? http://brainblogger.com/2014/09/03/is-anorexia-a-modern-culture-bound-disorder/ http://brainblogger.com/2014/09/03/is-anorexia-a-modern-culture-bound-disorder/#respond Wed, 03 Sep 2014 11:00:17 +0000 http://brainblogger.com/?p=17046 The universality and the origins of anorexia nervosa have been the subject of years of debate. Some argue that the eating disorder is a culture-bound syndrome specific to the Western, industrialized world, while others maintain that there is evidence that the disease is not confined to more recent times or one part of the world.

While acknowledging “symptomatic continuities” which extend back in time prior to the mid-1800s, others maintain that certain elements essential to the classification of anorexia nervosa were absent prior to the latter half of the 19th century.

Whether or not anorexia nervosa (AN) and bulimia nervosa (BN) existed before this time hinges on a crucial factor, argues Tilmann Habermas: “The clinical feature that renders both disorders distinctly modern and Western is the overvalued idea of being too big when at normal body weight, which serves as the primary conscious motivation for restricting food intake.”

Self-starvation is by no means modern, however the components of distorted body image and fat-phobia as principal motivators may not have emerged until more recently. This is consistent with the DSM-IV criteria for AN, which includes “refusal to maintain body weight at or above a minimally normal weight”, and “intense fear of gaining weight or becoming fat even though underweight,” along with “disturbance in the way in which one’s body weight or shape is experienced, undue influence of body weight or shape on self-evaluation, or denial of the seriousness of the current low body weight.”  

We should not, therefore, include cases of self-starvation throughout the ages which stemmed from other motivations, such as depression or grief, digestive issues, obsessive disgust with food or eating that does not have a weight-loss motivation, or fear of poisoning.

Cases Prior to the 19th Century
Klump and Keel go as far back as the case of a 9th century Bavarian serf named Friderada whose inability to eat was described as part of a monk’s attestation to a miracle peformed by Walburgis. The fasting medieval saints have been part of the discussion surrounding whether AN emerged more recently or has been present throughout the centuries.

While self-starvation criteria is met, fat-phobia is often missing in these cases. Habermas distinguished cases of inedia among fasting religious ascetics from AN as we know it in modern times: “Being underweight was not a necessary or even a typical condition for claiming inedia.”

He also points out that anorexics tend to hide their food avoidance, while the religious ascetics tended to exhibit their abstinence as miraculous. “The miraculous fasting girls from the 16th to the 19th century,” says Habermas, “remained in their families, were ill or handicapped, and claimed not to be eating anything, although most of them were not emaciated. Girls with AN, in contrast, try to make believe they are eating normally, and provide a variety of medical, somatic, or aesthetic reasons not to eat.”

Consequently, he concluded that neurotic asceticism accompanied by weight loss had similarities with AN, but was “probably still more similar to neurotic asceticism without weight loss.”

When AN Coincides with Religious Asceticism
Later studies have looked at anorexia and religiosity, finding that religiosity was associated with lower minimum body mass index achieved, and that religiosity tended to increase as AN progressed. Another study found that anorexics modified their religious practices after the onset of AN, with decreased participation in communion or feasts and increased religious fasting.

The element of self-disgust with one’s body is not recorded in many older and religious cases, however one French catholic named Renata, documented by Schnyder in 1912, did exhibit a loathing of her shape, with a religious bend. She felt that weight contributed to sensuality and did not want to be “an object of desire,” and expressed feeling “ashamed to feel that I was looked at.”

The time of onset, at about age 16, as well as Renata’s obsession with biking 40 kilometers, self-induced vomiting, and self-starvation, point to anorexia and bulimia framed in religious terms. Habermas therefore suggests that intense religiosity and AN need not be mutually exclusive, however many of the religious ascetics did not meet the DSM-IV criteria for AN.

Change in Incidence, or Change in Documentation?
The late 19th century produced abundant medical literature on AN cases, leading to the question of whether the incidence of AN actually increased at that time, or whether the clinical documentation that accompanied the emergence of psychiatry in France led to increased notice of cases in conjunction with what was otherwise a stable incidence rate.

Habermas also points to “an interest in nutrition, which, combined with an interest in the psychology of their patients, made German physicians of internal medicine sensitive to motives for not eating.”

German doctors, he also noted, attributed cases of AN to Simmonds’ Disease between 1914 and 1945. Cases of AN motivated by fear of weight gain were recorded in Russia in the late 1800s, as well as in Italy. However, a change in primary sources around this time also occurred, with cases recorded by physicians in the context of medicine rather than by religious sources.

The medical literature of the latter half of the 19th century describes French school girls drinking vinegar and limiting food intake for aesthetic purposes, an emaciated Queen Elizabeth of Austria, obsessed with exercise and food restriction, motivated by a fear of growing fat. Habermas recounts that “in the second half of the 19th century, medicine began to label as pathological even moderate degrees of overweight, which were increasingly judged by reference to statistical tables such as those by Quetelet (1835) and Worthington (1875).” Interestingly, concern about excess weight first emerged among men, in contrast to today’s typical anorexic, the adolescent girl. In the early 1900s, some physicians associated corseting with AN.

Universality vs. the Globalization of Eating Disorders
Whether AN is a Western phenomenon, or found throughout the world has also been debated. Isaac cites research in which Keel and Klump concluded that AN was not a culture-bound syndrome, but bulimia was, suggesting that nonwestern bulimics had  been exposed to Western culture.

Smink et al. performed a meta-analysis of 125 studies and point to “cultural transition and globalization” as mechanisms by which eating disorders which were historically characterized as culture-bound started appearing in non-western countries and among minorities. They note a study on the Caribbean island Curacao in which no cases were found among the black population, while incidence rates among white and mixed-race residents where comparable to those found in the U.S. and the Netherlands. Other eating disorders, such as binge eating, have been associated with migration from Mexico to the U.S.

As Haberman stated, “whether AN and BN are universal or restricted to specific historical phases (and cultures) has been shown to depend on two main factors, namely, on the methodology of historical research and on the diagnostic criteria chosen.” In this context, some evidence suggests that although abnormal eating habits and fasting can be found throughout the ages, the motives and body image distortion that accompany what we currently know as AN may not have been present prior to the mid or late 1800s. Alternately, AN may have been present, but cultural context and the nature of sources, both of which shifted in the 1800s, may have simply obscured its presence.

Habermas, T. (1992). Further Evidence on Early Case Descriptions of Anorexia Nervosa and Bulimia Nervosa. International Journal of Eating Disorders, 11(4) 351-359.

Habermas T (2005). On the uses of history in psychiatry: diagnostic implications for anorexia nervosa. The International journal of eating disorders, 38 (2), 167-82 PMID: 16134113

Isaac D (2013). Culture-bound syndromes in mental health: a discussion paper. Journal of psychiatric and mental health nursing, 20 (4), 355-61 PMID: 23495975

Smink FR, van Hoeken D, & Hoek HW (2012). Epidemiology of eating disorders: incidence, prevalence and mortality rates. Current psychiatry reports, 14 (4), 406-14 PMID: 22644309

Image via Keith A Frith / Shutterstock.

http://brainblogger.com/2014/09/03/is-anorexia-a-modern-culture-bound-disorder/feed/ 0
Jury Still Out On After Effects of Concussion http://brainblogger.com/2014/05/14/jury-still-out-on-after-effects-of-concussion/ http://brainblogger.com/2014/05/14/jury-still-out-on-after-effects-of-concussion/#comments Wed, 14 May 2014 15:50:04 +0000 http://brainblogger.com/?p=16465 Controversy surrounding Postconcussion Syndrome (PCS) dates back to the 1800s. 150 years on, contention still surrounds the lingering symptoms of insomnia, dizziness, irritability, depression, cognitive impairment and so on that affect between 30 and 80 percent of Americans following a mild traumatic brain injury (mTBI), or concussion.

The present day billboards of personal injury attorneys seeking auto accident victims as clients hark back to the late 19th century, when railways became a popular means of travel. During that time, railway accidents, and the number of physicians reporting on conditions such as “Railway Spine”, increased dramatically. Present day conditions of PCS and whiplash present similar symptoms, with etiology that is still the subject of debate. With billions of dollars of claims at stake, courtroom adversaries can pick and choose from a range of conflicting studies and theories.


In his book Post-Traumatic Neurosis, physician Michael Trimble notes that:

“In the nineteenth century and before, legal cases involved with personal injury were mainly to do with material injuries, such as loss of a limb or an eye, where objective evidence was unmistakable and quantifiable. With the advent of ‘concussion of the spine’ the situation changed, and the concept that the injured were victims of at best ‘shock’ and at worst spinal anaemia or meningitis became prevalent.”

In the late 1800s, the dominant theory involved organic lesions of the spine and brain. London surgeon John Eric Erichsen gave famous lectures in 1866, later republished in book format in 1875 as On Concussion of the Spine: nervous shock and other obscure injuries of the nervous system in their clinical and medico-legal aspects, in which he opined:

“The primary effects of these concussions or commotions of the spinal cord are probably due to changes in its structure. The secondary are mostly of an inflammatory character, or are dependent on retrogressive organic changes, such as softening, etc., consequent on interference with its nutrition.”

This view was challenged in the 1880s by London and Northwest Railway surgeon Herbert Page, who asserted that one of Erichsen’s spinal concussion cases was potentially suffering the effects of syphilis instead, and pointed to a lack of post-mortem data in the majority of spinal concussion cases. Dr. Page proposed that fear and shock played a role, suggesting psychological rather than organic causes in the large number of people who had been in relatively minor accidents yet remained symptomatic afterwards.

Among the many train crash victims was Charles Dickens, famous author of A Tale of Two Cities, A Christmas Carol, and Great Expectations. Dickens’ carriage did not go over the bridge, but was too close for comfort, dangling. He described “two or three hours work afterwards around the dead and dying surrounded by terrific sights”. Dickens suffered from symptoms including weakness and anxiety and being “not quite right within,” which he attributed to “the railway shaking.”

Later, Oppenheimer moved from the theory of “Railway Spine” to “Railway Brain”, like Erichsen attributing symptoms to an organic cause. Pioneering French neurologist Charcot suggested manifestations of hysteria instead. After World War II, as cars became popular, whiplash injuries multiplied, resulting in similar clusters of symptoms.

The Present Problem

According to the CDC, there were 2.5 million emergency room visits, hospitalizations, or deaths associated with traumatic brain injury, the majority of which were concussions, or “mild TBI”, in 2010. These numbers are understated, as they do not include those head injuries which did not involve a trip to the hospital. While causes vary and include falls, auto accidents, assault, occupational accidents, and sports, some of these, such as auto or occupational accidents, result in litigation.

Injury claims in the U.S. cost billions of dollars each year. Veronique de Rugy, a senior research fellow at George Mason University’s Mercatus Center indicated that despite medical advances which allow people to remain on the job, the number of Americans claiming disability has increased more than 6 fold.

Her report points to the fact that changing standards put more weight on self-reported pain and discomfort. Auto accidents follow a similar trend where the cost of claims is rising while the overall severity of injuries is declining.  

The Coalition Against Insurance Fraud estimates that fraudulent claims cost $80 billion per year in the U.S. In a litigious society with so much money at stake, plaintiffs are often portrayed negatively by the media. In particular, those who are injured in ways that are not obviously disfiguring frequently are subject to great scrutiny and accusations of fraud. Types of fraud can include malingering, falsely assigning real symptoms to a compensable cause, or misrepresentation of diminished capacity following injury.

Diagnosis of PCS depends largely on self reporting. An observed loss of consciousness is not necessary,  and common symptoms such as headache, irritability, loss of memory and the ability to concentrate, dizziness, and sensitivity to noise or alcohol do not lend themselves to objective, verifiable measurement. Some of these complaints are also highly prevalent in the general population.

Some explanations allow for the legitimacy of reported symptoms, while largely dismissing organic causes. Instead, other theories offered include psychogenic causes. Symptoms such as insomnia, dizziness, headache, and cognitive impairment can overlap with depression, anxiety, and post-traumatic stress disorder (PTSD). Studies have found premorbid depression common to almost half of those who develop PCS. Another study found that mild TBI was not a risk factor for PCS after adjusting for PTSD and depression. PTSD was found in other research to be the strongest predictor of PCS. (PTSD could also offer an explanation for what Dickens described following the train wreck).

Studies have found a link between persistent PCS and potential financial compensation. Evans reports that “on neuropsychological testing, there is a dose-response relationship between an increasing amount of potential compensation and an increasing rate of failure on malingering indicators, particularly in those who have suffered only mild TBI.”

Lithuania served as a testing ground, chosen for the fact that compensation for post-traumatic headache is unlikely there, and because the general population has fewer “expectations of persisting symptoms than in a Western society.” Among emergency room patients with mild headache injuries and loss of consciousness not exceeding 15 minutes, headaches disappeared within a month for the overwhelming majority of Lithuanian subjects- 96 percent of respondents. However this does not necessarily prove that American plaintiffs with PCS are faking.

Litigation Response Syndrome

While prospective studies of emergency room patients in other countries on the surface appear to support non-organic etiology, it is worth noting that studies of neurasthenia have found that cultural factors can influence the manifestation of clinical symptoms. Additionally, a group of stress problems known as Litigation Response Syndrome have been described, the symptoms of which mirror PCS.

People can become “so terrified and emotionally traumatized by the litigation that they appear to have mental disorders,” explains expert Paul Lees-Haley. When plaintiffs are followed by insurance investigators, subjected to intrusive depositions and compulsory medical exams conducted by strangers, must hand over their private medical records to insurance company employees, and are constantly bombarded with medical bills and legal documents, symptoms such as stress, insomnia, anxiety, and dizziness may ensue, but not be attributable to the actual injury.

However, since the litigation, and therefore the symptoms, did not occur prior to the injury, patients could reasonably believe that the injury caused the symptoms. This is also a plausible explanation for why these symptoms are so persistent in U.S., but not in places like Lithuania where litigation is less common.

Due to the subjective nature of the symptoms associated with PCS, and the substantial overlap with highly prevalent psychiatric and medical conditions such as headaches and depression, it is unlikely that the debate will be fully resolved in the near future. Research has increased as a result of the return of veterans with brain injuries and PTSD as well as increased awareness of sports related concussions, which should result in more refined studies of specific subsets of the population with PCS symptoms.


De Rugy, V. (August 7 2013). Social Security Disability Costs are Exploding. The Washington Examiner.

Evans RW (2010). Persistent post-traumatic headache, postconcussion syndrome, and whiplash injuries: the evidence for a non-traumatic basis with an historical review. Headache, 50 (4), 716-24 PMID: 20456159

Lees-Haley, P.(1989). Litigation Response Syndrome: How Stress Confuses the Issues. Defense Counsel Journal, 110.

Rogers R, & Payne JW (2006). Damages and rewards: assessment of malingered disorders in compensation cases. Behavioral sciences & the law, 24 (5), 645-58 PMID: 17016811

Styrke, J. (2012). Traumatic Brain Injuries and Whiplash Injuries. Umea University Department of Surgical and Perioperative Sciences.

Trimble, M. R. (1981). Post Traumatic Neurosis. New York: John Wiley and Sons.

Image via Riccardo Piccinini / Shutterstock.

http://brainblogger.com/2014/05/14/jury-still-out-on-after-effects-of-concussion/feed/ 2
Dental Anxiety’s Past and Its Lucrative Future http://brainblogger.com/2014/04/05/dental-anxietys-past-and-its-lucrative-future/ http://brainblogger.com/2014/04/05/dental-anxietys-past-and-its-lucrative-future/#comments Sat, 05 Apr 2014 11:30:40 +0000 http://brainblogger.com/?p=16133 Dental horrors abound in literature, artwork, and dental museum exhibits. Throughout most of the world’s history, dentistry, like childbirth, has been associated with intense pain. Yet in our modern era of preventive maintenance, restoration, and local anesthesia, the prevalence of dental anxiety remains persistently high. This widespread dental angst has created a ripe market for profitable sedation dentistry.

As the authors of a 2011 study point out, “consternation in dentistry” has been classified as dental fear, dental anxiety, and dental phobia. They propose that dental fear is the result of “a real, immediate, present and specific stimulus, such as needles or drilling,” whereas dental anxiety is characterized by a threat “ambiguous, unclear, and not immediately present.”

Other researchers have suggested that “dental phobia” is a misnomer, and that most dental anxiety is closer to a form of post-traumatic stress disorder, recommending that it be conceptualized as “Posttraumatic Dental-Care Anxiety” (PTDA) and classified as part of the PTSD spectrum. Bracha et al., (2006) state: “In our experience, most individuals with dental “phobia” do not recognize their symptoms as ‘excessive or unreasonable,’ and in that sense resemble individuals with PTSD. Our review of the dental-care literature suggests that true (innate) dental phobias (akin to unreasonable fear at the sight of blood or a syringe) probably account for a smaller percentage of cases, and that the vast majority of dental-care anxiety cases stem from aversive dental experiences.”

Tools commonly employed by researchers to identify dental anxiety include the Modified Dental Anxiety Scale and Corah’s Dental Anxiety Scale. Patients are asked questions such as “When you are waiting in the dentist’s office for your turn in the chair, how do you feel?” or “While you are waiting and the dentist is getting out the instruments which he or she will use to scrape your teeth around the gums, how do you feel?”

Dental anxiety is not merely the fear of pain. Nineteenth century literature paints a picture of what modern researchers delving into dental anxiety have categorized as a feeling of  “existential threat.” During the same year that Carl Koller first used cocaine as a local anesthetic for dental procedures, and about twenty years prior to German chemist Alfred Einhorn’s creation of what became widely known as Novocaine, French author Joris-Karl Huysmans described in his famous novel A Rebours an intrusive and anxiety-provoking memory of a molar extraction at the hands of “a mechanic who called himself a dentist of the people and who lived down near the quays.” Gatonax, The People’s Dentist, operated on the third floor, accessible from a “darkened stairway” stained with “large gobbets of blood-red spittle.”

Huysmans’ protagonist, Des Esseintes, recounts the procedure:”A cracking sound was heard, the molar was breaking as it came out; then it seemed to him that his head was being torn off, that his skull was being shattered; he lost all self control, had shouted at the top of his voice, furiously trying to defend himself against the man, who threw himself afresh on him, as if he wanted to force his arm into the very depths off Des Esseintes’ bowels, then he suddenly took a step back, and, lifting up the body that was still attached to the jaw, had brutally let it fall back down on is backside into the chair as, standing upright and filling the window frame, he panted, brandishing a blue tooth dripping with blood at the end of his forceps.”

This passage illustrates some of the qualitative aspects of dental anxiety found by Swedish researchers, such as feelings of powerlessness or the belief that the dentist has all the power, the presence of underlying neuroticism or general anxiety, and the dentist’s perceived lack of empathy. Perceived negative behavior by a dentist is common among sufferers of dental anxiety. One of the Swedish researchers’ subjects described a dentist from childhood who reeked of booze and had shaky hands. Studies have indicated that patients with anxiety often transfer negative sentiments about one dentist to all dentists they encounter subsequently.

Fear of suffocation as the mouth is filled with water, cotton, and dental instruments is another component of dental anxiety. Edgar Allan Poe’s short story, Berenice, published in 1835, invokes this fear as Berenice is buried prematurely and her thirty-two teeth, the object of her monomaniac cousin’s obsession, forcibly ripped out.

A survey of 1,882 individuals in the U.S. found an adult prevalence rate of “high dental fear” between 11.2 and 12.3 percent. Another 17.5 to 18.2 percent of respondents indicated moderate dental fear, and 15.5 percent avoided dental care out of dread. Outside of the U.S., a 2009 study in the Netherlands found that “the prevalence of dental fear was 24.3%, which is lower than for fear of snakes (34.8%), heights (30.8%), and physical injuries (27.2%). Among phobias, dental phobia was the most common (3.7%), followed by height phobia (3.1%) and spider phobia (2.7%). Scandinavian epidemiological studies found that 7 to 10 percent of population was highly anxious with regard to receiving dental care. Most research points to a higher prevalence rate among women compared to men. As to whether age and income are associated with dental anxiety, studies have shown mixed results, with some not finding any correlation with age, and others showing a curve where anxiety peaks in young adulthood and diminishes somewhat with age.

All this dental angst has fueled sedation dentistry’s growth and entrance into the mainstream. A survey of 1,101 Canadians found that the proportion of the population preferring sedation or general anesthesia was “7.2% for cleaning, 18% for fillings or crowns, 54.7% for endodontics, 68.2% for periodontal surgery, and 46.5% for extraction,” and that a greater proportion of respondents expressed an interest in sedation for dentistry than the proportion who had been sedated before. The researchers concluded that there was significant need and untapped demand.

Incidentally, The People’s Dentist charged 2 Francs for the extraction Huysmans so vividly described. Sedation dentistry is generally not covered by insurance, and prices can range from $50 for nitrous oxide to more than $1,000 for IV sedation. Fear is lucrative. The prospect of sedation has the ability to attract new dental patients to the market for general and cosmetic procedures- the patients who have avoided the dental market altogether as a result of their anxiety.

An analysis in Dental Economics showed that the highest profit in terms of dollars can be generated by IV sedation, but in terms of profit percentages, oral sedation such as Valium is the most lucrative. Also recognized was the collections advantage inherent in sedation dentistry: patients who plan to be sedated by oral or IV methods on the day of their procedure are usually required to pay up front.

Dental anxiety is a prolific problem affecting many people around the world, despite modern advances which have made dentistry dramatically more comfortable than it was historically. The avoidance of dental care due to dental anxiety can lead to deteriorating oral and general health, but also represents pent-up demand for sedation services and therefore incremental revenue for the 40 percent of U.S. dentists who are now offering some form of sedation dentistry.


Abrahamsson KH, Berggren U, Hallberg L, & Carlsson SG (2002). Dental phobic patients’ view of dental anxiety and experiences in dental care: a qualitative study. Scandinavian journal of caring sciences, 16 (2), 188-96 PMID: 12000673

Bracha HS, Vega EM, & Vega CB (2006). Posttraumatic dental-care anxiety (PTDA): Is “dental phobia” a misnomer? Hawaii dental journal, 37 (5), 17-9 PMID: 17152624

Chanpong B, Haas DA, & Locker D (2005). Need and demand for sedation or general anesthesia in dentistry: a national survey of the Canadian population. Anesthesia progress, 52 (1), 3-11 PMID: 15859442

Gatchel RJ, Ingersoll BD, Bowman L, Robertson MC, & Walker C (1983). The prevalence of dental fear and avoidance: a recent survey study. Journal of the American Dental Association (1939), 107 (4), 609-10 PMID: 6579095

Gatchel RJ (1989). The prevalence of dental fear and avoidance: expanded adult and recent adolescent surveys. Journal of the American Dental Association (1939), 118 (5), 591-3 PMID: 2785546

Oosterink FM, de Jongh A, & Hoogstraten J (2009). Prevalence of dental fear and phobia relative to other fear and phobia subtypes. European journal of oral sciences, 117 (2), 135-43 PMID: 19320722

Wong HM, Mak CM, & Xu YF (2011). A four-part setting on examining the anxiety-provoking capacity of the sound of dental equipment. Noise & health, 13 (55), 385-91 PMID: 22122954

Image via Zoya Kriminskaya / Shutterstock.

http://brainblogger.com/2014/04/05/dental-anxietys-past-and-its-lucrative-future/feed/ 1
Remembering Henry Molaison http://brainblogger.com/2014/03/03/remembering-henry-molaison/ http://brainblogger.com/2014/03/03/remembering-henry-molaison/#comments Mon, 03 Mar 2014 12:00:37 +0000 http://brainblogger.com/?p=16014 Henry Gustav Molaison (1926-2008) was perhaps the best-known and most studied patient in the history of neuroscience. Henry became the subject of a scientific article which would become one of the most cited articles in the history of medical literature.

At around the age of ten, Henry began having epileptic seizures, which became more severe over time, compromising his health, school performance, and social life. As an adult, Henry’s ability to work and function independently were severely impaired by his seizures in spite of taking high doses of anticonvulsant medications. In an effort to alleviate Henry’s seizures, neurosurgeon William Beecher Scoville performed an experimental brain operation on Henry, then 27, to remove portions of his medial temporal lobes, including a large portion of the hippocampus, which may have been the source of his seizures.

The article showed that, while Henry’s seizures were reduced in frequency, he acquired unexpected amnesia after his surgery. Henry was a normal, intelligent young man who had no memory impairments prior to his surgery. His acquired amnesia, therefore, was attributed to the surgical removal of his medial temporal lobes. This would prove to be a medical breakthrough, since the neurological substrate of memory in the brain was unknown at the time of Henry’s operation.

Thanks to Henry’s willingness to undergo extensive scientific testing, the modern era of memory research essentially began with evidence from his case. He was unexpectedly unable to retain new information – a condition known as anterograde amnesia, which prevents the process of consolidation, which transfers short-term into long-term memories. Reminiscent of the main character in the movie Memento, Henry was able to hold information in his mind through rehearsal, but could not store that information in long-term memory. Therefore, for all practical purposes, Henry lived only in the present tense. For example, Henry could not recognize members of the hospital staff who cared for him extensively during his stay. In fact, from recorded interviews with Henry, it is clear that he could not remember what he had eaten for lunch on any given day, or identify the current President of the United States.

Prior to his death from respiratory failure on December 2, 2008 in a Connecticut nursing home at the age of 82, he was popularly known as Patient H.M., Henry M., or, simply H.M. Shortly before Henry’s death, his guardians authorized the release of audio recordings made of him in the early 1990s, which were made available online by National Public Radio. These audio recordings represented the first opportunity for the general public to hear his voice and poignantly captured the essence of his existence, which included discernable optimism and good will – all in spite of his profound surgery-induced memory impairments.

After Henry’s death, his full name was made publicly available and the story of his life was published by neuroscientist Suzanne Corkin, Ph.D., Professor of Neuroscience, Emerita, at the Massachusetts Institute of Technology, in her book, Permanent Present Tense: The Unforgettable Life of the Amnesic Patient, H.M., which details Henry’s remarkable contributions to neuroscience.

Henry’s profound memory impairment had an enormously negative impact on his quality of life. Although Henry’s semantic memory (conscious recollections of facts and general knowledge about the world) for the years before his surgery was preserved, especially that for major world events, he was unable to recall any autobiographical memories from that time. In spite of this, Henry’s personality, language, attention, intellectual abilities, and short-term memory all remained unaffected. In fact, based on tests performed ten months after his surgery, Henry’s IQ was above average. Interestingly, further testing revealed that Henry had a number of preserved learning and memory functions (including normal performance in motor skill learning, perceptual learning, and visuoperceptual priming), even while claiming he could not recall these learning experiences and, therefore, lacked that declarative knowledge (“knowing that”). This revealed, for the first time, that non-declarative or procedural learning (“knowing how”), which was normal in Henry, relied on memory circuits separate from those in the medial temporal lobe and that it did not require conscious memory processes.

Since brain imaging technology was not available at the time of Henry’s surgery, the nature and extent of his brain lesions could not be visualized until much later. Computerized tomography (CT) scans of Henry’s brain were first published in 1984, although they did not clearly reveal the nature and extent of tissue damage in his medial temporal lobes. Later, in the 1990s, magnetic resonance imaging (MRI) indicated that his lesions included most of the amygdaloid complex and entorhinal cortex, as well as a large portion of the hippocampal formation. These MRI scans, however, did not reveal the exact anatomical boundaries of the lesions in Henry’s medial temporal lobes.

After Henry’s death, his brain was donated to science and sent to The Brain Observatory at the University of California at San Diego, where, as part of Project HM, it could be expertly cut and simultaneously recorded in a series of high resolution neuroanatomical images. From these images, it was possible to construct a detailed microscopic level mapping and to make 3D measurements from a digital model of Henry’s brain.

A research team, led by neuroanatomist Jacopo Annese, Ph.D., Founder and Director of The Brain Observatory, and Matthew Frosch, M.D., Ph.D., Director of Neuropathology Service at Massachusetts General Hospital, performed histological sectioning and digital 3D reconstruction of Henry’s brain. Project H.M., funded by the National Science Foundation, Dana Foundation, National Eye Institute, and National Institute of Mental Health, clearly delineated the exact anatomical locations of Henry’s brain lesions for the first time.

The findings, published online in Nature Communications, were based on 2,401 digital images of Henry’s brain, which was cut in an uninterrupted 53-hour procedure that was streamed live on the internet on the one-year anniversary of Henry’s death. “I didn’t sleep for three days,” said Dr. Annese. “It was probably the most engaging, most exciting thing I’ve ever done,” he said.

The results show that there was complete removal of Henry’s anterior hippocampus and most of his entorhinal cortex, although a substantial portion of presumably non-functional hippocampal tissue remained. “These initial results confirm what we already knew about the size and shape of Henry’s brain lesions,” said Dr. Corkin. In addition, there was a near-complete removal of the amygdala, which may explain Henry’s dampened expression of emotions, poor motivation, lack of initiative, and inability to identify internal states such as pain, hunger, and thirst. Moreover, the results showed that Henry had a small lesion in the left frontal lobe of unknown origin that had never previously been identified. Whether this newly discovered frontal lobe lesion had any influence on Henry’s behavior is currently unknown.

Also noteworthy was the visualization of atrophy of the cerebellum, which had been previously identified via brain imaging and which Henry acquired as a side effect of long-term use of Dilantin, part of Henry’s seizure management both before and after his surgery. Lastly, the results also showed that Henry had diffuse damage to deep white matter (insulation for parts of brain cells) underlying the removed medial temporal lobes, which appeared to be a recent age-related phenomenon attributable to medical conditions, including hypertension.

Henry Molaison lived solely in the present tense for 55 years, during which time he taught us more about memory and the brain than we had ever thought possible. By all accounts, Henry enjoyed participating in the research being conducted on him and expressed happiness, saying: “What they find out about me helps them to help other people.”

Undoubtedly, Henry’s contributions to neuroscience will never be forgotten, but rather will last forever. “Henry’s disability, a tremendous cost to him and his family, became science’s gain,” said Dr. Corkin.


Annese J, Schenker-Ahmed NM, Bartsch H, Maechler P, Sheh C, Thomas N, Kayano J, Ghatan A, Bresler N, Frosch MP, Klaming R, & Corkin S (2014). Postmortem examination of patient H.M.’s brain based on histological sectioning and digital 3D reconstruction. Nature communications, 5 PMID: 24473151

Buchen L (2009). Famous brain set to go under the knife. Nature, 462 (7272) PMID: 19940891

Corkin S. (2013). Permanent Present Tense: The Unforgettable Life of the Amnesic Patient, H.M. Jackson, TN: Basic Books.

Corkin S (2002). What’s new with the amnesic patient H.M.? Nature reviews. Neuroscience, 3 (2), 153-60 PMID: 11836523

Corkin S, Amaral DG, González RG, Johnson KA, & Hyman BT (1997). H. M.’s medial temporal lobe lesion: findings from magnetic resonance imaging. The Journal of neuroscience : the official journal of the Society for Neuroscience, 17 (10), 3964-79 PMID: 9133414

Corkin S, Amaral DG, González RG, Johnson KA, & Hyman BT (1997). H. M.’s medial temporal lobe lesion: findings from magnetic resonance imaging. The Journal of neuroscience : the official journal of the Society for Neuroscience, 17 (10), 3964-79 PMID: 9133414

Hughes, V. (2014, January 28). After death, H.M.’s brain uploaded to the cloud. National Geographic RSS. Retrieved February 2, 2014.

Newhouse, B. (2007, February 24). H.M.’s brain and the history of memory. National Public Radio. Retrieved February 2, 2014.

SCOVILLE WB, & MILNER B (1957). Loss of recent memory after bilateral hippocampal lesions. Journal of neurology, neurosurgery, and psychiatry, 20 (1), 11-21 PMID: 13406589

Thomson, H. (2014, January 28). Neuroscience’s most famous brain is reconstructed. New Scientist RSS. Retrieved February 2, 2014.

Image via Oliver Sved / Shutterstock.

http://brainblogger.com/2014/03/03/remembering-henry-molaison/feed/ 1
The Curious Case of Robert Ley’s Brain http://brainblogger.com/2013/10/09/the-curious-case-of-robert-leys-brain/ http://brainblogger.com/2013/10/09/the-curious-case-of-robert-leys-brain/#respond Wed, 09 Oct 2013 11:00:15 +0000 http://brainblogger.com/?p=15567 Six years ago, while researching the life of an American psychiatrist who studied the top Nazi leaders during their imprisonment and trial in Nuremberg, I came across a small box among the physician’s possessions. The box held a set of glass photographic transparencies, with each slide showing a cross-section of a brain. Labels on the slides identified the brain’s former owner as Robert Ley.

Ley’s name had often come up in my research. From 1933 until the end of World War II, he headed the German Labor Front, a Nazi governmental department that directed the working lives of the Third Reich’s citizens. How images of his brain had ended up mixed in with the personal and professional papers of the psychiatrist, Dr. Douglas M. Kelley, I could not imagine.

Over time I found out. And the story I pieced together on the fate of Robert Ley’s brain reveals much about the interpretation of the psychological testing that Kelley conducted on Ley and the other Nazis.

A U.S. Army major during the summer of 1945, Kelley had arrived in Nuremberg with orders to appraise the mental fitness of the Nazi leaders to stand trial for war crimes and crimes against humanity. Beyond this assignment, however, Kelley imposed on himself more challenging duties. Given unrestricted access to the men widely considered the worst criminals of the twentieth century, he hoped to find a common psychological thread running through the prisoners — a “Nazi personality” that could account for their heinous misdeeds. If Kelley could identify a psychiatric disorder or set of psychological traits that the Nazis shared, he might be able to isolate other people among us capable of committing horrific crimes.

To assess the Nazis, Kelley intensively interviewed the men in their prison cells, but he also used a battery of psychological tests that had recently gained prominence. He relied most heavily on the Rorschach inkblot test, an assessment he had championed in the U.S. starting in the 1930s. The Rorschach test presents subjects with abstract inkblot images. Projecting into the images their fantasies and needs, the subjects explain what they see. Kelley was one of the best trained and most talented of the American Rorschach interpreters.

Nearly all of the 22 top Nazis indicted to stand trial before the International Military Tribunal at Nuremberg — including Hermann Göring, long Hitler’s second in command and the Third Reich leader that Kelley found the most intriguing — took the Rorschach. The most startling results came from Robert Ley, who misnamed colors, offered confused descriptions, and gave responses that lacked context and sense. Kelley advanced a diagnosis of brain damage in Ley’s frontal lobe, even though the prisoner’s physical exams had revealed no sign of neurological problems.

To Kelley, Ley’s frequent angry outbursts in prison and his illogical speech offered further evidence of frontal lobe injury. Alone among the indicted Nazis, Ley might have received from Kelley a declaration of mental incompetence, but the doctor had no way to be sure of his diagnosis.

No way, that is, until Ley committed suicide in his cell on October 24, 1945. He asphyxiated himself using the hem of a towel, the zipper of his jacket, and the pipe of his toilet. Kelley responded by declaring that Ley “did me personally a particular favor, because his was the one brain that I suspected would have organic damage.” Kelley asked a colleague to remove the brain from Ley’s body and smuggled it out of Nuremberg and into the hands of a friend, Webb Haymaker, a renowned neuropathologist at the Army Institute of Pathology in Washington, D.C.

Kelley asked Haymaker to examine the brain for signs of the frontal lobe damage that the psychiatrist had diagnosed. Haymaker did so and found “a long-standing degenerative process of the frontal lobes” in the region that Kelley had predicted was injured. In the process, Haymaker shot the photographs that I found with Kelley’s possessions more than 60 years later.

Kelley rejoiced, a celebration that proved premature. Haymaker wasn’t yet quite finished with Ley’s brain. Two years later, seeking another point of view, he sent samples of the organ to pathologists at the Langley Porter Clinic in San Francisco. The examination there produced no clear evidence of damage to the frontal lobe. In a letter to Kelley, Haymaker broke the bad news that Ley’s brain abnormalities “were of a lesser scope than we had at first believed. Personally, I think maybe we had better let the whole thing lie buried, as the degree of change [in the brain] could be subject to a difference of opinion.”

By then, Kelley’s study of the German leaders had already caused the psychiatrist enough distress. The Nazis shared no significant psychological traits and were normal, Kelley had concluded. There was no Nazi personality. Struggling to understand this verdict, Kelley redirected his energies to criminology and fell into a downward spiral of alcoholism, workaholism, and eruptions of anger. He took his own life in 1958 by swallowing cyanide, just as his favorite subject Göring had done a dozen years earlier in Nuremberg.


Kelley, Douglas M. 22 Cells in Nuremberg; a Psychiatrist Examines the Nazi Criminals. New York: Greenberg, 1947.

Zillmer, Eric A., Molly Harrower, Barry A. Ritzler, and Robert P. Archer. The Quest for the Nazi Personality: A Psychological Investigation of Nazi War Criminals. Routledge, 1995.

Image via Wikimedia Commons.

http://brainblogger.com/2013/10/09/the-curious-case-of-robert-leys-brain/feed/ 0
Human Dissection, Part 2 – Murderers, Body Snatchers and Burkers http://brainblogger.com/2011/09/17/human-dissection-part-2-murderers-body-snatchers-and-burkers/ http://brainblogger.com/2011/09/17/human-dissection-part-2-murderers-body-snatchers-and-burkers/#comments Sat, 17 Sep 2011 14:25:28 +0000 http://brainblogger.com/?p=7141 The history of medical students using human cadavers for dissection is a long and choppy one (no pun intended). Before Christianity, mutilation and use of human corpses was widespread. It is common knowledge that ancient Egyptians mummified their dead, dissecting and preserving specific organs. After Christianity became a widespread influence, however, the practice of dissecting human cadavers to study was considered taboo. The remaining 1700 years of European history in regards to human dissection ranged from the illegal body-snatching of human corpses to using only the cadavers of executed criminals as subject of study, and finally, making human dissection a regulated scientific practice.

Unlike France, which made it easier for medically minded men to obtain human bodies for dissection, Great Britain had a much more difficult time obtaining and normalizing the practice of using real human cadavers for study.

The Murder Act

In 1752, Great Britain passed an act for better preventing the horrid Crime of Murder. Otherwise known as the Murder Act, it essentially legalized the dissection of human bodies for medical study. More specifically, it gave the bodies of executed murderers to medical universities, so that they could be dissected for medical examination. The Murder Act served dual purposes: it was an extra deterrent for murder, as no one wanted to be dissected, but it also made a legal supply of fresh human cadavers on which medical students could study.

The executions of murderers were usually held at eight o’clock on Monday mornings, after which the body of said murderer would be left hanging for an hour for good measure. This would ensure that he (or she) was actually dead. After the execution, the body would then be delivered to the London’s Royal College of Surgeons, who’s College Master, dressed in his full regalia, would receive the body, taking it into the college for dissection and study.

At this time, the Royal College of Surgeons had the only legal right to human bodies, and as the supply of murderer-bodies was relatively scarce, there were few legal avenues for human cadavers. Because of this, they often performed a “proper examination,” consisting of little more than a cut over the sternum, and then donated the rest of the body to other London hospitals, private schools of anatomy, or other well-connected surgeons or students. This was one way in which the Royal College remained in favor with London’s most powerful surgeons.

Body Snatchers and Burkers

Fortunately (or perhaps not so fortunately), there were very few executions that were eligible for human dissection, averaging about a dozen a year. This meant that there had to be another means of obtaining human remains to study. This meant that medical students and surgeons often had to raid the gravesites of the recently deceased, obtain unclaimed bodies from almshouses or hospitals, or purchase freshly dead bodies from body snatchers and murderers.

In the 1820’s, two men made a killing (pun intended) by murdering people and selling their bodies to medical professionals. In 1828 it was discovered that William Burke and William Hare, two Irishmen living in Edinburgh, had murdered and sold the bodies of at least sixteen men and women for dissection by doctors at the medical school of Dr. Robert Knox.

The two men had carefully crafted a method of murder, which went completely undetected by the medically-minded men to whom they sold their prey. They would douse their unsuspecting victim with a good amount of liquor, and once passed out drunk, would suffocate them in their sleep. Their method was so complete that the doctor dissecting the patient would have no knowledge of the way in which the person died. They would then pack the body in a tea-chest and carry it through the streets of Edinburgh, where they would present it to the school of Dr. Knox for sale.

By the time they two criminals had been caught, it was found out that Knox had personally bought the bodies of sixteen of Burke and Hare’s victims.

Burke and Hare were not the only ones to follow the same model, however, as further rings of murderers were later found in London, each of whom sold the bodies of their victims to doctors for medical study.

An Ironic Twist of Fate (or, A Fitting Punishment)

The most interesting (and fittingly ironic) part of the history of Burking, as it came to be called, was the fate of the movement’s founders. Having been found out by the authorities, Hare sold out his partner in crime, Burke, and escaped the noose by providing enough evidence to convict his partner of murder. Burke, on the other hand, was not so lucky. He was found guilty of murder, and sentenced to both death and dissection.

The hanging and subsequent dissection of Burke turned out to be a festive affair, as tens of thousands of spectators jostled for a peek. Small pieces of Burke’s body were souvenired out at the dissection, and a purse was made out of his skin. This Burke-skin purse is still on display at the Police Museum in Edinburgh.

The Anatomy Act

As a direct result of the Burking controversy, and the ensuing episodes of Burking in London, the British Parliament passed an Act for Regulating Schools of Anatomy, in 1832. In addition to rescinding the Murder Act, which allowed the dissection of executed murderers, this new act provided additional legal ways in which doctors could obtain human bodies for dissection. It also established a system of licensing physicians for “anatomical examinations” and specified where such “examinations” would take place.

After the implementation of the Anatomy Act, all physicians who were in legal possession of human bodies, such as hospitals and workhouses, could make the bodies available for dissection. Essentially, the act made it legal to claim all bodies that had been unclaimed for more than 48 hours for dissections.

After this legislation the heretofore theatrical act of hanging a murderer, delivering his body to the Royal College of Surgeons, and publicly displaying it’s dissection was regulated into a scientific affair. The British Anatomy Act of 1832 changed the scientific method of dissection from a public punishment for a crime into a regulated, scientific practice.

By the middle of the 19th century, the “necessary inhumanity” of human dissection for medical study had transitioned from a moral dilemma, to a punishment for vicious crimes, to a purely scientific practice. That being said, there was (and still is) serious moral, ethical, and legal issues regarding the use of human cadavers for study, and the use of human bodies for scientific study is still an important issue in contemporary medicine.


MacDonald, Helen. Human Remains: Dissection and Its Histories. London: Yale University Press, 2005.

Warner, John H., and Edmonson, James M. Dissection: Photographs of a Rite of Passage in American Medicine: 1880-1930. New York: Blast Books, 2009.

http://brainblogger.com/2011/09/17/human-dissection-part-2-murderers-body-snatchers-and-burkers/feed/ 4
Human Dissection – From Galen to the Great Revelations of Andreas Vesalius http://brainblogger.com/2011/08/20/human-dissection-from-galen-to-the-great-revelations-of-andreas-vesalius/ http://brainblogger.com/2011/08/20/human-dissection-from-galen-to-the-great-revelations-of-andreas-vesalius/#comments Sat, 20 Aug 2011 12:00:28 +0000 http://brainblogger.com/?p=7071 Humans have been cutting open cadavers and dissecting corpses almost since the beginning of recorded human history. Ancient Egyptians went to great lengths to mummify their dead, including cutting open bodies, dissecting out organs, and preserving remains. Following closely in their footsteps, ancient Greeks also pursued human dissection, in much more of a scientific vein. Rather than an immoral view of desecrating the human body, Greeks thought of human dissection as an extension of the empirical nature of science.

Two early Greek physicians, Erasistratus and Herophilus made the first systematic, scientific explorations of the human body, and they are now thought to be the first physiologist and the founder of human anatomy, respectively. Together, these two doctors advanced the study of the interior of the human body, which was once a sacrosanct mystery, into a field of scientific query. Herophilus dissected the entire human body, and differed from the authority at the time, Aristotle, when he claimed that consciousness was stored in the brain rather than in the heart. Erasistratus explained the workings of human organs in mechanical terms.

Unfortunately, the spark of empirical study of human anatomy that these two physicians should have set off did not light, as their two schools reverted to bickering over theoretical disputes. As if the fire of human dissection was not already flickering, it was snuffed out completely with the burning of the library of Alexandria and the widespread introduction of Christianity, when it became impossible to dissect human bodies anywhere in the Hellenistic world. This marked a great transition in the study of human anatomy, and for hundreds of years the European world valued the sanctity of the church more than scientific inquiry.

Galen’s Anatomical Influence

The first of the great anatomists was Galen of Pergamon (AD 130-200) who made vast achievements in the understanding of the heart, the nervous system, and the mechanics of breathing. Because human dissection was forbidden, he performed many of his dissections on Barbary apes, which he considered similar enough to the human form. The system of anatomy he developed was so influential that it was used for the next 1400 years. Galen continued to be influential into the 16th century, when a young and rebellious physician began the practice of using real human bodies to study the inner workings of the human body.

Enter Andreas Vesalius

Vesalius, who came from a line of four prominent family physicians, began as a young and precocious anatomy student. As a child, he would often catch and dissect small animals, and later as a medical student, he would go to great lengths to obtain human remains to study. At age 18, he entered the University of Paris, where they strictly adhered to the antiquated works of Hippocrates and Galen, and the medical professors thought it below themselves to perform actual dissections. During any actual demonstrations, the professor would lecture on high as a barber-surgeon did the actual cutting on the dissection floor.

Unlike Britain, in which only the bodies of executed murderers could be used for dissection by medical men, France’s revolutionary edicts made it easy for medically minded men to obtain bodies to study. This did not mean, however, that lowly students such as Andreas Vesalius would have direct access to any of these bodies.

Vesalius and other like-minded anatomy students would raid the gallows of Paris for half-decomposed bodies and skeletons to dissect. They would sometimes find the courage to go outside of the walls of Paris, braving the feral dogs and stench, in order to steal cadavers from the mound of Monfaucon, where the bodies of executed criminals were hung until they disintegrated.

Rather than considering dissection a lowering of his prestige as a doctor, Vesalius prided himself in being the only physician to directly study human anatomy since the ancients. During only his second anatomical lecture, Vesalius stepped onto the dissecting floor, took the knife away from the barber-surgeon, and began cutting at the cadaver himself, demonstrating his great skill with the knife.

Vesalius’ Rise

His professors quickly noticed his great knowledge and ability, and by the age of 22 he was giving his own anatomical lectures, all of which centered on a dissection. Some of his subjects were animals, but more often than not they were human cadavers. He also suspended a skeleton above the dissecting table during his lectures, and taught that the skeleton was the foundation of the body.

Similar to the influential works of Galen, Vesalius’ work on human anatomy revolutionized the scientific world. The publication of his book De humani corporis fabrica (On the Fabric of the Human Body) stands as a monument in the history of science and medicine. Whereas his contemporaries relied on the antiquated accounts of Galen, who dissected animals rather than humans, Vesalius relied on the actual human body to inform his theories.

Vesalius’ work provided the first accurate description of the internal structures and workings of the human body, and more importantly, revived the use of the scientific method for studying human anatomy. The birth of Christianity supplanted hands-on, empirical study of the human body with the philosophical reliance on a Supreme Intellect. This idea was that every human body part was a product of the Supreme Intellect’s design, whether or not it coincided with what actually lay out on the dissecting table.

Vesalius, on the other hand, could not support the ancient writings of Galen, who relied on this idea of Supreme design. Although he revered him highly, Vesalius often found that his study of the human form did not fit with the descriptions provided by Galen, whose descriptions often matched the anatomies of dogs, apes, or sheep. He eventually found over 200 discrepancies such as these, and publicly announced his break from the Galenic tradition.

A Revolutionary Physician

De humani corporis fabrica, published in 1543, was a turning point in the history of modern medicine. For the first time, the understanding of medicine and the treatment of disease was rooted in an accurate representation of the human body. This book revolutionized the medical world. Similar to the findings of Copernicus and Galileo, Vesalius’ works help spur an empirically-based, scientific study of the world around us.

Like his fellow revolutionary scientists, Vesalius’ masterpiece was met with harsh criticism. Many of these criticisms understandably came from the church, but the most strident of all came from Galenic anatomists. These critics vowed that Galen was in no way incorrect, and so if the human anatomy of which he wrote was different from that which was proved by Vesalius, it was because the human body had changed in the time between the two.

As a response to the harsh criticisms of his work, Vesalius vowed to never again bring forth truth to an ungrateful world. In the same year that he published de humani, he burned the remainder of his unpublished works, further criticisms of Galen, and preparations for his future studies. He left medical school, married, and lived out the rest of his conservative life as a court physician.

Even though Vesalius abandoned further studies of human anatomy, before he died he recognized the great contributions he had made to the scientific world. He understood that his revelations represented an awakening of inquiry into the human body, and a reliance on facts, rather than adherence to an antiquated text.

The remainder of the history of human dissection is just as rocky. Although France in the 16th century was open-minded about the use of human cadavers for scientific inquiry, the rest of the European world was not so revolutionary. Great Britain had its own tradition of illegal trade in dead bodies, and even the United States had a hard time opening up to the idea that human bodies should be used for scientific study.

Continue to Part 2 – Murderers, Body Snatchers and Burkers.


Adler, Robert E. Medical Firsts: From Hippocrates to the Human Genome. Hoboken, New Jersey: John Wiley &Sons, Inc., 2004.

MacDonald, Helen. Human Remains: Dissection and Its Histories. London: Yale University Press, 2005.

http://brainblogger.com/2011/08/20/human-dissection-from-galen-to-the-great-revelations-of-andreas-vesalius/feed/ 21
Extracting the Stone of Madness – The Search for the Cure to Insanity http://brainblogger.com/2011/05/27/extracting-the-stone-of-madness-the-search-for-the-cure-to-insanity/ http://brainblogger.com/2011/05/27/extracting-the-stone-of-madness-the-search-for-the-cure-to-insanity/#comments Fri, 27 May 2011 12:00:24 +0000 http://brainblogger.com/?p=6502 Both psychiatry and psychology have their roots in ancient practices and belief systems, which traced insanity back to the treatment of emotional disorders. Ancient Egyptians and Mesopotamians, more specifically, believed that all diseases, including mental ones, were the result of demonic influences on the soul.

Later, Greeks and Romans tried to create a rational approach to mental disorders by labeling them mental illnesses and writing texts examining their many manifestations. Nevertheless, most Greeks and Romans also believed that mental problems were caused by evil spirits, or the wrath of the Furies, the Greek goddesses of vengeance.

Trepanning also had its roots in the ancient world. Scientists have found many prehistoric skulls, which indicate that trepanning, or cutting a hole into a person’s skull, has been in practice since 10,000 BC. These prehistoric skulls have been found all over the world, in isolated geographic locations, indicating that prehistoric populations practiced trepanning independently of each other. This means that since the beginning of human history, people have been cutting into the head, opening a small hole in the skull.

But the question remains, why? Some early trepanned skulls indicate trauma, so perhaps it was performed as a treatment for injury. It may also have been performed as a healing method for headaches or epilepsy, to relieve the pressure within the skull. Other theories indicate that by cutting a hole on the top of ones skull, early populations believed that it would allow spirits to enter the soul, and to give the recipient magical powers.

On the other side of the coin, however, ancient trepanning may have been performed as a way to cure insanity, as it had been done in the Middle ages under the guise of cutting out stones of madness from the individual’s skull.

Cutting out Stones

After the Greco-Roman world was conquered by Christianity, insanity and mental illnesses went practically untreated. Those with mental disorders were compassionately taken care of physically, but their psychological needs or problems were not dealt with. In the Early Middle Ages, this compassionate attitude changed. Those with mental illnesses were no longer looked at as poor souls needing someone to take care of them. Mental illness was once again, as it had been hundreds of years prior, seen as a direct result of demonic possession of the soul and evil influences.

The prevailing theory that insanity was caused by evil demons made it necessary to treat, if not cure, the mentally ill. In the 15th, 16th, and as late as the 17th centuries, a group of charlatans came to the forefront to cure the diseases of the mind. These quacks were often men and women untrained in the medical sciences. Among them were astrologers, chemists, monks, nuns, alchemists, jugglers and street peddlers, all who claimed to have the cure for mental illness.

Mental illness, they said, was the result of a small stone inside the brain, and they could cure this disorder by trepanning the skull, and letting out the stone. It made sense, to a point, that an individual could produce mental results through physical intervention in the brain. Thus developed a mythical stone of madness, after removal of which, the insanity would be cured.

The great artist Hieronymus Bosch immortalized the scene of a physician trepanning the skull to remove stones in his painting The Cure of Folly, otherwise known as The Extraction of the Stone of Madness. This painting depicts the scene with a dry wit and sarcastic view of the removal of the stone of madness. The “doctor” in the scene is wearing a funnel hat, an early symbol of madness, indicating that he is also insane. He is trepanning the skull of a patient, in order to retrieve the stone from within the patient’s skull.

The rest of the painting is ripe with symbols of folly, foolery, madness and insanity. As a whole, the painting depicts not only the insanity of cutting out the “stone of madness” from a patient’s skull, but also reflects the reality of the situation. Bosch finished his painting around 1494, when charlatans and quacks came out in droves to cut into people’s heads. His painting is a sharp critique on the ridiculousness of it all.

Link to More Invasive Procedures

Whether ridiculous or not, the practice of removing stones from the heads of the insane continued as late as the 20th century, when practitioners would produce a small stone after the procedure, stating that they had removed it from within the brain.

It is not hard to see, that if removing stones was performed as late as the 20th century, how the practice of lobotomy could have developed. The theory that mental illness had a physical basis, rather than the result of demonic possession or evil influences, was sure to produce a more invasive form of cure for the insane. Lobotomy, which sought to cure mental illness through direct intervention and damage of the brain tissue, was a logical expansion of the cutting of stones and trepanning.

Luckily not all physicians of the mind went the same route as the stone-cutters (or the lobotomizers, for that matter). The fields of mental health developed into psychology and psychiatry, each of which focused on the mental health of the individual, although focused on treating the patient in different ways.

But this does not mean that trepanning is a dead science. It is commonly used in neuroscience, as well as in surgical procedures for injured patients. There are also modern alternative therapy groups, which advocate trepanning as a cure for a number of mental maladies. The recurrent use of trepanning is continuing a human tradition, started over 10,000 years ago with the earliest human. Whether to relieve headache pressure, let out demons in the skull, or to remove the stone of madness, trepanning is a fundamental part of our history of using medicine to better the human condition.


Franck, Irene M., and David M. Brownstone. Healers. New York: Facts on File, 1989.

Hollingham, Richard. Blood and Guts: A History of Surgery. New York: St Martin’s Press, 2008.

Shorter, Edward. A History of Psychiatry: From the Era of the Asylum to the Age of Prozac. John Wiley & Sons, 1997.

http://brainblogger.com/2011/05/27/extracting-the-stone-of-madness-the-search-for-the-cure-to-insanity/feed/ 3
The Strangling Angel of Children – Birth of Endotracheal Intubation http://brainblogger.com/2011/05/12/the-strangling-angel-of-children-birth-of-endotracheal-intubation/ http://brainblogger.com/2011/05/12/the-strangling-angel-of-children-birth-of-endotracheal-intubation/#comments Thu, 12 May 2011 12:00:06 +0000 http://brainblogger.com/?p=6459 Up until the beginning of the 1920’s in the United States and contemporarily in many parts of the world, diphtheria has been a leading cause of death in children. Referred to as “the strangling angel of children,” large outbreaks occurred in Europe and in America in the 18th century, and more recently in the 1990’s in Russia and Eastern Europe. In the western frontier of the US in the 19th century, illnesses were common, and epidemics such as cholera, smallpox, and typhoid fever came recurrently. Diphtheria was second only to malaria in taking the lives of young children. The disease was awful, causing pain, swelling of the neck and lymph nodes, and eventually suffocation and death.

To allow a patient to breathe, tracheotomies were common. This technique, which was fatal in nearly a quarter of all cases, was gradually replaced with endotracheal intubation, or the act of inserting a breathing tube directly into the patient’s trachea through the mouth. This method was so effective that it is still practiced today, in generally the same form as almost 100 years ago.

Diphtheria and Its Victims

Early scientists discovered the bacteria that cause diphtheria in the 1880’s, and they had developed an antitoxin for the disease as early as the 1890’s. Although the development and administration of a vaccine against diphtheria significantly decreased the prominence of the disease, it remains a stealthy killer of young children. In the United States today, diphtheria is rare, with no reported cases since 2003, but it remains endemic to many places around the world.

Diphtheria is an infectious disease, caused by a contagious bacterium, Corynebacterium diphtheriae, which primarily affects the mucous membranes of the respiratory tract, as well as the skin, and other sensitive areas of the body, including ears, eyes, and genital areas.

The symptoms of respiratory diphtheria include sore throat, fever, lethargy, difficulty swallowing, and difficulty breathing, and usually appear after a two to five day incubation period. As the disease worsens, a pseudomembrane often develops over the tonsils and nasopharynx. In other words, the bacterium collects and coagulates into a film, which covers the lining of the throat, causing swelling of the neck and difficulty breathing.

As the pseudomembrane gets larger, swelling worsens and lymph nodes become enlarged, giving the neck a “bull-neck” appearance. In untreated cases, this pseudomembrane will become large enough to obstruct the larynx and the trachea, slowly cutting off the airway, causing suffocation and death.

Unfortunately, diphtheria often affected the youngest of children, who were weaker and harder to treat. The pseudomembrane resulting from the infection would make it impossible for the patient to swallow, speak, and later, breathe. Early surgeons had made use of the tracheotomy, which was a small incision at the base of the throat. This allowed for a curved hollow tube to be inserted into the airway to allow the patient to breathe.

Tracheotomies bypassed the swelling of the pseudomembrane in the neck and opened an airway to allow the patient to breathe. They were, however, difficult, painful, and due to the risk of infection, often fatal.

From Tracheotomy to Intubation

Because it was a childhood disease, it is fitting that a pediatrician invented a better method of allowing young diphtheria patients to breathe. In 1858, Eugene Bouchut developed the method of endotracheal intubation, which bypassed the diphtheria pseudomembrane and opened the airway. This method was less invasive than the tracheotomy, and often resulted in a higher survival rate. Bouchut’s first experiment with intubation was by insertion of a small metal tube into the larynx, and simply leaving it in the airway until the pseudomembrane had resolved.

Bouchut’s invention did wonders for the medical world. Although it was at first negatively received, its use in medical procedures other than treating childhood cases of diphtheria was soon obvious. In 1878, endotracheal intubation was used in a surgery to remove vocal cord polyps, and later in treating patients with glottic edema.

After the turn of the century, intubation was used not only in patients requiring an airway, but as a way of administering general anesthesia. In 1919, Ivan Magill was the first to use endotracheal intubation, using a rubber tube rather than a metal pipe, to pass anesthetic gas directly into a patient’s lungs. Perhaps humorously, this was more than an easy way to administer anesthesia. Before intubation, gas anesthesia pipes were prone to leak, and surgeons, leaning close to anesthetic gas for hours at a time, would often fall asleep on top of their patients.

Endotracheal intubation was also used to administer drugs directly into the patient’s lungs. Mercury, for example, was administered in the form of a vapor, and was used as a treatment for membranous croup and ulcers of the mouth and throat, as well as for diphtheria.

Sometimes, good things come from bad, as in the case of diphtheria. An awful disease, which still causes hundreds, if not thousands, of deaths each year, led to the invention of a valuable piece of technology. Endotracheal intubation is now a common procedure in emergency rooms and operating rooms throughout the world. It is used in nearly all major surgeries and is a basic form of emergent care, and diphtheria, no longer the “strangling angel of children,” has now receded to an afterthought of the American way of life.


Bethard, Wayne. Lotions, Potions and Deadly Elixirs: Frontier Medicine in America. Lanhan, MD: Taylor Trade Publications, 2004.

Dary, David. Frontier Medicine: From the Atlantic to the Pacific, 1492-1941. New York: Alfred A. Knopf, 2008.

Hollingham, Richard. Blood and Guts: A History of Surgery. New York: St Martin’s Press, 2008.

Schwartz, MD, Seymour I. Gifted Hands: America’s Most Significant Contributions to History. New York: Prometheus Books, 2009.

http://brainblogger.com/2011/05/12/the-strangling-angel-of-children-birth-of-endotracheal-intubation/feed/ 1
From Haircuts to Hangnails – The Barber-Surgeon http://brainblogger.com/2011/05/06/from-haircuts-to-hangnails-the-barber-surgeon/ http://brainblogger.com/2011/05/06/from-haircuts-to-hangnails-the-barber-surgeon/#comments Fri, 06 May 2011 12:00:52 +0000 http://brainblogger.com/?p=6457 Imagine your monthly beauty routine. Perhaps you go to the salon and get a manicure and pedicure, or to the hairstylist for a cut and dye. Every six months you go to a dentist to have your teeth cleaned and examined, and to the doctor once a year for your physical exam. Three hundred years ago, your routine would have been much the same, except for one thing. It would all have been done at the barbershop.

Barbers in the modern period are known to do mainly one thing: cut hair. For much of the last hundred and fifty years, their red and white striped barber poles signified their ability to produce a good clean shave and a quick trim. This was not always the case, however.

Up until the 19th century barbers were generally referred to as barber-surgeons, and they were called upon to perform a wide variety of tasks. They treated and extracted teeth, branded slaves, created ritual tattoos or scars, cut out gallstones and hangnails, set fractures, gave enemas, and lanced abscesses. Whereas physicians of their age examined urine or studied the stars to determine a patient’s diagnosis, barber-surgeons experienced their patients up close and personal. Many patients would go to their local barber for semi-annual bloodletting, much like you take your car in for a periodic oil change.

Barbers through the Ages

Beginning in the Egyptian era, throughout Roman times and in the Middle Ages, barbers were known to perform much more than simple haircuts and efforts of vanity. They were called on to perform minor surgical operations, pull teeth, and embalm the dead. Their many duties made them the surgeons of the day.

The barbering occupation began in ancient Egypt, where both men and women shaved their heads and wore wigs, and higher-ranking officials often shaved their entire bodies. Egypt’s wealthy citizens and royalty were often tended to by personal slaves, who dressed their wigs, cleaned, and shaved their bodies. Gradually a working class of independent barbers developed, who would perform these duties for all members of society. Personal barbers would also perform additional duties, such as cleaning ears and examining teeth.

The Greeks, in their heyday, wore long hair and curled beards, which required much tending. Alexander the Great, fearing that enemies would use long hair as handles in battle, encouraged his men to cut their hair and shave their beards, which required a skilled set of haircutters. These expert Greek barbers spread along with the widening influence of the Greek state, eventually entering Roman territory, where they set up stalls in the city streets.

Many settled communities around the world also employed a set of skilled barber-surgeons. Cortez encountered barbers upon entering Tenochtitlan; European colonists relied on the surgical abilities of the newfound Indian populations in American colonies; and Chinese traveling barbers wandered through the streets, ringing a bell to announce their presence. Because barbers employed an array of sharp metal tools, and they were more affordable than the local physician, they were often called upon to perform a wide range of surgical tasks.

Barbers differed greatly from the medicine man or shaman, who used magic or religion to heal their patients. Surgery was considered a “lesser art,” and was not to be performed by the magical preist-physicians that ruled the mystical connection between the soul and the body. But this did not diminish their presence or usefulness.

In the ancient Mayan civilization, they were called upon to create ritual tattoos and scars. The ancient Chinese used them to castrate eunuchs. They gelded animals and assisted midwives, and performed circumcisions. Their accessibility and skill with precise instruments often made them the obvious choice for surgical procedures.

From Barbers to Barber-Surgeons

After the fall of the Roman Empire, barbers were a staple of monastery life. Monks required barbers to shave their faces and tonsures, the round area on the top of the head. At this time, physicians were forbidden to perform surgical procedures as the body was considered holy, and should not be violated by the hands of doctors. But monks, who also practiced as doctors, considered operations and surgical procedures as dirty and beneath their dignity, and passed those responsibilities to barbers.

One of the keystones of the barber’s surgical duties was bloodletting. Bleeding was done for a number of reasons, but the basis of the ideas was that by letting out the bad or morbid blood in the body, it would be replaced by fresh healthy blood. Bleeding of patients was done in many ways, including cupping and using leeches, but the most common was cutting a patient’s vein and letting the blood flow into a small basin.

As bleeding became one of the main responsibilities of the barbers, they came to signify their presence in the marketplace with a red and white striped pole, the colors reminiscent of the blood and rags used in bloodletting. This pole was usually capped with a small basin, used to symbolize the vessel with which they would collect the blood. Later, barbers placed bowls of blood in their shop windows, to indicate that they performed bloodletting services.

As they also often pulled teeth, they would string a row of teeth in front of their windows to alert potential customers of their services. In 1307, the people of London decried the bowls of blood sitting in barber’s windowsills, and passed a law that all fresh blood must be carried to the Thames.

In 1163, a papal decree forbade monks from shedding blood, and so all surgical tasks fell to the skilled barbers. In addition to bloodletting, barbers were called upon to perform almost all surgical and dental operations, as well as more unsavory occupations, such as embalmings and autopsies.

During the 12th and 13th centuries, secular universities began to develop throughout Europe, and along with an increased study of medicine and anatomy came an increased study in surgery. This led to a split between academically trained surgeons and barber-surgeons, which was formalized in the 13th century. After this, academic surgeons signified their status by wearing long robes, and barber-surgeons by wearing short robes. Barber-surgeons were thus largely referred to as “surgeons of the short robe.”

Barbers of both the short and long robes coexisted precariously during the next few centuries. In France in 1361, barbers and surgeons combined to form a united guild, but more often than not they formed separate guilds. The Barber’s Company guild, formed in 1462, merged with the surgeon’s guild in 1540. These guilds both helped ensure quality of service, by employing inspectors to verify the skill of the barber, as well as compete with other craftsmen, by negotiating contracts.

Certain barber-surgeons became very skilled at performing surgical procedures. Ambroise Paré, considered the father of modern surgery, was one such barber. Originally a wound-dresser at the Paris Hotel Dieu, Paré made history by his unconventional handling of gunshot wounds and open injuries, and later rose to be the surgeon to the royal family of France.

The Separation of Barber and Surgeon

Gradually, the split between barbers and surgeons became more severe, and in 1743 in France and 1745 in England, barber-surgeons who cut or shaved hair were not allowed to perform surgery. In 1800 the College of Surgery was founded in England, and the last practicing barber-surgeon in England died in 1821.

Dentistry, which was another one of the many responsibilities of the barber-surgeon, was also gradually relegated to its own specialty. Surgeon-dentists were practicing as early as the 17th century.

Barbers, who had once performed an entire plethora of surgical procedures, were now primarily responsible for the care of a patron’s hair and nails. Increasingly in the 17th and 18th centuries, barbers became wigmakers for the European elite, some of them eventually splitting off into their own specialty as hairdressers.

Even so, the barber-surgeons skills remained in high demand as late as 1727, when John Gay penned his poem, The Goat Without a Beard:

His pole, with pewter basins hung,
Black, rotten teeth in order strung,
Rang’d cups that in the window stood,
Lin’d with red rags, to look like blood,
Did well his threefold trade explain,
Who shav’d, drew teeth, and breath’d a vein.

It is hard to imagine going to the barber shop today to get a boil lanced or a tooth pulled, or for an occasional bloodletting, but for much of human history this was the case. As medicine, and surgery, advanced, so did the profession of barbery. From haircuts to hangnails, they did it all.

The barber shop was the common ancestor of many different occupations today; surgeons, dentists, tattooists, embalmers, doctors, hairdressers, wigmakers, manicurists, pedicurists, and more can all source their ancestry to that one common denominator: the barber-surgeon.


Dary, David. Frontier Medicine: From the Atlantic to the Pacific, 1492-1941. New York: Alfred A. Knopf, 2008.

Franck, Irene M., and David M. Brownstone. Healers. New York: Facts on File, 1989.

Hollingham, Richard. Blood and Guts: A History of Surgery. New York: St Martin’s Press, 2008.

http://brainblogger.com/2011/05/06/from-haircuts-to-hangnails-the-barber-surgeon/feed/ 12
Medical Art Imitating Life http://brainblogger.com/2010/06/17/medical-art-imitating-life/ http://brainblogger.com/2010/06/17/medical-art-imitating-life/#comments Fri, 18 Jun 2010 01:10:52 +0000 http://brainblogger.com/?p=5078 The ideal human body shape has evolved — for better or worse — over the course of human history. Its depiction in art parallels society’s beliefs regarding what is and is not attractive and desirable. The representation of the human form has, too, undergone a metamorphosis in the oft-forgotten field of medical illustration. With the recent 150-year anniversary of Henry Gray’s classic Anatomy, a look back at the growth and change of anatomical illustration showcases the paradigm shifts in beauty and human appeal.

The most primitive medical illustrations date back thousands of years to crude drawings of hunters and prey, with vital organs delineated with amazing accuracy. The drawings represented the most basic of needs – to kill animals with as much precision as possible. As the human race grew and learned, ancient cultures began to view medical illustration and anatomical drawing as a science, though less exact and formal than medicine itself. Medical artists of ancient Greece and Rome valued prolific production of artistic creations, and ultimately sought to represent the human body more as a pleasing form and less as a scientific tool for education and research purposes. Given the conditions and circumstances of the day, Romans often depicted medical illustrations as battle scenes. They attempted to emphasize the human form, but its focus was often lost in the abundance of details.

Art of the Renaissance combined realism with idealism, and anatomical illustration echoed these conventions. Many medical artists of this period presented the human body in dramatic action, representing philosophical and theological ideas about human nature. Artists’ egos tended to get in the way of the objective portrayal of the human body, as the artists claimed to be the discoverers of knowledge regarding the human body.

During the Victorian era, society valued full-figured bodies, and medical illustrators followed suit. Voluptuous women exemplified health and fertility, and robust men were signs of wealth and prosperity for artists of the day. As art and medicine continued to evolve, medical artists sought realism in their work, and went so far as to produce large, cumbersome medical texts with life-size drawings of human body parts. Gray’s Anatomy, first published in 1858 and illustrated by Henry Vandyke Carter, sought to simplify the presentation of the human body but maintain the realism. Gray aspired to avoid style and reject artistic and societal conventions in the illustrations and represent the body in its most basic form for use in education. Ironically, his avoidance of style resulted in a style all its own and gave rise to one of the most enduring medical texts, even though most of the original illustrations have been replaced in newer editions of Anatomy.

As the human body came to be viewed as a machine, a function of science and medicine, slenderness and athleticism became more popular and appealing. Medical artists now depict the human form with mathematically-calculated precision and proportions that demonstrate society’s ideals of youth and symmetry. But, the field of medical illustrating itself has evolved from artists who got paid to draw dissected cadavers to scientists who combine an understanding and appreciation of science with an eye for detail and form, collaborating with physicians to teach and to learn.


Bonafini BA, & Pozzilli P (2010). Body weight and beauty: the changing face of the ideal female body weight. Obesity reviews : an official journal of the International Association for the Study of Obesity PMID: 20492540

Kemp M (2010). Style and non-style in anatomical illustration: From Renaissance Humanism to Henry Gray. Journal of anatomy, 216 (2), 192-208 PMID: 20447244

Pearce JM (2009). Henry Gray’s Anatomy. Clinical anatomy (New York, N.Y.), 22 (3), 291-5 PMID: 19280653

http://brainblogger.com/2010/06/17/medical-art-imitating-life/feed/ 5
A Special Thanks – Remembering a Man Who Remembered No One http://brainblogger.com/2008/12/07/a-special-thanks-remembering-a-man-who-remembered-no-one/ http://brainblogger.com/2008/12/07/a-special-thanks-remembering-a-man-who-remembered-no-one/#respond Sun, 07 Dec 2008 15:31:55 +0000 http://brainblogger.com/?p=2001 In a way it is ironic, how many researchers, psychology students, and cognitive neuroscientists worldwide will remember Mr. Henry G. Molaison and he did not remember a single person he met after his brain surgery, circa 1953.

Henry G. Molaison, formerly known to all as simply HM was one of the most widely studied patients in the field of cognitive neuropsychology for over fifty years. His participation in several studies provided significant contributions to the understanding of brain function and memory. HM passed away on December 3, 2008 in Hartford, Connecticut.

Molaison suffered from seizures for many years, the cause attributed to a bicycle accident he had at the age of nine. In 1953, at sixteen years of age, Molaison underwent what would have been cutting-edge surgery at the time, at the hands of Dr. William Scoville. Scoville localized HM’s epilepsy to both medial temporal lobes. During surgery, he removed parts of the affected lobes and the majority of HM’s hippocampus, parahippocampal gyrus, and amygdala. Post-surgery, HM’s seizures were cured but he suffered from anterograde amnesia, an inability to form new knowledge and memories. However, his working and procedural memory remained intact, as did many of his long-term memories from before the surgery.

As an undergraduate student in an abnormal psychology course, our class had studied HM and his contributions to the field (as well as the fascinating case of Phineas Gage). I’d never given it much thought then, but to think of it, Molaison was studied for over fifty years! He may not have had the opportunity to live a normal life but he did selflessly partake in experiments for the rest of his life. Maybe it wasn’t selfless if he couldn’t remember what he was doing there but I can imagine the exchange between him and his caretakers:

HM, you’re here as a study patient. You can’t form new memories…

Did he have to consent to participation every day?

Thank you HM. Your participation in various studies single-handedly revolutionized our knowledge of human memory.

http://brainblogger.com/2008/12/07/a-special-thanks-remembering-a-man-who-remembered-no-one/feed/ 0
George Huntington and the Disease Bearing His Name http://brainblogger.com/2008/09/13/george-huntington-and-the-disease-bearing-his-name/ http://brainblogger.com/2008/09/13/george-huntington-and-the-disease-bearing-his-name/#comments Sat, 13 Sep 2008 17:08:10 +0000 http://brainblogger.com/?p=735 George Huntington was the son and grandson of medical practitioners. He gave rise to a great interest in the origins of this disease which now bears his name.

At the age of 22, the year following his graduation from medical school at Columbia, George Huntington (1850-1916) made his contribution to medical research, publishing his report on a hereditary form of chorea in The Medical and Surgical Reporter in the April 13, 1872 issue. His publication became one of the classical descriptions of neurological disease.

Neuron-likeHuntington dealt with hereditary chorea as a reminiscence of his childhood spent on the eastern extremity of Long Island (New York), where, as the son and grandson of physicians, he recalled patients from his father’s practice. The hereditary chorea, as he called it, was a rare but terrible disease. Its essential features, tersely noted by Dr. Huntington in three short paragraphs, included a “hereditary nature,” a “tendency toward insanity” and “its manifestation as a grave disease in adult life.” He also commented on the grotesque nature of associated abnormal movements and the lack of knowledge of both the cause and cure of the disorder.

Huntington noted that the disorder was confined to “a few families, and has been transmitted to them, an heirloom from generations away back in the dim past.” He also noted that in unaffected offspring, “the thread is broken and the grandchildren and great-grandchildren of the original shakers may rest assured that they are free from the disease.” Huntington in his description states that the first symptoms usually occur at an adult age, and he delineates the development of the chorea:

The movements gradually increase when muscles hitherto unaffected take on the spasmodic action, until every muscle in the body becomes affected (excepting the involuntary ones)…

On mental symptoms he writes:

As the disease progresses the mind becomes more or less impaired, in many accounting to insanity, while in others mind and body gradually fail until death relieves them of their suffering.

Huntington’s disease is of tremendous interest to neuropsychiatry because it has a known cause and manifests in changes in behavior, cognition, and affect. It is known to be caused by a mutation resulting in trinucleotide CAG repeats (polyglutamine) on the Huntington protein encoded on the short arm of chromosome 4. Neuropsychiatric studies of Huntington’s disease may lead to breakthroughs in understanding the neuropathological correlates of psychiatric disorders.


Neylan, T.C. (2003). Neurodegenerative Disorders: George Huntington’s Description of Hereditary Chorea. Journal of Neuropsychiatry, 15(1), 108-108. DOI: 10.1176/appi.neuropsych.15.1.108

http://brainblogger.com/2008/09/13/george-huntington-and-the-disease-bearing-his-name/feed/ 1
Aloysius “Alois” Alzheimer http://brainblogger.com/2007/11/04/aloysius-alois-alzheimer-neuro-nerd/ http://brainblogger.com/2007/11/04/aloysius-alois-alzheimer-neuro-nerd/#comments Sun, 04 Nov 2007 17:12:04 +0000 http://brainblogger.com/2007/11/04/aloysius-alois-alzheimer-neuro-nerd/ Neuro_Nerds2.jpgOn November 3, 1906, a key paper from a German physician at the Royal Psychiatric Clinic at Munich University, described a case of dementia and altered behavior in Frau Auguste Deter, who had died 7 months earlier. Although dementia was a commonly diagnosed symptom of the day, the paper was unique because for the first time, its symptoms and pathological features under the microscope — ‘neurofibrillary tangles’ and ‘plaques’ were described together. Alois Alzheimer had previously worked as a colleague with Franz Nissl (who described a popular method of staining brain sections with silver, thus enabling one to see brain cells under the microscope – a method in use even today) at Frankfurt am Main. This enabled him to study accurately the features of Frau Auguste’s brain, which was sent to lab at Munich, where he was working alongside Dr Kraeplin, who was well-known for his classification of schizophrenia. In his eight edition textbook Psychiatrie, he eponymously mentioned ‘Alzheimer’s disease’ as a distinct subcategory of senile dementia.

Alois, whose name appears in one of the commonest eponymous conditions of our times (Alzheimer’s disease affects nearly 15 million people worldwide) was born on June 14 1864 in Bavaria, the son of a notary public official in the family’s hometown, Markbreit. After qualifying form Wurzberg University in 1887, he gradually developed an interest in neuropathology (studying brain diseases under the microscope) while working as a psychiatrist. In 1901, Dr Alzheimer first met 51 year old Auguste Deter when he was working at the Frankfurt Asylum, with strange behavioral symptoms and a remarkable poor short term memory. Over the coming years, he would develop an obsession about her case, culminating in his detailed examination of her brain after her death, the microscopic slides of which were re-discovered recently, and reported in 1997.

Appointed Professor of Psychiatry at Breslau in 1912, his tenure was short-lived as he fell ill on the train on his way to Breslau in 1915, and died from a complicated case of streptococcal sore throat which resulted in subsequent rheumatic fever, kidney failure and heart failure at the age of 51.

http://brainblogger.com/2007/11/04/aloysius-alois-alzheimer-neuro-nerd/feed/ 1
Arnold Pick’s Disease http://brainblogger.com/2006/06/16/neuro-nerds-arnold-picks-disease/ http://brainblogger.com/2006/06/16/neuro-nerds-arnold-picks-disease/#comments Fri, 16 Jun 2006 18:00:28 +0000 http://brainblogger.com/2006/06/16/neuro-nerds-arnold-picks-disease/ Neuro_Nerds.jpgPick’s Disease is a rare and fatal degenerative disease of the nervous system. Clinically there are major overlaps with Alzheimer’s presenile dementia.

Arnold Pick was born of German-Jewish parents in a village called Velke Mezirici (Gros-Meseritsch) in Moravia. He studied medicine at Vienna and as a student was assistant to the neurologist Theodor Hermann Meynert (1833-1892). He obtained his doctorate in 1875 and subsequently was assistant to Alexander Karl Otto Westphal (1863-1941) in Berlin, at the same time as Karl Wernicke (1848-1905) worked in that unit. All three of them influenced Pick’s work on aphasia. Late 1875 Pick left Berlin for the position as second physician in the Grossherzogliche Oldenburgische Irrenheilanstalt in Wehnen. This institution later played a disreputable part in the German politics of euthanasia, which began in the 1920s and culminated with mass murders and sterilisations of the “racially inferior” and “unworthy lives”.

Pick undertook extensive pathological studies of patients with neuropsychiatric diseases, and his work on the cortical localization of speech disturbances and other functions of the brain won him international acclaim. In addition to more than 350 publications, many of them on apraxia and agrammatism, Pick wrote a textbook on the pathology of the nervous system.

Pick’s ability to record the history of a psychotic or even mute patient was legendary. His secretary was a manic-depressive and an inmate of the asylum in which he worked.

Pick collected an enormous library which gave him great pleasure. At his home they reached to the ceiling and were piled on the floor. When he started on a vacation, some volumes of Johann Wolfgang von Goethe and Thomas Carlyle went into the large case of medical books. He was also a great music lover.

Arnold Pick died of septicaemia in 1924, 73 years old, following a bladderstone-operation.

Article excerpted from whonamedit.com.

http://brainblogger.com/2006/06/16/neuro-nerds-arnold-picks-disease/feed/ 2