Quacks & Hacks: Walter Freeman and the Lobotomobile

 

walter_freeman-lobotomy

On 12 November 1935, a Portuguese neurologist named Antonio Egas Moniz [below right] became the first individual to perform what would later be known as a lobotomy. Moniz’s work built upon that of the 19th-century Swiss psychiatrist, Gottlieb Burkhardt, who performed a series of operations in 1888 in which he removed sections of the cerebral cortex from six patients under his care at the Préfargier Asylum. Moniz’s early experiments involved drilling holes into patients’ skulls and pouring alcohol into the frontal cortex in order to sever nerves; and coring out regions of the brain with hollow needles.

egas4Moniz’s lobotomy quickly became a popular treatment for various mental conditions, putting an end to the therapeutic nihilism that dominated the psychiatric profession in the Victorian era. Suddenly, doctors believed they could “cure” patients whom they had previously deemed beyond help. Within a decade, the lobotomy became so esteemed that Moniz was awarded a Nobel Prize for his role in developing it.

During this time, Moniz’s procedure was adopted (and adapted) by the American neuropsychiatrist Walter Freeman, who performed the first lobotomy in the United States in 1936. Freeman won acclaim for his technique, and people all over the country began lining up to get their lobotomies, including Rosemary Kennedy [below]—sister to the man who would later become President of the United States. Rosemary was described by members of her family as a rebellious child who was prone to violent mood swings while she was growing up. In November 1941, Rosemary’s father took her to see Freeman, who diagnosed the 23-year-old girl with “agitated depression” and suggested she undergo a lobotomy to correct her erratic behavior. [Interestingly, 80 percent of the lobotomies performed in the US in those early years were carried out on women].

rosemary-kennedy-01-435Freeman performed the operation right then and there on Rosemary, without her mother’s knowledge. Shortly afterwards, it became clear that something had gone terribly wrong. Rosemary could no longer speak, and her mental capacity was equivalent to that of a toddler. Her father institutionalized her, telling people that his daughter was mentally retarded rather than admitting that her condition was due to a failed brain operation. It was only after his death decades later that the truth behind her condition was revealed. Rosemary never did recover her ability to speak coherently, and remained in care till her death in 2005 at the age of 86. She was the first of her siblings to die of natural causes.

The incident did little to damage Freeman’s reputation, who soon began looking for a more efficient way to perform the operation without drilling directly into the skull. As a result, he created the transorbital lobotomy in which a pick-like instrument was forced through the back of the eye sockets to pierce the thin bone that separates the eye sockets from the frontal lobes. This procedure—which later became known as the “ice-pick” lobotomy—could be performed in under ten minutes without anesthetic.

a3524174610_16

 

Freeman took to the roads with his ice-pick and hammer, touring hospitals and mental institutions around the country. He performed ice-pick lobotomies for all kinds of conditions, including headaches. Eventually, he began performing the operation in his van—which later became known as “the lobotomobile.” At one point, he undertook 25 lobotomies in a single day. He even performed them on children as young as 4 years old. Years later, one of them spoke of the frightful incident: “I’ve always felt different—wondered if something’s missing from my soul. I have no memory of the operation, and never had the courage to ask my family about it.”

20131026_133743Over the course of four decades, Freeman performed nearly 3,500 lobotomies despite the fact that he had no surgical training. Many of his patients often had to relearn how to eat and use the bathroom. Some never recovered. And, of course, there were fatalities. In 1951, one of his patients died when Freeman suddenly stopped to pose for a photo during the procedure. The surgical instrument slipped and went too far into the patient’s brain. Many others fell victim to a similar fate at the good doctor’s hands.

The lobotomy eventually came under attack from the medical community. By the 1970s, several countries had banned the procedure altogether. Freeman eventually retired the lobotomobile and opened a private practice in California. Contrary to popular belief, he never lost his license to practice medicine.

Today, surgical lobotomies are no longer performed. The rise of drugs like thorazine make it easier to lobotomize patients chemically. In recent years, there have been calls for the Nobel Foundation to rescind Moniz’s prize that he received for developing the lobotomy, which has often been labeled one of the most barbaric mistakes of modern medicine.

 

Fitzharris_ButcheringArt_JKFSpecial thanks to Paul Koudounaris for bringing this fascinating subject to light for me when I was in Los Angeles this past April.

If you’re interested in the history of surgery, you can now pre-order my book, The Butchering Art. All pre-orders count towards first-week sales once the book is released, and therefore give me a greater chance of securing a place on bestseller lists in October. I would be hugely grateful for your support. If you’re in the US, click HERE. If you’re in the UK, click HERE. Info on further foreign editions to come.

 

 

Painful Operations: Removing Bladder Stones before Anesthesia

C8WOwAkWsAAX23b

If you visit the Gordon Museum at Guy’s Hospital in London, you’ll see a small bladder stone—no bigger than 3 centimetres across. Besides the fact that it has been sliced open to reveal concentric circles within, it is entirely unremarkable in appearance. Yet, this tiny stone was the source of enormous pain for 53-year-old Stephen Pollard, who agreed to undergo surgery to remove it in 1828.

People frequently suffered from bladder stones in earlier periods due to poor diet, which often consisted of lots of meat and alcohol, and very few vegetables. The oldest bladder stone on record was discovered in Egyptian grave from 4,800 B.C. The problem was so common that itinerant healers traveled from village to village offering a vast array of services and potions that promised to cure those suffering from the condition. Depending on the size of these stones, they could block the flow of urine into the bladder from the kidneys; or, they could prevent the flow of urine out of the bladder through the urethra. Either situation was potentially lethal. In the first instance, the kidney is slowly destroyed by pressure from the urine; in the second instance, the bladder swells and eventually bursts, leading to infection and finally death.

2Like today, bladder stones were unimaginably painful for those who suffered from them in the past. The stones themselves were often enormous. Some measured as large as a tennis ball. The afflicted often acted in desperation, going to great lengths to rid themselves of the agony. In the early 18th century, one man reportedly drove a nail through his penis and then used a blacksmith’s hammer to break the stone apart until the pieces were small enough to pass through his urethra. It’s not a surprise, then, that many sufferers chose to submit to the surgeon’s knife despite a very real risk of dying during or immediately after the procedure from shock or infection. Although the operation itself lasted only a matter of minutes, lithotomic procedures were incredibly painful and dangerous—not to mention humiliating.

The patient—naked from the waist down—was bound in such a way as to ensure an unobstructed view of his genitals and anus [see illustration below]. Afterwards, the surgeon passed a curved, metal tube up the patient’s penis and into the bladder. He then slid a finger into the man’s rectum, feeling for the stone. Once he had located it, his assistant removed the metal tube and replaced it with a wooden staff. This staff acted as a guide so that the surgeon did not fatally rupture the patient’s rectum or intestines as he began cutting deeper into the bladder. Once the staff was in place, the surgeon cut diagonally through the fibrous muscle of the scrotum until he reached the wooden staff. Next, he used a probe to widen the hole, ripping open the prostate gland in the process. At this point, the wooden staff was removed and the surgeon used forceps to extract the stone from the bladder. [1]

L0015225 Lithotomy scene

Unfortunately for Stephen Pollard, what should have lasted 5 minutes ended up lasting 55 minutes under the gaze of 200 spectators at Guy’s Hospital in London. The surgeon Bransby Cooper fumbled and panicked, cursing the patient loudly for having “a very deep perineum,” while the patient, in turn, cried: “Oh! let it go; —pray, let it keep in!’” The surgeon reportedly used every tool at his disposal before he finally reached into the gaping wound with his bare fingers. During this time, several of the spectators walked out of the operating theater, unable to bear witness to the patient’s agony any longer. Eventually, Cooper located the stone with a pair of forceps. He held it up for his audience, who clapped unenthusiastically at the sight of the stone.

Sadly, Pollard survived the surgery only to die the next day. His autopsy revealed that it was indeed the skill of his surgeon, and not his alleged “abnormal anatomy,” which was the cause of his death.

1200px-Thomas_Wakley72But the story didn’t end there. Word quickly got out about the botched operation. When Thomas Wakley [left]—the editor of The Lancet—heard of this medical disaster, he accused Cooper of incompetence and implied that the surgeon had only been appointed to Guy’s Hospital because he was nephew to one of the senior surgeons on staff. Wakley used the trial to attack what he believed to be corruption within the hospitals due to rampant nepotism. Outraged by the allegation, Cooper sued Wakley for libel and sought £2000 in damages. The jury reluctantly sided with the surgeon, but only awarded him £100. Wakley had raised more than that in a defence fund campaign and gave the remaining money over to Pollard’s widow after the trial. [2]

Bransby Cooper’s reputation, like his patient, never did recover.

If you’re interested in the history of pre-anesthetic and pre-antiseptic surgery, you can pre-order my book The Butchering Art in the US (click here) and in the UK (click here). Information of foreign editions to come!

fitzharris_butcheringart_021417

1. Druin Burch, Digging up the Dead: Uncovering the Life and Times of an Extraordinary Surgeon (2007), p. 26. I am greatly indebted to his work for bringing this story to my attention.
2. Thomas Wakley, A Report of the Trial of Cooper v. Wakley (1829), pp. 4-5.

The Wandering Womb: Female Hysteria through the Ages

Hysteriaa

The word “hysteria” conjures up an array of images, none of which probably include a nomadic uterus wandering aimlessly around the female body. Yet that is precisely what medical practitioners in the past believed was the cause behind this mysterious disorder. The very word “hysteria” comes from the Greek word hystera, meaning “womb,” and arises from medical misunderstandings of basic female anatomy.

Today, hysteria is regarded as a physical expression of a mental conflict and can affect anyone regardless of age or gender. [1] Centuries ago, however, it was attributed only to women, and believed to be physiological (not psychological) in nature.

enhanced-1129-1458094853-1For instance, Plato believed that the womb—especially one which was barren—could become vexed and begin wandering throughout the body, blocking respiratory channels causing bizarre behavior. [2] This belief was ubiquitous in ancient Greece. The physician Aretaeus of Cappadocia went so far as to consider the womb “an animal within an animal,” an organ that “moved of itself hither and thither in the flanks.” [3] The uterus could move upwards, downwards, left or right. It could even collide with the liver or spleen. Depending on its direction, a wandering womb could cause all kinds of hell. One that traveled upwards might cause sluggishness, lack of strength, and vertigo in a patient; while a womb that moved downwards could cause a person to feel as if she were choking. So worrisome was the prospect of a wandering womb during this period, that some women wore amulets to protect themselves against it. [4]

The womb continued to hold a mystical place in medical text for centuries, and was often used to explain away an array of female complaints. The 17th-century physician William Harvey, famed for his theories on the circulation of the blood around the heart, perpetuated the belief that women were slaves to their own biology. He described the uterus as “insatiable, ferocious, animal-like,” and drew parallels between “bitches in heat and hysterical women.” [5] When a woman named Mary Glover accused her neighbor Elizabeth Jackson of cursing her in 1602, the physician Edward Jorden argued that the erratic behavior that drove Mary to make such an accusation was actually caused by noxious vapors in her womb, which he believed were slowly suffocating her. (The courts disagreed and Elizabeth Jackson was executed for witchcraft shortly thereafter.)

So what could be done for hysteria in the past?

e789fb4fb909b2a53918eb9a18b08db3Physicians prescribed all kinds of treatments for a wayward womb. These included sweet-smelling vaginal suppositories and fumigations used to tempt the uterus back to its rightful place. The Greek physician Atreaus wrote that the womb “delights…in fragrant smells and advances towards them; and it has an aversion to foetid smells, and flees from them.” Women were also advised to ingest disgusting substances—sometimes containing repulsive ingredients such as human or animal excrement—in order to coax the womb away from the lungs and heart. In some cases, physical force was used to correct the position of a wandering womb (see image, right). For the single woman suffering from hysteria, the cure was simple: marriage, followed by children. Lots and lots of children.

Today, wombs are no longer thought to wander; however, medicine still tends to pathologize the vagaries of the female reproductive system. [6] Over the course of several thousand years, the womb has become less of a way to explain physician ailments, and more of a way to explain psychological disfunction—often being cited as the reason behind irrationality and mood swings in women. Has the ever-elusive hysteria brought on by roving uteri simply been replaced by the equally intangible yet mysterious PMS? I’ll let you decide.

fitzharris_butcheringart_021417

You can now pre-order my book THE BUTCHERING ART by clicking here. THE BUTCHERING ART follows the story of Joseph Lister as he attempts to revolutionize the brutal world of Victorian surgery through antisepsis. Pre-orders are incredibly helpful to new authors. Info on how to order foreign editions coming soon. Your support is greatly appreciated. 

 

 

1. Mark J Adair, “Plato’s View of the ‘Wandering Uterus,’” The Classical Journal 91:2 (1996), p. 153.
2. G. S. Rousseau, “‘A Strange Pathology:’ Hysteria in the Early Modern World, 1500-1800” in Hysteria Beyond Freud (1993), p.104. Originally qtd in Heather Meek, “Of Wandering Wombs and Wrongs of Women: Evolving Concepts of Hysteria in the Age of Reason,” English Studies in Canada 35:2-3 (June/September 2009), p.109.
3. Quoted in Matt Simon, “Fantastically Wrong: The Theory of the Wandering Wombs that Drove Women to Madness,” Wired (7 May 2014).
4. Robert K. Ritner, “A Uterine Amulet in the Oriental Institute Collection,” Journal of Near Eastern Studies 45:3 (Jul. 1984), pp.209-221. For more on the fascinating subject of magical amulets, see Tom Blaen, Medical Jewels, Magical Gems: Precious Stones in Early Modern Britain (2012).
5. Rousseau, “A Strange Pathology,” p. 132.
6. Mary Lefkowitz, “Medical Notes: The Wandering Womb,” The New Yorker (26 February 1996).

Houses of Death: Walking the Wards of a Victorian Hospital

9deb7918e7e1d5281d6cfba4eafb711dThe following blog post relates to my forthcoming book THE BUTCHERING ART, which you can pre-order here

Today, we think of the hospital as an exemplar of sanitation. However, during the first half of the nineteenth century, hospitals were anything but hygienic. They were breeding grounds for infection and provided only the most primitive facilities for the sick and dying, many of whom were housed on wards with little ventilation or access to clean water. As a result of this squalor, hospitals became known as “Houses of Death.”

L0059152 Trade card for a 'Bug Destroyer' Andrew Cooke, LondonThe best that can be said about Victorian hospitals is that they were a slight improvement over their Georgian predecessors. That’s hardly a ringing endorsement when one considers that a hospital’s “Chief Bug-Catcher”—whose job it was to rid the mattresses of lice—was paid more than its surgeons in the eighteenth century. In fact, bed bugs were so common that the “Bug Destroyer” Andrew Cooke [see image, left] claimed to have cleared upwards of 20,000 beds of insects during the course of his career.[1]

In spite of token efforts to make them cleaner, most hospitals remained overcrowded, grimy, and poorly managed. The assistant surgeon at St. Thomas’s Hospital in London was expected to examine over 200 patients in a single day. The sick often languished in filth for long periods before they received medical attention, because most hospitals were disastrously understaffed. In 1825, visitors to St. George’s Hospital discovered mushrooms and wriggling maggots thriving in the damp, soiled sheets of a patient with a compound fracture. The afflicted man, believing this to be the norm, had not complained about the conditions, nor had any of his fellow convalescents thought the squalor especially noteworthy.[2]

Worst of all was the fact that a sickening odor permeated every hospital ward. The air was thick with the stench of piss, shit, and vomit. The smell was so offensive that the staff sometimes walked around with handkerchiefs pressed to their noses. Doctors didn’t exactly smell like rose beds, either. Berkeley Moynihan—one of the first surgeons in England to use rubber gloves—recalled how he and his colleagues used to throw off their own jackets when entering the operating theater and don ancient frocks that were often stiff with dried blood and pus. They had belonged to retired members of staff and were worn as badges of honor by their proud successors, as were many items of surgical clothing.

llanionmilitaryhospitalmoreThe operating theaters within these hospitals were just as dirty as the surgeons working in them. In the early decades of the nineteenth century, it was safer to have surgery at home than it was in a hospital, where mortality rates were three to five times higher than they were in domestic settings. Those who went under the knife did so as a last resort, and so were usually mortally ill. Very few surgical patients recovered without incident. Many either died or fought their way back to only partial health. Those unlucky enough to find themselves hospitalized during this period would frequently fall prey to a host of infections, most of which were fatal in a pre-antibiotic era.

419c2b28d1b137197a21298b24a604c0In addition to the foul smells, fear permeated the atmosphere of the Victorian hospital. The surgeon John Bell wrote that it was easy to imagine the mental anguish of the hospital patient awaiting surgery. He would hear regularly “the cries of those under operation which he is preparing to undergo,” and see his “fellow-sufferer conveyed to that scene of trial,” only to be “carried back in solemnity and silence to his bed.” Lastly, he was subjected to the sound of their dying groans as they suffered the final throes of what was almost certainly their end.[3]

As horrible as these hospitals were, it was not easy gaining entry to one. Throughout the nineteenth century, almost all the hospitals in London except the Royal Free controlled inpatient admission through a system of ticketing. One could obtain a ticket from one of the hospital’s “subscribers,” who had paid an annual fee in exchange for the right to recommend patients to the hospital and vote in elections of medical staff. Securing a ticket required tireless soliciting on the part of potential patients, who might spend days waiting and calling on the servants of subscribers and begging their way into the hospital. Some hospitals only admitted patients who brought with them money to cover their almost inevitable burial. Others, like St. Thomas’ in London, charged double if the person in question was deemed “foul” by the admissions officer.[4]

27

Before germs and antisepsis were fully understood, remedies for hospital squalor were hard to come by. The obstetrician James Y. Simpson suggested an almost-fatalistic approach to the problem. If cross-contamination could not be controlled, he argued, then hospitals should be periodically destroyed and built anew. Another surgeon voiced a similar view. “Once a hospital has become incurably pyemia-stricken, it is impossible to disinfect it by any known hygienic means, as it would to disinfect an old cheese of the maggots which have been generated in it,” he wrote. There was only one solution: the wholesale “demolition of the infected fabric.”[5]

fitzharris_butcheringart_021417It wasn’t until a young surgeon named Joseph Lister developed the concept of antisepsis in the 1860s that hospitals became places of healing rather than places of death.

To read more about 19th-century hospitals and Joseph Lister’s antiseptic revolution, pre-order my book THE BUTCHERING ART by clicking here. Pre-orders are incredibly helpful to new authors . Info on how to order foreign editions coming soon. Your support is greatly appreciated. 

 

1. Adrian Teal, The Gin Lane Gazette (London: Unbound, 2014).
2. F. B. Smith, The People’s Health 1830-1910 (London: Croom Helm, 1979), 262.
3. John Bell, The Principles of Surgery, Vol. III (1808), 293.
4. Elisabeth Bennion, Antique Medical Instruments (Berkeley: University of California Press, 1979), 13.
5. John Eric Erichsen, On Hospitalism and the Causes of Death after Operations (London: Longmans, Green, and Co., 1874), 98.

Syphilis: A Little Valentine’s Day Love Story

12

Photo Credit: The Royal College of Surgeons of England 

We don’t know much about her. We don’t even know her name. What we do know is that the woman who wore the above prosthetic in the mid-19th century was suffering from a severe case of syphilis.

Before the discovery of penicillin in 1928, syphilis was an incurable disease. Its symptoms were as terrifying as they were unrelenting. Those who suffered from it long enough could expect to develop unsightly skin ulcers, paralysis, gradual blindness, dementia and “saddle nose,” a grotesque deformity which occurs when the bridge of the nose caves into the face.

stlcfo00239This deformity was so common amongst those suffering from the pox (as it was sometimes called) that “no nose clubs” sprung up in London. On 18 February 1874, the Star reported: “Miss Sanborn tells us that an eccentric gentleman, having taken a fancy to see a large party of noseless persons, invited every one thus afflicted, whom he met in the streets, to dine on a certain day at a tavern, where he formed them into a brotherhood.”[1] The man, who assumed the name Mr. Crampton for these clandestine parties, entertained his “noseless’” friends every month until he died a year later, at which time the group “unhappily dissolved.”[2]

The 19th century was particularly rife with syphilis. Because of its prevalence, both physicians and surgeons treated victims of the disease. Many treatments involved the use of mercury, hence giving rise to the saying: “One night with Venus, a lifetime with Mercury.” Mercury could be administered in the form of calomel (mercury chloride), an ointment, a steam bath or pill. Unfortunately, the side effects could be as painful and terrifying as the disease itself. Many patients who underwent mercury treatments suffered from extensive tooth loss, ulcerations and neurological damage. In many cases, people died from significant mercury poisoning.

For those determined to avoid the pox altogether, condoms made from animal membrane and secured with a silk ribbon were available [below], but these were outlandishly expensive. Moreover, many men shunned them for being uncomfortable and cumbersome. In 1717, the surgeon, Daniel Turner, wrote:

The Condum being the best, if not only Preservative our Libertines have found out at present; and yet by reason of its blunting the Sensation, I have heard some of them acknowledge, that they had often chose to risk a Clap, rather than engage cum Hastis sic clypeatis [with spears thus sheathed].[3]

13Everyone blamed each other for the burdensome condom. The French called it “la capote anglaise” (the English cape), while the English called it the “French letter.” Even more unpleasant was the fact that once one procured a condom, he was expected to use it repeatedly. Unsurprisingly, syphilis continued to rage despite the growing availability of condoms during the Victorian period.

Which brings me back to the owner of the prosthetic nose. Eventually, she lost her teeth and palate after prolonged exposure to mercury treatments. Her husband—who may have been the source of her suffering—finally died from the disease, leaving her a widow. But it wasn’t all doom and gloom for the poor, unfortunate Mrs X.

According to records at the Royal College of Surgeons in London, the woman found another suitor despite her deformities. After the wedding, she sought out the physician, James Merryweather, and sold the contraption to him for £3. The reason? Her new husband liked her just the way she was – no nose and all!

And that, kind readers, is a true Valentine’s Day love story…Ignore the part where she most certainly transmitted the disease to her new lover.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Origin of the No Nose Club. Star, Issue 1861 (18 February 1874), p. 3.
2. Ibid.
3. Daniel Turner, Syphilis: A Practical Treatise on the Venereal Disease (1717), p. 74.

The Medicalization of Death in History

Ars2

When the Black Death swept through Europe in the 14th century, it claimed the lives of over 75 million people, many of who were clergymen whose job it was to help usher the dying into the next world. In response to the shortage of priests, the Ars Moriendi (Art of Dying) first emerged in 1415. This was a manual that provided practical guidance to the dying and those who attended them in their final moments. These included prayers and prescribed rites to be performed at the deathbed, as well as illustrations and descriptions of the temptations one must overcome in order to achieve a “good death.”

From the medieval period onwards, the dying were expected to follow a set of “rules” when facing the final moments of their lives, which included repentance of sins, forgiving enemies, and accepting one’s fate stoically without complaint.  It was each person’s duty to die a righteous death.

Ars1In earlier periods, many people believed that pain was a necessary component of a good death. Indeed, the word patient comes from the Latin word patiens, meaning “long-suffering” or “one who suffers.”  Evangelical Christians, in particular, feared losing lucidity as death approached, as this would prohibit the person from begging forgiveness for past sins and putting his or her worldly affairs in order before departing this life. Death was a public event, with those closest to the dying in attendance. Friends and relatives were not merely passive observers. They often assisted the dying person in his or her final hours, offering up prayers as the moment drew closer. The deathbed presented the dying with the final opportunity for eternal salvation.

For this reason, the physician rarely appeared at the bedside of a dying person because pain management wasn’t required. Moreover, the general consensus was that it was inappropriate for a person to profit from another’s death. Caricatures depicting the greedy physician running off with bags of money after his patient had succumbed to his fate were not uncommon in the 18th and early 19th centuries.

Over time, however, religious sentiments faded, and physicians began to appear more regularly in the homes of the dying. Part of this was driven by the fact that doctors became more effective at pain management. At the start of the 19th century, they typically administered laudanum drops orally to patients. This process was imprecise, and sometimes not effective at all. All this changed in 1853 when Charles Pravaz and Alexander Wood developed a medical hypodermic syringe with a needle fine enough to pierce the skin (see example below from c.1880). From that point onwards, physicians began administering morphine intravenously, which was far more effective (though also more dangerous). As new techniques emerged, people’s attitudes towards pain management in treating the dying began to change. Soon, a painless death was not only seen to be acceptable, but also vital to achieving a “good death.”

Ars3

The doctor’s place at the bedside of the dying became increasingly more commonplace. Whereas the deathbed was formerly governed by religious tradition, it now became the purview of the medical community. The emphasis for the doctor had shifted from cure to care by the end of the 19th century, thus making his place in the “death chamber” more acceptable. Now, the good physician was one who stood by his patient, even when nothing could be done to save his or her life. As the legal scholar Shai Lavi succinctly put it: “The ethics of the deathbed shifted from religion to medicine, and dying further emerged as a matter of regulating life: life was now understood in its biological, rather than biographical, sense.”

The medicalization of death had arrived, one which for better or worse continues to shape how we die today.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

  1. Shai Lavi, “Euthanasia and the Changing Ethics of the Deathbed: A Study in Historical Jurisprudence,” Theoretical Inquiries in Law, 4.2 (2003): pp. 729 – 761.

“Limbs Not Yet Rigid” – A History of Dissecting the Living

L0031335 The dead alive! H. Wigstead 1784

Several years ago, the news reported a story that could have come straight from the script of a horror movie. In October 2009, Colleen S. Burns was admitted to St Joseph’s Hospital Center in New York for a drug overdose. A short time later, a team of doctors pronounced the 39-year-old woman dead. Her family was notified and Burns’s body was prepped for organ donation.

The only problem was: Burns wasn’t actually dead.

She was in a drug-induced coma. Fortunately for her, she woke minutes before the first incision was made. Happily, occurrences such as this are few and far between these days. Yet in the past, incidences of premature dissection were not uncommon.

In 1746, Jacques-Bénigne Winslow wrote: “Tho’ Death, at some Time or other, is the necessary and unavoidable Portion of Human Nature in its present Condition, yet it is not always certain, that Persons taken for dead are really and irretrievably deprived of Life.” Indeed, the Danish anatomist went on to claim that it was “evident from Experience” that those thought to be dead have proven otherwise “by rising from their Shrowds [sic], their Coffins, and even from their Graves.” [1]

DA3Fears over premature burial were ubiquitous during this period, so much so that people created “life preserving coffins” with bells and breathing tubes attached. But even worse than being buried alive was the thought of being dissected alive. The threat was real, and it happened often enough to be commented on in contemporary literature with some frequency.

The 17th and 18th centuries were rife with stories about executed criminals who had “returned from the dead” just moments before being dissected. In 1651, Anne Green was hanged in Oxford for infanticide. For thirty minutes, she dangled at the end of the noose while her friends thumped her chest and put “their weight upon her leggs [sic]…lifting her up and then pulling her downe againe with a suddain jerke” in order to quicken her death. Afterwards, her body was cut down from the gallows and brought to Drs Thomas Willis and William Petty to be dissected. Just seconds before Willis plunged the knife into her sternum, Anne miraculously awoke. [2]

The 19th century had its fair share of incidences too. The physician and surgeon, Sir Robert Christison, complained that dissection in St Bartholomew’s Hospital in London was “apt to be performed with indecent, sometimes with dangerous haste” during this period. He remembered:

…an occasion when [William] Cullen commenced the dissection of a man who died on hour before, and when fluid blood gushed in abundance from the first incision through the skin…Instantly I seized his wrist in great alarm, and arrested his progress; nor was I easily persuaded to let him go on, when I saw the blood coagulate on the table exactly like living blood.

He further remarked: “It was no uncommon occurrence that, when the operator proceeded with his work, the body was sensibly warm, the limbs not yet rigid, the blood in the great vessels fluid and coagulable [sic].” [3]

M0008887 An aged anatomist selecting his dissection instrument whilst

The problem wasn’t contained to Britain alone. The French physician — Pierre Charles Alexandre Louis — reported the story of a patient who had been placed in his dissection room at the Pitié-Salpêtrière Hospital in Paris. The next morning, the doctor’s colleagues informed him that they had heard moans in the locked theater overnight. When Louis went to check it out, he found “to his horror that the supposed corpse had revived during the night, and had actually died in the struggle to disengage herself from the winding sheet in which she was enveloped.” [4]

It was largely because of reports like this that anatomists, themselves, worried about the precise moment of death when cutting open bodies. To avoid disaster, good old Winslow suggested that a person’s gums be rubbed with caustic substances, and that the body be “stimulate[d]…with Whips and Nettles” before being dissected. Furthermore, the anatomist should “irritate his Intestines by Means of Clysters and Injections of Air or Smoke” as well as “agitate… the Limbs by violent Extensions and Inflexions.” If possible, an attempt should also be made to “shock [the person’s] Ears by hideous Shrieks and excessive Noises.” [5]

To our modern sensibilities, these measures may seem extreme, even comical, but to Winslow, this was no laughing matter. In fact, he went even further, recommending that the palms of the hands and the soles of the feet be pricked with needles, and that the “Scapulae, Shoulders and Arms” be scarified using fire or sharp instruments so as to “lacerate and strip [them] of the epidermis.” Indeed, when reading Winslow’s work, one gets the innate feeling that he took pleasure in imaging new ways to torture the dead.

Today, new debates have arisen over the very definition of death itself with the emergence of “beating heart cadavers.” Though considered dead in both a medical and legal capacity, these “cadavers” are kept on ventilators for organ and tissue transplantation. Their hearts beat; they expel waste; they have the ability to heal themselves of infection; they can even carry a fetus to term. Crucially, though, their brains are no longer functioning. It is in this way that the medical community has redefined death in the 21st century.

Yet, some wonder whether these “beating heart cadavers” are really dead, or whether they are just straddling the great divide between life and death before the finally lights go out. Or, worse, have been misdiagnosed, as in the case of Colleen Burns.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Jacques-Bénigne Winslow, The Uncertainty of the Signs of Death, and the Danger of Precipitate Interments and Dissections (1746), pp. 1-2.
2. Anon., A declaration from Oxford, of Anne Green a young woman that was lately, and unjustly hanged in the Castle-yard; but since recovered (London, 1651), p. 2.
3. R. Christison, The Life of Sir Robert Christison (1885-6), pp. 192-3. Originally quoted in Ruth Richardson, Death, Dissection and the Destitute (2000), p. 98.
4. Ibid.
5. Winslow, The Uncertainty of the Signs of Death, p. 2.