The Medicalization of Death in History


When the Black Death swept through Europe in the 14th century, it claimed the lives of over 75 million people, many of who were clergymen whose job it was to help usher the dying into the next world. In response to the shortage of priests, the Ars Moriendi (Art of Dying) first emerged in 1415. This was a manual that provided practical guidance to the dying and those who attended them in their final moments. These included prayers and prescribed rites to be performed at the deathbed, as well as illustrations and descriptions of the temptations one must overcome in order to achieve a “good death.”

From the medieval period onwards, the dying were expected to follow a set of “rules” when facing the final moments of their lives, which included repentance of sins, forgiving enemies, and accepting one’s fate stoically without complaint.  It was each person’s duty to die a righteous death.

Ars1In earlier periods, many people believed that pain was a necessary component of a good death. Indeed, the word patient comes from the Latin word patiens, meaning “long-suffering” or “one who suffers.”  Evangelical Christians, in particular, feared losing lucidity as death approached, as this would prohibit the person from begging forgiveness for past sins and putting his or her worldly affairs in order before departing this life. Death was a public event, with those closest to the dying in attendance. Friends and relatives were not merely passive observers. They often assisted the dying person in his or her final hours, offering up prayers as the moment drew closer. The deathbed presented the dying with the final opportunity for eternal salvation.

For this reason, the physician rarely appeared at the bedside of a dying person because pain management wasn’t required. Moreover, the general consensus was that it was inappropriate for a person to profit from another’s death. Caricatures depicting the greedy physician running off with bags of money after his patient had succumbed to his fate were not uncommon in the 18th and early 19th centuries.

Over time, however, religious sentiments faded, and physicians began to appear more regularly in the homes of the dying. Part of this was driven by the fact that doctors became more effective at pain management. At the start of the 19th century, they typically administered laudanum drops orally to patients. This process was imprecise, and sometimes not effective at all. All this changed in 1853 when Charles Pravaz and Alexander Wood developed a medical hypodermic syringe with a needle fine enough to pierce the skin (see example below from c.1880). From that point onwards, physicians began administering morphine intravenously, which was far more effective (though also more dangerous). As new techniques emerged, people’s attitudes towards pain management in treating the dying began to change. Soon, a painless death was not only seen to be acceptable, but also vital to achieving a “good death.”


The doctor’s place at the bedside of the dying became increasingly more commonplace. Whereas the deathbed was formerly governed by religious tradition, it now became the purview of the medical community. The emphasis for the doctor had shifted from cure to care by the end of the 19th century, thus making his place in the “death chamber” more acceptable. Now, the good physician was one who stood by his patient, even when nothing could be done to save his or her life. As the legal scholar Shai Lavi succinctly put it: “The ethics of the deathbed shifted from religion to medicine, and dying further emerged as a matter of regulating life: life was now understood in its biological, rather than biographical, sense.”

The medicalization of death had arrived, one which for better or worse continues to shape how we die today.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

  1. Shai Lavi, “Euthanasia and the Changing Ethics of the Deathbed: A Study in Historical Jurisprudence,” Theoretical Inquiries in Law, 4.2 (2003): pp. 729 – 761.

The Cutter’s Art: A Brief History of Bloodletting


V0011195 An ill man who is being bled by his doctor. Coloured etching

When King Charles II suffered a sudden seizure on the morning of 2 February 1685, his personal physician had just the remedy. He quickly slashed open a vein in the king’s left arm and filled a basin with the royal blood. Over the next few days, the king was tortured by a swarm of physicians buzzing around his bedside. They gave enemas and urged him to drink various potions, including boiled spirits from a human skull. The monarch was bled a second time before he lapsed into a coma. He never awoke.

Even without his doctors’ ministrations, the king may well have succumbed to whatever ailed him, yet his final days were certainly not made any easier by the relentless bloodletting and purging. By the time of Charles II’s death, however, bloodletting was standard medical practice.

Bloodletting dates back to the Roman physician, Galen, who lived in the 2nd century AD. Galen taught that blood was the product of food. After reaching the stomach, food was liquefied and then sent to the liver, where it was turned into blood. Occasionally, the body produced an excess of blood, which according to Galenic practitioners, caused fevers, headaches, and even seizures. The only recourse was to rid the body of this superfluous fluid.

BarberAs vital as bloodletting was felt to be, many physicians believed the “cutter’s art” was beneath their station. Instead, they referred those in need of bleeding to barber-surgeons, who carried out this duty in addition to a diverse range of other personal services.

The traditional striped barber’s pole harks back to that era, when it served as an advertisement for their proficiency as bloodletters. The pole represents the rod that the patient gripped to make their veins bulge, and the brass ball at the top symbolizes the basin used to collect the blood. The red and white stripes represent the bloodied bandages. Once washed and hung to dry on the rod outside the shop, they would twist in the wind, forming the familiar spiral pattern adorning modern poles.

While bloodletting seems barbaric to modern eyes, it was considered a standard part of medical treatment, demanded by many people when they felt ill in the same way we might ask for antibiotics when visiting the doctor’s office today. Take George Washington (below), who woke on the morning of 14 December 1799 complaining that he couldn’t breathe. Fearing his doctor would not arrive in time, Washington asked for the overseer of his slaves to step in and bleed him. The cut was deep, and Washington lost nearly half a pint before the wound was closed. Eventually, the physicians arrived and proceeded to bleed Washington four more times in the next eight hours. By evening, America’s first president was dead. One of his physicians, James Craik, later admitted that he thought the blood loss was partly responsible.


Bloodletting reached its apogee in the early 19th century. By then, people were not just bled when they were ill. It was also used for preventative purposes, typically in the spring, seen as a time of rebirth and rejuvenation. During this period, leeching was the preferred method. This type of worm can suck several times its own body weight in blood and is a lot safer than cutting open a vein. Leeching became so popular that it led to a “leech craze.” Throughout England, leech collectors (mostly women) would wade into leech-infested ponds with bare legs in order to attract the slimy bloodsuckers. Once the leeches had had their fill, they would fall off leaving the collector to then sell them to medical practitioners for profit.

L0057148 Pewter box for transporting leeches, Europe, 1801-1900

Unsurprisingly, leech collectors commonly suffered from headaches as a result blood-loss, and sometimes contracted diseases from contact with the leeches

But why did bloodletting remain so popular for so long? Despite advances in anatomy and diagnostics during the 18th and 19th centuries, therapeutics did not evolve quickly enough to match new understandings of the body. Many practitioners believed it was better to do something than to do nothing.

In the cases of Charles II and George Washington, they were most definitely wrong.

If you enjoy my blog, please consider supporting my content by clicking HERE.



Mangling the Dead: Dissection, Past & Present


I never feel more alive than when I am standing among the rows and rows of anatomical specimens in medical museums around London. In one jar floats the remains of an ulcerated stomach; in another, the hands of a suicide victim. Cabinets are filled with syphilitic skulls, arthritic joints, and cancerous bones. The unborn sit alongside the aged; murderers occupy the same space as the murdered.

D3As a medical historian, I have a professional interest in these collections as part of my ongoing research into the early history of surgery. Occasionally, however, I catch a glimpse of veins and arteries dangling from a severed wrist—or the bloated face of a child who died long ago—and I reflect on the actual surgeons and anatomists who cut up these dead bodies. How did they overcome the emotional and physical realities of dissection? And how do contemporary experiences in the dissection room compare with those from the past?

A few months ago, I sat down with my mother, a registered nurse, and we talked about her first dissection. She spoke with intense clarity as if it had happened only yesterday: “She was a woman in her 30s, younger than I was then, who had died from toxic shock syndrome. I felt sorry for her.” My mother paused as the memories flooded over her. “I wanted to cover her body with a blanket, not because she was naked…I don’t know. I just thought she’d be more comfortable with a blanket over her before we began poking, prodding, and pulling her to pieces.”

The idea that the anatomisation of the body is tantamount to “hacking,” “distorting,” or “disfiguring” what once was a living human being has troubled medical students for centuries. In 1663, the Danish physician, Thomas Bartholin, wrote that one must partake in “mangling the dead so that he may know the living.” Nearly a century later, the Master of Anatomy to the Incorporation of Surgeons of London remarked to those attending the public dissection of the criminal, Richard Lamb, “I think few who now look upon that miserable mangled object before us, can ever forget it.” Then, as now, confronting a cadaver could be a distressing event. But the unsanitary conditions of the past often made the experience even more traumatic.


Unlike the sterile laboratories of today, the “dead house” from previous centuries was a very different place. One medical student from the 19th century described the “swarms of sparrows fighting for scraps” and the “rats in the corner gnawing bleeding vertebrae”. The dissection theatre was bloody, smelly, and filled with all kinds of animals and insects trying to feast on the decomposing bodies, some of which had been plucked from the grave after a couple of days in the ground. In letters, diaries, and medical notes from Europe during the Enlightenment, I often come across descriptions of “decaying flesh,” “rancid corpses,” and “putrid stenches”—not the “slightly sweet, clinical smell” that some medical practitioners remember today. In a letter dated Oct 8, 1793, James Williams—a 16-year-old English surgical student—described his living quarters in John Hunter’s anatomy school in London as “a little perfumed.” The 17th-century German surgeon, Lorenz Heister, was not as delicate in his descriptions. He recommended that “Students in Surgery should not only be furnished with Strength of Body, but constancy of Mind” so that they remain “unmolested and unmoved by Stench, Blood, Pus and Nastiness that will naturally occur to them” during their practice.

D6There were plenty of young men who entered their anatomy lessons only to discover they did not have the “constancy of Mind” required to endure the realities of dissection. The composer, Hector Berlioz, who attended medical school in Paris in 1821 “leapt out of the window, and fled as though Death and all his hideous crew were at my heels” the first time he entered what he described as a “human charnel-house.” Berlioz claimed that it was “twenty-four hours before I recovered from the shock” and was able to return to the dissection theatre. Thomas Platter the Younger, a 16th-century Swiss physician, was haunted by the memories of his first dissection. During the first week of his lessons, he dreamt he had feasted upon human flesh. When he awoke in the middle of the night, he vomited.

Many, however, did learn to adapt over time. Bit by bit, piece by piece, they began to view the body not as a person but as an object. Some surgeons and physicians were even able to cut open the bodies of relatives. The French anatomist, Guillaume Rondelet, caused an uproar in 1538 when he publicly dissected the body of his infant son, whilst William Harvey undertook private dissections on both his father and his sister during the 16th century. These men, of course, were exceptional, but it does illustrate the extent to which one could become detached from the dissected body.


This made me wonder about medical students today. How did their experiences compare to those from earlier periods? To find out, I interviewed several doctors about their earliest memories in the dissection room. From these conversations, I discovered that many medical students are just as apprehensive about their first encounters with a cadaver as their predecessors. Erica Lilly, a general practitioner in Canada, remembers her unease as the cadaver’s head was unwrapped from its plastic covering during her first anatomy lesson. The face did not look human “as much as it looked like a mask,” she says, her voice laced with emotion. Similarly, Jennifer Kasten, Research Fellow at the Department of Surgery of the University of California, Los Angeles, recalls the “muffled crying” from some of her fellow students as the body bags were unzipped for the very first time. She describes the first cut as “one of quiet and awed intensity.” For her, it was an “initiation into the mysteries of medicine.” The physical act of cutting open a dead body was only one of the challenges that interviewees mentioned during the course of our conversations. The odour was also an obstacle. Thomas Waite, Specialty Registrar in Public Health in Wales, remembers it vividly: “I’ll never forget [the smell]…At the end of the year I threw away the only set of clothes I wore under my dissection coat because no amount of washing could get rid of it.”


The sensory experiences of those working in the earlier periods would have differed greatly from those of Waite. To better understand what medical students in earlier periods might have felt when first confronted with the rotting flesh of unpreserved corpses, I turned to William MacLehose, a medical historian at University College London. Several years ago, he visited the “Body Farm,” the University of Tennessee’s Anthropology Research Facility in Knoxville, TN, USA, where human decomposition is studied. When I asked MacLehose to describe his reaction to what he saw on the Body Farm, he struggles to find words, pointing out that “words will always have some level of distance to them” that cannot fully capture the “raw and horrific” experience he had when he first visited the research facility. He confesses that the “safe, stale, academic references” he had in his mind before his visit were no preparation for the reality he faced: “I remember wishing I hadn’t gone,” he admits. The realities that awaited the young surgical student during the 17th, 18th, and 19th centuries were grim. These were not the bloodless bodies of today—with their preserved limbs and starched linens. Indeed, Kasten tells me that she found the “lack of particular smells” in the dissection room to be “surprising.” Even when slicing open the colon and “squeezing out the long toothpaste-like stream of feces,” she was not met with the familiar “human smells” one might expect.

Today, cadavers are cloaked in anonymity. Yet, I was surprised by the frequency with which questions about a specimen’s former humanity came up during my interviews. Lilly remembers the first time she looked upon the feet of a cadaver. She wondered if those feet “had walked on the beach;” if those “toes had ever had sand between them?” Similarly, Waite often thinks back to an elderly man he dissected during anatomy lessons. Aside from some atherosclerosis, the man belied his age. Waite remembers being struck that one can achieve great age with so little evidence of disease after death. 12 years later, he still had questions: had this man “walked with a frame or unaided?” Did he “maintain his independence or was he mentally more frail in life than his physical organs appeared in death?” I believe these questions speak less about the dead than they do the living. Focusing on the humanity of the corpse sometimes serves as a distraction from one’s own sense of inhumanity as a dissector. It is a small comfort to those faced with the task of cutting open a dead body. “We worried there was something defective about us,” Kasten reflects, “that we were so easily able to go about cutting up a person into his constituent parts in a methodical, emotionless way.” After all, she admits, “our new normal really was very abnormal.”

If you enjoy my blog, please consider supporting my content by clicking HERE.

Works Cited
1. D. Burch, Digging up the Dead: Uncovering the Life and Time of an Extraordinary Surgeon (Vintage: London,2008).
2. A. Cunningham, The Anatomist Anatomis’d: An Experimental Discipline in Enlightenment Europe (Ashgate: Aldershot, 2010).
3. L. Payne, With Words and Knives: Learning Medical Dispassion in Early Modern England (Ashgate: Aldershot, 2007).
4. R. Richardson, Death, Dissection and the Destitute, 2nd edn. (University of Chicago Press: Chicago, 2000).

The Strange, the Morbid, the Bizarre – Now on Instagram!


After years of resisting, I’m finally on Instagram! Follow me for strange, morbid, and bizarre history facts each day by clicking HERE. The above photo (featured on my account) is a radioactive chocolate bar from 1931. The German company that produced it claimed that it would make those who ate it look younger!

As always, you can also follow me on Twitter and Facebook to get your fill of the weird. Come say hello to me on social media.



“Limbs Not Yet Rigid” – A History of Dissecting the Living

L0031335 The dead alive! H. Wigstead 1784

Several years ago, the news reported a story that could have come straight from the script of a horror movie. In October 2009, Colleen S. Burns was admitted to St Joseph’s Hospital Center in New York for a drug overdose. A short time later, a team of doctors pronounced the 39-year-old woman dead. Her family was notified and Burns’s body was prepped for organ donation.

The only problem was: Burns wasn’t actually dead.

She was in a drug-induced coma. Fortunately for her, she woke minutes before the first incision was made. Happily, occurrences such as this are few and far between these days. Yet in the past, incidences of premature dissection were not uncommon.

In 1746, Jacques-Bénigne Winslow wrote: “Tho’ Death, at some Time or other, is the necessary and unavoidable Portion of Human Nature in its present Condition, yet it is not always certain, that Persons taken for dead are really and irretrievably deprived of Life.” Indeed, the Danish anatomist went on to claim that it was “evident from Experience” that those thought to be dead have proven otherwise “by rising from their Shrowds [sic], their Coffins, and even from their Graves.” [1]

DA3Fears over premature burial were ubiquitous during this period, so much so that people created “life preserving coffins” with bells and breathing tubes attached. But even worse than being buried alive was the thought of being dissected alive. The threat was real, and it happened often enough to be commented on in contemporary literature with some frequency.

The 17th and 18th centuries were rife with stories about executed criminals who had “returned from the dead” just moments before being dissected. In 1651, Anne Green was hanged in Oxford for infanticide. For thirty minutes, she dangled at the end of the noose while her friends thumped her chest and put “their weight upon her leggs [sic]…lifting her up and then pulling her downe againe with a suddain jerke” in order to quicken her death. Afterwards, her body was cut down from the gallows and brought to Drs Thomas Willis and William Petty to be dissected. Just seconds before Willis plunged the knife into her sternum, Anne miraculously awoke. [2]

The 19th century had its fair share of incidences too. The physician and surgeon, Sir Robert Christison, complained that dissection in St Bartholomew’s Hospital in London was “apt to be performed with indecent, sometimes with dangerous haste” during this period. He remembered:

…an occasion when [William] Cullen commenced the dissection of a man who died on hour before, and when fluid blood gushed in abundance from the first incision through the skin…Instantly I seized his wrist in great alarm, and arrested his progress; nor was I easily persuaded to let him go on, when I saw the blood coagulate on the table exactly like living blood.

He further remarked: “It was no uncommon occurrence that, when the operator proceeded with his work, the body was sensibly warm, the limbs not yet rigid, the blood in the great vessels fluid and coagulable [sic].” [3]

M0008887 An aged anatomist selecting his dissection instrument whilst

The problem wasn’t contained to Britain alone. The French physician — Pierre Charles Alexandre Louis — reported the story of a patient who had been placed in his dissection room at the Pitié-Salpêtrière Hospital in Paris. The next morning, the doctor’s colleagues informed him that they had heard moans in the locked theater overnight. When Louis went to check it out, he found “to his horror that the supposed corpse had revived during the night, and had actually died in the struggle to disengage herself from the winding sheet in which she was enveloped.” [4]

It was largely because of reports like this that anatomists, themselves, worried about the precise moment of death when cutting open bodies. To avoid disaster, good old Winslow suggested that a person’s gums be rubbed with caustic substances, and that the body be “stimulate[d]…with Whips and Nettles” before being dissected. Furthermore, the anatomist should “irritate his Intestines by Means of Clysters and Injections of Air or Smoke” as well as “agitate… the Limbs by violent Extensions and Inflexions.” If possible, an attempt should also be made to “shock [the person’s] Ears by hideous Shrieks and excessive Noises.” [5]

To our modern sensibilities, these measures may seem extreme, even comical, but to Winslow, this was no laughing matter. In fact, he went even further, recommending that the palms of the hands and the soles of the feet be pricked with needles, and that the “Scapulae, Shoulders and Arms” be scarified using fire or sharp instruments so as to “lacerate and strip [them] of the epidermis.” Indeed, when reading Winslow’s work, one gets the innate feeling that he took pleasure in imaging new ways to torture the dead.

Today, new debates have arisen over the very definition of death itself with the emergence of “beating heart cadavers.” Though considered dead in both a medical and legal capacity, these “cadavers” are kept on ventilators for organ and tissue transplantation. Their hearts beat; they expel waste; they have the ability to heal themselves of infection; they can even carry a fetus to term. Crucially, though, their brains are no longer functioning. It is in this way that the medical community has redefined death in the 21st century.

Yet, some wonder whether these “beating heart cadavers” are really dead, or whether they are just straddling the great divide between life and death before the finally lights go out. Or, worse, have been misdiagnosed, as in the case of Colleen Burns.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Jacques-Bénigne Winslow, The Uncertainty of the Signs of Death, and the Danger of Precipitate Interments and Dissections (1746), pp. 1-2.
2. Anon., A declaration from Oxford, of Anne Green a young woman that was lately, and unjustly hanged in the Castle-yard; but since recovered (London, 1651), p. 2.
3. R. Christison, The Life of Sir Robert Christison (1885-6), pp. 192-3. Originally quoted in Ruth Richardson, Death, Dissection and the Destitute (2000), p. 98.
4. Ibid.
5. Winslow, The Uncertainty of the Signs of Death, p. 2.

“Our Changing Attitudes Towards Death” – in THE GUARDIAN


D10My article on the history of our ever-changing attitudes towards death is out in The Guardian today, featuring fascinating photos by Dr. Paul Koudounaris of the Ma’nene Festival of Corpses in Indonesia. Big thanks to Caitlin Doughty and Dr. John Troyer for sharing their thoughts on the future of death with me for this article. Check it out by clicking HERE.