Mangling the Dead: Dissection, Past & Present

D2

I never feel more alive than when I am standing among the rows and rows of anatomical specimens in medical museums around London. In one jar floats the remains of an ulcerated stomach; in another, the hands of a suicide victim. Cabinets are filled with syphilitic skulls, arthritic joints, and cancerous bones. The unborn sit alongside the aged; murderers occupy the same space as the murdered.

D3As a medical historian, I have a professional interest in these collections as part of my ongoing research into the early history of surgery. Occasionally, however, I catch a glimpse of veins and arteries dangling from a severed wrist—or the bloated face of a child who died long ago—and I reflect on the actual surgeons and anatomists who cut up these dead bodies. How did they overcome the emotional and physical realities of dissection? And how do contemporary experiences in the dissection room compare with those from the past?

A few months ago, I sat down with my mother, a registered nurse, and we talked about her first dissection. She spoke with intense clarity as if it had happened only yesterday: “She was a woman in her 30s, younger than I was then, who had died from toxic shock syndrome. I felt sorry for her.” My mother paused as the memories flooded over her. “I wanted to cover her body with a blanket, not because she was naked…I don’t know. I just thought she’d be more comfortable with a blanket over her before we began poking, prodding, and pulling her to pieces.”

The idea that the anatomisation of the body is tantamount to “hacking,” “distorting,” or “disfiguring” what once was a living human being has troubled medical students for centuries. In 1663, the Danish physician, Thomas Bartholin, wrote that one must partake in “mangling the dead so that he may know the living.” Nearly a century later, the Master of Anatomy to the Incorporation of Surgeons of London remarked to those attending the public dissection of the criminal, Richard Lamb, “I think few who now look upon that miserable mangled object before us, can ever forget it.” Then, as now, confronting a cadaver could be a distressing event. But the unsanitary conditions of the past often made the experience even more traumatic.

D5

Unlike the sterile laboratories of today, the “dead house” from previous centuries was a very different place. One medical student from the 19th century described the “swarms of sparrows fighting for scraps” and the “rats in the corner gnawing bleeding vertebrae”. The dissection theatre was bloody, smelly, and filled with all kinds of animals and insects trying to feast on the decomposing bodies, some of which had been plucked from the grave after a couple of days in the ground. In letters, diaries, and medical notes from Europe during the Enlightenment, I often come across descriptions of “decaying flesh,” “rancid corpses,” and “putrid stenches”—not the “slightly sweet, clinical smell” that some medical practitioners remember today. In a letter dated Oct 8, 1793, James Williams—a 16-year-old English surgical student—described his living quarters in John Hunter’s anatomy school in London as “a little perfumed.” The 17th-century German surgeon, Lorenz Heister, was not as delicate in his descriptions. He recommended that “Students in Surgery should not only be furnished with Strength of Body, but constancy of Mind” so that they remain “unmolested and unmoved by Stench, Blood, Pus and Nastiness that will naturally occur to them” during their practice.

D6There were plenty of young men who entered their anatomy lessons only to discover they did not have the “constancy of Mind” required to endure the realities of dissection. The composer, Hector Berlioz, who attended medical school in Paris in 1821 “leapt out of the window, and fled as though Death and all his hideous crew were at my heels” the first time he entered what he described as a “human charnel-house.” Berlioz claimed that it was “twenty-four hours before I recovered from the shock” and was able to return to the dissection theatre. Thomas Platter the Younger, a 16th-century Swiss physician, was haunted by the memories of his first dissection. During the first week of his lessons, he dreamt he had feasted upon human flesh. When he awoke in the middle of the night, he vomited.

Many, however, did learn to adapt over time. Bit by bit, piece by piece, they began to view the body not as a person but as an object. Some surgeons and physicians were even able to cut open the bodies of relatives. The French anatomist, Guillaume Rondelet, caused an uproar in 1538 when he publicly dissected the body of his infant son, whilst William Harvey undertook private dissections on both his father and his sister during the 16th century. These men, of course, were exceptional, but it does illustrate the extent to which one could become detached from the dissected body.

D4

This made me wonder about medical students today. How did their experiences compare to those from earlier periods? To find out, I interviewed several doctors about their earliest memories in the dissection room. From these conversations, I discovered that many medical students are just as apprehensive about their first encounters with a cadaver as their predecessors. Erica Lilly, a general practitioner in Canada, remembers her unease as the cadaver’s head was unwrapped from its plastic covering during her first anatomy lesson. The face did not look human “as much as it looked like a mask,” she says, her voice laced with emotion. Similarly, Jennifer Kasten, Research Fellow at the Department of Surgery of the University of California, Los Angeles, recalls the “muffled crying” from some of her fellow students as the body bags were unzipped for the very first time. She describes the first cut as “one of quiet and awed intensity.” For her, it was an “initiation into the mysteries of medicine.” The physical act of cutting open a dead body was only one of the challenges that interviewees mentioned during the course of our conversations. The odour was also an obstacle. Thomas Waite, Specialty Registrar in Public Health in Wales, remembers it vividly: “I’ll never forget [the smell]…At the end of the year I threw away the only set of clothes I wore under my dissection coat because no amount of washing could get rid of it.”

D7

The sensory experiences of those working in the earlier periods would have differed greatly from those of Waite. To better understand what medical students in earlier periods might have felt when first confronted with the rotting flesh of unpreserved corpses, I turned to William MacLehose, a medical historian at University College London. Several years ago, he visited the “Body Farm,” the University of Tennessee’s Anthropology Research Facility in Knoxville, TN, USA, where human decomposition is studied. When I asked MacLehose to describe his reaction to what he saw on the Body Farm, he struggles to find words, pointing out that “words will always have some level of distance to them” that cannot fully capture the “raw and horrific” experience he had when he first visited the research facility. He confesses that the “safe, stale, academic references” he had in his mind before his visit were no preparation for the reality he faced: “I remember wishing I hadn’t gone,” he admits. The realities that awaited the young surgical student during the 17th, 18th, and 19th centuries were grim. These were not the bloodless bodies of today—with their preserved limbs and starched linens. Indeed, Kasten tells me that she found the “lack of particular smells” in the dissection room to be “surprising.” Even when slicing open the colon and “squeezing out the long toothpaste-like stream of feces,” she was not met with the familiar “human smells” one might expect.

Today, cadavers are cloaked in anonymity. Yet, I was surprised by the frequency with which questions about a specimen’s former humanity came up during my interviews. Lilly remembers the first time she looked upon the feet of a cadaver. She wondered if those feet “had walked on the beach;” if those “toes had ever had sand between them?” Similarly, Waite often thinks back to an elderly man he dissected during anatomy lessons. Aside from some atherosclerosis, the man belied his age. Waite remembers being struck that one can achieve great age with so little evidence of disease after death. 12 years later, he still had questions: had this man “walked with a frame or unaided?” Did he “maintain his independence or was he mentally more frail in life than his physical organs appeared in death?” I believe these questions speak less about the dead than they do the living. Focusing on the humanity of the corpse sometimes serves as a distraction from one’s own sense of inhumanity as a dissector. It is a small comfort to those faced with the task of cutting open a dead body. “We worried there was something defective about us,” Kasten reflects, “that we were so easily able to go about cutting up a person into his constituent parts in a methodical, emotionless way.” After all, she admits, “our new normal really was very abnormal.”

If you enjoy my blog, please consider supporting my content by clicking HERE.

Works Cited
1. D. Burch, Digging up the Dead: Uncovering the Life and Time of an Extraordinary Surgeon (Vintage: London,2008).
2. A. Cunningham, The Anatomist Anatomis’d: An Experimental Discipline in Enlightenment Europe (Ashgate: Aldershot, 2010).
3. L. Payne, With Words and Knives: Learning Medical Dispassion in Early Modern England (Ashgate: Aldershot, 2007).
4. R. Richardson, Death, Dissection and the Destitute, 2nd edn. (University of Chicago Press: Chicago, 2000).

The Strange, the Morbid, the Bizarre – Now on Instagram!

Chocolate

After years of resisting, I’m finally on Instagram! Follow me for strange, morbid, and bizarre history facts each day by clicking HERE. The above photo (featured on my account) is a radioactive chocolate bar from 1931. The German company that produced it claimed that it would make those who ate it look younger!

As always, you can also follow me on Twitter and Facebook to get your fill of the weird. Come say hello to me on social media.

 

 

“Limbs Not Yet Rigid” – A History of Dissecting the Living

L0031335 The dead alive! H. Wigstead 1784

Several years ago, the news reported a story that could have come straight from the script of a horror movie. In October 2009, Colleen S. Burns was admitted to St Joseph’s Hospital Center in New York for a drug overdose. A short time later, a team of doctors pronounced the 39-year-old woman dead. Her family was notified and Burns’s body was prepped for organ donation.

The only problem was: Burns wasn’t actually dead.

She was in a drug-induced coma. Fortunately for her, she woke minutes before the first incision was made. Happily, occurrences such as this are few and far between these days. Yet in the past, incidences of premature dissection were not uncommon.

In 1746, Jacques-Bénigne Winslow wrote: “Tho’ Death, at some Time or other, is the necessary and unavoidable Portion of Human Nature in its present Condition, yet it is not always certain, that Persons taken for dead are really and irretrievably deprived of Life.” Indeed, the Danish anatomist went on to claim that it was “evident from Experience” that those thought to be dead have proven otherwise “by rising from their Shrowds [sic], their Coffins, and even from their Graves.” [1]

DA3Fears over premature burial were ubiquitous during this period, so much so that people created “life preserving coffins” with bells and breathing tubes attached. But even worse than being buried alive was the thought of being dissected alive. The threat was real, and it happened often enough to be commented on in contemporary literature with some frequency.

The 17th and 18th centuries were rife with stories about executed criminals who had “returned from the dead” just moments before being dissected. In 1651, Anne Green was hanged in Oxford for infanticide. For thirty minutes, she dangled at the end of the noose while her friends thumped her chest and put “their weight upon her leggs [sic]…lifting her up and then pulling her downe againe with a suddain jerke” in order to quicken her death. Afterwards, her body was cut down from the gallows and brought to Drs Thomas Willis and William Petty to be dissected. Just seconds before Willis plunged the knife into her sternum, Anne miraculously awoke. [2]

The 19th century had its fair share of incidences too. The physician and surgeon, Sir Robert Christison, complained that dissection in St Bartholomew’s Hospital in London was “apt to be performed with indecent, sometimes with dangerous haste” during this period. He remembered:

…an occasion when [William] Cullen commenced the dissection of a man who died on hour before, and when fluid blood gushed in abundance from the first incision through the skin…Instantly I seized his wrist in great alarm, and arrested his progress; nor was I easily persuaded to let him go on, when I saw the blood coagulate on the table exactly like living blood.

He further remarked: “It was no uncommon occurrence that, when the operator proceeded with his work, the body was sensibly warm, the limbs not yet rigid, the blood in the great vessels fluid and coagulable [sic].” [3]

M0008887 An aged anatomist selecting his dissection instrument whilst

The problem wasn’t contained to Britain alone. The French physician — Pierre Charles Alexandre Louis — reported the story of a patient who had been placed in his dissection room at the Pitié-Salpêtrière Hospital in Paris. The next morning, the doctor’s colleagues informed him that they had heard moans in the locked theater overnight. When Louis went to check it out, he found “to his horror that the supposed corpse had revived during the night, and had actually died in the struggle to disengage herself from the winding sheet in which she was enveloped.” [4]

It was largely because of reports like this that anatomists, themselves, worried about the precise moment of death when cutting open bodies. To avoid disaster, good old Winslow suggested that a person’s gums be rubbed with caustic substances, and that the body be “stimulate[d]…with Whips and Nettles” before being dissected. Furthermore, the anatomist should “irritate his Intestines by Means of Clysters and Injections of Air or Smoke” as well as “agitate… the Limbs by violent Extensions and Inflexions.” If possible, an attempt should also be made to “shock [the person’s] Ears by hideous Shrieks and excessive Noises.” [5]

To our modern sensibilities, these measures may seem extreme, even comical, but to Winslow, this was no laughing matter. In fact, he went even further, recommending that the palms of the hands and the soles of the feet be pricked with needles, and that the “Scapulae, Shoulders and Arms” be scarified using fire or sharp instruments so as to “lacerate and strip [them] of the epidermis.” Indeed, when reading Winslow’s work, one gets the innate feeling that he took pleasure in imaging new ways to torture the dead.

Today, new debates have arisen over the very definition of death itself with the emergence of “beating heart cadavers.” Though considered dead in both a medical and legal capacity, these “cadavers” are kept on ventilators for organ and tissue transplantation. Their hearts beat; they expel waste; they have the ability to heal themselves of infection; they can even carry a fetus to term. Crucially, though, their brains are no longer functioning. It is in this way that the medical community has redefined death in the 21st century.

Yet, some wonder whether these “beating heart cadavers” are really dead, or whether they are just straddling the great divide between life and death before the finally lights go out. Or, worse, have been misdiagnosed, as in the case of Colleen Burns.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Jacques-Bénigne Winslow, The Uncertainty of the Signs of Death, and the Danger of Precipitate Interments and Dissections (1746), pp. 1-2.
2. Anon., A declaration from Oxford, of Anne Green a young woman that was lately, and unjustly hanged in the Castle-yard; but since recovered (London, 1651), p. 2.
3. R. Christison, The Life of Sir Robert Christison (1885-6), pp. 192-3. Originally quoted in Ruth Richardson, Death, Dissection and the Destitute (2000), p. 98.
4. Ibid.
5. Winslow, The Uncertainty of the Signs of Death, p. 2.

“Our Changing Attitudes Towards Death” – in THE GUARDIAN

 

D10My article on the history of our ever-changing attitudes towards death is out in The Guardian today, featuring fascinating photos by Dr. Paul Koudounaris of the Ma’nene Festival of Corpses in Indonesia. Big thanks to Caitlin Doughty and Dr. John Troyer for sharing their thoughts on the future of death with me for this article. Check it out by clicking HERE.

The Mad Dogs of London: A Tale of Rabies

 

L0048997 A mad dog on the run in a London street: citizens attack it

There was panic on the streets of London in 1760, and the city’s newspapers weren’t helping the situation. Hundreds of column inches, for week upon week, were full of terrifying reports about an outbreak of attacks by rabid dogs. Armchair experts even wrote letters to newspaper editors offering advice and hypotheses on the causes and prevention of rabies (or “hydrophobia” as contemporaries called it).

Rumor fueled the journalistic fire, and the scare stories came in thick and fast. The London Chronicle was just one of many newspapers that reported the following representative sample of  incidents.

MD5A rabid dog bit a child on the hand in the Strand, and her parents had her arm immediately cut off to prevent the infection spreading, but the unfortunate girl expired soon after, in great agony.

One Sunday morning, a William Hambly of Deptford was getting into his coach, when he was bitten by a “mad dog,” which had come running down But Lane from London Road. The alarm was raised, and the dog was soon hunted down and shot.

A nine-year-old girl of Virginia Street, Wapping, was bitten by a puppy. Shortly afterwards she began to show signs of madness, and her parents were obliged to tie her down to her bed. It was reported that a few days later he raved and barked like a dog.

The son of a ticket porter of Thames Street was also savaged by a stray mutt, but as there appeared to be no subsequent signs of madness about him, his death a while after the attack was attributed to fright rather than to the wound itself. Another report stated that a mad dog had bitten three other dogs in Islington, and also two cows belonging to a Mr. Pullein. [1]

M0009276 Man being bitten by a mad dog

Such stories unnerved the people of Georgian London, and not without good reason. Even today, rabies is the most deadly virus on earth—more so than Ebola—with nearly a 100 percent mortality rate in the unvaccinated. Since ancient times, it had been recognized that the virus was contracted via an animal bite. In 300 B.C., Aristotle wrote that “Rabies makes the animal mad… It is fatal to the dog itself, and to any animal it bites.” Pliny the Elder also recognized this mode of transmission, but added that dogs could become rabid by tasting the menstrual blood of a woman. [2] (Seems legit).

The first mention of rabies in Britain dates back to 1026 A.D. A Welshman named Howel Dda reported that numerous dogs were suffering from the “madness.” As a result, a law was enacted in Wales that called for the killing of any dog suspected of having rabies. In addition to preventative measures, doctors ofered a multitude of suggestions for how to cure the virus once a person contracted it. These included various herbal remedies such as Scutellaria lateriflora, also known as  Mad-dog Skullcap (a member of the mint family); and cauterizing the wound with a hot iron. [3]

It wasn’t until the 18th century, however, that large outbreaks of rabies began to occur in Britain. With public alarm at fever pitch in 1760, London’s Common Council decided it was time for a radical solution. On August 26th, the Council issued an order that a bounty of 2 shillings should be paid to public-spirited Londoners for any stray dog killed. As a consequence, boys, apprentices, and nefarious youths started going about the city carrying clubs and cudgels, with the intention of butchering numberless dogs. Their bloodied and battered carcasses were tossed in ditches in Moorfields. The Gentleman’s Magazine reported that “No less than the bodies of thirty dead dogs were told in one day…by a person of undoubted veracity, who was only casually passing by that way.” [4]

L0009996 Rabies: Slaying a mad dog

The Council’s actions were condemned by many animal lovers as licensed cruelty. The Gentleman’s Magazine added their concern: “not one in a thousand [of these dead dogs] will be mad…Those who make it a revenue to kill the dogs will carefully avoid meddling with any that have bad symptoms, from the dread of the consequences.” [5] One famous name was dragged into the debate about the rights and wrongs of the dog cull. Renowned pug-owner William Hogarth, doyen of satirical engravings, had his own art-form turned against him by the caricaturists, who showed him in a print entitled, The Dog killers of London & Westminster or licenc’d Cruelty 1760. Surrounded by men beating strays to death in the street, the distraught Hogarth laments, “Oh! My poor Pugg. Oh! My little Dog.”

The rabies outbreak lasted three years. Eventually, the infected dogs died out or were killed, and calm was restored to the city. It would take another 125 years before Louis Pasteur would create a vaccination for the deadly virus and test it on 9-year-old Joseph Mesiter. But that, dear reader, is a subject for another blog post.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. I am hugely indebted to Adrian Teal and his fantastic book, The Gin Lane Gazette, for the 18th century stories cited in this article. Also, see The London Chronicle (August 1760) for similar tales.
2. Both Aristotle and Pliny were quoted in Arthur A. King, Historical Perspective of Rabies in Europe and the Mediterranean Basin (Paris, 2004). This book can be found online. It has a lot of detailed information about historical perceptions of rabies. Another great (digestible) article on the subject is by Lisa Smith, “The Problem of Mad Dogs in the Eighteenth Century,” The Sloane Letters Blog (27 January 2014).
3. Mike Rendell, “Rabies in the Eighteenth Century,” Georgian Gentleman (5 July 2012); and G. Fleming, Rabies and Hydrophobia. Their history, nature, causes, symptoms and prevention (London: Chapman and Hall, 1872), 405.
4. The Gentleman’s Magazine (August 1760). A compilation of this magazine from this period can be found here.
5. Ibid.

“The Queen’s Big Belly:” The Phantom Pregnancy of Mary I

M1On 30 April 1555, the people of London took to the streets in celebration as bells ringing out around the city announced that Mary I, Queen of England, had been safely delivered of a healthy son. A preacher proclaimed to gatherers that no one had ever seen such a beautiful prince. News spread quickly to the continent, and letters of congratulation to the royal family began pouring in from Europe.

There was just one problem: Mary hadn’t given birth. In fact, there was no baby at all. What was initially hailed as a royal pregnancy ended in devastation and embarrassment for the Tudor Queen several months later.

Rumors began circulating about the pregnancy shortly after the Queen’s wedding to Philip II of Spain, in September 1554. Mary, who was by then 37-years-old, had reportedly stopped menstruating. Over the coming months, her belly expanded and her doctors attended to her morning sickness. The Queen—thoroughly convinced of the legitimacy of her pregnancy—ordered a royal nursery prepared in anticipation of the arrival of an heir that spring. Letters that would announce the birth of the prince or princess were primed and ready to be sent out at a day’s notice [Elizabeth I’s birth announcement below]. Only the dates and sex of the child needed filling in.

M4

The months ticked by. In June, the Queen issued a statement claiming that God would not allow her child to be born until all Protestant dissenters were punished. Mary—who had already burned countless heretics at the stake since coming to the throne the previous year— began another round of executions in a desperate attempt to induce labor. During this time, the court grew suspicious of the Queen’s condition. Giovanni Michieli, the Venetian ambassador, wrote that Mary’s pregnancy was more likely to “end in wind rather than anything else.”

By August it was clear that there would be no baby, and Mary finally emerged from her confinement, humiliated and defeated. Her belly was once again flat. Her body showed none of the signs that had led to the pregnancy being announced. Her political rivals rejoiced, believing this to be a sign of divine retribution.

L0005249 Foetus in womb.Conspiracy theories erupted immediately. Many people were convinced that Mary was ill. Others believed she had miscarried and simply couldn’t face the truth. Some even went so far as to claim that the barren Queen had been planning to smuggle a baby boy into the court but that the plan had fallen apart. A few wondered if Mary was even still alive. Whatever had happened, however, one thing was clear: Mary seemed to truly believe she had been pregnant.

Pseudocyesis, or phantom pregnancy, was a condition recognized by medical practitioners in the Tudor period. The physician William Harvey—best known for his discovery of the circulation of the blood around the heart—recorded several cases of phantom pregnancies which he had encountered in his practice during the 16th century. Most, he said, ended in “flatulency and fatness.” While many doctors like Harvey believed these phantom pregnancies were the product of trapped wind or the build-up of some kind of matter in the uterus, some thought they were the direct result of wishful thinking on the part of the expectant mother. Guillaume Mauqeust de la Motte referred to aging women, like Mary, who “have such an aversion for old-age, that they had rather believe themselves with child, than to confess they are growing old.”

Although it may seem astonishing today that a woman could falsely believe herself to be pregnant for a full nine months, we must remember that Mary lived during a time when there were no certain ways of determining pregnancy. This wasn’t helped by the fact that throughout her adolescence, Mary had also suffered from extremely painful and unpredictable periods that often left her incapacitated for weeks on end. Wildly fluctuating hormones may have been the cause of her halted menstruation in 1554, which naturally the Queen and her doctors took to be a sign of pregnancy.

M2 (1)

Upon hearing the news, Mary’s husband left England to prosecute a war against France. When he returned to his wife’s side two years later, he brought with him his mistress. It was at this time that the Queen suffered yet another phantom pregnancy—perhaps brought on by grief from her failing marriage and inability to bear children to date. This second incident, however, led many to believe she had a tumor growing in her womb. What else could cause Mary’s belly to grow as big as it would if she were carrying a royal heir?

Sadly, Mary died childless shortly after this second phantom pregnancy. She had been Queen for only four years. Those who embalmed her body and prepared it for burial found no indication of a tumor, or any other explanation for her false pregnancies, which were a source of such deep sadness for Mary in her lifetime.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

Levin, Carole. “Mary I’s Phantom Pregnancy.” History Extra (12 May 2015).

Levin, Carole, Barrett-Graves, D., Carney, J. (Eds.) High and Mighty Queens of Early Modern England: Realities and Representations (2003).

Rosenhek, Jackie. “An Heir-Raising Experience.” The Doctors Review (August 2013).