The Cutter’s Art: A Brief History of Bloodletting

 

V0011195 An ill man who is being bled by his doctor. Coloured etching

When King Charles II suffered a sudden seizure on the morning of 2 February 1685, his personal physician had just the remedy. He quickly slashed open a vein in the king’s left arm and filled a basin with the royal blood. Over the next few days, the king was tortured by a swarm of physicians buzzing around his bedside. They gave enemas and urged him to drink various potions, including boiled spirits from a human skull. The monarch was bled a second time before he lapsed into a coma. He never awoke.

Even without his doctors’ ministrations, the king may well have succumbed to whatever ailed him, yet his final days were certainly not made any easier by the relentless bloodletting and purging. By the time of Charles II’s death, however, bloodletting was standard medical practice.

Bloodletting dates back to the Roman physician, Galen, who lived in the 2nd century AD. Galen taught that blood was the product of food. After reaching the stomach, food was liquefied and then sent to the liver, where it was turned into blood. Occasionally, the body produced an excess of blood, which according to Galenic practitioners, caused fevers, headaches, and even seizures. The only recourse was to rid the body of this superfluous fluid.

BarberAs vital as bloodletting was felt to be, many physicians believed the “cutter’s art” was beneath their station. Instead, they referred those in need of bleeding to barber-surgeons, who carried out this duty in addition to a diverse range of other personal services.

The traditional striped barber’s pole harks back to that era, when it served as an advertisement for their proficiency as bloodletters. The pole represents the rod that the patient gripped to make their veins bulge, and the brass ball at the top symbolizes the basin used to collect the blood. The red and white stripes represent the bloodied bandages. Once washed and hung to dry on the rod outside the shop, they would twist in the wind, forming the familiar spiral pattern adorning modern poles.

While bloodletting seems barbaric to modern eyes, it was considered a standard part of medical treatment, demanded by many people when they felt ill in the same way we might ask for antibiotics when visiting the doctor’s office today. Take George Washington (below), who woke on the morning of 14 December 1799 complaining that he couldn’t breathe. Fearing his doctor would not arrive in time, Washington asked for the overseer of his slaves to step in and bleed him. The cut was deep, and Washington lost nearly half a pint before the wound was closed. Eventually, the physicians arrived and proceeded to bleed Washington four more times in the next eight hours. By evening, America’s first president was dead. One of his physicians, James Craik, later admitted that he thought the blood loss was partly responsible.

Washington

Bloodletting reached its apogee in the early 19th century. By then, people were not just bled when they were ill. It was also used for preventative purposes, typically in the spring, seen as a time of rebirth and rejuvenation. During this period, leeching was the preferred method. This type of worm can suck several times its own body weight in blood and is a lot safer than cutting open a vein. Leeching became so popular that it led to a “leech craze.” Throughout England, leech collectors (mostly women) would wade into leech-infested ponds with bare legs in order to attract the slimy bloodsuckers. Once the leeches had had their fill, they would fall off leaving the collector to then sell them to medical practitioners for profit.

L0057148 Pewter box for transporting leeches, Europe, 1801-1900

Unsurprisingly, leech collectors commonly suffered from headaches as a result blood-loss, and sometimes contracted diseases from contact with the leeches

But why did bloodletting remain so popular for so long? Despite advances in anatomy and diagnostics during the 18th and 19th centuries, therapeutics did not evolve quickly enough to match new understandings of the body. Many practitioners believed it was better to do something than to do nothing.

In the cases of Charles II and George Washington, they were most definitely wrong.

If you enjoy my blog, please consider supporting my content by clicking HERE.

 

 

Mangling the Dead: Dissection, Past & Present

D2

I never feel more alive than when I am standing among the rows and rows of anatomical specimens in medical museums around London. In one jar floats the remains of an ulcerated stomach; in another, the hands of a suicide victim. Cabinets are filled with syphilitic skulls, arthritic joints, and cancerous bones. The unborn sit alongside the aged; murderers occupy the same space as the murdered.

D3As a medical historian, I have a professional interest in these collections as part of my ongoing research into the early history of surgery. Occasionally, however, I catch a glimpse of veins and arteries dangling from a severed wrist—or the bloated face of a child who died long ago—and I reflect on the actual surgeons and anatomists who cut up these dead bodies. How did they overcome the emotional and physical realities of dissection? And how do contemporary experiences in the dissection room compare with those from the past?

A few months ago, I sat down with my mother, a registered nurse, and we talked about her first dissection. She spoke with intense clarity as if it had happened only yesterday: “She was a woman in her 30s, younger than I was then, who had died from toxic shock syndrome. I felt sorry for her.” My mother paused as the memories flooded over her. “I wanted to cover her body with a blanket, not because she was naked…I don’t know. I just thought she’d be more comfortable with a blanket over her before we began poking, prodding, and pulling her to pieces.”

The idea that the anatomisation of the body is tantamount to “hacking,” “distorting,” or “disfiguring” what once was a living human being has troubled medical students for centuries. In 1663, the Danish physician, Thomas Bartholin, wrote that one must partake in “mangling the dead so that he may know the living.” Nearly a century later, the Master of Anatomy to the Incorporation of Surgeons of London remarked to those attending the public dissection of the criminal, Richard Lamb, “I think few who now look upon that miserable mangled object before us, can ever forget it.” Then, as now, confronting a cadaver could be a distressing event. But the unsanitary conditions of the past often made the experience even more traumatic.

D5

Unlike the sterile laboratories of today, the “dead house” from previous centuries was a very different place. One medical student from the 19th century described the “swarms of sparrows fighting for scraps” and the “rats in the corner gnawing bleeding vertebrae”. The dissection theatre was bloody, smelly, and filled with all kinds of animals and insects trying to feast on the decomposing bodies, some of which had been plucked from the grave after a couple of days in the ground. In letters, diaries, and medical notes from Europe during the Enlightenment, I often come across descriptions of “decaying flesh,” “rancid corpses,” and “putrid stenches”—not the “slightly sweet, clinical smell” that some medical practitioners remember today. In a letter dated Oct 8, 1793, James Williams—a 16-year-old English surgical student—described his living quarters in John Hunter’s anatomy school in London as “a little perfumed.” The 17th-century German surgeon, Lorenz Heister, was not as delicate in his descriptions. He recommended that “Students in Surgery should not only be furnished with Strength of Body, but constancy of Mind” so that they remain “unmolested and unmoved by Stench, Blood, Pus and Nastiness that will naturally occur to them” during their practice.

D6There were plenty of young men who entered their anatomy lessons only to discover they did not have the “constancy of Mind” required to endure the realities of dissection. The composer, Hector Berlioz, who attended medical school in Paris in 1821 “leapt out of the window, and fled as though Death and all his hideous crew were at my heels” the first time he entered what he described as a “human charnel-house.” Berlioz claimed that it was “twenty-four hours before I recovered from the shock” and was able to return to the dissection theatre. Thomas Platter the Younger, a 16th-century Swiss physician, was haunted by the memories of his first dissection. During the first week of his lessons, he dreamt he had feasted upon human flesh. When he awoke in the middle of the night, he vomited.

Many, however, did learn to adapt over time. Bit by bit, piece by piece, they began to view the body not as a person but as an object. Some surgeons and physicians were even able to cut open the bodies of relatives. The French anatomist, Guillaume Rondelet, caused an uproar in 1538 when he publicly dissected the body of his infant son, whilst William Harvey undertook private dissections on both his father and his sister during the 16th century. These men, of course, were exceptional, but it does illustrate the extent to which one could become detached from the dissected body.

D4

This made me wonder about medical students today. How did their experiences compare to those from earlier periods? To find out, I interviewed several doctors about their earliest memories in the dissection room. From these conversations, I discovered that many medical students are just as apprehensive about their first encounters with a cadaver as their predecessors. Erica Lilly, a general practitioner in Canada, remembers her unease as the cadaver’s head was unwrapped from its plastic covering during her first anatomy lesson. The face did not look human “as much as it looked like a mask,” she says, her voice laced with emotion. Similarly, Jennifer Kasten, Research Fellow at the Department of Surgery of the University of California, Los Angeles, recalls the “muffled crying” from some of her fellow students as the body bags were unzipped for the very first time. She describes the first cut as “one of quiet and awed intensity.” For her, it was an “initiation into the mysteries of medicine.” The physical act of cutting open a dead body was only one of the challenges that interviewees mentioned during the course of our conversations. The odour was also an obstacle. Thomas Waite, Specialty Registrar in Public Health in Wales, remembers it vividly: “I’ll never forget [the smell]…At the end of the year I threw away the only set of clothes I wore under my dissection coat because no amount of washing could get rid of it.”

D7

The sensory experiences of those working in the earlier periods would have differed greatly from those of Waite. To better understand what medical students in earlier periods might have felt when first confronted with the rotting flesh of unpreserved corpses, I turned to William MacLehose, a medical historian at University College London. Several years ago, he visited the “Body Farm,” the University of Tennessee’s Anthropology Research Facility in Knoxville, TN, USA, where human decomposition is studied. When I asked MacLehose to describe his reaction to what he saw on the Body Farm, he struggles to find words, pointing out that “words will always have some level of distance to them” that cannot fully capture the “raw and horrific” experience he had when he first visited the research facility. He confesses that the “safe, stale, academic references” he had in his mind before his visit were no preparation for the reality he faced: “I remember wishing I hadn’t gone,” he admits. The realities that awaited the young surgical student during the 17th, 18th, and 19th centuries were grim. These were not the bloodless bodies of today—with their preserved limbs and starched linens. Indeed, Kasten tells me that she found the “lack of particular smells” in the dissection room to be “surprising.” Even when slicing open the colon and “squeezing out the long toothpaste-like stream of feces,” she was not met with the familiar “human smells” one might expect.

Today, cadavers are cloaked in anonymity. Yet, I was surprised by the frequency with which questions about a specimen’s former humanity came up during my interviews. Lilly remembers the first time she looked upon the feet of a cadaver. She wondered if those feet “had walked on the beach;” if those “toes had ever had sand between them?” Similarly, Waite often thinks back to an elderly man he dissected during anatomy lessons. Aside from some atherosclerosis, the man belied his age. Waite remembers being struck that one can achieve great age with so little evidence of disease after death. 12 years later, he still had questions: had this man “walked with a frame or unaided?” Did he “maintain his independence or was he mentally more frail in life than his physical organs appeared in death?” I believe these questions speak less about the dead than they do the living. Focusing on the humanity of the corpse sometimes serves as a distraction from one’s own sense of inhumanity as a dissector. It is a small comfort to those faced with the task of cutting open a dead body. “We worried there was something defective about us,” Kasten reflects, “that we were so easily able to go about cutting up a person into his constituent parts in a methodical, emotionless way.” After all, she admits, “our new normal really was very abnormal.”

If you enjoy my blog, please consider supporting my content by clicking HERE.

Works Cited
1. D. Burch, Digging up the Dead: Uncovering the Life and Time of an Extraordinary Surgeon (Vintage: London,2008).
2. A. Cunningham, The Anatomist Anatomis’d: An Experimental Discipline in Enlightenment Europe (Ashgate: Aldershot, 2010).
3. L. Payne, With Words and Knives: Learning Medical Dispassion in Early Modern England (Ashgate: Aldershot, 2007).
4. R. Richardson, Death, Dissection and the Destitute, 2nd edn. (University of Chicago Press: Chicago, 2000).

The Strange, the Morbid, the Bizarre – Now on Instagram!

Chocolate

After years of resisting, I’m finally on Instagram! Follow me for strange, morbid, and bizarre history facts each day by clicking HERE. The above photo (featured on my account) is a radioactive chocolate bar from 1931. The German company that produced it claimed that it would make those who ate it look younger!

As always, you can also follow me on Twitter and Facebook to get your fill of the weird. Come say hello to me on social media.

 

 

“Limbs Not Yet Rigid” – A History of Dissecting the Living

L0031335 The dead alive! H. Wigstead 1784

Several years ago, the news reported a story that could have come straight from the script of a horror movie. In October 2009, Colleen S. Burns was admitted to St Joseph’s Hospital Center in New York for a drug overdose. A short time later, a team of doctors pronounced the 39-year-old woman dead. Her family was notified and Burns’s body was prepped for organ donation.

The only problem was: Burns wasn’t actually dead.

She was in a drug-induced coma. Fortunately for her, she woke minutes before the first incision was made. Happily, occurrences such as this are few and far between these days. Yet in the past, incidences of premature dissection were not uncommon.

In 1746, Jacques-Bénigne Winslow wrote: “Tho’ Death, at some Time or other, is the necessary and unavoidable Portion of Human Nature in its present Condition, yet it is not always certain, that Persons taken for dead are really and irretrievably deprived of Life.” Indeed, the Danish anatomist went on to claim that it was “evident from Experience” that those thought to be dead have proven otherwise “by rising from their Shrowds [sic], their Coffins, and even from their Graves.” [1]

DA3Fears over premature burial were ubiquitous during this period, so much so that people created “life preserving coffins” with bells and breathing tubes attached. But even worse than being buried alive was the thought of being dissected alive. The threat was real, and it happened often enough to be commented on in contemporary literature with some frequency.

The 17th and 18th centuries were rife with stories about executed criminals who had “returned from the dead” just moments before being dissected. In 1651, Anne Green was hanged in Oxford for infanticide. For thirty minutes, she dangled at the end of the noose while her friends thumped her chest and put “their weight upon her leggs [sic]…lifting her up and then pulling her downe againe with a suddain jerke” in order to quicken her death. Afterwards, her body was cut down from the gallows and brought to Drs Thomas Willis and William Petty to be dissected. Just seconds before Willis plunged the knife into her sternum, Anne miraculously awoke. [2]

The 19th century had its fair share of incidences too. The physician and surgeon, Sir Robert Christison, complained that dissection in St Bartholomew’s Hospital in London was “apt to be performed with indecent, sometimes with dangerous haste” during this period. He remembered:

…an occasion when [William] Cullen commenced the dissection of a man who died on hour before, and when fluid blood gushed in abundance from the first incision through the skin…Instantly I seized his wrist in great alarm, and arrested his progress; nor was I easily persuaded to let him go on, when I saw the blood coagulate on the table exactly like living blood.

He further remarked: “It was no uncommon occurrence that, when the operator proceeded with his work, the body was sensibly warm, the limbs not yet rigid, the blood in the great vessels fluid and coagulable [sic].” [3]

M0008887 An aged anatomist selecting his dissection instrument whilst

The problem wasn’t contained to Britain alone. The French physician — Pierre Charles Alexandre Louis — reported the story of a patient who had been placed in his dissection room at the Pitié-Salpêtrière Hospital in Paris. The next morning, the doctor’s colleagues informed him that they had heard moans in the locked theater overnight. When Louis went to check it out, he found “to his horror that the supposed corpse had revived during the night, and had actually died in the struggle to disengage herself from the winding sheet in which she was enveloped.” [4]

It was largely because of reports like this that anatomists, themselves, worried about the precise moment of death when cutting open bodies. To avoid disaster, good old Winslow suggested that a person’s gums be rubbed with caustic substances, and that the body be “stimulate[d]…with Whips and Nettles” before being dissected. Furthermore, the anatomist should “irritate his Intestines by Means of Clysters and Injections of Air or Smoke” as well as “agitate… the Limbs by violent Extensions and Inflexions.” If possible, an attempt should also be made to “shock [the person’s] Ears by hideous Shrieks and excessive Noises.” [5]

To our modern sensibilities, these measures may seem extreme, even comical, but to Winslow, this was no laughing matter. In fact, he went even further, recommending that the palms of the hands and the soles of the feet be pricked with needles, and that the “Scapulae, Shoulders and Arms” be scarified using fire or sharp instruments so as to “lacerate and strip [them] of the epidermis.” Indeed, when reading Winslow’s work, one gets the innate feeling that he took pleasure in imaging new ways to torture the dead.

Today, new debates have arisen over the very definition of death itself with the emergence of “beating heart cadavers.” Though considered dead in both a medical and legal capacity, these “cadavers” are kept on ventilators for organ and tissue transplantation. Their hearts beat; they expel waste; they have the ability to heal themselves of infection; they can even carry a fetus to term. Crucially, though, their brains are no longer functioning. It is in this way that the medical community has redefined death in the 21st century.

Yet, some wonder whether these “beating heart cadavers” are really dead, or whether they are just straddling the great divide between life and death before the finally lights go out. Or, worse, have been misdiagnosed, as in the case of Colleen Burns.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Jacques-Bénigne Winslow, The Uncertainty of the Signs of Death, and the Danger of Precipitate Interments and Dissections (1746), pp. 1-2.
2. Anon., A declaration from Oxford, of Anne Green a young woman that was lately, and unjustly hanged in the Castle-yard; but since recovered (London, 1651), p. 2.
3. R. Christison, The Life of Sir Robert Christison (1885-6), pp. 192-3. Originally quoted in Ruth Richardson, Death, Dissection and the Destitute (2000), p. 98.
4. Ibid.
5. Winslow, The Uncertainty of the Signs of Death, p. 2.

“Our Changing Attitudes Towards Death” – in THE GUARDIAN

 

D10My article on the history of our ever-changing attitudes towards death is out in The Guardian today, featuring fascinating photos by Dr. Paul Koudounaris of the Ma’nene Festival of Corpses in Indonesia. Big thanks to Caitlin Doughty and Dr. John Troyer for sharing their thoughts on the future of death with me for this article. Check it out by clicking HERE.

The Mad Dogs of London: A Tale of Rabies

 

L0048997 A mad dog on the run in a London street: citizens attack it

There was panic on the streets of London in 1760, and the city’s newspapers weren’t helping the situation. Hundreds of column inches, for week upon week, were full of terrifying reports about an outbreak of attacks by rabid dogs. Armchair experts even wrote letters to newspaper editors offering advice and hypotheses on the causes and prevention of rabies (or “hydrophobia” as contemporaries called it).

Rumor fueled the journalistic fire, and the scare stories came in thick and fast. The London Chronicle was just one of many newspapers that reported the following representative sample of  incidents.

MD5A rabid dog bit a child on the hand in the Strand, and her parents had her arm immediately cut off to prevent the infection spreading, but the unfortunate girl expired soon after, in great agony.

One Sunday morning, a William Hambly of Deptford was getting into his coach, when he was bitten by a “mad dog,” which had come running down But Lane from London Road. The alarm was raised, and the dog was soon hunted down and shot.

A nine-year-old girl of Virginia Street, Wapping, was bitten by a puppy. Shortly afterwards she began to show signs of madness, and her parents were obliged to tie her down to her bed. It was reported that a few days later he raved and barked like a dog.

The son of a ticket porter of Thames Street was also savaged by a stray mutt, but as there appeared to be no subsequent signs of madness about him, his death a while after the attack was attributed to fright rather than to the wound itself. Another report stated that a mad dog had bitten three other dogs in Islington, and also two cows belonging to a Mr. Pullein. [1]

M0009276 Man being bitten by a mad dog

Such stories unnerved the people of Georgian London, and not without good reason. Even today, rabies is the most deadly virus on earth—more so than Ebola—with nearly a 100 percent mortality rate in the unvaccinated. Since ancient times, it had been recognized that the virus was contracted via an animal bite. In 300 B.C., Aristotle wrote that “Rabies makes the animal mad… It is fatal to the dog itself, and to any animal it bites.” Pliny the Elder also recognized this mode of transmission, but added that dogs could become rabid by tasting the menstrual blood of a woman. [2] (Seems legit).

The first mention of rabies in Britain dates back to 1026 A.D. A Welshman named Howel Dda reported that numerous dogs were suffering from the “madness.” As a result, a law was enacted in Wales that called for the killing of any dog suspected of having rabies. In addition to preventative measures, doctors ofered a multitude of suggestions for how to cure the virus once a person contracted it. These included various herbal remedies such as Scutellaria lateriflora, also known as  Mad-dog Skullcap (a member of the mint family); and cauterizing the wound with a hot iron. [3]

It wasn’t until the 18th century, however, that large outbreaks of rabies began to occur in Britain. With public alarm at fever pitch in 1760, London’s Common Council decided it was time for a radical solution. On August 26th, the Council issued an order that a bounty of 2 shillings should be paid to public-spirited Londoners for any stray dog killed. As a consequence, boys, apprentices, and nefarious youths started going about the city carrying clubs and cudgels, with the intention of butchering numberless dogs. Their bloodied and battered carcasses were tossed in ditches in Moorfields. The Gentleman’s Magazine reported that “No less than the bodies of thirty dead dogs were told in one day…by a person of undoubted veracity, who was only casually passing by that way.” [4]

L0009996 Rabies: Slaying a mad dog

The Council’s actions were condemned by many animal lovers as licensed cruelty. The Gentleman’s Magazine added their concern: “not one in a thousand [of these dead dogs] will be mad…Those who make it a revenue to kill the dogs will carefully avoid meddling with any that have bad symptoms, from the dread of the consequences.” [5] One famous name was dragged into the debate about the rights and wrongs of the dog cull. Renowned pug-owner William Hogarth, doyen of satirical engravings, had his own art-form turned against him by the caricaturists, who showed him in a print entitled, The Dog killers of London & Westminster or licenc’d Cruelty 1760. Surrounded by men beating strays to death in the street, the distraught Hogarth laments, “Oh! My poor Pugg. Oh! My little Dog.”

The rabies outbreak lasted three years. Eventually, the infected dogs died out or were killed, and calm was restored to the city. It would take another 125 years before Louis Pasteur would create a vaccination for the deadly virus and test it on 9-year-old Joseph Mesiter. But that, dear reader, is a subject for another blog post.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. I am hugely indebted to Adrian Teal and his fantastic book, The Gin Lane Gazette, for the 18th century stories cited in this article. Also, see The London Chronicle (August 1760) for similar tales.
2. Both Aristotle and Pliny were quoted in Arthur A. King, Historical Perspective of Rabies in Europe and the Mediterranean Basin (Paris, 2004). This book can be found online. It has a lot of detailed information about historical perceptions of rabies. Another great (digestible) article on the subject is by Lisa Smith, “The Problem of Mad Dogs in the Eighteenth Century,” The Sloane Letters Blog (27 January 2014).
3. Mike Rendell, “Rabies in the Eighteenth Century,” Georgian Gentleman (5 July 2012); and G. Fleming, Rabies and Hydrophobia. Their history, nature, causes, symptoms and prevention (London: Chapman and Hall, 1872), 405.
4. The Gentleman’s Magazine (August 1760). A compilation of this magazine from this period can be found here.
5. Ibid.