Under The Knife – Reboot!

It’s been 18 months since I’ve filmed an episode of my YouTube series, Under The Knife. But that ends today! Check out the trailer to the series reboot, which may or may not involve my severed head. A NEW episode is coming next week. If you haven’t subscribed to the channel, please do. You’ll be automatically entered to win macabre little trinkets before the launch of our next video.

My team and I have a lot of fun, quirky things planned for the series in the coming months. Under The Knife combines traditional storytelling techniques with animation, special effects, and artwork to bring the medical past alive. I hope you enjoy watching the new series as much as I enjoy filming it for you.

“We Have Conquered Pain!” The Uses & Abuses of Ether in History

ether_dome_mural

The surgical revolution began with an American dentist and a curiously sweet-smelling liquid known as ether.

Officially, ether had been discovered in 1275, but its stupefying effects weren’t synthesized until 1540, when the German botanist and chemist Valerius Cordus created a revolutionary formula that involved adding sulfuric acid to ethyl alcohol. His contemporary Paracelsus experimented with ether on chickens, noting that when the birds drank the liquid, they would undergo prolonged sleep and awake unharmed. He concluded that the substance “quiets all suffering without any harm and relieves all pain, and quenches all fevers, and prevents complications in all disease.” [1] Yet inexplicably, it would be several hundred years before it was tested on humans.

00_01_morton-inhaler-replica-mThat moment finally arrived in 1842, when Crawford Williamson Long became the first pioneer to use ether as a general anesthetic when he removed a tumor from a patient’s neck in Jefferson, Georgia. Unfortunately, Long didn’t publish the results of his experiments until 1848. By that time, Boston dentist William T. G. Morton had won fame by using it while extracting a tooth painlessly from a patient on September 30, 1846 [see Morton’s inhaler for administering ether, right]. An account of this successful procedure was published in a newspaper, prompting a notable surgeon to ask Morton to assist him in an operation removing a large tumor from a patient’s lower jaw at Massachusetts General Hospital. After the demonstration, someone nicknamed the surgical amphitheater the “Ether Dome,” and it has been known by this name ever since.

It was an incredible breakthrough. Up until that point, surgery had been brutally painful. The patient, fully awake, would be restrained while the surgeon cut through skin, tissue, muscle, and bone. Surgeons were lauded for their brute strength and quick hands. A capable surgeon could remove a leg in under a minute. But with the discovery of ether, the need for speed in the operating theater had now vanished.

On November 18, 1846, Dr. Henry Jacob Bigelow wrote about this groundbreaking moment in The Boston Medical and Surgical Journal. He described how Morton had administered what he called “Letheon” to the patient before the operation commenced. This was a gas named after the River Lethe in classical mythology which made the souls of the dead forget their lives on earth. Morton, who had patented the composition of the gas shortly after the operation, kept its parts secret, even from the surgeons. Bigelow revealed, however, that he could detect the sickly sweet smell of ether in it. News about the miraculous substance which could render patients unconscious during surgery spread quickly around the world as surgeons rushed to test the effects of ether on their own patients.

The term “etherization” was coined, and the use of ether in surgery was celebrated in newspapers. “The history of Medicine has presented no parallel to the perfect success that has attended the use of ether,” a writer at the Exeter Flying Post proclaimed. [2] Another journalist declared: “Oh, what delight for every feeling heart… the announcement of this noble discovery of the power to still the sense of pain, and veil the eye and memory from all the horrors of an operation…WE HAVE CONQUERED PAIN!” [3]

5A curious by-product of all this was the ether parties that sprang up all over the world. Thomas Lint, a medical student at St. Bartholomew’s Hospital in London, confessed: “We sit round a table and suck [on an inhaling apparatus], like many nabobs with their hookahs. It’s glorious, as you will see from this analysis of a quarter of an hour’s jolly good suck.” [4] He then went on to describe several “ethereal” experiences he and his fellow classmates had while under the influence of the newly discovered substance.

Ether wasn’t just inhaled. It was also drunk, like alcohol. In Ireland, the substance replaced whiskey for a while, due to its low cost (a penny a draught). After drinking a glass of water, “ethermaniacs” would take a drop of the drug on their tongues while pinching their noses and chasing it with another glass of water. Taken this way, ether hit the user hard and fast. Dr. Ernest Hart wrote that “the immediate effects of drinking ether are similar to those produced by alcohol, but everything takes place more rapidly.” [5] Recovery was just as swift. Those taken into custody for drunken disorderliness were often completely sober by the time they reached the police station, with the bonus that they also suffered no hangover. In this way, 19th-century revelers could take draughts of ether several times a day, with little consequence. [6]

Today, the “Ether Dome” at Massachusetts General Hospital has become a national historic landmark [pictured below], visited by thousands of members of the public each year. Although surgeons haven’t operated there for well over a hundred years, the room is still used for meetings and lectures at the hospital. The Ether Dome looks more or less like it did 165 years ago. Display cases at either end of the room contain surgical instruments from Morton’s day, their blades dull and rusted with age. At the front of the room an Egyptian mummy lords over the phantom audience. One can almost detect the sweet smell of ether in the air from so long ago.

the-ether-dome-mgh-boston-04

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Quoted in Steve Parker, Kill or Cure: An Illustrated History ofMedicine (London: DK, 2013), 174.
2. “Etherization in Surgery,” Exeter Flying Post, 24 June, 1847, 4.
3. London People’s Journal, 9 January, 1847.
4. Punch, or The London Charivari (December 1847), 259.
5. Quoted in David J. Linden, Pleasure: How Our Brains Make Junk Food, Exercise, Marijuana, Generosity & Gambling Feel So Good (Viking, 2011), 31.
6. Sterling Haynes, “Ethermaniacs,” BC Medical Journal (June 2014), Vol. 56 (No.5), 254-3.

The Medicalization of Death in History

Ars2

When the Black Death swept through Europe in the 14th century, it claimed the lives of over 75 million people, many of who were clergymen whose job it was to help usher the dying into the next world. In response to the shortage of priests, the Ars Moriendi (Art of Dying) first emerged in 1415. This was a manual that provided practical guidance to the dying and those who attended them in their final moments. These included prayers and prescribed rites to be performed at the deathbed, as well as illustrations and descriptions of the temptations one must overcome in order to achieve a “good death.”

From the medieval period onwards, the dying were expected to follow a set of “rules” when facing the final moments of their lives, which included repentance of sins, forgiving enemies, and accepting one’s fate stoically without complaint.  It was each person’s duty to die a righteous death.

Ars1In earlier periods, many people believed that pain was a necessary component of a good death. Indeed, the word patient comes from the Latin word patiens, meaning “long-suffering” or “one who suffers.”  Evangelical Christians, in particular, feared losing lucidity as death approached, as this would prohibit the person from begging forgiveness for past sins and putting his or her worldly affairs in order before departing this life. Death was a public event, with those closest to the dying in attendance. Friends and relatives were not merely passive observers. They often assisted the dying person in his or her final hours, offering up prayers as the moment drew closer. The deathbed presented the dying with the final opportunity for eternal salvation.

For this reason, the physician rarely appeared at the bedside of a dying person because pain management wasn’t required. Moreover, the general consensus was that it was inappropriate for a person to profit from another’s death. Caricatures depicting the greedy physician running off with bags of money after his patient had succumbed to his fate were not uncommon in the 18th and early 19th centuries.

Over time, however, religious sentiments faded, and physicians began to appear more regularly in the homes of the dying. Part of this was driven by the fact that doctors became more effective at pain management. At the start of the 19th century, they typically administered laudanum drops orally to patients. This process was imprecise, and sometimes not effective at all. All this changed in 1853 when Charles Pravaz and Alexander Wood developed a medical hypodermic syringe with a needle fine enough to pierce the skin (see example below from c.1880). From that point onwards, physicians began administering morphine intravenously, which was far more effective (though also more dangerous). As new techniques emerged, people’s attitudes towards pain management in treating the dying began to change. Soon, a painless death was not only seen to be acceptable, but also vital to achieving a “good death.”

Ars3

The doctor’s place at the bedside of the dying became increasingly more commonplace. Whereas the deathbed was formerly governed by religious tradition, it now became the purview of the medical community. The emphasis for the doctor had shifted from cure to care by the end of the 19th century, thus making his place in the “death chamber” more acceptable. Now, the good physician was one who stood by his patient, even when nothing could be done to save his or her life. As the legal scholar Shai Lavi succinctly put it: “The ethics of the deathbed shifted from religion to medicine, and dying further emerged as a matter of regulating life: life was now understood in its biological, rather than biographical, sense.”

The medicalization of death had arrived, one which for better or worse continues to shape how we die today.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

  1. Shai Lavi, “Euthanasia and the Changing Ethics of the Deathbed: A Study in Historical Jurisprudence,” Theoretical Inquiries in Law, 4.2 (2003): pp. 729 – 761.

The Cutter’s Art: A Brief History of Bloodletting

 

V0011195 An ill man who is being bled by his doctor. Coloured etching

When King Charles II suffered a sudden seizure on the morning of 2 February 1685, his personal physician had just the remedy. He quickly slashed open a vein in the king’s left arm and filled a basin with the royal blood. Over the next few days, the king was tortured by a swarm of physicians buzzing around his bedside. They gave enemas and urged him to drink various potions, including boiled spirits from a human skull. The monarch was bled a second time before he lapsed into a coma. He never awoke.

Even without his doctors’ ministrations, the king may well have succumbed to whatever ailed him, yet his final days were certainly not made any easier by the relentless bloodletting and purging. By the time of Charles II’s death, however, bloodletting was standard medical practice.

Bloodletting dates back to the Roman physician, Galen, who lived in the 2nd century AD. Galen taught that blood was the product of food. After reaching the stomach, food was liquefied and then sent to the liver, where it was turned into blood. Occasionally, the body produced an excess of blood, which according to Galenic practitioners, caused fevers, headaches, and even seizures. The only recourse was to rid the body of this superfluous fluid.

BarberAs vital as bloodletting was felt to be, many physicians believed the “cutter’s art” was beneath their station. Instead, they referred those in need of bleeding to barber-surgeons, who carried out this duty in addition to a diverse range of other personal services.

The traditional striped barber’s pole harks back to that era, when it served as an advertisement for their proficiency as bloodletters. The pole represents the rod that the patient gripped to make their veins bulge, and the brass ball at the top symbolizes the basin used to collect the blood. The red and white stripes represent the bloodied bandages. Once washed and hung to dry on the rod outside the shop, they would twist in the wind, forming the familiar spiral pattern adorning modern poles.

While bloodletting seems barbaric to modern eyes, it was considered a standard part of medical treatment, demanded by many people when they felt ill in the same way we might ask for antibiotics when visiting the doctor’s office today. Take George Washington (below), who woke on the morning of 14 December 1799 complaining that he couldn’t breathe. Fearing his doctor would not arrive in time, Washington asked for the overseer of his slaves to step in and bleed him. The cut was deep, and Washington lost nearly half a pint before the wound was closed. Eventually, the physicians arrived and proceeded to bleed Washington four more times in the next eight hours. By evening, America’s first president was dead. One of his physicians, James Craik, later admitted that he thought the blood loss was partly responsible.

Washington

Bloodletting reached its apogee in the early 19th century. By then, people were not just bled when they were ill. It was also used for preventative purposes, typically in the spring, seen as a time of rebirth and rejuvenation. During this period, leeching was the preferred method. This type of worm can suck several times its own body weight in blood and is a lot safer than cutting open a vein. Leeching became so popular that it led to a “leech craze.” Throughout England, leech collectors (mostly women) would wade into leech-infested ponds with bare legs in order to attract the slimy bloodsuckers. Once the leeches had had their fill, they would fall off leaving the collector to then sell them to medical practitioners for profit.

L0057148 Pewter box for transporting leeches, Europe, 1801-1900

Unsurprisingly, leech collectors commonly suffered from headaches as a result blood-loss, and sometimes contracted diseases from contact with the leeches

But why did bloodletting remain so popular for so long? Despite advances in anatomy and diagnostics during the 18th and 19th centuries, therapeutics did not evolve quickly enough to match new understandings of the body. Many practitioners believed it was better to do something than to do nothing.

In the cases of Charles II and George Washington, they were most definitely wrong.

If you enjoy my blog, please consider supporting my content by clicking HERE.

 

 

Mangling the Dead: Dissection, Past & Present

D2

I never feel more alive than when I am standing among the rows and rows of anatomical specimens in medical museums around London. In one jar floats the remains of an ulcerated stomach; in another, the hands of a suicide victim. Cabinets are filled with syphilitic skulls, arthritic joints, and cancerous bones. The unborn sit alongside the aged; murderers occupy the same space as the murdered.

D3As a medical historian, I have a professional interest in these collections as part of my ongoing research into the early history of surgery. Occasionally, however, I catch a glimpse of veins and arteries dangling from a severed wrist—or the bloated face of a child who died long ago—and I reflect on the actual surgeons and anatomists who cut up these dead bodies. How did they overcome the emotional and physical realities of dissection? And how do contemporary experiences in the dissection room compare with those from the past?

A few months ago, I sat down with my mother, a registered nurse, and we talked about her first dissection. She spoke with intense clarity as if it had happened only yesterday: “She was a woman in her 30s, younger than I was then, who had died from toxic shock syndrome. I felt sorry for her.” My mother paused as the memories flooded over her. “I wanted to cover her body with a blanket, not because she was naked…I don’t know. I just thought she’d be more comfortable with a blanket over her before we began poking, prodding, and pulling her to pieces.”

The idea that the anatomisation of the body is tantamount to “hacking,” “distorting,” or “disfiguring” what once was a living human being has troubled medical students for centuries. In 1663, the Danish physician, Thomas Bartholin, wrote that one must partake in “mangling the dead so that he may know the living.” Nearly a century later, the Master of Anatomy to the Incorporation of Surgeons of London remarked to those attending the public dissection of the criminal, Richard Lamb, “I think few who now look upon that miserable mangled object before us, can ever forget it.” Then, as now, confronting a cadaver could be a distressing event. But the unsanitary conditions of the past often made the experience even more traumatic.

D5

Unlike the sterile laboratories of today, the “dead house” from previous centuries was a very different place. One medical student from the 19th century described the “swarms of sparrows fighting for scraps” and the “rats in the corner gnawing bleeding vertebrae”. The dissection theatre was bloody, smelly, and filled with all kinds of animals and insects trying to feast on the decomposing bodies, some of which had been plucked from the grave after a couple of days in the ground. In letters, diaries, and medical notes from Europe during the Enlightenment, I often come across descriptions of “decaying flesh,” “rancid corpses,” and “putrid stenches”—not the “slightly sweet, clinical smell” that some medical practitioners remember today. In a letter dated Oct 8, 1793, James Williams—a 16-year-old English surgical student—described his living quarters in John Hunter’s anatomy school in London as “a little perfumed.” The 17th-century German surgeon, Lorenz Heister, was not as delicate in his descriptions. He recommended that “Students in Surgery should not only be furnished with Strength of Body, but constancy of Mind” so that they remain “unmolested and unmoved by Stench, Blood, Pus and Nastiness that will naturally occur to them” during their practice.

D6There were plenty of young men who entered their anatomy lessons only to discover they did not have the “constancy of Mind” required to endure the realities of dissection. The composer, Hector Berlioz, who attended medical school in Paris in 1821 “leapt out of the window, and fled as though Death and all his hideous crew were at my heels” the first time he entered what he described as a “human charnel-house.” Berlioz claimed that it was “twenty-four hours before I recovered from the shock” and was able to return to the dissection theatre. Thomas Platter the Younger, a 16th-century Swiss physician, was haunted by the memories of his first dissection. During the first week of his lessons, he dreamt he had feasted upon human flesh. When he awoke in the middle of the night, he vomited.

Many, however, did learn to adapt over time. Bit by bit, piece by piece, they began to view the body not as a person but as an object. Some surgeons and physicians were even able to cut open the bodies of relatives. The French anatomist, Guillaume Rondelet, caused an uproar in 1538 when he publicly dissected the body of his infant son, whilst William Harvey undertook private dissections on both his father and his sister during the 16th century. These men, of course, were exceptional, but it does illustrate the extent to which one could become detached from the dissected body.

D4

This made me wonder about medical students today. How did their experiences compare to those from earlier periods? To find out, I interviewed several doctors about their earliest memories in the dissection room. From these conversations, I discovered that many medical students are just as apprehensive about their first encounters with a cadaver as their predecessors. Erica Lilly, a general practitioner in Canada, remembers her unease as the cadaver’s head was unwrapped from its plastic covering during her first anatomy lesson. The face did not look human “as much as it looked like a mask,” she says, her voice laced with emotion. Similarly, Jennifer Kasten, Research Fellow at the Department of Surgery of the University of California, Los Angeles, recalls the “muffled crying” from some of her fellow students as the body bags were unzipped for the very first time. She describes the first cut as “one of quiet and awed intensity.” For her, it was an “initiation into the mysteries of medicine.” The physical act of cutting open a dead body was only one of the challenges that interviewees mentioned during the course of our conversations. The odour was also an obstacle. Thomas Waite, Specialty Registrar in Public Health in Wales, remembers it vividly: “I’ll never forget [the smell]…At the end of the year I threw away the only set of clothes I wore under my dissection coat because no amount of washing could get rid of it.”

D7

The sensory experiences of those working in the earlier periods would have differed greatly from those of Waite. To better understand what medical students in earlier periods might have felt when first confronted with the rotting flesh of unpreserved corpses, I turned to William MacLehose, a medical historian at University College London. Several years ago, he visited the “Body Farm,” the University of Tennessee’s Anthropology Research Facility in Knoxville, TN, USA, where human decomposition is studied. When I asked MacLehose to describe his reaction to what he saw on the Body Farm, he struggles to find words, pointing out that “words will always have some level of distance to them” that cannot fully capture the “raw and horrific” experience he had when he first visited the research facility. He confesses that the “safe, stale, academic references” he had in his mind before his visit were no preparation for the reality he faced: “I remember wishing I hadn’t gone,” he admits. The realities that awaited the young surgical student during the 17th, 18th, and 19th centuries were grim. These were not the bloodless bodies of today—with their preserved limbs and starched linens. Indeed, Kasten tells me that she found the “lack of particular smells” in the dissection room to be “surprising.” Even when slicing open the colon and “squeezing out the long toothpaste-like stream of feces,” she was not met with the familiar “human smells” one might expect.

Today, cadavers are cloaked in anonymity. Yet, I was surprised by the frequency with which questions about a specimen’s former humanity came up during my interviews. Lilly remembers the first time she looked upon the feet of a cadaver. She wondered if those feet “had walked on the beach;” if those “toes had ever had sand between them?” Similarly, Waite often thinks back to an elderly man he dissected during anatomy lessons. Aside from some atherosclerosis, the man belied his age. Waite remembers being struck that one can achieve great age with so little evidence of disease after death. 12 years later, he still had questions: had this man “walked with a frame or unaided?” Did he “maintain his independence or was he mentally more frail in life than his physical organs appeared in death?” I believe these questions speak less about the dead than they do the living. Focusing on the humanity of the corpse sometimes serves as a distraction from one’s own sense of inhumanity as a dissector. It is a small comfort to those faced with the task of cutting open a dead body. “We worried there was something defective about us,” Kasten reflects, “that we were so easily able to go about cutting up a person into his constituent parts in a methodical, emotionless way.” After all, she admits, “our new normal really was very abnormal.”

If you enjoy my blog, please consider supporting my content by clicking HERE.

Works Cited
1. D. Burch, Digging up the Dead: Uncovering the Life and Time of an Extraordinary Surgeon (Vintage: London,2008).
2. A. Cunningham, The Anatomist Anatomis’d: An Experimental Discipline in Enlightenment Europe (Ashgate: Aldershot, 2010).
3. L. Payne, With Words and Knives: Learning Medical Dispassion in Early Modern England (Ashgate: Aldershot, 2007).
4. R. Richardson, Death, Dissection and the Destitute, 2nd edn. (University of Chicago Press: Chicago, 2000).

The Strange, the Morbid, the Bizarre – Now on Instagram!

Chocolate

After years of resisting, I’m finally on Instagram! Follow me for strange, morbid, and bizarre history facts each day by clicking HERE. The above photo (featured on my account) is a radioactive chocolate bar from 1931. The German company that produced it claimed that it would make those who ate it look younger!

As always, you can also follow me on Twitter and Facebook to get your fill of the weird. Come say hello to me on social media.