Under the Knife, Episode 10 – Al Capone’s Grave

In Episode 10 of Under the Knife, I hit the road to visit the grave of the infamous American gangster, Al Capone. Learn about Capone’s torturous descent into madness caused by advance stage syphilis, and his eventual death and burial that left his grave exposed to vandals.

If you enjoy the series, please consider becoming a patron of my project by clicking here. And don’t forget to subscribe to my YouTube Channel, and like/comment on the video!

The Surgeon who Operated on Himself

blog1

Leonid Ivanovich Rogozov (pictured above and below right) knew he was in trouble when he began experiencing intense pain in lower right quadrant of his abdomen. He had been feeling unwell for several days, but suddenly, his temperature skyrocketed and he was overcome by waves of nausea. The 27-year-old surgeon knew it could only be one thing: appendicitis.

blog3The year was 1961, and under normal circumstances, appendicitis was not life-threatening. But Rogozov was stuck in the middle of the Antartica, surrounded by nothing but thousands of square miles of snow and ice, far from civilization. He was one of thirteen researchers who had just embarked on the sixth Soviet Antarctic Expedition.

And he was the only doctor.

At first, Rogozov resigned himself to his fate. He wrote in his diary:

It seems that I have appendicitis. I am keeping quiet about it, even smiling. Why frighten my friends? Who could be of help? A polar explorer’s only encounter with medicine is likely to have been in a dentist’s chair.

He was right that there was no one who could help. Even if there had been another research station within a reasonable distance, the blizzard raging outside Rogozov’s own encampment would have prevented anyone from reaching him. An evacuation by air was out of the question in those treacherous conditions. As the situation grew worse, the young Soviet surgeon did the only thing he could think of: he prepared to operate on himself.

Rogozov was not the first to attempt a self-appendectomy. In 1921, the American surgeon Evan O’Neill Kane undertook an impromptu experiment after he too was diagnosed with a severe case of appendicitis. He wanted to know whether invasive surgery performed under local anesthetic could be painless. Kane had several patients who had medical conditions which prevented them from undergoing general anesthetic. If he could remove his own appendix using just a local anesthetic, Kane reasoned that he could operate on others without having to administer ether, which he believed was dangerous and overused in surgery.

Lying in the operating theater at the Kane Summit Hospital, the 60-year-old surgeon announced his intentions to his staff. As he was Chief of Surgery, no one dared disagree with him. Kane proceeded by administering novocaine—a local anesthetic that had only recently replaced the far more dangerous drug, cocaine—as well as adrenalin into his abdominal wall. Propping himself up on pillows and using mirrors, he began cutting into his abdomen. At one point, Kane leaned too far forward and part of his intestines popped out. The seasoned surgeon calmly shoved his guts back into their rightful place before continuing with the operation. Within thirty minutes, he had located and removed the swollen appendix. Kane later said that he could have completed the operation more rapidly had it not been for the staff flitting around him nervously, unsure of what they were supposed to do.

blog2

Emboldened by his success, Kane decided to repair his own inguinal hernia under local anesthetic eleven years later. The operation was carried out with the the press in attendance. This operation was more dangerous than the appendectomy because of the risk of puncturing the femoral artery. Unfortunately, this second surgery was tricky, and ended up taking well over an hour. Kane never fully regained his strength. He eventually came down with pneumonia, and died three months later.

Back in Antartica, Rogozov enlisted the help of his colleagues, who assisted with mirrors and retractors as the surgeon cut deep into his own abdomen. After forty-five minutes, Rogozov began experiencing weakness and vertigo, and had to take short breaks. Eventually he was able to remove the offending organ and sew up the incision (pictured below, recovering). Miraculously, Rogozov was able to return to work within two weeks.

blog4The incident captured the imagination of the Soviet public at the time. After he returned from the expedition, Rogozov was awarded the Order of the Red Banner of Labour. The incident also brought about a change in policy. Thereafter, extensive health checks became mandatory for personnel before their departure for Antartica was sanctioned.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Under The Knife, Episode 9 – The Barber’s Pole

At last! A brand new episode of Under The Knife!

In Episode 9, I discuss the history of the barber’s pole, and how it relates to a bloody practice from our medical past. Learn how the barber’s pole got its red & white stripes.

If you enjoy the series, please consider becoming a patron of my project by clicking here. And don’t forget to subscribe to my YouTube Channel, and like/comment on the video!

Under The Knife – Reboot!

It’s been 18 months since I’ve filmed an episode of my YouTube series, Under The Knife. But that ends today! Check out the trailer to the series reboot, which may or may not involve my severed head. A NEW episode is coming next week. If you haven’t subscribed to the channel, please do. You’ll be automatically entered to win macabre little trinkets before the launch of our next video.

My team and I have a lot of fun, quirky things planned for the series in the coming months. Under The Knife combines traditional storytelling techniques with animation, special effects, and artwork to bring the medical past alive. I hope you enjoy watching the new series as much as I enjoy filming it for you.

“We Have Conquered Pain!” The Uses & Abuses of Ether in History

ether_dome_mural

The surgical revolution began with an American dentist and a curiously sweet-smelling liquid known as ether.

Officially, ether had been discovered in 1275, but its stupefying effects weren’t synthesized until 1540, when the German botanist and chemist Valerius Cordus created a revolutionary formula that involved adding sulfuric acid to ethyl alcohol. His contemporary Paracelsus experimented with ether on chickens, noting that when the birds drank the liquid, they would undergo prolonged sleep and awake unharmed. He concluded that the substance “quiets all suffering without any harm and relieves all pain, and quenches all fevers, and prevents complications in all disease.” [1] Yet inexplicably, it would be several hundred years before it was tested on humans.

00_01_morton-inhaler-replica-mThat moment finally arrived in 1842, when Crawford Williamson Long became the first pioneer to use ether as a general anesthetic when he removed a tumor from a patient’s neck in Jefferson, Georgia. Unfortunately, Long didn’t publish the results of his experiments until 1848. By that time, Boston dentist William T. G. Morton had won fame by using it while extracting a tooth painlessly from a patient on September 30, 1846 [see Morton’s inhaler for administering ether, right]. An account of this successful procedure was published in a newspaper, prompting a notable surgeon to ask Morton to assist him in an operation removing a large tumor from a patient’s lower jaw at Massachusetts General Hospital. After the demonstration, someone nicknamed the surgical amphitheater the “Ether Dome,” and it has been known by this name ever since.

It was an incredible breakthrough. Up until that point, surgery had been brutally painful. The patient, fully awake, would be restrained while the surgeon cut through skin, tissue, muscle, and bone. Surgeons were lauded for their brute strength and quick hands. A capable surgeon could remove a leg in under a minute. But with the discovery of ether, the need for speed in the operating theater had now vanished.

On November 18, 1846, Dr. Henry Jacob Bigelow wrote about this groundbreaking moment in The Boston Medical and Surgical Journal. He described how Morton had administered what he called “Letheon” to the patient before the operation commenced. This was a gas named after the River Lethe in classical mythology which made the souls of the dead forget their lives on earth. Morton, who had patented the composition of the gas shortly after the operation, kept its parts secret, even from the surgeons. Bigelow revealed, however, that he could detect the sickly sweet smell of ether in it. News about the miraculous substance which could render patients unconscious during surgery spread quickly around the world as surgeons rushed to test the effects of ether on their own patients.

The term “etherization” was coined, and the use of ether in surgery was celebrated in newspapers. “The history of Medicine has presented no parallel to the perfect success that has attended the use of ether,” a writer at the Exeter Flying Post proclaimed. [2] Another journalist declared: “Oh, what delight for every feeling heart… the announcement of this noble discovery of the power to still the sense of pain, and veil the eye and memory from all the horrors of an operation…WE HAVE CONQUERED PAIN!” [3]

5A curious by-product of all this was the ether parties that sprang up all over the world. Thomas Lint, a medical student at St. Bartholomew’s Hospital in London, confessed: “We sit round a table and suck [on an inhaling apparatus], like many nabobs with their hookahs. It’s glorious, as you will see from this analysis of a quarter of an hour’s jolly good suck.” [4] He then went on to describe several “ethereal” experiences he and his fellow classmates had while under the influence of the newly discovered substance.

Ether wasn’t just inhaled. It was also drunk, like alcohol. In Ireland, the substance replaced whiskey for a while, due to its low cost (a penny a draught). After drinking a glass of water, “ethermaniacs” would take a drop of the drug on their tongues while pinching their noses and chasing it with another glass of water. Taken this way, ether hit the user hard and fast. Dr. Ernest Hart wrote that “the immediate effects of drinking ether are similar to those produced by alcohol, but everything takes place more rapidly.” [5] Recovery was just as swift. Those taken into custody for drunken disorderliness were often completely sober by the time they reached the police station, with the bonus that they also suffered no hangover. In this way, 19th-century revelers could take draughts of ether several times a day, with little consequence. [6]

Today, the “Ether Dome” at Massachusetts General Hospital has become a national historic landmark [pictured below], visited by thousands of members of the public each year. Although surgeons haven’t operated there for well over a hundred years, the room is still used for meetings and lectures at the hospital. The Ether Dome looks more or less like it did 165 years ago. Display cases at either end of the room contain surgical instruments from Morton’s day, their blades dull and rusted with age. At the front of the room an Egyptian mummy lords over the phantom audience. One can almost detect the sweet smell of ether in the air from so long ago.

the-ether-dome-mgh-boston-04

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Quoted in Steve Parker, Kill or Cure: An Illustrated History ofMedicine (London: DK, 2013), 174.
2. “Etherization in Surgery,” Exeter Flying Post, 24 June, 1847, 4.
3. London People’s Journal, 9 January, 1847.
4. Punch, or The London Charivari (December 1847), 259.
5. Quoted in David J. Linden, Pleasure: How Our Brains Make Junk Food, Exercise, Marijuana, Generosity & Gambling Feel So Good (Viking, 2011), 31.
6. Sterling Haynes, “Ethermaniacs,” BC Medical Journal (June 2014), Vol. 56 (No.5), 254-3.

The Medicalization of Death in History

Ars2

When the Black Death swept through Europe in the 14th century, it claimed the lives of over 75 million people, many of who were clergymen whose job it was to help usher the dying into the next world. In response to the shortage of priests, the Ars Moriendi (Art of Dying) first emerged in 1415. This was a manual that provided practical guidance to the dying and those who attended them in their final moments. These included prayers and prescribed rites to be performed at the deathbed, as well as illustrations and descriptions of the temptations one must overcome in order to achieve a “good death.”

From the medieval period onwards, the dying were expected to follow a set of “rules” when facing the final moments of their lives, which included repentance of sins, forgiving enemies, and accepting one’s fate stoically without complaint.  It was each person’s duty to die a righteous death.

Ars1In earlier periods, many people believed that pain was a necessary component of a good death. Indeed, the word patient comes from the Latin word patiens, meaning “long-suffering” or “one who suffers.”  Evangelical Christians, in particular, feared losing lucidity as death approached, as this would prohibit the person from begging forgiveness for past sins and putting his or her worldly affairs in order before departing this life. Death was a public event, with those closest to the dying in attendance. Friends and relatives were not merely passive observers. They often assisted the dying person in his or her final hours, offering up prayers as the moment drew closer. The deathbed presented the dying with the final opportunity for eternal salvation.

For this reason, the physician rarely appeared at the bedside of a dying person because pain management wasn’t required. Moreover, the general consensus was that it was inappropriate for a person to profit from another’s death. Caricatures depicting the greedy physician running off with bags of money after his patient had succumbed to his fate were not uncommon in the 18th and early 19th centuries.

Over time, however, religious sentiments faded, and physicians began to appear more regularly in the homes of the dying. Part of this was driven by the fact that doctors became more effective at pain management. At the start of the 19th century, they typically administered laudanum drops orally to patients. This process was imprecise, and sometimes not effective at all. All this changed in 1853 when Charles Pravaz and Alexander Wood developed a medical hypodermic syringe with a needle fine enough to pierce the skin (see example below from c.1880). From that point onwards, physicians began administering morphine intravenously, which was far more effective (though also more dangerous). As new techniques emerged, people’s attitudes towards pain management in treating the dying began to change. Soon, a painless death was not only seen to be acceptable, but also vital to achieving a “good death.”

Ars3

The doctor’s place at the bedside of the dying became increasingly more commonplace. Whereas the deathbed was formerly governed by religious tradition, it now became the purview of the medical community. The emphasis for the doctor had shifted from cure to care by the end of the 19th century, thus making his place in the “death chamber” more acceptable. Now, the good physician was one who stood by his patient, even when nothing could be done to save his or her life. As the legal scholar Shai Lavi succinctly put it: “The ethics of the deathbed shifted from religion to medicine, and dying further emerged as a matter of regulating life: life was now understood in its biological, rather than biographical, sense.”

The medicalization of death had arrived, one which for better or worse continues to shape how we die today.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

  1. Shai Lavi, “Euthanasia and the Changing Ethics of the Deathbed: A Study in Historical Jurisprudence,” Theoretical Inquiries in Law, 4.2 (2003): pp. 729 – 761.

The Cutter’s Art: A Brief History of Bloodletting

 

V0011195 An ill man who is being bled by his doctor. Coloured etching

When King Charles II suffered a sudden seizure on the morning of 2 February 1685, his personal physician had just the remedy. He quickly slashed open a vein in the king’s left arm and filled a basin with the royal blood. Over the next few days, the king was tortured by a swarm of physicians buzzing around his bedside. They gave enemas and urged him to drink various potions, including boiled spirits from a human skull. The monarch was bled a second time before he lapsed into a coma. He never awoke.

Even without his doctors’ ministrations, the king may well have succumbed to whatever ailed him, yet his final days were certainly not made any easier by the relentless bloodletting and purging. By the time of Charles II’s death, however, bloodletting was standard medical practice.

Bloodletting dates back to the Roman physician, Galen, who lived in the 2nd century AD. Galen taught that blood was the product of food. After reaching the stomach, food was liquefied and then sent to the liver, where it was turned into blood. Occasionally, the body produced an excess of blood, which according to Galenic practitioners, caused fevers, headaches, and even seizures. The only recourse was to rid the body of this superfluous fluid.

BarberAs vital as bloodletting was felt to be, many physicians believed the “cutter’s art” was beneath their station. Instead, they referred those in need of bleeding to barber-surgeons, who carried out this duty in addition to a diverse range of other personal services.

The traditional striped barber’s pole harks back to that era, when it served as an advertisement for their proficiency as bloodletters. The pole represents the rod that the patient gripped to make their veins bulge, and the brass ball at the top symbolizes the basin used to collect the blood. The red and white stripes represent the bloodied bandages. Once washed and hung to dry on the rod outside the shop, they would twist in the wind, forming the familiar spiral pattern adorning modern poles.

While bloodletting seems barbaric to modern eyes, it was considered a standard part of medical treatment, demanded by many people when they felt ill in the same way we might ask for antibiotics when visiting the doctor’s office today. Take George Washington (below), who woke on the morning of 14 December 1799 complaining that he couldn’t breathe. Fearing his doctor would not arrive in time, Washington asked for the overseer of his slaves to step in and bleed him. The cut was deep, and Washington lost nearly half a pint before the wound was closed. Eventually, the physicians arrived and proceeded to bleed Washington four more times in the next eight hours. By evening, America’s first president was dead. One of his physicians, James Craik, later admitted that he thought the blood loss was partly responsible.

Washington

Bloodletting reached its apogee in the early 19th century. By then, people were not just bled when they were ill. It was also used for preventative purposes, typically in the spring, seen as a time of rebirth and rejuvenation. During this period, leeching was the preferred method. This type of worm can suck several times its own body weight in blood and is a lot safer than cutting open a vein. Leeching became so popular that it led to a “leech craze.” Throughout England, leech collectors (mostly women) would wade into leech-infested ponds with bare legs in order to attract the slimy bloodsuckers. Once the leeches had had their fill, they would fall off leaving the collector to then sell them to medical practitioners for profit.

L0057148 Pewter box for transporting leeches, Europe, 1801-1900

Unsurprisingly, leech collectors commonly suffered from headaches as a result blood-loss, and sometimes contracted diseases from contact with the leeches

But why did bloodletting remain so popular for so long? Despite advances in anatomy and diagnostics during the 18th and 19th centuries, therapeutics did not evolve quickly enough to match new understandings of the body. Many practitioners believed it was better to do something than to do nothing.

In the cases of Charles II and George Washington, they were most definitely wrong.

If you enjoy my blog, please consider supporting my content by clicking HERE.