Painful Operations: Removing Bladder Stones before Anesthesia

C8WOwAkWsAAX23b

If you visit the Gordon Museum at Guy’s Hospital in London, you’ll see a small bladder stone—no bigger than 3 centimetres across. Besides the fact that it has been sliced open to reveal concentric circles within, it is entirely unremarkable in appearance. Yet, this tiny stone was the source of enormous pain for 53-year-old Stephen Pollard, who agreed to undergo surgery to remove it in 1828.

People frequently suffered from bladder stones in earlier periods due to poor diet, which often consisted of lots of meat and alcohol, and very few vegetables. The oldest bladder stone on record was discovered in Egyptian grave from 4,800 B.C. The problem was so common that itinerant healers traveled from village to village offering a vast array of services and potions that promised to cure those suffering from the condition. Depending on the size of these stones, they could block the flow of urine into the bladder from the kidneys; or, they could prevent the flow of urine out of the bladder through the urethra. Either situation was potentially lethal. In the first instance, the kidney is slowly destroyed by pressure from the urine; in the second instance, the bladder swells and eventually bursts, leading to infection and finally death.

2Like today, bladder stones were unimaginably painful for those who suffered from them in the past. The stones themselves were often enormous. Some measured as large as a tennis ball. The afflicted often acted in desperation, going to great lengths to rid themselves of the agony. In the early 18th century, one man reportedly drove a nail through his penis and then used a blacksmith’s hammer to break the stone apart until the pieces were small enough to pass through his urethra. It’s not a surprise, then, that many sufferers chose to submit to the surgeon’s knife despite a very real risk of dying during or immediately after the procedure from shock or infection. Although the operation itself lasted only a matter of minutes, lithotomic procedures were incredibly painful and dangerous—not to mention humiliating.

The patient—naked from the waist down—was bound in such a way as to ensure an unobstructed view of his genitals and anus [see illustration below]. Afterwards, the surgeon passed a curved, metal tube up the patient’s penis and into the bladder. He then slid a finger into the man’s rectum, feeling for the stone. Once he had located it, his assistant removed the metal tube and replaced it with a wooden staff. This staff acted as a guide so that the surgeon did not fatally rupture the patient’s rectum or intestines as he began cutting deeper into the bladder. Once the staff was in place, the surgeon cut diagonally through the fibrous muscle of the scrotum until he reached the wooden staff. Next, he used a probe to widen the hole, ripping open the prostate gland in the process. At this point, the wooden staff was removed and the surgeon used forceps to extract the stone from the bladder. [1]

L0015225 Lithotomy scene

Unfortunately for Stephen Pollard, what should have lasted 5 minutes ended up lasting 55 minutes under the gaze of 200 spectators at Guy’s Hospital in London. The surgeon Bransby Cooper fumbled and panicked, cursing the patient loudly for having “a very deep perineum,” while the patient, in turn, cried: “Oh! let it go; —pray, let it keep in!’” The surgeon reportedly used every tool at his disposal before he finally reached into the gaping wound with his bare fingers. During this time, several of the spectators walked out of the operating theater, unable to bear witness to the patient’s agony any longer. Eventually, Cooper located the stone with a pair of forceps. He held it up for his audience, who clapped unenthusiastically at the sight of the stone.

Sadly, Pollard survived the surgery only to die the next day. His autopsy revealed that it was indeed the skill of his surgeon, and not his alleged “abnormal anatomy,” which was the cause of his death.

1200px-Thomas_Wakley72But the story didn’t end there. Word quickly got out about the botched operation. When Thomas Wakley [left]—the editor of The Lancet—heard of this medical disaster, he accused Cooper of incompetence and implied that the surgeon had only been appointed to Guy’s Hospital because he was nephew to one of the senior surgeons on staff. Wakley used the trial to attack what he believed to be corruption within the hospitals due to rampant nepotism. Outraged by the allegation, Cooper sued Wakley for libel and sought £2000 in damages. The jury reluctantly sided with the surgeon, but only awarded him £100. Wakley had raised more than that in a defence fund campaign and gave the remaining money over to Pollard’s widow after the trial. [2]

Bransby Cooper’s reputation, like his patient, never did recover.

If you’re interested in the history of pre-anesthetic and pre-antiseptic surgery, you can pre-order my book The Butchering Art in the US (click here) and in the UK (click here). Information of foreign editions to come!

fitzharris_butcheringart_021417

1. Druin Burch, Digging up the Dead: Uncovering the Life and Times of an Extraordinary Surgeon (2007), p. 26. I am greatly indebted to his work for bringing this story to my attention.
2. Thomas Wakley, A Report of the Trial of Cooper v. Wakley (1829), pp. 4-5.

The Wandering Womb: Female Hysteria through the Ages

Hysteriaa

The word “hysteria” conjures up an array of images, none of which probably include a nomadic uterus wandering aimlessly around the female body. Yet that is precisely what medical practitioners in the past believed was the cause behind this mysterious disorder. The very word “hysteria” comes from the Greek word hystera, meaning “womb,” and arises from medical misunderstandings of basic female anatomy.

Today, hysteria is regarded as a physical expression of a mental conflict and can affect anyone regardless of age or gender. [1] Centuries ago, however, it was attributed only to women, and believed to be physiological (not psychological) in nature.

enhanced-1129-1458094853-1For instance, Plato believed that the womb—especially one which was barren—could become vexed and begin wandering throughout the body, blocking respiratory channels causing bizarre behavior. [2] This belief was ubiquitous in ancient Greece. The physician Aretaeus of Cappadocia went so far as to consider the womb “an animal within an animal,” an organ that “moved of itself hither and thither in the flanks.” [3] The uterus could move upwards, downwards, left or right. It could even collide with the liver or spleen. Depending on its direction, a wandering womb could cause all kinds of hell. One that traveled upwards might cause sluggishness, lack of strength, and vertigo in a patient; while a womb that moved downwards could cause a person to feel as if she were choking. So worrisome was the prospect of a wandering womb during this period, that some women wore amulets to protect themselves against it. [4]

The womb continued to hold a mystical place in medical text for centuries, and was often used to explain away an array of female complaints. The 17th-century physician William Harvey, famed for his theories on the circulation of the blood around the heart, perpetuated the belief that women were slaves to their own biology. He described the uterus as “insatiable, ferocious, animal-like,” and drew parallels between “bitches in heat and hysterical women.” [5] When a woman named Mary Glover accused her neighbor Elizabeth Jackson of cursing her in 1602, the physician Edward Jorden argued that the erratic behavior that drove Mary to make such an accusation was actually caused by noxious vapors in her womb, which he believed were slowly suffocating her. (The courts disagreed and Elizabeth Jackson was executed for witchcraft shortly thereafter.)

So what could be done for hysteria in the past?

e789fb4fb909b2a53918eb9a18b08db3Physicians prescribed all kinds of treatments for a wayward womb. These included sweet-smelling vaginal suppositories and fumigations used to tempt the uterus back to its rightful place. The Greek physician Atreaus wrote that the womb “delights…in fragrant smells and advances towards them; and it has an aversion to foetid smells, and flees from them.” Women were also advised to ingest disgusting substances—sometimes containing repulsive ingredients such as human or animal excrement—in order to coax the womb away from the lungs and heart. In some cases, physical force was used to correct the position of a wandering womb (see image, right). For the single woman suffering from hysteria, the cure was simple: marriage, followed by children. Lots and lots of children.

Today, wombs are no longer thought to wander; however, medicine still tends to pathologize the vagaries of the female reproductive system. [6] Over the course of several thousand years, the womb has become less of a way to explain physician ailments, and more of a way to explain psychological disfunction—often being cited as the reason behind irrationality and mood swings in women. Has the ever-elusive hysteria brought on by roving uteri simply been replaced by the equally intangible yet mysterious PMS? I’ll let you decide.

fitzharris_butcheringart_021417

You can now pre-order my book THE BUTCHERING ART by clicking here. THE BUTCHERING ART follows the story of Joseph Lister as he attempts to revolutionize the brutal world of Victorian surgery through antisepsis. Pre-orders are incredibly helpful to new authors. Info on how to order foreign editions coming soon. Your support is greatly appreciated. 

 

 

1. Mark J Adair, “Plato’s View of the ‘Wandering Uterus,’” The Classical Journal 91:2 (1996), p. 153.
2. G. S. Rousseau, “‘A Strange Pathology:’ Hysteria in the Early Modern World, 1500-1800” in Hysteria Beyond Freud (1993), p.104. Originally qtd in Heather Meek, “Of Wandering Wombs and Wrongs of Women: Evolving Concepts of Hysteria in the Age of Reason,” English Studies in Canada 35:2-3 (June/September 2009), p.109.
3. Quoted in Matt Simon, “Fantastically Wrong: The Theory of the Wandering Wombs that Drove Women to Madness,” Wired (7 May 2014).
4. Robert K. Ritner, “A Uterine Amulet in the Oriental Institute Collection,” Journal of Near Eastern Studies 45:3 (Jul. 1984), pp.209-221. For more on the fascinating subject of magical amulets, see Tom Blaen, Medical Jewels, Magical Gems: Precious Stones in Early Modern Britain (2012).
5. Rousseau, “A Strange Pathology,” p. 132.
6. Mary Lefkowitz, “Medical Notes: The Wandering Womb,” The New Yorker (26 February 1996).

Houses of Death: Walking the Wards of a Victorian Hospital

9deb7918e7e1d5281d6cfba4eafb711dThe following blog post relates to my forthcoming book THE BUTCHERING ART, which you can pre-order here

Today, we think of the hospital as an exemplar of sanitation. However, during the first half of the nineteenth century, hospitals were anything but hygienic. They were breeding grounds for infection and provided only the most primitive facilities for the sick and dying, many of whom were housed on wards with little ventilation or access to clean water. As a result of this squalor, hospitals became known as “Houses of Death.”

L0059152 Trade card for a 'Bug Destroyer' Andrew Cooke, LondonThe best that can be said about Victorian hospitals is that they were a slight improvement over their Georgian predecessors. That’s hardly a ringing endorsement when one considers that a hospital’s “Chief Bug-Catcher”—whose job it was to rid the mattresses of lice—was paid more than its surgeons in the eighteenth century. In fact, bed bugs were so common that the “Bug Destroyer” Andrew Cooke [see image, left] claimed to have cleared upwards of 20,000 beds of insects during the course of his career.[1]

In spite of token efforts to make them cleaner, most hospitals remained overcrowded, grimy, and poorly managed. The assistant surgeon at St. Thomas’s Hospital in London was expected to examine over 200 patients in a single day. The sick often languished in filth for long periods before they received medical attention, because most hospitals were disastrously understaffed. In 1825, visitors to St. George’s Hospital discovered mushrooms and wriggling maggots thriving in the damp, soiled sheets of a patient with a compound fracture. The afflicted man, believing this to be the norm, had not complained about the conditions, nor had any of his fellow convalescents thought the squalor especially noteworthy.[2]

Worst of all was the fact that a sickening odor permeated every hospital ward. The air was thick with the stench of piss, shit, and vomit. The smell was so offensive that the staff sometimes walked around with handkerchiefs pressed to their noses. Doctors didn’t exactly smell like rose beds, either. Berkeley Moynihan—one of the first surgeons in England to use rubber gloves—recalled how he and his colleagues used to throw off their own jackets when entering the operating theater and don ancient frocks that were often stiff with dried blood and pus. They had belonged to retired members of staff and were worn as badges of honor by their proud successors, as were many items of surgical clothing.

llanionmilitaryhospitalmoreThe operating theaters within these hospitals were just as dirty as the surgeons working in them. In the early decades of the nineteenth century, it was safer to have surgery at home than it was in a hospital, where mortality rates were three to five times higher than they were in domestic settings. Those who went under the knife did so as a last resort, and so were usually mortally ill. Very few surgical patients recovered without incident. Many either died or fought their way back to only partial health. Those unlucky enough to find themselves hospitalized during this period would frequently fall prey to a host of infections, most of which were fatal in a pre-antibiotic era.

419c2b28d1b137197a21298b24a604c0In addition to the foul smells, fear permeated the atmosphere of the Victorian hospital. The surgeon John Bell wrote that it was easy to imagine the mental anguish of the hospital patient awaiting surgery. He would hear regularly “the cries of those under operation which he is preparing to undergo,” and see his “fellow-sufferer conveyed to that scene of trial,” only to be “carried back in solemnity and silence to his bed.” Lastly, he was subjected to the sound of their dying groans as they suffered the final throes of what was almost certainly their end.[3]

As horrible as these hospitals were, it was not easy gaining entry to one. Throughout the nineteenth century, almost all the hospitals in London except the Royal Free controlled inpatient admission through a system of ticketing. One could obtain a ticket from one of the hospital’s “subscribers,” who had paid an annual fee in exchange for the right to recommend patients to the hospital and vote in elections of medical staff. Securing a ticket required tireless soliciting on the part of potential patients, who might spend days waiting and calling on the servants of subscribers and begging their way into the hospital. Some hospitals only admitted patients who brought with them money to cover their almost inevitable burial. Others, like St. Thomas’ in London, charged double if the person in question was deemed “foul” by the admissions officer.[4]

27

Before germs and antisepsis were fully understood, remedies for hospital squalor were hard to come by. The obstetrician James Y. Simpson suggested an almost-fatalistic approach to the problem. If cross-contamination could not be controlled, he argued, then hospitals should be periodically destroyed and built anew. Another surgeon voiced a similar view. “Once a hospital has become incurably pyemia-stricken, it is impossible to disinfect it by any known hygienic means, as it would to disinfect an old cheese of the maggots which have been generated in it,” he wrote. There was only one solution: the wholesale “demolition of the infected fabric.”[5]

fitzharris_butcheringart_021417It wasn’t until a young surgeon named Joseph Lister developed the concept of antisepsis in the 1860s that hospitals became places of healing rather than places of death.

To read more about 19th-century hospitals and Joseph Lister’s antiseptic revolution, pre-order my book THE BUTCHERING ART by clicking here. Pre-orders are incredibly helpful to new authors . Info on how to order foreign editions coming soon. Your support is greatly appreciated. 

 

1. Adrian Teal, The Gin Lane Gazette (London: Unbound, 2014).
2. F. B. Smith, The People’s Health 1830-1910 (London: Croom Helm, 1979), 262.
3. John Bell, The Principles of Surgery, Vol. III (1808), 293.
4. Elisabeth Bennion, Antique Medical Instruments (Berkeley: University of California Press, 1979), 13.
5. John Eric Erichsen, On Hospitalism and the Causes of Death after Operations (London: Longmans, Green, and Co., 1874), 98.

“We Have Conquered Pain!” The Uses & Abuses of Ether in History

ether_dome_mural

The surgical revolution began with an American dentist and a curiously sweet-smelling liquid known as ether.

Officially, ether had been discovered in 1275, but its stupefying effects weren’t synthesized until 1540, when the German botanist and chemist Valerius Cordus created a revolutionary formula that involved adding sulfuric acid to ethyl alcohol. His contemporary Paracelsus experimented with ether on chickens, noting that when the birds drank the liquid, they would undergo prolonged sleep and awake unharmed. He concluded that the substance “quiets all suffering without any harm and relieves all pain, and quenches all fevers, and prevents complications in all disease.” [1] Yet inexplicably, it would be several hundred years before it was tested on humans.

00_01_morton-inhaler-replica-mThat moment finally arrived in 1842, when Crawford Williamson Long became the first pioneer to use ether as a general anesthetic when he removed a tumor from a patient’s neck in Jefferson, Georgia. Unfortunately, Long didn’t publish the results of his experiments until 1848. By that time, Boston dentist William T. G. Morton had won fame by using it while extracting a tooth painlessly from a patient on September 30, 1846 [see Morton’s inhaler for administering ether, right]. An account of this successful procedure was published in a newspaper, prompting a notable surgeon to ask Morton to assist him in an operation removing a large tumor from a patient’s lower jaw at Massachusetts General Hospital. After the demonstration, someone nicknamed the surgical amphitheater the “Ether Dome,” and it has been known by this name ever since.

It was an incredible breakthrough. Up until that point, surgery had been brutally painful. The patient, fully awake, would be restrained while the surgeon cut through skin, tissue, muscle, and bone. Surgeons were lauded for their brute strength and quick hands. A capable surgeon could remove a leg in under a minute. But with the discovery of ether, the need for speed in the operating theater had now vanished.

On November 18, 1846, Dr. Henry Jacob Bigelow wrote about this groundbreaking moment in The Boston Medical and Surgical Journal. He described how Morton had administered what he called “Letheon” to the patient before the operation commenced. This was a gas named after the River Lethe in classical mythology which made the souls of the dead forget their lives on earth. Morton, who had patented the composition of the gas shortly after the operation, kept its parts secret, even from the surgeons. Bigelow revealed, however, that he could detect the sickly sweet smell of ether in it. News about the miraculous substance which could render patients unconscious during surgery spread quickly around the world as surgeons rushed to test the effects of ether on their own patients.

The term “etherization” was coined, and the use of ether in surgery was celebrated in newspapers. “The history of Medicine has presented no parallel to the perfect success that has attended the use of ether,” a writer at the Exeter Flying Post proclaimed. [2] Another journalist declared: “Oh, what delight for every feeling heart… the announcement of this noble discovery of the power to still the sense of pain, and veil the eye and memory from all the horrors of an operation…WE HAVE CONQUERED PAIN!” [3]

5A curious by-product of all this was the ether parties that sprang up all over the world. Thomas Lint, a medical student at St. Bartholomew’s Hospital in London, confessed: “We sit round a table and suck [on an inhaling apparatus], like many nabobs with their hookahs. It’s glorious, as you will see from this analysis of a quarter of an hour’s jolly good suck.” [4] He then went on to describe several “ethereal” experiences he and his fellow classmates had while under the influence of the newly discovered substance.

Ether wasn’t just inhaled. It was also drunk, like alcohol. In Ireland, the substance replaced whiskey for a while, due to its low cost (a penny a draught). After drinking a glass of water, “ethermaniacs” would take a drop of the drug on their tongues while pinching their noses and chasing it with another glass of water. Taken this way, ether hit the user hard and fast. Dr. Ernest Hart wrote that “the immediate effects of drinking ether are similar to those produced by alcohol, but everything takes place more rapidly.” [5] Recovery was just as swift. Those taken into custody for drunken disorderliness were often completely sober by the time they reached the police station, with the bonus that they also suffered no hangover. In this way, 19th-century revelers could take draughts of ether several times a day, with little consequence. [6]

Today, the “Ether Dome” at Massachusetts General Hospital has become a national historic landmark [pictured below], visited by thousands of members of the public each year. Although surgeons haven’t operated there for well over a hundred years, the room is still used for meetings and lectures at the hospital. The Ether Dome looks more or less like it did 165 years ago. Display cases at either end of the room contain surgical instruments from Morton’s day, their blades dull and rusted with age. At the front of the room an Egyptian mummy lords over the phantom audience. One can almost detect the sweet smell of ether in the air from so long ago.

the-ether-dome-mgh-boston-04

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Quoted in Steve Parker, Kill or Cure: An Illustrated History ofMedicine (London: DK, 2013), 174.
2. “Etherization in Surgery,” Exeter Flying Post, 24 June, 1847, 4.
3. London People’s Journal, 9 January, 1847.
4. Punch, or The London Charivari (December 1847), 259.
5. Quoted in David J. Linden, Pleasure: How Our Brains Make Junk Food, Exercise, Marijuana, Generosity & Gambling Feel So Good (Viking, 2011), 31.
6. Sterling Haynes, “Ethermaniacs,” BC Medical Journal (June 2014), Vol. 56 (No.5), 254-3.

Ten Terrifying Knives from Medical History

I’m excited to announce that I’ve just finished filming the first episode of my new YouTube series, Under The Knife, and will be releasing it very soon (please subscribe to my channel for video updates). Unsurprisingly, that got me thinking about, well, knives. Here’s a list of some rather terrifying knives from our medical past.

  1. VALENTIN KNIFE, 1838. This knife was one of the few able to cut slices of organs and soft tissues for microscopic examination. The double-bladed knife worked best when the blades were wet – best of all when submerged in water. Named after its inventor, Professor Gabriel Valentin (1810-1883), a German-Swiss physiologist, the knife was invented in 1838. This example, however, dates from 1890.

  2. BISTOURY CACHÉ, c.1850. Invented in the mid-19th century, bistoury caché literally translates from the French as ‘hidden knife’. The device was used to cut internal organs or to open cavities, particularly during the surgical removal of a bladder or kidney stone – a practice known as lithotomy.
  3. CIRCUMCISION KNIFE, c.1775. Circumcision – the removal of the foreskin of the penis – is practised across the world often for cultural and religious reasons. In some countries it is also promoted for reasons of hygiene and health. This knife dates from the late 18th century.
  4. CATARACT KNIFE & NEEDLE, 1805. Georg Joseph Beer (1763-1821), an Austrian professor of ophthalmology, invented this cataract knife and needle around 1805. Cataracts cause blurred vision as the lens becomes cloudy and if left untreated can cause blindness. These instruments allowed for the surgical removal of some of the cloudy mass and, if necessary, part or all of the lens itself. Prior to effective anaesthetics, this was an excruciatingly painful process. This particular example dates from 1820.
  5. ORTHOPEDIC KNIFE, 1855. William Adams (1820-1900), an English surgeon, invented this type of knife for his new procedure called periosteotomy in 1855. This involved un-fusing the bones of the hip joint by cutting the neck of the femur (upper leg bone). He affectionately called it ‘my little thaw’, because the knife was used to cut through and ‘melt’ fused bones.
  6. LISTON KNIFE, c.1830. Robert Liston (1797-1847), a Scottish surgeon renowned for his speed and precision in surgery, invented this double-edged amputation knife in the 1830s. This particular example is made of steel with a nickel-plated handle. Nickel plating was introduced in the 1890s and meant that the knife could be boiled without it rusting and was therefore ideal for aseptic surgery. It was made by Down Bros, a leading surgical instrument maker, in the 1920s.
  7. SYRIAN SURGICAL KNIFE, c.900 AD. Most of the blade of this ancient surgical knife is rusty and part of it is broken. The steel blade is slotted into a brass handle. The loop at the end may have been used as a finger hole for gripping. This knife dates to a period when the Islamic world became a major centre for medical study and practice.
  8. PLAGUE LANCET, c.1600. Plague epidemics ravaged Marseilles in France throughout the 17th and 18th centuries. Lancets, such as the copy shown here, were used to open buboes in order to relieve pressure and also remove poisons from the body – an unsuccessful attempt to cure the patient. The lancet would have been stored in a brass case.
  9. DOUBLE BLADED LITHOTOME, 1812. This object was used to cut the bladder in order to remove stones – a practice known as lithotomy. Baron Guillaume Dupuytren (1777-1835), a French surgeon and pathologist, invented this double bladed lithotome for the bi-lateral lithotomy procedure he developed in 1812. This procedure became widely used from the 1850s onwards, and this example dates from 1825.
  10. FALCIFORM AMPUTATION KNIFE, c.1700. The curved shape of this amputation knife was common in the early 1700s. Amputation knives became straighter once the practice of leaving a flap of skin to cover the limb stump became the preferred amputation method. Ebony was a common material for handles as it is a hard-wearing wood. This knife was probably made by Eberle in Germany, as indicated by the inscription on the silver blade.

     

Disturbing Disorders: Sirenomelia (Mermaid Syndrome)

The sea king down there had been a widower for years, and his old mother kept house for him…she was an altogether praiseworthy person, particularly so because she was extremely fond of her granddaughters, the little sea princesses. They were six lovely girls, but the youngest was the most beautiful of them all. Her skin was as soft and tender as a rose petal, and her eyes were as blue as the deep sea, but like all the others she had no feet. Her body ended in a fish tail.

Hans Christen Anderson, The Little Mermaid, 1837.

Mermaids have teased our imagination for thousands of years. One of the earliest tales originated in ancient Assyria, where the goddess Atargatis transformed herself into a mermaid out of shame for accidentally killing her human lover. Homer called them sirens in the Odyssey, and described them as beautiful singing creatures who lure sailors to their deaths. Throughout history, these seductive beings have been associated with floods, storms, shipwrecks and drownings. They have been depicted in countless mediums: in Etrurian sculptures, in Greek jewelry, and in bas-relief on ancient Roman tombs. Christopher Columbus even reported seeing these mythical creatures on his voyage to the Caribbean in 1493.

But could our concept of what a mermaid looks like actually have originated from a real medical disorder?

M2Sirenomelia is a lethal condition characterised by rotation and fusion of the legs, resulting in what often looks like a fish tail (left). It occurs when the umbilical cord fails to form two arteries, thus preventing a sufficient blood supply from reaching the fetus. As a result, the single artery steals the blood and nutrition from the lower body and diverts it back up to the placenta. Due to malnutrition, the fetus fails to develop two separate limbs.

Sirenomelia, also known as ‘Mermaid Syndrome’, is extremely rare. It affects 1 in 100,000 babies and is 100 times more likely to occur in identical twins. Usually, those born with this condition die within days.

Over the course of my research, I’ve found very little about  the disorder’s history. There are snippets here and there which claim that fetuses born with sirenomelia were sometimes preserved in jars and put on display in ‘freak shows’ during the 19th century—but these sources are frustratingly vague. There is brief mention of the condition in a four-volume atlas published in 1891 titled Human Monstrosities, but nothing that hints at how medical practitioners understood sirenomelia in earlier periods.

Perhaps because the disorder is so rare, it’s also been hard for me to locate specimens in anatomical collections. My search in the Hunterian Museum at the Royal College of Surgeons in London came up cold. I did, however, find an early 20th-century example at the National Museum of Health & Medicine in Washington D.C. There are also three fetuses in the Anatomical Museum of the Second University of Naples, which have undergone 3D bone reconstructions (two pictured below).

M3

M4

By far the largest number of fetuses comes from the Vrolik Museum in Amsterdam, which consists of more than 5,000 specimens of human and animal anatomy, embryology, pathology and congenital anomalies. The collection was founded by Gerardus Crolik (1755 – 185) and his son, Willem Vrolik (1801 – 1863), who both wrote extensively on anatomical deformities in the 18th and 19th centuries. The Vrolik Museum has both wet preparations and skeletal remains, all of which are on display to the public today.

Unlike the first disorder I examined in this series—Harlequin Ichthyosis—sirenomelia is extremely fatal. There are no accounts of anyone with this condition surviving in the past. Most died within days of being born due to kidney and bladder failure. Even today, the odds are against those with sirenomelia, though there are a handful of examples of children living past infancy.

In 1988, Tiffany Yorks underwent surgery to separate her legs before her first birthday. She continues to suffer from mobility issues due to her fragile leg bones, and compensates by using crutches of a wheelchair to move around. At the age of 26, she is the longest-surviving sirenomelia patient to date.

If you enjoy reading my articles, please consider becoming a patron of The Chirurgeon’s Apprentice. Support my content by clicking HERE.

The Battle of the Tooth Worm

_toothwormI come across a lot of strange objects in my research: books bound in human skin, prosthetic noses made of silver, iron coffins with safety devices to prevent premature burial. But perhaps one of the strangest objects I’ve seen is the one pictured on the left.

This is a depiction of the infamous tooth worm believed by many people in the past to bore holes in human teeth and cause toothaches.  But before I tell you about this fascinating piece of art, let me give you a quick lesson in dental folklore.

Tooth worms have a long history, first appearing in a Sumerian text around 5,000 BC. References to tooth worms can be found in China, Egypt and India long before the belief finally takes root (pun intended) into Western Europe in the 8th century. [1]

Treatment of tooth worms varied depending on the severity of the patient’s pain. Often, practitioners would try to ‘smoke’ the worm out by heating a mixture of beeswax and henbane seed on a piece of iron and directing the fumes into the cavity with a funnel. Afterwards, the hole was filled with powered henbane seed and gum mastic.  This may have provided temporary relief given the fact that henbane is a mild narcotic. Many times, though, the achy tooth had to be removed altogether. Some tooth-pullers mistook nerves for tooth worms, and extracted both the tooth and the nerve in what was certainly an extremely painful procedure in a period before anaesthetics. [2]

_Toothworm3The tooth worm came under attack in the 18th century when Pierre Fauchard—known today as the father of modern dentistry—posited that tooth decay was linked to sugar consumption and not little creatures burrowing inside the tooth. In the 1890s, W.D. Miller took this idea a step further, and discovered through a series of experiments that bacteria living inside the mouth produced acids that dissolved tooth enamel when in the presence of fermentable carbohydrates.

Despite these discoveries, many people continued to believe in the existence of tooth worms even into the 20th century.

The piece of art at the top of the article is titled ‘The Tooth Worm as Hell’s Demon.’ It was created in the 18th century by an unknown artist, and is carved from ivory. It is an incredibly intricate piece when you consider it only stands a little over 4 inches tall. The two halves open up to reveal a scene about the infernal torments of a toothache depicted as a battle with the tooth worm, complete with mini skulls, hellfire, and naked humans wielding clubs.

_toothworm4

It is, without a doubt, one of the strangest objects I’ve come across in my research; and today, I pass this random bit of trivia on to you in the hopes that you may use it someday to revive a dying conversation at a cocktail party.

1. W. E. Gerabek, ‘The Tooth-Worm: Historical Apsects of a Popular Belief,’ Clinical Oral Investigations (April 1999): pp. 1-6.
2. Leo Kanner, Folklore of the Teeth (1928).