Pre-Order My Book! The Butchering Art

fitzharris_butcheringart_021417

I’m thrilled to reveal the cover for the US edition of my forthcoming book, THE BUTCHERING ART, which will be published by FSG on October 17th.

The book delves into the grisly world of Victorian surgery and transports the reader to a period when a broken leg could result in amputation, when giving birth in a squalid hospital was extraordinarily dangerous, and when a minor injury could lead to a miserable death. Surgeons—lauded for their brute strength and quick knives—rarely washed their hands or their instruments, and carried with them a cadaverous smell of rotting flesh, which those in the profession cheerfully referred to as “good old hospital stink.” At a time when surgery couldn’t have been more dangerous, an unlikely figure stepped forward: Joseph Lister, a young, melancholic Quaker surgeon. By making the audacious claim that germs were the source of all infection—and could be treated with antiseptics—he changed the history of surgery forever.

Many of you have been devoted readers of my blog since its inception in 2010, and I can’t thank you enough for your continued interest in my work. Writing a book has been the next logical step for a very long time. The idea of telling this particular story arose during a very difficult period in my life when my writing career was at risk. It is therefore with great pride (and some trepidation) that I am turning this book loose into the world, and humbly ask you to consider pre-ordering it. All pre-orders count towards first-week sales once THE BUTCHERING ART is released, and therefore give me a greater chance of securing a place on bestseller lists in October. I would be hugely grateful for your support.

Pre-order from any one of these vendors using the links below:

*Please note that THE BUTCHERING ART will also be published by Penguin in the United Kingdom, as well as several other publishers around the world. I’ll be revealing covers for these foreign editions in the coming months, along with information on where to buy a copy.

Syphilis: A Little Valentine’s Day Love Story

12

Photo Credit: The Royal College of Surgeons of England 

We don’t know much about her. We don’t even know her name. What we do know is that the woman who wore the above prosthetic in the mid-19th century was suffering from a severe case of syphilis.

Before the discovery of penicillin in 1928, syphilis was an incurable disease. Its symptoms were as terrifying as they were unrelenting. Those who suffered from it long enough could expect to develop unsightly skin ulcers, paralysis, gradual blindness, dementia and “saddle nose,” a grotesque deformity which occurs when the bridge of the nose caves into the face.

stlcfo00239This deformity was so common amongst those suffering from the pox (as it was sometimes called) that “no nose clubs” sprung up in London. On 18 February 1874, the Star reported: “Miss Sanborn tells us that an eccentric gentleman, having taken a fancy to see a large party of noseless persons, invited every one thus afflicted, whom he met in the streets, to dine on a certain day at a tavern, where he formed them into a brotherhood.”[1] The man, who assumed the name Mr. Crampton for these clandestine parties, entertained his “noseless’” friends every month until he died a year later, at which time the group “unhappily dissolved.”[2]

The 19th century was particularly rife with syphilis. Because of its prevalence, both physicians and surgeons treated victims of the disease. Many treatments involved the use of mercury, hence giving rise to the saying: “One night with Venus, a lifetime with Mercury.” Mercury could be administered in the form of calomel (mercury chloride), an ointment, a steam bath or pill. Unfortunately, the side effects could be as painful and terrifying as the disease itself. Many patients who underwent mercury treatments suffered from extensive tooth loss, ulcerations and neurological damage. In many cases, people died from significant mercury poisoning.

For those determined to avoid the pox altogether, condoms made from animal membrane and secured with a silk ribbon were available [below], but these were outlandishly expensive. Moreover, many men shunned them for being uncomfortable and cumbersome. In 1717, the surgeon, Daniel Turner, wrote:

The Condum being the best, if not only Preservative our Libertines have found out at present; and yet by reason of its blunting the Sensation, I have heard some of them acknowledge, that they had often chose to risk a Clap, rather than engage cum Hastis sic clypeatis [with spears thus sheathed].[3]

13Everyone blamed each other for the burdensome condom. The French called it “la capote anglaise” (the English cape), while the English called it the “French letter.” Even more unpleasant was the fact that once one procured a condom, he was expected to use it repeatedly. Unsurprisingly, syphilis continued to rage despite the growing availability of condoms during the Victorian period.

Which brings me back to the owner of the prosthetic nose. Eventually, she lost her teeth and palate after prolonged exposure to mercury treatments. Her husband—who may have been the source of her suffering—finally died from the disease, leaving her a widow. But it wasn’t all doom and gloom for the poor, unfortunate Mrs X.

According to records at the Royal College of Surgeons in London, the woman found another suitor despite her deformities. After the wedding, she sought out the physician, James Merryweather, and sold the contraption to him for £3. The reason? Her new husband liked her just the way she was – no nose and all!

And that, kind readers, is a true Valentine’s Day love story…Ignore the part where she most certainly transmitted the disease to her new lover.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Origin of the No Nose Club. Star, Issue 1861 (18 February 1874), p. 3.
2. Ibid.
3. Daniel Turner, Syphilis: A Practical Treatise on the Venereal Disease (1717), p. 74.

The Surgeon who Operated on Himself

blog1

Leonid Ivanovich Rogozov (pictured above and below right) knew he was in trouble when he began experiencing intense pain in lower right quadrant of his abdomen. He had been feeling unwell for several days, but suddenly, his temperature skyrocketed and he was overcome by waves of nausea. The 27-year-old surgeon knew it could only be one thing: appendicitis.

blog3The year was 1961, and under normal circumstances, appendicitis was not life-threatening. But Rogozov was stuck in the middle of the Antartica, surrounded by nothing but thousands of square miles of snow and ice, far from civilization. He was one of thirteen researchers who had just embarked on the sixth Soviet Antarctic Expedition.

And he was the only doctor.

At first, Rogozov resigned himself to his fate. He wrote in his diary:

It seems that I have appendicitis. I am keeping quiet about it, even smiling. Why frighten my friends? Who could be of help? A polar explorer’s only encounter with medicine is likely to have been in a dentist’s chair.

He was right that there was no one who could help. Even if there had been another research station within a reasonable distance, the blizzard raging outside Rogozov’s own encampment would have prevented anyone from reaching him. An evacuation by air was out of the question in those treacherous conditions. As the situation grew worse, the young Soviet surgeon did the only thing he could think of: he prepared to operate on himself.

Rogozov was not the first to attempt a self-appendectomy. In 1921, the American surgeon Evan O’Neill Kane undertook an impromptu experiment after he too was diagnosed with a severe case of appendicitis. He wanted to know whether invasive surgery performed under local anesthetic could be painless. Kane had several patients who had medical conditions which prevented them from undergoing general anesthetic. If he could remove his own appendix using just a local anesthetic, Kane reasoned that he could operate on others without having to administer ether, which he believed was dangerous and overused in surgery.

Lying in the operating theater at the Kane Summit Hospital, the 60-year-old surgeon announced his intentions to his staff. As he was Chief of Surgery, no one dared disagree with him. Kane proceeded by administering novocaine—a local anesthetic that had only recently replaced the far more dangerous drug, cocaine—as well as adrenalin into his abdominal wall. Propping himself up on pillows and using mirrors, he began cutting into his abdomen. At one point, Kane leaned too far forward and part of his intestines popped out. The seasoned surgeon calmly shoved his guts back into their rightful place before continuing with the operation. Within thirty minutes, he had located and removed the swollen appendix. Kane later said that he could have completed the operation more rapidly had it not been for the staff flitting around him nervously, unsure of what they were supposed to do.

blog2

Emboldened by his success, Kane decided to repair his own inguinal hernia under local anesthetic eleven years later. The operation was carried out with the the press in attendance. This operation was more dangerous than the appendectomy because of the risk of puncturing the femoral artery. Unfortunately, this second surgery was tricky, and ended up taking well over an hour. Kane never fully regained his strength. He eventually came down with pneumonia, and died three months later.

Back in Antartica, Rogozov enlisted the help of his colleagues, who assisted with mirrors and retractors as the surgeon cut deep into his own abdomen. After forty-five minutes, Rogozov began experiencing weakness and vertigo, and had to take short breaks. Eventually he was able to remove the offending organ and sew up the incision (pictured below, recovering). Miraculously, Rogozov was able to return to work within two weeks.

blog4The incident captured the imagination of the Soviet public at the time. After he returned from the expedition, Rogozov was awarded the Order of the Red Banner of Labour. The incident also brought about a change in policy. Thereafter, extensive health checks became mandatory for personnel before their departure for Antartica was sanctioned.

If you enjoy my blog, please consider supporting my content by clicking HERE.

“We Have Conquered Pain!” The Uses & Abuses of Ether in History

ether_dome_mural

The surgical revolution began with an American dentist and a curiously sweet-smelling liquid known as ether.

Officially, ether had been discovered in 1275, but its stupefying effects weren’t synthesized until 1540, when the German botanist and chemist Valerius Cordus created a revolutionary formula that involved adding sulfuric acid to ethyl alcohol. His contemporary Paracelsus experimented with ether on chickens, noting that when the birds drank the liquid, they would undergo prolonged sleep and awake unharmed. He concluded that the substance “quiets all suffering without any harm and relieves all pain, and quenches all fevers, and prevents complications in all disease.” [1] Yet inexplicably, it would be several hundred years before it was tested on humans.

00_01_morton-inhaler-replica-mThat moment finally arrived in 1842, when Crawford Williamson Long became the first pioneer to use ether as a general anesthetic when he removed a tumor from a patient’s neck in Jefferson, Georgia. Unfortunately, Long didn’t publish the results of his experiments until 1848. By that time, Boston dentist William T. G. Morton had won fame by using it while extracting a tooth painlessly from a patient on September 30, 1846 [see Morton’s inhaler for administering ether, right]. An account of this successful procedure was published in a newspaper, prompting a notable surgeon to ask Morton to assist him in an operation removing a large tumor from a patient’s lower jaw at Massachusetts General Hospital. After the demonstration, someone nicknamed the surgical amphitheater the “Ether Dome,” and it has been known by this name ever since.

It was an incredible breakthrough. Up until that point, surgery had been brutally painful. The patient, fully awake, would be restrained while the surgeon cut through skin, tissue, muscle, and bone. Surgeons were lauded for their brute strength and quick hands. A capable surgeon could remove a leg in under a minute. But with the discovery of ether, the need for speed in the operating theater had now vanished.

On November 18, 1846, Dr. Henry Jacob Bigelow wrote about this groundbreaking moment in The Boston Medical and Surgical Journal. He described how Morton had administered what he called “Letheon” to the patient before the operation commenced. This was a gas named after the River Lethe in classical mythology which made the souls of the dead forget their lives on earth. Morton, who had patented the composition of the gas shortly after the operation, kept its parts secret, even from the surgeons. Bigelow revealed, however, that he could detect the sickly sweet smell of ether in it. News about the miraculous substance which could render patients unconscious during surgery spread quickly around the world as surgeons rushed to test the effects of ether on their own patients.

The term “etherization” was coined, and the use of ether in surgery was celebrated in newspapers. “The history of Medicine has presented no parallel to the perfect success that has attended the use of ether,” a writer at the Exeter Flying Post proclaimed. [2] Another journalist declared: “Oh, what delight for every feeling heart… the announcement of this noble discovery of the power to still the sense of pain, and veil the eye and memory from all the horrors of an operation…WE HAVE CONQUERED PAIN!” [3]

5A curious by-product of all this was the ether parties that sprang up all over the world. Thomas Lint, a medical student at St. Bartholomew’s Hospital in London, confessed: “We sit round a table and suck [on an inhaling apparatus], like many nabobs with their hookahs. It’s glorious, as you will see from this analysis of a quarter of an hour’s jolly good suck.” [4] He then went on to describe several “ethereal” experiences he and his fellow classmates had while under the influence of the newly discovered substance.

Ether wasn’t just inhaled. It was also drunk, like alcohol. In Ireland, the substance replaced whiskey for a while, due to its low cost (a penny a draught). After drinking a glass of water, “ethermaniacs” would take a drop of the drug on their tongues while pinching their noses and chasing it with another glass of water. Taken this way, ether hit the user hard and fast. Dr. Ernest Hart wrote that “the immediate effects of drinking ether are similar to those produced by alcohol, but everything takes place more rapidly.” [5] Recovery was just as swift. Those taken into custody for drunken disorderliness were often completely sober by the time they reached the police station, with the bonus that they also suffered no hangover. In this way, 19th-century revelers could take draughts of ether several times a day, with little consequence. [6]

Today, the “Ether Dome” at Massachusetts General Hospital has become a national historic landmark [pictured below], visited by thousands of members of the public each year. Although surgeons haven’t operated there for well over a hundred years, the room is still used for meetings and lectures at the hospital. The Ether Dome looks more or less like it did 165 years ago. Display cases at either end of the room contain surgical instruments from Morton’s day, their blades dull and rusted with age. At the front of the room an Egyptian mummy lords over the phantom audience. One can almost detect the sweet smell of ether in the air from so long ago.

the-ether-dome-mgh-boston-04

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Quoted in Steve Parker, Kill or Cure: An Illustrated History ofMedicine (London: DK, 2013), 174.
2. “Etherization in Surgery,” Exeter Flying Post, 24 June, 1847, 4.
3. London People’s Journal, 9 January, 1847.
4. Punch, or The London Charivari (December 1847), 259.
5. Quoted in David J. Linden, Pleasure: How Our Brains Make Junk Food, Exercise, Marijuana, Generosity & Gambling Feel So Good (Viking, 2011), 31.
6. Sterling Haynes, “Ethermaniacs,” BC Medical Journal (June 2014), Vol. 56 (No.5), 254-3.

The Medicalization of Death in History

Ars2

When the Black Death swept through Europe in the 14th century, it claimed the lives of over 75 million people, many of who were clergymen whose job it was to help usher the dying into the next world. In response to the shortage of priests, the Ars Moriendi (Art of Dying) first emerged in 1415. This was a manual that provided practical guidance to the dying and those who attended them in their final moments. These included prayers and prescribed rites to be performed at the deathbed, as well as illustrations and descriptions of the temptations one must overcome in order to achieve a “good death.”

From the medieval period onwards, the dying were expected to follow a set of “rules” when facing the final moments of their lives, which included repentance of sins, forgiving enemies, and accepting one’s fate stoically without complaint.  It was each person’s duty to die a righteous death.

Ars1In earlier periods, many people believed that pain was a necessary component of a good death. Indeed, the word patient comes from the Latin word patiens, meaning “long-suffering” or “one who suffers.”  Evangelical Christians, in particular, feared losing lucidity as death approached, as this would prohibit the person from begging forgiveness for past sins and putting his or her worldly affairs in order before departing this life. Death was a public event, with those closest to the dying in attendance. Friends and relatives were not merely passive observers. They often assisted the dying person in his or her final hours, offering up prayers as the moment drew closer. The deathbed presented the dying with the final opportunity for eternal salvation.

For this reason, the physician rarely appeared at the bedside of a dying person because pain management wasn’t required. Moreover, the general consensus was that it was inappropriate for a person to profit from another’s death. Caricatures depicting the greedy physician running off with bags of money after his patient had succumbed to his fate were not uncommon in the 18th and early 19th centuries.

Over time, however, religious sentiments faded, and physicians began to appear more regularly in the homes of the dying. Part of this was driven by the fact that doctors became more effective at pain management. At the start of the 19th century, they typically administered laudanum drops orally to patients. This process was imprecise, and sometimes not effective at all. All this changed in 1853 when Charles Pravaz and Alexander Wood developed a medical hypodermic syringe with a needle fine enough to pierce the skin (see example below from c.1880). From that point onwards, physicians began administering morphine intravenously, which was far more effective (though also more dangerous). As new techniques emerged, people’s attitudes towards pain management in treating the dying began to change. Soon, a painless death was not only seen to be acceptable, but also vital to achieving a “good death.”

Ars3

The doctor’s place at the bedside of the dying became increasingly more commonplace. Whereas the deathbed was formerly governed by religious tradition, it now became the purview of the medical community. The emphasis for the doctor had shifted from cure to care by the end of the 19th century, thus making his place in the “death chamber” more acceptable. Now, the good physician was one who stood by his patient, even when nothing could be done to save his or her life. As the legal scholar Shai Lavi succinctly put it: “The ethics of the deathbed shifted from religion to medicine, and dying further emerged as a matter of regulating life: life was now understood in its biological, rather than biographical, sense.”

The medicalization of death had arrived, one which for better or worse continues to shape how we die today.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

  1. Shai Lavi, “Euthanasia and the Changing Ethics of the Deathbed: A Study in Historical Jurisprudence,” Theoretical Inquiries in Law, 4.2 (2003): pp. 729 – 761.

Hold The Butter! A Brief History of Gorging

5

’Tis the season for gorging! Mince pies, buttery rolls, homemade stuffing, turkey joints…all topped off with a dollop of cranberry sauce. In January, we’ll all heave a collective groan as we step onto the scales for the first time and face the consequences of our gluttony.

You may think that obesity is largely a symptom of the modern world, but the battle of the bulge has been raging for centuries.

2One of the most famous corpulent characters from the past was a man named Daniel Lambert [left]. Born on 13 March, 1770, Lambert was slim and athletic throughout most of his boyhood. Then, in 1791, he took over from his father as the Keeper of Leicester’s House of Correction on Highcross Street. It was at this point that young Lambert’s waistline began expanding at an extraordinary rate. Within two years, he had ballooned to 483 pounds; and by 1804, he weighed a whopping 686 pounds. A year later, the House of Correction closed, and Lambert found himself out of a job and unemployable due to his extraordinary size.

Sensitive about his weight, Lambert withdrew and became reclusive. His meagre pension, however, could not sustain the needs of a man whose enormous suits cost £20 (or £1,440 in today’s money!) Not wanting to become a sideshow freak, but unable to feed or clothe himself, Lambert did the only thing he could. In March, 1806, the Stamford Mercury reported: “Daniel Lambert is having a special carriage built to convey himself to London where he means to exhibit himself as a natural curiosity.” Once settled in the capital, Lambert began charging people for access to view him. Over the next 6 months, he became minor celebrity. All sorts of people visited him at his home, including King George III.

Eventually, Lambert returned to Leicester. He traveled the country periodically to raise money, although by then, he was a very rich man. In 1809, Lambert died shaving at the age of 39 while staying at the Wagon & Horses Inn in Stamford, during what he intended to be his final tour. At the time of his death, his waist measured an incredible 9’4’’ in circumference [picture of his breeches below], and his calf measured 3’1’’. Lambert weighed 739 pounds—approximately a third of the weight of a modern-day Mini Cooper. The wall of the inn had to be dismantled to remove his corpse. Lambert’s coffin—which was constructed from 156 square feet of wood—had to be supported on wheels, and it took 20 men to lower it down a ramp into the grave.

3

Medical interest in obesity has a long history. In the 17th century, several discoveries were made that helped doctors understand how the human body processed and stored food. In 1614, the Italian physician Santorio invented a movable platform attached to a steelyard scale [below] that allowed him to quantify changes in the bodyweight of his subjects. This allowed Santorio to measure metabolic rates in humans for the very first time.

_100Further contributions were made by Theophile Bonet, who became the first anatomist to dissect obese cadavers, and he documented his findings in 1679. His work was taken up in the following century by the Italian anatomist Giovanni Battista Morgagni, who recorded the first case of hardened arteries in the corpse of an obese male. By 1727, the first monograph on obesity and the treatment of the condition appeared; and by the 19th century, there was a proliferation of literature on the subject, as people’s concern about obesity grew with their waistlines.

The Age of Dieting had begun.

Probably one of the most successful diets of the Victorian period (and beyond) was down to a man named William Banting, who self-published a booklet entitled Letter on Corpulence in 1863. In it, he proclaimed success with the first “low carb” diet, imposed upon him by his physician, a William Harvey of Soho Square. The regimen included eating four meals a day, each consisting of meat, greens, fruits, and dry wine. The emphasis was on avoiding sugar, starch, beer, milk and butter.

The pamphlet’s popularity was such that the question “Do you bant?” – alluding to Banting’s method – became commonplace, and eventually came to refer to dieting in general. Incredibly, Banting’s booklet remains in print today.

We may all need a copy come January!

If you enjoy reading my articles, please consider supporting my blog by clicking HERE.

Disturbing Disorders: Sirenomelia (Mermaid Syndrome)

The sea king down there had been a widower for years, and his old mother kept house for him…she was an altogether praiseworthy person, particularly so because she was extremely fond of her granddaughters, the little sea princesses. They were six lovely girls, but the youngest was the most beautiful of them all. Her skin was as soft and tender as a rose petal, and her eyes were as blue as the deep sea, but like all the others she had no feet. Her body ended in a fish tail.

Hans Christen Anderson, The Little Mermaid, 1837.

Mermaids have teased our imagination for thousands of years. One of the earliest tales originated in ancient Assyria, where the goddess Atargatis transformed herself into a mermaid out of shame for accidentally killing her human lover. Homer called them sirens in the Odyssey, and described them as beautiful singing creatures who lure sailors to their deaths. Throughout history, these seductive beings have been associated with floods, storms, shipwrecks and drownings. They have been depicted in countless mediums: in Etrurian sculptures, in Greek jewelry, and in bas-relief on ancient Roman tombs. Christopher Columbus even reported seeing these mythical creatures on his voyage to the Caribbean in 1493.

But could our concept of what a mermaid looks like actually have originated from a real medical disorder?

M2Sirenomelia is a lethal condition characterised by rotation and fusion of the legs, resulting in what often looks like a fish tail (left). It occurs when the umbilical cord fails to form two arteries, thus preventing a sufficient blood supply from reaching the fetus. As a result, the single artery steals the blood and nutrition from the lower body and diverts it back up to the placenta. Due to malnutrition, the fetus fails to develop two separate limbs.

Sirenomelia, also known as ‘Mermaid Syndrome’, is extremely rare. It affects 1 in 100,000 babies and is 100 times more likely to occur in identical twins. Usually, those born with this condition die within days.

Over the course of my research, I’ve found very little about  the disorder’s history. There are snippets here and there which claim that fetuses born with sirenomelia were sometimes preserved in jars and put on display in ‘freak shows’ during the 19th century—but these sources are frustratingly vague. There is brief mention of the condition in a four-volume atlas published in 1891 titled Human Monstrosities, but nothing that hints at how medical practitioners understood sirenomelia in earlier periods.

Perhaps because the disorder is so rare, it’s also been hard for me to locate specimens in anatomical collections. My search in the Hunterian Museum at the Royal College of Surgeons in London came up cold. I did, however, find an early 20th-century example at the National Museum of Health & Medicine in Washington D.C. There are also three fetuses in the Anatomical Museum of the Second University of Naples, which have undergone 3D bone reconstructions (two pictured below).

M3

M4

By far the largest number of fetuses comes from the Vrolik Museum in Amsterdam, which consists of more than 5,000 specimens of human and animal anatomy, embryology, pathology and congenital anomalies. The collection was founded by Gerardus Crolik (1755 – 185) and his son, Willem Vrolik (1801 – 1863), who both wrote extensively on anatomical deformities in the 18th and 19th centuries. The Vrolik Museum has both wet preparations and skeletal remains, all of which are on display to the public today.

Unlike the first disorder I examined in this series—Harlequin Ichthyosis—sirenomelia is extremely fatal. There are no accounts of anyone with this condition surviving in the past. Most died within days of being born due to kidney and bladder failure. Even today, the odds are against those with sirenomelia, though there are a handful of examples of children living past infancy.

In 1988, Tiffany Yorks underwent surgery to separate her legs before her first birthday. She continues to suffer from mobility issues due to her fragile leg bones, and compensates by using crutches of a wheelchair to move around. At the age of 26, she is the longest-surviving sirenomelia patient to date.

If you enjoy reading my articles, please consider becoming a patron of The Chirurgeon’s Apprentice. Support my content by clicking HERE.