Houses of Death: Walking the Wards of a Victorian Hospital

9deb7918e7e1d5281d6cfba4eafb711dThe following blog post relates to my forthcoming book THE BUTCHERING ART, which you can pre-order here

Today, we think of the hospital as an exemplar of sanitation. However, during the first half of the nineteenth century, hospitals were anything but hygienic. They were breeding grounds for infection and provided only the most primitive facilities for the sick and dying, many of whom were housed on wards with little ventilation or access to clean water. As a result of this squalor, hospitals became known as “Houses of Death.”

L0059152 Trade card for a 'Bug Destroyer' Andrew Cooke, LondonThe best that can be said about Victorian hospitals is that they were a slight improvement over their Georgian predecessors. That’s hardly a ringing endorsement when one considers that a hospital’s “Chief Bug-Catcher”—whose job it was to rid the mattresses of lice—was paid more than its surgeons in the eighteenth century. In fact, bed bugs were so common that the “Bug Destroyer” Andrew Cooke [see image, left] claimed to have cleared upwards of 20,000 beds of insects during the course of his career.[1]

In spite of token efforts to make them cleaner, most hospitals remained overcrowded, grimy, and poorly managed. The assistant surgeon at St. Thomas’s Hospital in London was expected to examine over 200 patients in a single day. The sick often languished in filth for long periods before they received medical attention, because most hospitals were disastrously understaffed. In 1825, visitors to St. George’s Hospital discovered mushrooms and wriggling maggots thriving in the damp, soiled sheets of a patient with a compound fracture. The afflicted man, believing this to be the norm, had not complained about the conditions, nor had any of his fellow convalescents thought the squalor especially noteworthy.[2]

Worst of all was the fact that a sickening odor permeated every hospital ward. The air was thick with the stench of piss, shit, and vomit. The smell was so offensive that the staff sometimes walked around with handkerchiefs pressed to their noses. Doctors didn’t exactly smell like rose beds, either. Berkeley Moynihan—one of the first surgeons in England to use rubber gloves—recalled how he and his colleagues used to throw off their own jackets when entering the operating theater and don ancient frocks that were often stiff with dried blood and pus. They had belonged to retired members of staff and were worn as badges of honor by their proud successors, as were many items of surgical clothing.

llanionmilitaryhospitalmoreThe operating theaters within these hospitals were just as dirty as the surgeons working in them. In the early decades of the nineteenth century, it was safer to have surgery at home than it was in a hospital, where mortality rates were three to five times higher than they were in domestic settings. Those who went under the knife did so as a last resort, and so were usually mortally ill. Very few surgical patients recovered without incident. Many either died or fought their way back to only partial health. Those unlucky enough to find themselves hospitalized during this period would frequently fall prey to a host of infections, most of which were fatal in a pre-antibiotic era.

419c2b28d1b137197a21298b24a604c0In addition to the foul smells, fear permeated the atmosphere of the Victorian hospital. The surgeon John Bell wrote that it was easy to imagine the mental anguish of the hospital patient awaiting surgery. He would hear regularly “the cries of those under operation which he is preparing to undergo,” and see his “fellow-sufferer conveyed to that scene of trial,” only to be “carried back in solemnity and silence to his bed.” Lastly, he was subjected to the sound of their dying groans as they suffered the final throes of what was almost certainly their end.[3]

As horrible as these hospitals were, it was not easy gaining entry to one. Throughout the nineteenth century, almost all the hospitals in London except the Royal Free controlled inpatient admission through a system of ticketing. One could obtain a ticket from one of the hospital’s “subscribers,” who had paid an annual fee in exchange for the right to recommend patients to the hospital and vote in elections of medical staff. Securing a ticket required tireless soliciting on the part of potential patients, who might spend days waiting and calling on the servants of subscribers and begging their way into the hospital. Some hospitals only admitted patients who brought with them money to cover their almost inevitable burial. Others, like St. Thomas’ in London, charged double if the person in question was deemed “foul” by the admissions officer.[4]

27

Before germs and antisepsis were fully understood, remedies for hospital squalor were hard to come by. The obstetrician James Y. Simpson suggested an almost-fatalistic approach to the problem. If cross-contamination could not be controlled, he argued, then hospitals should be periodically destroyed and built anew. Another surgeon voiced a similar view. “Once a hospital has become incurably pyemia-stricken, it is impossible to disinfect it by any known hygienic means, as it would to disinfect an old cheese of the maggots which have been generated in it,” he wrote. There was only one solution: the wholesale “demolition of the infected fabric.”[5]

fitzharris_butcheringart_021417It wasn’t until a young surgeon named Joseph Lister developed the concept of antisepsis in the 1860s that hospitals became places of healing rather than places of death.

To read more about 19th-century hospitals and Joseph Lister’s antiseptic revolution, pre-order my book THE BUTCHERING ART by clicking here. Pre-orders are incredibly helpful to new authors . Info on how to order foreign editions coming soon. Your support is greatly appreciated. 

 

1. Adrian Teal, The Gin Lane Gazette (London: Unbound, 2014).
2. F. B. Smith, The People’s Health 1830-1910 (London: Croom Helm, 1979), 262.
3. John Bell, The Principles of Surgery, Vol. III (1808), 293.
4. Elisabeth Bennion, Antique Medical Instruments (Berkeley: University of California Press, 1979), 13.
5. John Eric Erichsen, On Hospitalism and the Causes of Death after Operations (London: Longmans, Green, and Co., 1874), 98.

Pre-Order My Book! The Butchering Art

fitzharris_butcheringart_021417

I’m thrilled to reveal the cover for the US edition of my forthcoming book, THE BUTCHERING ART, which will be published by FSG on October 17th.

The book delves into the grisly world of Victorian surgery and transports the reader to a period when a broken leg could result in amputation, when giving birth in a squalid hospital was extraordinarily dangerous, and when a minor injury could lead to a miserable death. Surgeons—lauded for their brute strength and quick knives—rarely washed their hands or their instruments, and carried with them a cadaverous smell of rotting flesh, which those in the profession cheerfully referred to as “good old hospital stink.” At a time when surgery couldn’t have been more dangerous, an unlikely figure stepped forward: Joseph Lister, a young, melancholic Quaker surgeon. By making the audacious claim that germs were the source of all infection—and could be treated with antiseptics—he changed the history of surgery forever.

Many of you have been devoted readers of my blog since its inception in 2010, and I can’t thank you enough for your continued interest in my work. Writing a book has been the next logical step for a very long time. The idea of telling this particular story arose during a very difficult period in my life when my writing career was at risk. It is therefore with great pride (and some trepidation) that I am turning this book loose into the world, and humbly ask you to consider pre-ordering it. All pre-orders count towards first-week sales once THE BUTCHERING ART is released, and therefore give me a greater chance of securing a place on bestseller lists in October. I would be hugely grateful for your support.

Pre-order from any one of these vendors using the links below:

*Please note that THE BUTCHERING ART will also be published by Penguin in the United Kingdom, as well as several other publishers around the world. I’ll be revealing covers for these foreign editions in the coming months, along with information on where to buy a copy.

Syphilis: A Little Valentine’s Day Love Story

12

Photo Credit: The Royal College of Surgeons of England 

We don’t know much about her. We don’t even know her name. What we do know is that the woman who wore the above prosthetic in the mid-19th century was suffering from a severe case of syphilis.

Before the discovery of penicillin in 1928, syphilis was an incurable disease. Its symptoms were as terrifying as they were unrelenting. Those who suffered from it long enough could expect to develop unsightly skin ulcers, paralysis, gradual blindness, dementia and “saddle nose,” a grotesque deformity which occurs when the bridge of the nose caves into the face.

stlcfo00239This deformity was so common amongst those suffering from the pox (as it was sometimes called) that “no nose clubs” sprung up in London. On 18 February 1874, the Star reported: “Miss Sanborn tells us that an eccentric gentleman, having taken a fancy to see a large party of noseless persons, invited every one thus afflicted, whom he met in the streets, to dine on a certain day at a tavern, where he formed them into a brotherhood.”[1] The man, who assumed the name Mr. Crampton for these clandestine parties, entertained his “noseless’” friends every month until he died a year later, at which time the group “unhappily dissolved.”[2]

The 19th century was particularly rife with syphilis. Because of its prevalence, both physicians and surgeons treated victims of the disease. Many treatments involved the use of mercury, hence giving rise to the saying: “One night with Venus, a lifetime with Mercury.” Mercury could be administered in the form of calomel (mercury chloride), an ointment, a steam bath or pill. Unfortunately, the side effects could be as painful and terrifying as the disease itself. Many patients who underwent mercury treatments suffered from extensive tooth loss, ulcerations and neurological damage. In many cases, people died from significant mercury poisoning.

For those determined to avoid the pox altogether, condoms made from animal membrane and secured with a silk ribbon were available [below], but these were outlandishly expensive. Moreover, many men shunned them for being uncomfortable and cumbersome. In 1717, the surgeon, Daniel Turner, wrote:

The Condum being the best, if not only Preservative our Libertines have found out at present; and yet by reason of its blunting the Sensation, I have heard some of them acknowledge, that they had often chose to risk a Clap, rather than engage cum Hastis sic clypeatis [with spears thus sheathed].[3]

13Everyone blamed each other for the burdensome condom. The French called it “la capote anglaise” (the English cape), while the English called it the “French letter.” Even more unpleasant was the fact that once one procured a condom, he was expected to use it repeatedly. Unsurprisingly, syphilis continued to rage despite the growing availability of condoms during the Victorian period.

Which brings me back to the owner of the prosthetic nose. Eventually, she lost her teeth and palate after prolonged exposure to mercury treatments. Her husband—who may have been the source of her suffering—finally died from the disease, leaving her a widow. But it wasn’t all doom and gloom for the poor, unfortunate Mrs X.

According to records at the Royal College of Surgeons in London, the woman found another suitor despite her deformities. After the wedding, she sought out the physician, James Merryweather, and sold the contraption to him for £3. The reason? Her new husband liked her just the way she was – no nose and all!

And that, kind readers, is a true Valentine’s Day love story…Ignore the part where she most certainly transmitted the disease to her new lover.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Origin of the No Nose Club. Star, Issue 1861 (18 February 1874), p. 3.
2. Ibid.
3. Daniel Turner, Syphilis: A Practical Treatise on the Venereal Disease (1717), p. 74.

The Surgeon who Operated on Himself

blog1

Leonid Ivanovich Rogozov (pictured above and below right) knew he was in trouble when he began experiencing intense pain in lower right quadrant of his abdomen. He had been feeling unwell for several days, but suddenly, his temperature skyrocketed and he was overcome by waves of nausea. The 27-year-old surgeon knew it could only be one thing: appendicitis.

blog3The year was 1961, and under normal circumstances, appendicitis was not life-threatening. But Rogozov was stuck in the middle of the Antartica, surrounded by nothing but thousands of square miles of snow and ice, far from civilization. He was one of thirteen researchers who had just embarked on the sixth Soviet Antarctic Expedition.

And he was the only doctor.

At first, Rogozov resigned himself to his fate. He wrote in his diary:

It seems that I have appendicitis. I am keeping quiet about it, even smiling. Why frighten my friends? Who could be of help? A polar explorer’s only encounter with medicine is likely to have been in a dentist’s chair.

He was right that there was no one who could help. Even if there had been another research station within a reasonable distance, the blizzard raging outside Rogozov’s own encampment would have prevented anyone from reaching him. An evacuation by air was out of the question in those treacherous conditions. As the situation grew worse, the young Soviet surgeon did the only thing he could think of: he prepared to operate on himself.

Rogozov was not the first to attempt a self-appendectomy. In 1921, the American surgeon Evan O’Neill Kane undertook an impromptu experiment after he too was diagnosed with a severe case of appendicitis. He wanted to know whether invasive surgery performed under local anesthetic could be painless. Kane had several patients who had medical conditions which prevented them from undergoing general anesthetic. If he could remove his own appendix using just a local anesthetic, Kane reasoned that he could operate on others without having to administer ether, which he believed was dangerous and overused in surgery.

Lying in the operating theater at the Kane Summit Hospital, the 60-year-old surgeon announced his intentions to his staff. As he was Chief of Surgery, no one dared disagree with him. Kane proceeded by administering novocaine—a local anesthetic that had only recently replaced the far more dangerous drug, cocaine—as well as adrenalin into his abdominal wall. Propping himself up on pillows and using mirrors, he began cutting into his abdomen. At one point, Kane leaned too far forward and part of his intestines popped out. The seasoned surgeon calmly shoved his guts back into their rightful place before continuing with the operation. Within thirty minutes, he had located and removed the swollen appendix. Kane later said that he could have completed the operation more rapidly had it not been for the staff flitting around him nervously, unsure of what they were supposed to do.

blog2

Emboldened by his success, Kane decided to repair his own inguinal hernia under local anesthetic eleven years later. The operation was carried out with the the press in attendance. This operation was more dangerous than the appendectomy because of the risk of puncturing the femoral artery. Unfortunately, this second surgery was tricky, and ended up taking well over an hour. Kane never fully regained his strength. He eventually came down with pneumonia, and died three months later.

Back in Antartica, Rogozov enlisted the help of his colleagues, who assisted with mirrors and retractors as the surgeon cut deep into his own abdomen. After forty-five minutes, Rogozov began experiencing weakness and vertigo, and had to take short breaks. Eventually he was able to remove the offending organ and sew up the incision (pictured below, recovering). Miraculously, Rogozov was able to return to work within two weeks.

blog4The incident captured the imagination of the Soviet public at the time. After he returned from the expedition, Rogozov was awarded the Order of the Red Banner of Labour. The incident also brought about a change in policy. Thereafter, extensive health checks became mandatory for personnel before their departure for Antartica was sanctioned.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Under The Knife – Reboot!

It’s been 18 months since I’ve filmed an episode of my YouTube series, Under The Knife. But that ends today! Check out the trailer to the series reboot, which may or may not involve my severed head. A NEW episode is coming next week. If you haven’t subscribed to the channel, please do. You’ll be automatically entered to win macabre little trinkets before the launch of our next video.

My team and I have a lot of fun, quirky things planned for the series in the coming months. Under The Knife combines traditional storytelling techniques with animation, special effects, and artwork to bring the medical past alive. I hope you enjoy watching the new series as much as I enjoy filming it for you.

The Medicalization of Death in History

Ars2

When the Black Death swept through Europe in the 14th century, it claimed the lives of over 75 million people, many of who were clergymen whose job it was to help usher the dying into the next world. In response to the shortage of priests, the Ars Moriendi (Art of Dying) first emerged in 1415. This was a manual that provided practical guidance to the dying and those who attended them in their final moments. These included prayers and prescribed rites to be performed at the deathbed, as well as illustrations and descriptions of the temptations one must overcome in order to achieve a “good death.”

From the medieval period onwards, the dying were expected to follow a set of “rules” when facing the final moments of their lives, which included repentance of sins, forgiving enemies, and accepting one’s fate stoically without complaint.  It was each person’s duty to die a righteous death.

Ars1In earlier periods, many people believed that pain was a necessary component of a good death. Indeed, the word patient comes from the Latin word patiens, meaning “long-suffering” or “one who suffers.”  Evangelical Christians, in particular, feared losing lucidity as death approached, as this would prohibit the person from begging forgiveness for past sins and putting his or her worldly affairs in order before departing this life. Death was a public event, with those closest to the dying in attendance. Friends and relatives were not merely passive observers. They often assisted the dying person in his or her final hours, offering up prayers as the moment drew closer. The deathbed presented the dying with the final opportunity for eternal salvation.

For this reason, the physician rarely appeared at the bedside of a dying person because pain management wasn’t required. Moreover, the general consensus was that it was inappropriate for a person to profit from another’s death. Caricatures depicting the greedy physician running off with bags of money after his patient had succumbed to his fate were not uncommon in the 18th and early 19th centuries.

Over time, however, religious sentiments faded, and physicians began to appear more regularly in the homes of the dying. Part of this was driven by the fact that doctors became more effective at pain management. At the start of the 19th century, they typically administered laudanum drops orally to patients. This process was imprecise, and sometimes not effective at all. All this changed in 1853 when Charles Pravaz and Alexander Wood developed a medical hypodermic syringe with a needle fine enough to pierce the skin (see example below from c.1880). From that point onwards, physicians began administering morphine intravenously, which was far more effective (though also more dangerous). As new techniques emerged, people’s attitudes towards pain management in treating the dying began to change. Soon, a painless death was not only seen to be acceptable, but also vital to achieving a “good death.”

Ars3

The doctor’s place at the bedside of the dying became increasingly more commonplace. Whereas the deathbed was formerly governed by religious tradition, it now became the purview of the medical community. The emphasis for the doctor had shifted from cure to care by the end of the 19th century, thus making his place in the “death chamber” more acceptable. Now, the good physician was one who stood by his patient, even when nothing could be done to save his or her life. As the legal scholar Shai Lavi succinctly put it: “The ethics of the deathbed shifted from religion to medicine, and dying further emerged as a matter of regulating life: life was now understood in its biological, rather than biographical, sense.”

The medicalization of death had arrived, one which for better or worse continues to shape how we die today.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

  1. Shai Lavi, “Euthanasia and the Changing Ethics of the Deathbed: A Study in Historical Jurisprudence,” Theoretical Inquiries in Law, 4.2 (2003): pp. 729 – 761.

“Limbs Not Yet Rigid” – A History of Dissecting the Living

L0031335 The dead alive! H. Wigstead 1784

Several years ago, the news reported a story that could have come straight from the script of a horror movie. In October 2009, Colleen S. Burns was admitted to St Joseph’s Hospital Center in New York for a drug overdose. A short time later, a team of doctors pronounced the 39-year-old woman dead. Her family was notified and Burns’s body was prepped for organ donation.

The only problem was: Burns wasn’t actually dead.

She was in a drug-induced coma. Fortunately for her, she woke minutes before the first incision was made. Happily, occurrences such as this are few and far between these days. Yet in the past, incidences of premature dissection were not uncommon.

In 1746, Jacques-Bénigne Winslow wrote: “Tho’ Death, at some Time or other, is the necessary and unavoidable Portion of Human Nature in its present Condition, yet it is not always certain, that Persons taken for dead are really and irretrievably deprived of Life.” Indeed, the Danish anatomist went on to claim that it was “evident from Experience” that those thought to be dead have proven otherwise “by rising from their Shrowds [sic], their Coffins, and even from their Graves.” [1]

DA3Fears over premature burial were ubiquitous during this period, so much so that people created “life preserving coffins” with bells and breathing tubes attached. But even worse than being buried alive was the thought of being dissected alive. The threat was real, and it happened often enough to be commented on in contemporary literature with some frequency.

The 17th and 18th centuries were rife with stories about executed criminals who had “returned from the dead” just moments before being dissected. In 1651, Anne Green was hanged in Oxford for infanticide. For thirty minutes, she dangled at the end of the noose while her friends thumped her chest and put “their weight upon her leggs [sic]…lifting her up and then pulling her downe againe with a suddain jerke” in order to quicken her death. Afterwards, her body was cut down from the gallows and brought to Drs Thomas Willis and William Petty to be dissected. Just seconds before Willis plunged the knife into her sternum, Anne miraculously awoke. [2]

The 19th century had its fair share of incidences too. The physician and surgeon, Sir Robert Christison, complained that dissection in St Bartholomew’s Hospital in London was “apt to be performed with indecent, sometimes with dangerous haste” during this period. He remembered:

…an occasion when [William] Cullen commenced the dissection of a man who died on hour before, and when fluid blood gushed in abundance from the first incision through the skin…Instantly I seized his wrist in great alarm, and arrested his progress; nor was I easily persuaded to let him go on, when I saw the blood coagulate on the table exactly like living blood.

He further remarked: “It was no uncommon occurrence that, when the operator proceeded with his work, the body was sensibly warm, the limbs not yet rigid, the blood in the great vessels fluid and coagulable [sic].” [3]

M0008887 An aged anatomist selecting his dissection instrument whilst

The problem wasn’t contained to Britain alone. The French physician — Pierre Charles Alexandre Louis — reported the story of a patient who had been placed in his dissection room at the Pitié-Salpêtrière Hospital in Paris. The next morning, the doctor’s colleagues informed him that they had heard moans in the locked theater overnight. When Louis went to check it out, he found “to his horror that the supposed corpse had revived during the night, and had actually died in the struggle to disengage herself from the winding sheet in which she was enveloped.” [4]

It was largely because of reports like this that anatomists, themselves, worried about the precise moment of death when cutting open bodies. To avoid disaster, good old Winslow suggested that a person’s gums be rubbed with caustic substances, and that the body be “stimulate[d]…with Whips and Nettles” before being dissected. Furthermore, the anatomist should “irritate his Intestines by Means of Clysters and Injections of Air or Smoke” as well as “agitate… the Limbs by violent Extensions and Inflexions.” If possible, an attempt should also be made to “shock [the person’s] Ears by hideous Shrieks and excessive Noises.” [5]

To our modern sensibilities, these measures may seem extreme, even comical, but to Winslow, this was no laughing matter. In fact, he went even further, recommending that the palms of the hands and the soles of the feet be pricked with needles, and that the “Scapulae, Shoulders and Arms” be scarified using fire or sharp instruments so as to “lacerate and strip [them] of the epidermis.” Indeed, when reading Winslow’s work, one gets the innate feeling that he took pleasure in imaging new ways to torture the dead.

Today, new debates have arisen over the very definition of death itself with the emergence of “beating heart cadavers.” Though considered dead in both a medical and legal capacity, these “cadavers” are kept on ventilators for organ and tissue transplantation. Their hearts beat; they expel waste; they have the ability to heal themselves of infection; they can even carry a fetus to term. Crucially, though, their brains are no longer functioning. It is in this way that the medical community has redefined death in the 21st century.

Yet, some wonder whether these “beating heart cadavers” are really dead, or whether they are just straddling the great divide between life and death before the finally lights go out. Or, worse, have been misdiagnosed, as in the case of Colleen Burns.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Jacques-Bénigne Winslow, The Uncertainty of the Signs of Death, and the Danger of Precipitate Interments and Dissections (1746), pp. 1-2.
2. Anon., A declaration from Oxford, of Anne Green a young woman that was lately, and unjustly hanged in the Castle-yard; but since recovered (London, 1651), p. 2.
3. R. Christison, The Life of Sir Robert Christison (1885-6), pp. 192-3. Originally quoted in Ruth Richardson, Death, Dissection and the Destitute (2000), p. 98.
4. Ibid.
5. Winslow, The Uncertainty of the Signs of Death, p. 2.