Houses of Death: Walking the Wards of a Victorian Hospital

9deb7918e7e1d5281d6cfba4eafb711dThe following blog post relates to my forthcoming book THE BUTCHERING ART, which you can pre-order here

Today, we think of the hospital as an exemplar of sanitation. However, during the first half of the nineteenth century, hospitals were anything but hygienic. They were breeding grounds for infection and provided only the most primitive facilities for the sick and dying, many of whom were housed on wards with little ventilation or access to clean water. As a result of this squalor, hospitals became known as “Houses of Death.”

L0059152 Trade card for a 'Bug Destroyer' Andrew Cooke, LondonThe best that can be said about Victorian hospitals is that they were a slight improvement over their Georgian predecessors. That’s hardly a ringing endorsement when one considers that a hospital’s “Chief Bug-Catcher”—whose job it was to rid the mattresses of lice—was paid more than its surgeons in the eighteenth century. In fact, bed bugs were so common that the “Bug Destroyer” Andrew Cooke [see image, left] claimed to have cleared upwards of 20,000 beds of insects during the course of his career.[1]

In spite of token efforts to make them cleaner, most hospitals remained overcrowded, grimy, and poorly managed. The assistant surgeon at St. Thomas’s Hospital in London was expected to examine over 200 patients in a single day. The sick often languished in filth for long periods before they received medical attention, because most hospitals were disastrously understaffed. In 1825, visitors to St. George’s Hospital discovered mushrooms and wriggling maggots thriving in the damp, soiled sheets of a patient with a compound fracture. The afflicted man, believing this to be the norm, had not complained about the conditions, nor had any of his fellow convalescents thought the squalor especially noteworthy.[2]

Worst of all was the fact that a sickening odor permeated every hospital ward. The air was thick with the stench of piss, shit, and vomit. The smell was so offensive that the staff sometimes walked around with handkerchiefs pressed to their noses. Doctors didn’t exactly smell like rose beds, either. Berkeley Moynihan—one of the first surgeons in England to use rubber gloves—recalled how he and his colleagues used to throw off their own jackets when entering the operating theater and don ancient frocks that were often stiff with dried blood and pus. They had belonged to retired members of staff and were worn as badges of honor by their proud successors, as were many items of surgical clothing.

llanionmilitaryhospitalmoreThe operating theaters within these hospitals were just as dirty as the surgeons working in them. In the early decades of the nineteenth century, it was safer to have surgery at home than it was in a hospital, where mortality rates were three to five times higher than they were in domestic settings. Those who went under the knife did so as a last resort, and so were usually mortally ill. Very few surgical patients recovered without incident. Many either died or fought their way back to only partial health. Those unlucky enough to find themselves hospitalized during this period would frequently fall prey to a host of infections, most of which were fatal in a pre-antibiotic era.

419c2b28d1b137197a21298b24a604c0In addition to the foul smells, fear permeated the atmosphere of the Victorian hospital. The surgeon John Bell wrote that it was easy to imagine the mental anguish of the hospital patient awaiting surgery. He would hear regularly “the cries of those under operation which he is preparing to undergo,” and see his “fellow-sufferer conveyed to that scene of trial,” only to be “carried back in solemnity and silence to his bed.” Lastly, he was subjected to the sound of their dying groans as they suffered the final throes of what was almost certainly their end.[3]

As horrible as these hospitals were, it was not easy gaining entry to one. Throughout the nineteenth century, almost all the hospitals in London except the Royal Free controlled inpatient admission through a system of ticketing. One could obtain a ticket from one of the hospital’s “subscribers,” who had paid an annual fee in exchange for the right to recommend patients to the hospital and vote in elections of medical staff. Securing a ticket required tireless soliciting on the part of potential patients, who might spend days waiting and calling on the servants of subscribers and begging their way into the hospital. Some hospitals only admitted patients who brought with them money to cover their almost inevitable burial. Others, like St. Thomas’ in London, charged double if the person in question was deemed “foul” by the admissions officer.[4]

27

Before germs and antisepsis were fully understood, remedies for hospital squalor were hard to come by. The obstetrician James Y. Simpson suggested an almost-fatalistic approach to the problem. If cross-contamination could not be controlled, he argued, then hospitals should be periodically destroyed and built anew. Another surgeon voiced a similar view. “Once a hospital has become incurably pyemia-stricken, it is impossible to disinfect it by any known hygienic means, as it would to disinfect an old cheese of the maggots which have been generated in it,” he wrote. There was only one solution: the wholesale “demolition of the infected fabric.”[5]

fitzharris_butcheringart_021417It wasn’t until a young surgeon named Joseph Lister developed the concept of antisepsis in the 1860s that hospitals became places of healing rather than places of death.

To read more about 19th-century hospitals and Joseph Lister’s antiseptic revolution, pre-order my book THE BUTCHERING ART by clicking here. Pre-orders are incredibly helpful to new authors . Info on how to order foreign editions coming soon. Your support is greatly appreciated. 

 

1. Adrian Teal, The Gin Lane Gazette (London: Unbound, 2014).
2. F. B. Smith, The People’s Health 1830-1910 (London: Croom Helm, 1979), 262.
3. John Bell, The Principles of Surgery, Vol. III (1808), 293.
4. Elisabeth Bennion, Antique Medical Instruments (Berkeley: University of California Press, 1979), 13.
5. John Eric Erichsen, On Hospitalism and the Causes of Death after Operations (London: Longmans, Green, and Co., 1874), 98.

Syphilis: A Little Valentine’s Day Love Story

12

Photo Credit: The Royal College of Surgeons of England 

We don’t know much about her. We don’t even know her name. What we do know is that the woman who wore the above prosthetic in the mid-19th century was suffering from a severe case of syphilis.

Before the discovery of penicillin in 1928, syphilis was an incurable disease. Its symptoms were as terrifying as they were unrelenting. Those who suffered from it long enough could expect to develop unsightly skin ulcers, paralysis, gradual blindness, dementia and “saddle nose,” a grotesque deformity which occurs when the bridge of the nose caves into the face.

stlcfo00239This deformity was so common amongst those suffering from the pox (as it was sometimes called) that “no nose clubs” sprung up in London. On 18 February 1874, the Star reported: “Miss Sanborn tells us that an eccentric gentleman, having taken a fancy to see a large party of noseless persons, invited every one thus afflicted, whom he met in the streets, to dine on a certain day at a tavern, where he formed them into a brotherhood.”[1] The man, who assumed the name Mr. Crampton for these clandestine parties, entertained his “noseless’” friends every month until he died a year later, at which time the group “unhappily dissolved.”[2]

The 19th century was particularly rife with syphilis. Because of its prevalence, both physicians and surgeons treated victims of the disease. Many treatments involved the use of mercury, hence giving rise to the saying: “One night with Venus, a lifetime with Mercury.” Mercury could be administered in the form of calomel (mercury chloride), an ointment, a steam bath or pill. Unfortunately, the side effects could be as painful and terrifying as the disease itself. Many patients who underwent mercury treatments suffered from extensive tooth loss, ulcerations and neurological damage. In many cases, people died from significant mercury poisoning.

For those determined to avoid the pox altogether, condoms made from animal membrane and secured with a silk ribbon were available [below], but these were outlandishly expensive. Moreover, many men shunned them for being uncomfortable and cumbersome. In 1717, the surgeon, Daniel Turner, wrote:

The Condum being the best, if not only Preservative our Libertines have found out at present; and yet by reason of its blunting the Sensation, I have heard some of them acknowledge, that they had often chose to risk a Clap, rather than engage cum Hastis sic clypeatis [with spears thus sheathed].[3]

13Everyone blamed each other for the burdensome condom. The French called it “la capote anglaise” (the English cape), while the English called it the “French letter.” Even more unpleasant was the fact that once one procured a condom, he was expected to use it repeatedly. Unsurprisingly, syphilis continued to rage despite the growing availability of condoms during the Victorian period.

Which brings me back to the owner of the prosthetic nose. Eventually, she lost her teeth and palate after prolonged exposure to mercury treatments. Her husband—who may have been the source of her suffering—finally died from the disease, leaving her a widow. But it wasn’t all doom and gloom for the poor, unfortunate Mrs X.

According to records at the Royal College of Surgeons in London, the woman found another suitor despite her deformities. After the wedding, she sought out the physician, James Merryweather, and sold the contraption to him for £3. The reason? Her new husband liked her just the way she was – no nose and all!

And that, kind readers, is a true Valentine’s Day love story…Ignore the part where she most certainly transmitted the disease to her new lover.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Origin of the No Nose Club. Star, Issue 1861 (18 February 1874), p. 3.
2. Ibid.
3. Daniel Turner, Syphilis: A Practical Treatise on the Venereal Disease (1717), p. 74.

The Medicalization of Death in History

Ars2

When the Black Death swept through Europe in the 14th century, it claimed the lives of over 75 million people, many of who were clergymen whose job it was to help usher the dying into the next world. In response to the shortage of priests, the Ars Moriendi (Art of Dying) first emerged in 1415. This was a manual that provided practical guidance to the dying and those who attended them in their final moments. These included prayers and prescribed rites to be performed at the deathbed, as well as illustrations and descriptions of the temptations one must overcome in order to achieve a “good death.”

From the medieval period onwards, the dying were expected to follow a set of “rules” when facing the final moments of their lives, which included repentance of sins, forgiving enemies, and accepting one’s fate stoically without complaint.  It was each person’s duty to die a righteous death.

Ars1In earlier periods, many people believed that pain was a necessary component of a good death. Indeed, the word patient comes from the Latin word patiens, meaning “long-suffering” or “one who suffers.”  Evangelical Christians, in particular, feared losing lucidity as death approached, as this would prohibit the person from begging forgiveness for past sins and putting his or her worldly affairs in order before departing this life. Death was a public event, with those closest to the dying in attendance. Friends and relatives were not merely passive observers. They often assisted the dying person in his or her final hours, offering up prayers as the moment drew closer. The deathbed presented the dying with the final opportunity for eternal salvation.

For this reason, the physician rarely appeared at the bedside of a dying person because pain management wasn’t required. Moreover, the general consensus was that it was inappropriate for a person to profit from another’s death. Caricatures depicting the greedy physician running off with bags of money after his patient had succumbed to his fate were not uncommon in the 18th and early 19th centuries.

Over time, however, religious sentiments faded, and physicians began to appear more regularly in the homes of the dying. Part of this was driven by the fact that doctors became more effective at pain management. At the start of the 19th century, they typically administered laudanum drops orally to patients. This process was imprecise, and sometimes not effective at all. All this changed in 1853 when Charles Pravaz and Alexander Wood developed a medical hypodermic syringe with a needle fine enough to pierce the skin (see example below from c.1880). From that point onwards, physicians began administering morphine intravenously, which was far more effective (though also more dangerous). As new techniques emerged, people’s attitudes towards pain management in treating the dying began to change. Soon, a painless death was not only seen to be acceptable, but also vital to achieving a “good death.”

Ars3

The doctor’s place at the bedside of the dying became increasingly more commonplace. Whereas the deathbed was formerly governed by religious tradition, it now became the purview of the medical community. The emphasis for the doctor had shifted from cure to care by the end of the 19th century, thus making his place in the “death chamber” more acceptable. Now, the good physician was one who stood by his patient, even when nothing could be done to save his or her life. As the legal scholar Shai Lavi succinctly put it: “The ethics of the deathbed shifted from religion to medicine, and dying further emerged as a matter of regulating life: life was now understood in its biological, rather than biographical, sense.”

The medicalization of death had arrived, one which for better or worse continues to shape how we die today.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

  1. Shai Lavi, “Euthanasia and the Changing Ethics of the Deathbed: A Study in Historical Jurisprudence,” Theoretical Inquiries in Law, 4.2 (2003): pp. 729 – 761.

“Limbs Not Yet Rigid” – A History of Dissecting the Living

L0031335 The dead alive! H. Wigstead 1784

Several years ago, the news reported a story that could have come straight from the script of a horror movie. In October 2009, Colleen S. Burns was admitted to St Joseph’s Hospital Center in New York for a drug overdose. A short time later, a team of doctors pronounced the 39-year-old woman dead. Her family was notified and Burns’s body was prepped for organ donation.

The only problem was: Burns wasn’t actually dead.

She was in a drug-induced coma. Fortunately for her, she woke minutes before the first incision was made. Happily, occurrences such as this are few and far between these days. Yet in the past, incidences of premature dissection were not uncommon.

In 1746, Jacques-Bénigne Winslow wrote: “Tho’ Death, at some Time or other, is the necessary and unavoidable Portion of Human Nature in its present Condition, yet it is not always certain, that Persons taken for dead are really and irretrievably deprived of Life.” Indeed, the Danish anatomist went on to claim that it was “evident from Experience” that those thought to be dead have proven otherwise “by rising from their Shrowds [sic], their Coffins, and even from their Graves.” [1]

DA3Fears over premature burial were ubiquitous during this period, so much so that people created “life preserving coffins” with bells and breathing tubes attached. But even worse than being buried alive was the thought of being dissected alive. The threat was real, and it happened often enough to be commented on in contemporary literature with some frequency.

The 17th and 18th centuries were rife with stories about executed criminals who had “returned from the dead” just moments before being dissected. In 1651, Anne Green was hanged in Oxford for infanticide. For thirty minutes, she dangled at the end of the noose while her friends thumped her chest and put “their weight upon her leggs [sic]…lifting her up and then pulling her downe againe with a suddain jerke” in order to quicken her death. Afterwards, her body was cut down from the gallows and brought to Drs Thomas Willis and William Petty to be dissected. Just seconds before Willis plunged the knife into her sternum, Anne miraculously awoke. [2]

The 19th century had its fair share of incidences too. The physician and surgeon, Sir Robert Christison, complained that dissection in St Bartholomew’s Hospital in London was “apt to be performed with indecent, sometimes with dangerous haste” during this period. He remembered:

…an occasion when [William] Cullen commenced the dissection of a man who died on hour before, and when fluid blood gushed in abundance from the first incision through the skin…Instantly I seized his wrist in great alarm, and arrested his progress; nor was I easily persuaded to let him go on, when I saw the blood coagulate on the table exactly like living blood.

He further remarked: “It was no uncommon occurrence that, when the operator proceeded with his work, the body was sensibly warm, the limbs not yet rigid, the blood in the great vessels fluid and coagulable [sic].” [3]

M0008887 An aged anatomist selecting his dissection instrument whilst

The problem wasn’t contained to Britain alone. The French physician — Pierre Charles Alexandre Louis — reported the story of a patient who had been placed in his dissection room at the Pitié-Salpêtrière Hospital in Paris. The next morning, the doctor’s colleagues informed him that they had heard moans in the locked theater overnight. When Louis went to check it out, he found “to his horror that the supposed corpse had revived during the night, and had actually died in the struggle to disengage herself from the winding sheet in which she was enveloped.” [4]

It was largely because of reports like this that anatomists, themselves, worried about the precise moment of death when cutting open bodies. To avoid disaster, good old Winslow suggested that a person’s gums be rubbed with caustic substances, and that the body be “stimulate[d]…with Whips and Nettles” before being dissected. Furthermore, the anatomist should “irritate his Intestines by Means of Clysters and Injections of Air or Smoke” as well as “agitate… the Limbs by violent Extensions and Inflexions.” If possible, an attempt should also be made to “shock [the person’s] Ears by hideous Shrieks and excessive Noises.” [5]

To our modern sensibilities, these measures may seem extreme, even comical, but to Winslow, this was no laughing matter. In fact, he went even further, recommending that the palms of the hands and the soles of the feet be pricked with needles, and that the “Scapulae, Shoulders and Arms” be scarified using fire or sharp instruments so as to “lacerate and strip [them] of the epidermis.” Indeed, when reading Winslow’s work, one gets the innate feeling that he took pleasure in imaging new ways to torture the dead.

Today, new debates have arisen over the very definition of death itself with the emergence of “beating heart cadavers.” Though considered dead in both a medical and legal capacity, these “cadavers” are kept on ventilators for organ and tissue transplantation. Their hearts beat; they expel waste; they have the ability to heal themselves of infection; they can even carry a fetus to term. Crucially, though, their brains are no longer functioning. It is in this way that the medical community has redefined death in the 21st century.

Yet, some wonder whether these “beating heart cadavers” are really dead, or whether they are just straddling the great divide between life and death before the finally lights go out. Or, worse, have been misdiagnosed, as in the case of Colleen Burns.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Jacques-Bénigne Winslow, The Uncertainty of the Signs of Death, and the Danger of Precipitate Interments and Dissections (1746), pp. 1-2.
2. Anon., A declaration from Oxford, of Anne Green a young woman that was lately, and unjustly hanged in the Castle-yard; but since recovered (London, 1651), p. 2.
3. R. Christison, The Life of Sir Robert Christison (1885-6), pp. 192-3. Originally quoted in Ruth Richardson, Death, Dissection and the Destitute (2000), p. 98.
4. Ibid.
5. Winslow, The Uncertainty of the Signs of Death, p. 2.

The Mad Dogs of London: A Tale of Rabies

 

L0048997 A mad dog on the run in a London street: citizens attack it

There was panic on the streets of London in 1760, and the city’s newspapers weren’t helping the situation. Hundreds of column inches, for week upon week, were full of terrifying reports about an outbreak of attacks by rabid dogs. Armchair experts even wrote letters to newspaper editors offering advice and hypotheses on the causes and prevention of rabies (or “hydrophobia” as contemporaries called it).

Rumor fueled the journalistic fire, and the scare stories came in thick and fast. The London Chronicle was just one of many newspapers that reported the following representative sample of  incidents.

MD5A rabid dog bit a child on the hand in the Strand, and her parents had her arm immediately cut off to prevent the infection spreading, but the unfortunate girl expired soon after, in great agony.

One Sunday morning, a William Hambly of Deptford was getting into his coach, when he was bitten by a “mad dog,” which had come running down But Lane from London Road. The alarm was raised, and the dog was soon hunted down and shot.

A nine-year-old girl of Virginia Street, Wapping, was bitten by a puppy. Shortly afterwards she began to show signs of madness, and her parents were obliged to tie her down to her bed. It was reported that a few days later he raved and barked like a dog.

The son of a ticket porter of Thames Street was also savaged by a stray mutt, but as there appeared to be no subsequent signs of madness about him, his death a while after the attack was attributed to fright rather than to the wound itself. Another report stated that a mad dog had bitten three other dogs in Islington, and also two cows belonging to a Mr. Pullein. [1]

M0009276 Man being bitten by a mad dog

Such stories unnerved the people of Georgian London, and not without good reason. Even today, rabies is the most deadly virus on earth—more so than Ebola—with nearly a 100 percent mortality rate in the unvaccinated. Since ancient times, it had been recognized that the virus was contracted via an animal bite. In 300 B.C., Aristotle wrote that “Rabies makes the animal mad… It is fatal to the dog itself, and to any animal it bites.” Pliny the Elder also recognized this mode of transmission, but added that dogs could become rabid by tasting the menstrual blood of a woman. [2] (Seems legit).

The first mention of rabies in Britain dates back to 1026 A.D. A Welshman named Howel Dda reported that numerous dogs were suffering from the “madness.” As a result, a law was enacted in Wales that called for the killing of any dog suspected of having rabies. In addition to preventative measures, doctors ofered a multitude of suggestions for how to cure the virus once a person contracted it. These included various herbal remedies such as Scutellaria lateriflora, also known as  Mad-dog Skullcap (a member of the mint family); and cauterizing the wound with a hot iron. [3]

It wasn’t until the 18th century, however, that large outbreaks of rabies began to occur in Britain. With public alarm at fever pitch in 1760, London’s Common Council decided it was time for a radical solution. On August 26th, the Council issued an order that a bounty of 2 shillings should be paid to public-spirited Londoners for any stray dog killed. As a consequence, boys, apprentices, and nefarious youths started going about the city carrying clubs and cudgels, with the intention of butchering numberless dogs. Their bloodied and battered carcasses were tossed in ditches in Moorfields. The Gentleman’s Magazine reported that “No less than the bodies of thirty dead dogs were told in one day…by a person of undoubted veracity, who was only casually passing by that way.” [4]

L0009996 Rabies: Slaying a mad dog

The Council’s actions were condemned by many animal lovers as licensed cruelty. The Gentleman’s Magazine added their concern: “not one in a thousand [of these dead dogs] will be mad…Those who make it a revenue to kill the dogs will carefully avoid meddling with any that have bad symptoms, from the dread of the consequences.” [5] One famous name was dragged into the debate about the rights and wrongs of the dog cull. Renowned pug-owner William Hogarth, doyen of satirical engravings, had his own art-form turned against him by the caricaturists, who showed him in a print entitled, The Dog killers of London & Westminster or licenc’d Cruelty 1760. Surrounded by men beating strays to death in the street, the distraught Hogarth laments, “Oh! My poor Pugg. Oh! My little Dog.”

The rabies outbreak lasted three years. Eventually, the infected dogs died out or were killed, and calm was restored to the city. It would take another 125 years before Louis Pasteur would create a vaccination for the deadly virus and test it on 9-year-old Joseph Mesiter. But that, dear reader, is a subject for another blog post.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. I am hugely indebted to Adrian Teal and his fantastic book, The Gin Lane Gazette, for the 18th century stories cited in this article. Also, see The London Chronicle (August 1760) for similar tales.
2. Both Aristotle and Pliny were quoted in Arthur A. King, Historical Perspective of Rabies in Europe and the Mediterranean Basin (Paris, 2004). This book can be found online. It has a lot of detailed information about historical perceptions of rabies. Another great (digestible) article on the subject is by Lisa Smith, “The Problem of Mad Dogs in the Eighteenth Century,” The Sloane Letters Blog (27 January 2014).
3. Mike Rendell, “Rabies in the Eighteenth Century,” Georgian Gentleman (5 July 2012); and G. Fleming, Rabies and Hydrophobia. Their history, nature, causes, symptoms and prevention (London: Chapman and Hall, 1872), 405.
4. The Gentleman’s Magazine (August 1760). A compilation of this magazine from this period can be found here.
5. Ibid.

“The Queen’s Big Belly:” The Phantom Pregnancy of Mary I

M1On 30 April 1555, the people of London took to the streets in celebration as bells ringing out around the city announced that Mary I, Queen of England, had been safely delivered of a healthy son. A preacher proclaimed to gatherers that no one had ever seen such a beautiful prince. News spread quickly to the continent, and letters of congratulation to the royal family began pouring in from Europe.

There was just one problem: Mary hadn’t given birth. In fact, there was no baby at all. What was initially hailed as a royal pregnancy ended in devastation and embarrassment for the Tudor Queen several months later.

Rumors began circulating about the pregnancy shortly after the Queen’s wedding to Philip II of Spain, in September 1554. Mary, who was by then 37-years-old, had reportedly stopped menstruating. Over the coming months, her belly expanded and her doctors attended to her morning sickness. The Queen—thoroughly convinced of the legitimacy of her pregnancy—ordered a royal nursery prepared in anticipation of the arrival of an heir that spring. Letters that would announce the birth of the prince or princess were primed and ready to be sent out at a day’s notice [Elizabeth I’s birth announcement below]. Only the dates and sex of the child needed filling in.

M4

The months ticked by. In June, the Queen issued a statement claiming that God would not allow her child to be born until all Protestant dissenters were punished. Mary—who had already burned countless heretics at the stake since coming to the throne the previous year— began another round of executions in a desperate attempt to induce labor. During this time, the court grew suspicious of the Queen’s condition. Giovanni Michieli, the Venetian ambassador, wrote that Mary’s pregnancy was more likely to “end in wind rather than anything else.”

By August it was clear that there would be no baby, and Mary finally emerged from her confinement, humiliated and defeated. Her belly was once again flat. Her body showed none of the signs that had led to the pregnancy being announced. Her political rivals rejoiced, believing this to be a sign of divine retribution.

L0005249 Foetus in womb.Conspiracy theories erupted immediately. Many people were convinced that Mary was ill. Others believed she had miscarried and simply couldn’t face the truth. Some even went so far as to claim that the barren Queen had been planning to smuggle a baby boy into the court but that the plan had fallen apart. A few wondered if Mary was even still alive. Whatever had happened, however, one thing was clear: Mary seemed to truly believe she had been pregnant.

Pseudocyesis, or phantom pregnancy, was a condition recognized by medical practitioners in the Tudor period. The physician William Harvey—best known for his discovery of the circulation of the blood around the heart—recorded several cases of phantom pregnancies which he had encountered in his practice during the 16th century. Most, he said, ended in “flatulency and fatness.” While many doctors like Harvey believed these phantom pregnancies were the product of trapped wind or the build-up of some kind of matter in the uterus, some thought they were the direct result of wishful thinking on the part of the expectant mother. Guillaume Mauqeust de la Motte referred to aging women, like Mary, who “have such an aversion for old-age, that they had rather believe themselves with child, than to confess they are growing old.”

Although it may seem astonishing today that a woman could falsely believe herself to be pregnant for a full nine months, we must remember that Mary lived during a time when there were no certain ways of determining pregnancy. This wasn’t helped by the fact that throughout her adolescence, Mary had also suffered from extremely painful and unpredictable periods that often left her incapacitated for weeks on end. Wildly fluctuating hormones may have been the cause of her halted menstruation in 1554, which naturally the Queen and her doctors took to be a sign of pregnancy.

M2 (1)

Upon hearing the news, Mary’s husband left England to prosecute a war against France. When he returned to his wife’s side two years later, he brought with him his mistress. It was at this time that the Queen suffered yet another phantom pregnancy—perhaps brought on by grief from her failing marriage and inability to bear children to date. This second incident, however, led many to believe she had a tumor growing in her womb. What else could cause Mary’s belly to grow as big as it would if she were carrying a royal heir?

Sadly, Mary died childless shortly after this second phantom pregnancy. She had been Queen for only four years. Those who embalmed her body and prepared it for burial found no indication of a tumor, or any other explanation for her false pregnancies, which were a source of such deep sadness for Mary in her lifetime.

If you enjoy my blog, please consider supporting my content by clicking HERE.

Suggested Reading:

Levin, Carole. “Mary I’s Phantom Pregnancy.” History Extra (12 May 2015).

Levin, Carole, Barrett-Graves, D., Carney, J. (Eds.) High and Mighty Queens of Early Modern England: Realities and Representations (2003).

Rosenhek, Jackie. “An Heir-Raising Experience.” The Doctors Review (August 2013).

 

The Battle over Bodies: A History of Criminal Dissection

_100

On 29 July 1831, John Amy Bird Bell was found guilty of murdering a young boy for the sake of a few coins. At his trial, Bell expressed no emotion when he was sentenced to death. He did, however, break down when he was informed that his body would be given over to the surgeons to be dissected.

Bell was only 14-years-old when he was executed and anatomized. As he made his way to the gallows, he turned to the constable and asked: “He [the murdered child] is better off than I am now, do you not think he is, sir?” The constable agreed.

The Murder Act of 1752 decreed that the bodies of all murderers—young and old—be anatomized as an additional punishment for the heinous crime of taking another person’s life. Most of the criminal bodies harvested for dissection came from Tyburn in London, a place of execution since the 12th century.

NB2Locals called the permanent scaffold there “the deadly nevergreen,” the tree which bore fruit all year long. It consisted of three posts—each ten to twelve feet high—held together by three wooden crossbars at the top. Between 1169 (when the first recorded execution took place) and 1783 (when hangings were moved to Newgate Prison), an estimated 40,000-60,000 died at Tyburn. Amongst these were Perkin Warbeck (1499), pretender to the throne; Francis Dereham (1541), Queen Catherine Howard’s lover; and Jack Sheppard (1724), the notorious thief and escape artist.

The public’s desire for justice did not necessarily include a desire to see the criminal body dissected. Most believed the body was sacred and should remain intact after death. A sketch made in 1782 by the artist, Thomas Rowlandson, depicts the interior of William Hunter’s anatomical museum on the Last Day of Judgment as resurrected corpses bewilderingly search for missing body parts  [See below]. As comical as this may seem, fears about what happened to one’s body after death were very real during this period. Many people believed that the execution itself was punishment enough and that the body of a criminal should not suffer the final indignity of dissection.

L0016844 'Museum... in Windmill Street, on the last Day'.

After the passage of the Murder Act, Tyburn became a battleground between the surgeons who needed to procure corpses for dissection and the mob who fought ferociously to protect the dead. Samuel Richardson, writing in 1740, described such a scene:

As soon as the poor creatures were half-dead, I was much surprised before such a number of peace-officers, to see the populace fall to hauling and pulling the carcasses with so much earnestness, as to occasion several warm rencounters [sic], and broken heads. These were the friends of the persons executed…and some persons sent by private surgeons to obtain bodies for dissection. The contests between these were fierce and bloody, and frightful to look at. [1]

Before the day of reckoning, the condemned went to great lengths to protect their bodies from the dissection table. They appealed to family, friends, lovers and acquaintances. Martin Gray begged his uncle to come to his execution in 1721, “lest his Body should be cut, and torn, and mangled after Death.” [2] Sarah Wilmhurst, who was convicted of murdering her bastard child in 1743, was more concerned that her father and brother would fail to secure her body after the execution than with the prospects of death itself. [3] Most telling of all was a plea made by Vincent Davis, who was condemned to die after murdering his wife, Elizabeth, “by giving her with a Knife one mortal Wound in the Right Side of the Breast.” During his consignment, Davis

…sent many Letters to all his former Friends and Acquaintance to form a Company, and prevent the Surgeons in their Designs upon his Body…So great were these Apprehensions that he should be Anatomiz’d, that…he desired and wish’d he might be hang’d in Chains to prevent it, and with that view affronted the Court of Justice. [4]

The court did not acquiesce to his pleas. On the day of execution, however, Davis’s friends fought the surgeons for his body and won. He was later buried in Clerkenwell. [5]

NB3These battles were not for the faint-hearted. Accounts from the Barber Surgeon’s Company reveal how violent scenes around the gallows could become. An entry from 1739 records: “Paid the Beadles for their being beaten and wounded at the late execution £4.4.0.” Another entry from 1740 reads: “Paid for mending the windows broke upon bringing the last body from Tyburn. £0.6.0.” In one record we discover that the “dead man’s clothes…were lost in the scuffle.” The hangman who had procured the body thus required 15 pence compensation as the clothes of the executed rightly belonged to him. [6]

Eventually, “the deadly nevergreen” was taken down after the last criminal—John Austin—was hanged there on 3 November 1783. From that point forward, hangings took place just outside the walls of the Newgate Prison. Given the close proximity of Surgeon’s Hall to the site of execution, it was easier for surgeons to procure bodies for dissection away from the prying eyes of an angry crowd.

tumblr_midn0fROZ31qasg9no1_1280Nonetheless, surgeons continued to be the object of public loathing and ridicule well into the 19th century. On 19 April 1828, The London Medical Gazette reported:

The practice of dissection seems repugnant to the strongest prejudices of the people in this country; a repugnance which is by no means limited to the lower classes of the community, but which at present pervades nearly all, and which has unfortunately been increased, if not originally produced, by dissection having been made to constitute part of the punishment of the most aggravated felonies, and thus associated in the public mind with crime and degradation. [7]

It wasn’t until the Anatomy Act of 1832—when the bodies of the unclaimed poor were made available—that the links between dissection and punishment were formally severed. Unfortunately, in the minds of many, the executioner and surgeon would remain bound together for some time.

One executed the body, the other executed the law.

If you enjoy my blog, please consider supporting my content by clicking HERE.

1. Samuel Richardson, Familiar Letters on Important Occasions (1928), p. 219.
2. The Ordinary’s Account, 3 April 1721
3. The Ordinary’s Account, 18 May 1743.
4. The Ordinary’s Account, 30 April 1725.
5. Peter Linebaugh, “The Tyburn Riot Against the Surgeons,” in Albion’s Fatal Tree: Crime and Society in Eighteenth-Century England (1975; repr. 1988), p. 81. I am hugely indebted to Linebaugh for information found in this blog post.
6. S. Young, Annals of the Barber-Surgeons of London (1890).
7. The London Medical Gazette (19 April 1828).