The Saddest Place in London: A Story of Self-Sacrifice

SS1Tucked away in a quiet area of East London is a peaceful place that goes by the unassuming name of Postman’s Park (left), so called because it once stood in the shadow of the city’s old General Post Office building. At first glance, you might mistake it for any green space in the city, with its manicured lawn, leafy trees and decorative water fountain. But if you took the time to venture through the gates, you would stumble upon something far from ordinary.

On a stone wall, underneath a makeshift overhang, are a series of ceramic plaques, each one painted beautifully with the names of people who died while trying to save the lives of others. One plaque reads:

 SS2

And another:

SS3

And on and on they go. The first time I stumbled upon this memorial was on a walking tour given by Tina Hodgkinson. I was instantly overwhelmed with sadness. So many of the people listed on these plaques were children, like John Clinton, aged 10, who drowned ‘trying to save a companion younger than himself’. Or Henry James Bristow, aged 8, who ‘saved his little sister’s life by tearing off her flaming clothes’ only to catch fire himself and die later of burns and shock. And then there was Solomon Galaman, aged 11, who saved his little brother from being run over in Commercial Street on 6 September 1901. His plaque reads: ‘Mother I saved him but I could not save myself’.

SS5

Unlike many valiant memorials dedicated to those in the armed forces, this was created entirely in honour of the everyday hero. It is a testament to the incredible sacrifices we, as humans, can and do make on a daily basis.

But how did this memorial come into existence in the first place? And why did we stop creating plaques for it?

On 5 September 1887, the painter and sculptor, Fredric Watts (pictured below), wrote to The Times, proposing a tribute of a different sort for Queen Victoria’s upcoming Golden Jubilee. Watts believed that art could act as a force for social change, and suggested a didactic monument celebrating ‘heroism in every-day life’. He wrote:

It must surely be a matter of regret when names worthy to be remembered and stories stimulating and instructive are allowed to be forgotten. The material prosperity of a nation is not an abiding possession; the deeds of its people are.

Watts referred to the case of Alice Ayres, a nursemaid who died on 12 September 1859 in a house fire after she saved the lives of her employer’s children by throwing a mattress out the window and dropping them to safety. She, herself, was overcome by the fumes and stumbled out of the window to her death.

SS6

Watts proposed that a marble wall inscribed with the names of everyday heroes be built in Hyde Park. Sadly, his suggestion could not garner enough support, leading him to quip that if he had proposed a race course instead, he would have had plenty of sympathizers. In the years that followed, Watts continued to lobby for the memorial. Both he and his wife redrafted their wills to leave a bulk of their estate to its construction, and even considered selling their home to finance the project.

SS8Then, in 1898, Henry Gamble—Vicar of St Botolph’s Aldersgate and longtime friend of Watts—acquired the land which would later be called Postman’s Park, and Watts suggested that the memorial be built there. Although there was resistance to the idea of the park being used in this manner, construction began a year later after the necessary funds were secured (Watts himself donated the extraordinary sum of £700 to the cause).

On 30 July 1900, the 50 foot long wall with space for 120 ceramic plaques was unveiled to the public. Watts, who was then 83 years old, was too ill to attend the ceremony. He died 4 years later.

Over the course of several decades, plaques were added to the wall, many of the names chosen from Watts’s collection of newspaper clippings he had accumulated over the years about ‘everyday heroes’. In 1931, the 52nd plaque commemorating the life of Herbert Maconoghu—who died aged 13 while trying to rescue two drowning classmates—was placed. This would be the last name added to the wall in the 20th century.

SS7

After Watts’s wife and lifetime advocate of the memorial died in 1938, the wall fell from fashion and it seemed that no names would ever be added to it again. Then in 2007, a man named Leigh Pitt died while rescuing a 9-year-old boy from drowning in a canal in Thamesmead. His colleagues and fiancée, Hema Shah, approached the Diocese of London to suggest Pitt be added to the wall. Despite opposition from the Watts Gallery to proposals that the memorial be completed, a new plaque commemorating Pitt’s heroic actions was added on 11 June 2009.

SS9

Today, Postman’s Park remains an obscure destination, attracting only a handful of visitors who are drawn, perhaps, to the strangeness of the Victorian deaths chronicled on Watts’s wall. After all, not many people are trampled under the hooves of runaway horses, or die tragically in theatre fires these days. In this way, the plaques are as much a historical testament to an era long gone as they are to the lives of the people whose names adorn them.

To date, there are currently no plans to add further plaques to the memorial. I, for one, hope we don’t have to wait another 78 years before we see another ‘everyday hero’ commemorated in such a beautiful and thoughtful way.

If you enjoy reading my articles, please consider becoming a patron of The Chirurgeon’s Apprentice. Support my  content by clicking HERE

Become a ‘Patreon’ of The Chirurgeon’s Apprentice 

2Since I launched a donate button on my website several weeks ago, I’ve been overwhelmed by your generosity and support. Then, a few days ago, a friend of mine turned me onto a site called Patreon (no, that’s not a typo!) which would allow me to give a little something extra back to each and every one of you who donates to The Chirurgeon’s Apprentice.

Unlike Kickstarter or IndieGoGo which only allows you to fund one big project, Patreon allows people to support their favourite artists, writers, and musicians as they create free content online. The way it works: you donate a fixed amount of money (for example, 50 cents) each time I produce an article on The Chirurgeon’s Apprentice. In exchange, you get various perks. This can be anything from access to Morbid Minute—a video I’ll release each month revealing a weird/gruesome fact about our medical past—to the first chapter of a novel I’m currently writing called The Surgeon’s Tale.

So if you’d like to become a ‘patreon’ of The Chirurgeon’s Apprentice, please visit my page and watch my video by clicking HERE. I’m offering lots of fun perks which I hope you will like!

As always, I thank you greatly for your kindness.

*For those who’ve donated in the past several weeks, I’ll be in contact shortly to offer you a corresponding perk!

The Anaesthetized Queen & the Path to Painless Childbirth

L0058939 Clear glass shop round for Chloroform, United Kingdom, 1850-‘Did the epidural hurt?’ I ask Rebecca Rideal—editor of The History Vault—one morning as we sit outside the British Library.

‘Not really.’ She hesitates, clearly wanting to say more without divulging too much information. ‘I mean, it’s nothing compared to the labour pains. The hardest part was lying still while the anaesthesiologist administered the needle.’

Rebecca is one of many friends of mine who have now endured the pains of childbirth. Nearly all of them (with the exception of one) did so with the aid of anaesthetics and pain medication. Not one of them regretted it.

Of course, there was a time when women had no choice but to give birth naturally, and often did so while sitting up in a birthing chair. The experience was wrought with dangers, not least the risk of ‘childbed fever’ which claimed the lives of thousands of women, including Henry VIII’s wife, Jane Seymour.

But even if a woman escaped with her life, she couldn’t avoid the pain.

All this changed in November 1847, when Dr James Young Simpson—a Scottish obstetrician—began using chloroform as an anaesthetic. Earlier that year, Simpson started using ether to relieve the pains of childbirth, but he was dissatisfied with the smell, the large quantity needed, and the lung irritation it caused. Ether was also highly explosive, which made it dangerous to use in candlelit rooms heated by fireplaces. It was then that David Waldie, a chemist from Liverpool, recommended chloroform to Simpson.

On the evening of November 4th, Simpson and his two friends experimented with it. At first, they felt very cheerful and talkative. After a short time, they passed out. Impressed with the drug’s potency, Simpson began using chloroform as an anaesthetic, and indeed, the first baby born to a mother under the drug’s influence was named Anaesthesia.

M0003274 Sir J. Y. Simpson and two friends, having tested chloroform

It was soon after this that the Duchess of Sutherland sent a pamphlet on Simpson’s discovery to Queen Victoria who was then in her sixth pregnancy. The Queen’s distaste for pregnancy was well-known. She considered it ‘wretched’ and experienced ‘occasional lowness and a tendency to cry’ after the birth of her first two children. [1]

Unfortunately, it was also at this time that the first chloroform fatality occurred when 15-year-old Hannah Greener died within 3 minutes of inhaling the chemical. The Queen was hesitant, and decided to forgo the new drug during her delivery of Princess Louise in March 1848. But the labour pains were severe, and so when Victoria became pregnant again a year later, she wrote to the Duchess of Sutherland, enquiring after her daughter who had just given birth using chloroform. Further discussion followed amongst the Royal medical household, but the decision was made once more to abstain despite assurances from the the physician, John Snow, that chloroform was perfectly safe when administered correctly. And so on 1 May 1850, Victoria endured her seventh labour without the aid of anaesthetics.

L0000578 J. Snow, "Chloroform and other anaesthetics", title pageBy 1852—when Victoria became pregnant with Prince Leopold—attitudes towards the drug were beginning to change. Most importantly, the Queen’s husband, Prince Albert, had become an advocate of its usage. Albert, a long-time champion of the sciences and President of the Royal College of Chemistry, had had lengthy discussions with Dr Snow about the administration of chloroform and the distinctions between giving it to patients undergoing surgery (which required full unconsciousness) and women in labour. Wishing to ease his wife’s pains, Albert urged Victoria to submit to the drug.

On 7 April 1853, Snow was summoned to Buckingham Palace. A lot was at stake. If the good doctor were successful in using chloroform to ease the Queen’s delivery, he would silence critics of childbirth anaesthesia and help pave the way to painless labour for women everywhere.

Lucky for Snow, the birth was simple and uncomplicated. Prince Leopold was born within 53 minutes of his administration of the drug, which Victoria described as ‘that blessed Chloroform… soothing, quieting and delightful beyond measure’. [2] Snow later wrote in his medical casebooks that Queen was ‘very cheerful and well, expressing herself much gratified with the effect of the [drug]’. [3]

Not everyone was pleased with the outcome, however. Some protested on religious grounds; others for medical reasons. The Lancet questioned the veracity behind claims that the Queen had even used the drug in her last delivery.

A very extraordinary report has obtained general circulation [that]…Her Majesty during the last labour was placed under the influence of chloroform, an agent which has unquestionably caused instantaneous death in a considerable number of cases. Doubts on this subject cannot exist…In no case could it be justifiable to administer chloroform in perfectly ordinary labour…These facts being perfectly well known to the medical world, we could not imagine that anyone had incurred the awful responsibility of advising the administration of chloroform to her Majesty… [4]

These doubts aside, Queen Victoria’s use of the drug was overwhelmingly lauded, and led to a public fervour for painless childbirth. The editor of the Association Medical Journal called it ‘an event of unquestionable medical importance’, and hoped that this would remove ‘lingering professional and popular prejudice against the use of anaesthesia in midwifery’. [5] Women everywhere were requesting chloroform to ease their labour pains.

Dr Snow was discreet about the details of that fateful day in Buckingham Palace, though he was questioned often about the event. On one occasion, one of his patients refused to inhale the chloroform he was hopelessly trying to administer lest he tell her ‘what the Queen said, word for word, when she was taking it’. Snow cleverly replied that ‘Her Majesty asked no questions until she had breathed very much longer than you have; and if you will only go in loyal imitation, I will tell you everything’. [6]

Shortly after the lady gave birth, Snow slipped away, leaving his promise unfulfilled.

Queen Victoria was destined for one final pregnancy. In 1857, she gave birth to her ninth child, Princess Beatrice (pictured below with the entire family). Once again, Dr Snow successfully administered chloroform during the delivery, securing the path to painless childbirth for women everywhere.

B4

 

1. Roger Fulford (ed.), Dearest Child: Letters between Queen Victoria and the Princess Royal, 1858 – 61 (1964), p. 195, 162. Originally quoted in Stephanie Snow, Blessed Days of Anaesthesia (2008), p. 82.
2. Quoted in Matthew Dennison, The Last Princess: The Devoted Life of Queen Victoria’s Youngest Daughter (2007), p. 2.
3. Ibid.
4. Lancet I (1853), p. 453.
5. Association Medical Journal (1853), p. 318.
6. John Snow, On Chloroform and other Anaesthetics (1858), p. xxxi. Originally quoted in Snow, Blessed Days of Anaesthesia, p. 88.

*This article is dedicated to my dear friend, Marla Ginex, who any day will give birth to her second daughter. Good luck and lots of love.

Disturbing Disorders: A Brief History of Harlequin Ichthyosis

H2Last Saturday, I was lounging around on the couch watching 5 straight episodes of Forensic Detectives (don’t judge) when I heard my computer ping. Being the internet junkie that I am, I immediately checked my inbox and saw a message from my old school friend, Andy, who is currently studying medicine at Case Western. He had an idea for a blog post, he wrote, but worried it might be too disturbing for my audience. Naturally, my curiosity was piqued.

Turns out, Andy had reason to worry. In the next message, he attached a photo of a 19th-century fetus (left), which is now housed at Museum Vrolik in Amsterdam. The baby had died from a very rare genetic disorder known as Harlequin Ichthyosis, which causes the overproduction of keratin protein in skin. As a result, those with the condition are born with huge, diamond-like scales all over their bodies, and usually die young due to infections from cracks in the skin.

I have to admit, I’ve seen and researched many terrible diseases, and yet I had a gut reaction to this particular specimen. For me, there is always a deep sadness attached to a child’s death—and even more so when one considers the pain and suffering that brought on such a premature demise. But it wasn’t necessarily the fact that I was gazing upon a life cut short that most disturbed me. It was the extremeness of the deformity that gave me pause. Was this a subject I should tackle on my blog?

As you are reading this post, you will know what decision I finally made in the end. Although Harlequin Ichthyosis is a horrible condition, it is still part of our medical past, present and future (since we have yet to find a cure). For that reason, alone, it deserves contextualization here. More so, our own emotional reactions to the specimen above may help us understand why people in the past feared disfiguring diseases, like smallpox or leprosy, and why many people today continue to struggle when interacting with those who suffer from serious deformities and disabilities.

Harlequin Ichthyosis’s history begins on 5 April 1750, when Reverend Oliver Hart—a cleric from Charleston, South Carolina—became the first to document (but not necessarily observe) the condition. He wrote:

I went to see a most deplorable object of a child, born the night before of one Mary Evans in ‘Chas’town. It was surprising to all who beheld it, and I scarcely know how to describe it. The skin was dry and hard and seemed to be cracked in many places, somewhat resembling the scales of a fish. The mouth was large and round and open. It had no external nose, but two holes where the nose should have been. The eyes appeared to be lumps of coagulated blood, turned out, about the bigness of a plum, ghastly to behold. It had no external ears, but holes where the ears should be. The hands and feet appeared to be swollen, were cramped up and felt quite hard. The back part of the head was much open. It made a strange kind of noise, very low, which I cannot describe. [1]

Mrs Evans’s baby died 48 hours later.

Hart’s description was very accurate. Babies born with Harlequin Ichthyosis have poorly developed ears and nose (which are sometimes absent altogether). Their eyelids are turned inside out, leaving the eyes and area around them susceptible to trauma and infection. They often bleed when they are born, and their lips—pulled upwards by the dry skin—resemble a clown’s smile.

H1Those suffering from Harlequin Ichthyosis are also extremely susceptible to hyperthermia; and they are frequently dehydrated as their skin is not well suited to keeping water or heat in. They often have difficulties breathing due to their armor-like scales, which impede the chest wall from expanding and drawing in enough air. Sadly, this can lead to respiratory failure in many infants.

The disorder’s name alludes to the character Harlequin in the Italian Commedia dell’arte, which made its debut in the 1580s. The Harlequin is characterized by his chequered costume. As you can see, the disease mimics a similar pattern on the skin of the afflicted.

In the past, babies born with Harlequin Ichthyosis had no hope of living more than a few days. As a result, there is very little mention of it in 18th- and 19th-century medical books; and I have only come across two preserved specimens in anatomical collections: one from Museum Vrolik in Amsterdam (mentioned above), and the other from Musée Dupuytren in Paris (pictured below).

H3Advances in medicine, however, have made it possible for people with this condition to live into young adulthood. Improvements in neonatal care, combined with the use of topical retinoids such as Isotrex which enable the skin to shed cells faster than they are produced, are helping to make Harlequin Ichthyosis a chronic condition rather than a fatal disease.

Just last year, 20-year-old Stephanie Turner—who herself was born with the disorder—gave birth to a perfectly healthy baby boy. Hope springs eternal.

 

1. Qtd from J. I. Waring, M.D., ‘Early Mention of a Harlequin Fetus in America’, American Journal of Diseases of Children, Vol. 43 No. 2, February 1932.

*This is the first article in a series called Disturbing Disorders. If you would like to learn more about Harlequin Ichthyosis, or donate to research, please click here.

Being a Medical History Blogger

1The year was 2010. I had just completed 9 years of university education which culminated in a PhD from the University of Oxford in the History of Science, Medicine & Technology; and I was about to start a 3-year postdoctoral research fellowship with the Wellcome Trust. I was on top of the world, academically-speaking.

Yet, for me, there was something missing.

It may not come as a surprise to you that I was a strange child, and the signs were there from the beginning that I would become an even stranger adult. I used to make my grandmother take me around to old cemeteries in Chicago when I was younger. I suppose you could say I have always been fascinated with death. But more so, I have always been fascinated with the past.

I’ve also always been a passionate storyteller. I suppose that’s what first attracted me to history as a subject. I’m so often moved by the stories I come across in my research—stories about the people who died, about the loved ones they left behind, and about the surgeons who opened up their dead bodies for the sake of medical science.

But back to 2010. I was feeling uninspired, and not a little burnt-out. I was tired of philosophizing and theorizing about the past. I wanted to fall in love with history again, and get back to the stories that once stirred my imagination. Thus, The Chirurgeon’s Apprentice was born.

I’m not going to lie. Not everyone agrees with what I’m doing. Some people think I am ‘bastardizing’ the discipline, and wasting my credentials. Others think I’m being purposefully sensational. And while there is no doubt that I hit upon sensitive subjects here, I hope that people come away with a real understanding of our medical past when they leave. I like to say: ‘Come for the skin book, stay for the history!’

Today, I am no longer part of academia. I have no institutional affiliation, and no funding. But I love what I do. Not a day goes by that I am not grateful to you, my readers, for your continued support and enthusiasm for the subject. Since its launch in 2010, The Chirugeon’s Apprentice has had nearly a million hits, and now has 45,000 fans from around the world. I am truly humbled.

Over the years, many of you have asked how you can support my work. I’ve always prided myself on providing free content for those who seek it. I believe the past doesn’t just belong to historians and scholars. It belongs to everyone. That said, I’ve recently launched a ‘Donate Page’ should you want to help defray the costs of running this website, or merely show your support for a freelance writer. You can also find a donate button below.

I absolutely do not expect it, but I do appreciate your generosity.

Thank you, Dr Lindsey (AKA The Chirurgeon’s Apprentice)

Donate Button with Credit Cards

The Horrors of Pre-Anaesthetic Surgery

L0034242 Five surgeons participating in the amputationI often joke that The Chirurgeon’s Apprentice is all about ‘the horrors of pre-anaesthetic surgery’ and yet, I’ve never written an article which focuses primarily on the patient’s experience before the widespread use of ether beginning in the 1840s. Suffice-to-say, it was not a pleasant affair.

In 1750, the anatomist, John Hunter, colourfully described surgery as ‘a humiliating spectacle of the futility of science’ and the surgeon as ‘a savage armed with a knife’.[1] He was not far from the truth. Surgery was brutal and only to be undertaken in extreme circumstances. In 1811, Fanny Burney had a mastectomy after being diagnosed with breast cancer. She later recorded the incident vividly for posterity:

When the dreadful steel was plunged into the breast—cutting through veins—arteries—flesh—nerves—I needed no injunctions not to restrain my cries. I began a scream that lasted unintermittingly during the whole time of the incision—& I almost marvel that it rings not in my Ears still!

S2Fanny (pictured right) went on to depict her own terror as one that ‘surpasses all description’. The agony, she said, was ‘excruciating’. So terrible was the operation that her surgeons decided to limit her anxiety by choosing a day at random and giving her only two hours notice before they began.[2]

Fanny was one of the lucky ones. Not only did she survive surgery, but she also went on to live for another 28 years. Others were not so fortunate. When Stephen Pollard underwent an operation to remove a bladder stone in 1828, he did so under the gaze of 200 spectators. What should have lasted 5 minutes ended up taking almost an hour. The surgeon, Bransby Cooper, fumbled and panicked, cursing the patient loudly for having ‘a very deep perineum,’ while the patient, in turn, cried: ‘Oh! let it go; —pray, let it keep in!’[3]

Pollard died the next day.

Pain was not just an unavoidable side effect of surgery. Most surgeons operating in a pre-anaesthetic era believed it was a vital stimulant necessary for keeping the patient alive. This is why opiates and alcohol were used sparingly, and typically administered shortly before (not during) a procedure, as the loss of consciousness was considered to be extremely dangerous.

_100Today, patients are laid flat on an operating table. Before the latter half of the 19th century, however, patients were often sat upright in an elevated chair. This prevented them from bracing when the surgeon’s knife began to dig into their flesh. Unsurprisingly, they were also restrained, sometimes with leather straps. The operating chair depicted on the left is not dissimilar to ones which would have been used during these earlier periods.

The patients weren’t the only ones who felt anxious before an operation. Surgeons, too, were apprehensive about cutting into living bodies. The Scottish surgeon, Charles Bell (1774 – 1842), was described by one colleague as having ‘the reluctance of one who has to face an unavoidable evil’.[4] John Abernethy (1764 – 1831), a surgeon at St Bartholomew’s Hospital, confessed to shedding tears and being physically ill before or after a particularly terrible operation. He described the walk to the operating room like ‘going to a hanging’.[5] And William Cheselden (1688 – 1752) once remarked that ‘no one ever endured more anxiety and sickness before an operation…’[6]

Surgery was a last resort, and one which brought with it considerable risks. During an operation, patients could die from the sheer amount of blood lost during a procedure. In hospital operating theatres, a wooden box was placed under the patient to catch blood and pus during the surgery. Additionally, sawdust was placed under the floorboards to catch the overflow. Even if a patient survived the traumatic ordeal, he or she might die from post-surgical infection.

Surgeons, of course, were aware of these risks, and went to great lengths to avoid operating. The historian Stephanie Snow argues that because of these dangers, ‘an elaborate etiquette of medical consultations developed’ before a decision was made to operate.[7] Indeed, only a handful of surgeries took place each month at most of the major London hospitals in the first half of the 19th century. Robert Liston—known as ‘the fastest knife in the West End’—lost 1 in 10 patients in the operating theatre at University College Hospital during this period. His success rates were fairly good. Surgeons at nearby St Bartholomew’s lost approximately 1 in 4.[8]

This does not account for the number of patients who died later of complications.

Very few people who underwent surgery recorded their thoughts for posterity. They either did not have the resources to do so, or the inclination to write about their painful experiences. George Wilson—a Professor of Chemistry at Edinburgh University—underwent a foot amputation in 1842. He remembered ‘the fingering of the sawed bone; the sponge pressed on the flap; the tying of the blood-vessels; the stitching of the skin; and the bloody dismembered limb lying on the floor’. Later, he wrote that these memories were not ‘pleasant remembrances’ and were ‘never welcome’.[9]

When reading descriptions like these, it is easy to understand why so many patients’ voices are now lost to us. All too often, their memories, like the surgeries they suffered, were simply too painful to endure.

 

1. Quoted in P. H. Jacobson, ‘Dentistry’s answer to “the humiliating spectacle'”, Journal of the American Dental Association (1994), p. 1576.
2. The full description of this surgery can be found in Fanny Burney, Selected Letters and Journals, ed. Joyce Hemlow (1986), pp. 127-41.
3. Druin Burch, Digging up the Dead: Uncovering the Life and Times of an Extraordinary Surgeon (2007), p. 26. A fuller description of this incident can be found in an earlier article I wrote here.
4. J.M. Arnott, quoted in Gordon Gordon-Taylor & E.W. Walls, Sir Charles Bell: His Life and Times (1958), p. 82. Originally quoted in Peter Stanley, For Fear of Pain: British Surgery, 1790 – 1850 (2003), p. 205.
5. George Macilwain, Memoirs of John Abernethy, 2 vols (1854), Vol II, p. 203. Originally quoted in Stanley, For Fear of Pain, p. 204.
6. William Cheselden, The Anatomy of the Human Body (1741), p. 334. Originally quoted in Lynda Payne, With Words and Knives: Learning Medical Dispassion in Early Modern England (2007), p. 79.
7. Stephanie Snow, Blessed Days of Anaesthesia (2008), p. 4. I am hugely indebted to Snow for pointing me to some of the sources cited in this article.
8. Matt Soniak, ‘”Time Me, Gentlemen”: The Fastest Surgeon of the 19th Century’, The Atlantic (24 October 2012).
9. Jessie Aitken Wilson, Memoir of George Wilson (1860), pp. 296-7. Originally quoted in Stanley, For Fear of Pain, p. 276.

Public Health & Victorian Cemetery Reform

In 1843, the Scottish cemetery designer, John Claudius Loudon, explained that the purpose of a burial ground was to dispose of the dead ‘in such a manner as that their decomposition, and return to the earth from which they sprung, shall not prove injurious to the living.’ [1] A decade earlier, London cemeteries had reached critical mass. Death rates were rising within the city due to overcrowding and outbreaks of cholera, tuberculosis, diphtheria, smallpox and typhus. Burial grounds were bursting at the seams, causing one Reverend John Blackburn to remark:

I am sure the moral sensibilities of many delicate minds must sicken to witness the heaped soil, saturated and blackened with human remains and fragments of the dead… [2]

C1The rate at which burials were growing was mind-boggling. According to one report, many cemeteries around London were burying as many as 11,000 people per acre. To put this in perspective, most cemeteries today accommodate 750-1,000 burials per acre—a tiny fraction of what was acceptable in the past. [3]

Bodies were literally crammed on top of one another. Most graveyards contained open pits with rows and rows of coffins exposed to sight and smell. Pit burial was so common in London that two men asphyxiated on the methane and other gases emanating from decomposing bodies after falling twenty feet to the bottom of one such pit in the early 19th century. [4]

For those living nearby, the smell was unbearable, especially during the summer months. The houses on Clement’s Lane in the East End of London backed into the local churchyard, and ‘ran with stinking slime.’ The stench was so overpowering, that occupants kept their windows shut all year long. Even the children attending Sunday school could not escape these unpleasantries. They learned their lessons as insects buzzed around them, no doubt originating from inside the church’s crypt which was crammed with 12,000 decomposing bodies. Even after the chapel was closed in 1844, it continued to be used, this time for ‘Dances on the Dead’ (see illustration, below) until the bodies were eventually moved to West Norwood Cemetery a few years later. [5]

L0073464 Illustration of a dance hall above a cemetary area

With this in mind, it’s hardly shocking that people in the 19th century wanted to reform cemeteries. Londoners were up to their noses in blackened corpses and stinking slime. But for the Victorians, this wasn’t just about the aesthetics of living in a city bubbling over with rotting corpses. It was about public health.

During this period, people associated bad odours with disease. It’s easy to understand why. Poor areas where people were jammed together in cramped living quarters would have smelled horribly. It was the poor who would have been forced to live near graveyards and open burial pits. Not surprisingly, these areas were also hotbeds for disease. The English reformer, Edwin Chadwick, was particularly concerned with ‘putrid emanations’ from corpses, which he argued were ‘injurious to the health of the living’. [6] He believed that lead coffins were especially dangerous:

The retention of bodies in leaden coffins in vaults is objected to, as increasing the noxiousness of the gases, which sooner or later escape, and when in vaults beneath churches, create a miasma which is apt to escape through the floor, whenever the church is warmed. [7]

According to Chadwick, the Austrian Emperor had banned the use of coffins altogether for this very reason, insisting that ‘all people should be buried in sacks’ for sanitary purposes. The Turks also recognized the dangers of lead coffins, and made it mandatory that pine be used as an alternative as it ‘decays rapidly,’ thus allowing the corpse to return to the earth more naturally. [8]

Chadwick would not get his wish with respect to lead coffins. However, change did come about in the form of cemetery reform. In 1832, Parliament authorized the General Cemetery Company to build a large, park-like cemetery in Kensal Green, a suburb of London. Shortly afterwards, other ‘garden cemeteries’ sprung up outside the city centre: West Norwood (1837), Highgate (1839), Abney Park (1840), Brompton (1840), Nunhead (1840), and lastly, Tower Hamlets (1841). Collectively, these cemeteries are known today as the ‘Magnificent Seven’ (see slideshow below).

People continued to bury their dead within the city for two decades after the establishment of Kensal Green and the garden cemeteries. By 1852, burials within central London were finally outlawed, and the days of overcrowded graveyards died with their last occupants. In 1885, Britain’s first legal crematorium opened in Woking. It wasn’t until 1968, however, that cremations outnumbered burials. [9]

Today, nearly 73% of people who die in Britain are cremated.

This slideshow requires JavaScript.

1. John Claudius Loudon, On the Laying Out, Planting, and Managing of Cemeteries, and on the Improvement of Churchyards (1843), p. 1.
2. Edwin Chadwick, Report on the Sanitary Condition of the Labouring Population of Great Britain. A Supplementary Report on the results of a Special Inquiry into The Practice of Internment in Towns. (1843), p. 134.
3. Ibid, p. 135.
4. Ruth Richardson, Death, Dissection and the Destitute (1987), p. 60.
5. Sarah Wise, The Italian Boy: Murder and Grave-Robbery in 1830s London (2005), p. 52.
6.Chadwick, Report on the Sanitary Condition, p. 31.
7. Ibid, p. 135.
8. Ibid, p. 136.
9.  I am greatly indebted to Ruth Levitt and her article, ‘A Grave Dilemma,’ in BBC History (May 2014) for inspiration & information for this article.