From bloodletting to transfusions, as our understanding of the nature of blood has changed so has the role it plays in medicine.
We know that if we break the skin we bleed, and if we lose too much blood we die. From the earliest times, people have understood the importance of blood for life. But our knowledge of what blood is and the role it plays in the body has changed over the centuries.
The humoral system of medicine, practised in Europe for hundreds of years, defined blood as one of four vital bodily fluids. In order to maintain good health and treat illness, it was believed that the four substances—or humours—needed to be kept in balance.
Developments in anatomy during the Renaissance began to challenge the humoral system. As physicians and researchers carried out further investigations and experiments on blood, their understanding of the function and nature of it began to change.
At the end of the 1800s, laboratory science introduced a new (microscopic) view of blood that led to our modern scientific understanding of the fluid and its function in the body.
What is the function of blood?
Modern medicine has identified many of the essential functions of blood that keep us alive and healthy:
- It supplies oxygen to the cells and tissues
- It transports essential nutrients and chemicals, such as glucose and hormones, around the body
- It removes waste materials such as carbon dioxide and lactic acid
- It helps regulate acidity levels and body temperature
- It defends the body against infections and foreign materials
- It creates clots and scabs to stop bleeding and protect wounds from infection
What is blood?
The modern understanding of blood is based on scientific analysis of its constituents. Thanks to the microscope, scientists are able to observe these constituents at the cellular level. The electron microscope has given us unique and beautiful images of the individual elements that make up our blood.
Bloodletting
Blood plays an essential role in our bodies. Losing too much of it can result in shock and, ultimately, death. So it's perhaps surprising that bloodletting was the most common procedure performed by surgeons and physicians for almost 2,000 years.
Bloodletting was a treatment developed as part of the humoral theory of medicine. According to this model, humours were essential liquids within the body, identified as blood, phlegm, black bile and yellow bile.
These were in turn associated with the fundamental elements of air, water, earth and fire. It was further proposed that each of the humours was associated with a particular season of the year, during which too much of the corresponding humour could exist in the body. Blood, for example, was associated with spring.
A good balance between the four humours was considered essential to retain a healthy body and mind, as imbalance was thought to result in disease. Such notions of internal balance have parallels in other medical traditions, notably Ayurveda, Unani Tibb and Traditional Chinese Medicine.
The treatments for disease within humoral theory were concerned with restoring balance, either by removing an excess of one humour or promoting the production of another. Some involved simple changes to diet and lifestyle. But more aggressive treatments included purging the body with substances to induce diarrhoea and vomiting, or cutting open a vein to let blood out—a process known as 'breathing a vein'.
The Roman physician Galen (129–216) was an enthusiastic advocate of bloodletting. He used it to treat fevers, apoplexy (stroke) and headaches. But over the centuries, the development of a complex holistic humoral system meant that bloodletting was used to treat a wide variety of symptoms and conditions.
Leeches
Leeches have a long association with bloodletting. They are a type of worm that, when applied to the skin, can suck out several times its body weight in blood. The use of leeches in Europe peaked between 1830 and 1850, then fell into decline.
During the 'leech craze' of the 1800s, leech collectors (who were usually women) would wade into ponds with their skirts raised to attract the leeches to their bare legs. Some used animals—such as horses that were too old to work—instead of their own bodies to catch the leeches.
While this work was not physically demanding, leech collectors suffered from the loss of blood and frequent infections they caught from the leeches.
Today, leeches are used in surgery to help heal skin grafts and restore blood circulation to wounds. Their usefulness is no longer attributed to the amount of blood they can withdraw, but to an enzyme in their saliva which prevents clotting. Leeches used in medicine today are supplied by specialised 'leech farms' that ensure they are raised in special, sterile conditions.
Blood circulation
According to the humoral tradition, there were two separate blood systems in the body. One carried purple, 'nutritive' blood made in the liver from digested food. This purple blood travelled to the heart, where it was heated then pumped through the veins to the rest of the body.
The other blood system circulated scarlet, 'vivifying' (or 'vital') blood made in the lungs from the air we breathe and distributed it around the body via the arteries.
In the 1200s the Persian scholar Ibn Al-Nafis (1213–88) discovered the pulmonary circulation of the blood (between the heart and the lungs). But his discovery remained largely unknown in Europe until the 1600s, when the English physician William Harvey (1578–1657) devised experiments to prove the circulation of blood.
Harvey made a meticulous study of the anatomy of the chest and concluded that the purpose of the heart was not to heat the blood, but to pump it around the body via the arteries. He proposed that, rather than being burned up and replaced by the liver, blood returned to the heart through the veins.
Harvey’s radical suggestion placed the heart at the centre of a single system of arteries and veins that continuously pumped blood around the body. He carried out extensive experiments and animal dissections to prove his theory. He calculated the volume of blood flowing through the heart, demonstrating that the body was simply not capable of producing or consuming that amount so rapidly, and that the blood therefore had to be circulating.
Harvey published his findings in a 1628 work called Exercitatio Anatomica de Motu Cordis et Sanguinis in Animalibus (Anatomical Exercise on the Motion of the Heart and Blood in Animals), reproduced in English in 1653.
He already knew, from his teacher Fabricius, that the veins had one-way stepladder valves in them which helped the blood return to the heart. But he couldn't determine how the blood moved from the arteries to the veins to complete the circuit. He surmised that this happened via tiny blood vessels—so small that they couldn’t be seen by the naked eye. The existence of these capillary blood vessels was not confirmed until later in the 1600s, after the microscope was invented.
Despite support from the College of Physicians for his theory, many of Harvey's colleagues found it hard to accept his findings because they challenged one of the key beliefs of humoral medicine—that the liver was the origin of venous blood.
But Harvey had spent years experimenting and investigating the anatomical structures of the heart and blood vessels and his evidence was irrefutable. By the time of his death in 1657, the circulation of the blood had become established anatomy.
Blood transfusion
A better understanding of how blood circulated around the body helped surgeons treat wounds, especially in battle. They developed tools and techniques—such as tourniquets and ligatures—to limit blood loss by restricting the flow of blood and sealing damaged blood vessels.
Surgeons, particularly military surgeons, were only too aware of the dangers of excessive blood loss. As far back as the 1400s, attempts had been made to transfuse blood from animal to human and from person to person. Many of these experiments resulted in death for the person receiving the blood.
It wasn’t until the early 1900s that an Austrian doctor called Karl Landsteiner (1868–1943) demonstrated in his laboratory that not all blood was the same.
He observed that, if mixed together in a test tube, human and animal blood formed clumps (agglutinated). When Landsteiner repeated the process with human blood from different donors, he discovered that some samples also agglutinated when mixed.
After experimenting with different combinations of blood, he was able to identify three distinct groups: A, B and C (now known as O). A fourth group, AB, was discovered a year later by his students Adriano Sturli and Alfred von Decastello.
In later experiments, Landsteiner discovered the 'Rhesus factor', which further classified blood as Rh(D) positive or negative. There are eight main blood groups in the Rh system, and only some can be safely mixed with others.
It was only in the wake of Landsteiner's discoveries that blood transfusion became a viable medical treatment. Tests were developed to identify blood groups, and by the First World War (1914–18) blood transfusions could be carried out in the field by identifying compatible donors and using portable transfusion equipment to directly transfer blood from one person to another.
Geoffrey Keynes (1887–1982), a British surgeon, worked to develop a portable machine for storing blood so that a compatible donor did not have to be present for a blood transfusion. In the 1930s, blood plasma and red blood cells were separated so they could be stored for longer periods.
By the Second World War, new techniques of refrigeration and plasma storage led to the creation of blood banks and emergency transfusions were available for any patient who needed them.
Blood analysis
We now take for granted that our doctor will order a battery of blood tests to help with diagnosis. Blood samples are tested for many reasons, from assessing organ function and screening for genetic conditions, to identifying the presence of toxins, infections or vitamin deficiencies.
Although blood analysis has been possible since the late 1800s, it was initially a lengthy and complex process that had to be carried out by hand, often producing unreliable results.
This changed in the 1950s, when automated machines revolutionised laboratory diagnosis. One of the first was the AutoAnalyzer, invented by the American biochemist Leonard Skeggs (1918–2002). His machine was able to perform one blood test per minute.
Skeggs built a prototype of the AutoAnalyzer in his home workshop in 1951. Only 50 of the machines were sold in 1957, but by the end of the 1960s automated machines were used in every laboratory that tested blood samples.
Suggestions for further research
Books
- Z Cope (ed.) History of the Second World War: Surgery, 1953
- M Duke, The Development of Medical Techniques and Treatments. From Leeches to Heart Surgery, 1991
- M Harrison, Medicine and Victory: British Military Medicine in the Second World War, 2004
- W H Marsh, Automation In Clinical Chemistry, 1963
- K Pelis, ‘Transfusion, with teeth’ in R Bud, B Finn and H Trischler (eds) Manifesting Medicine: Bodies and Machines, 1999 (pp 1–29)
- R Porter, The Greatest Benefit to Mankind, 1997
- K Shigehisa, 'Interpreting the History of Bloodletting', The Journal Of The History Of Medicine And Allied Sciences, 1995 (pp 11–46)
- K Shigehisa, ‘Blood and life’, The expressiveness of the body and the divergence of Greek and Chinese medicine, 1999 (pp 195–231)
- D Starr, Blood: An Epic History of Medicine and Commerce, 1998