Skip to main content

Book your free admission ticket now to visit the museum. Schools and groups can book free tickets here.

Understanding bodily functions

Published: 10 July 2019

Early healers relied on their senses to detect changes in the body—but over time medical practitioners developed tools and technology to identify signs of illness.

The Egyptian scholar and physician Ali Ibn Ridwan (c. 988 – c. 1061) said:

For your diagnosis ... you should always choose things that are extremely powerful and easy to recognise, and these are what can be perceived by sight, touch, hearing, smell, taste and by the intellect.

Healers throughout history and across cultures used different diagnostic methods. Early civilisations used magical practices such as divination, whereas medieval Europeans used astrology. Medical traditions worldwide including Ayurveda, Unani Tibb and Traditional Chinese Medicine interpret features of the patient’s body such as the pulse and the urine. As well as using their senses, physicians in ancient Greece and Rome and in medieval Europe used instruments such as probes and specula to access hidden parts of a patient's body. 

As science and technology developed, medical practitioners had an increasing array of instruments at their disposal. These served not only to enhance their observational skills, but assisted them in recording changes in symptoms and bodily functions over time. 

Today, medical practitioners worldwide use their senses in combination with the patient's account of his or her symptoms and a variety of tools and tests to confirm their diagnosis. Measurements such as pulse, blood pressure and temperature are routinely recorded on a patient's hospital chart as indicators of the progress of illness or recovery.

Visible signs of illness

Some signs and symptoms of disease are easy to identify. Yellowing in the eyes can indicate jaundice, buboes at the joints are an indication of plague.

But diagnosing a disease based on visible symptoms alone can be difficult and unreliable. Many diseases present with similar symptoms, and only an experienced physician who has encountered numerous cases of measles, for example, can confidently differentiate it from another infectious disease that produces a fever and skin rash.

In the past, doctors shared detailed descriptions of symptoms and visual 'atlases' of disease to help them distinguish between similar ailments. Once dissection became widespread at the beginning of the 1800s, pathological anatomy sometimes revealed visible 'lesions' inside the body which confirmed a diagnosis.

Radiograph of a hand Science Museum Group Collection Image source for Radiograph of a hand
Photograph of a radiograph (an x-ray negative) taken by Röntgen of his wife's hand in 1895

Body imaging dominated the way disease was diagnosed during the 1900s. X-rays were discovered by Wilhelm Röntgen (1845–1923) in 1895, and had an immediate impact. Other body imaging techniques such as magnetic resonance imaging (MRI), positron emission tomography (PET) and ultrasound were developed over the century, helping medical practitioners look inside the body without having to cut it open.

Some imaging technology allowed them to observe bodily functions—such as the heart’s electrical activity—in real time. Electrocardiograms (ECG) and other machines that visually recorded body function became central to hospital medicine.

Listening to the body

Auscultation is the medical technique for listening to the internal sounds that the body makes. It is an ancient diagnostic practice in which doctors most commonly listen to the sounds of the lungs, heart and bowels.

Doctors or medical students listening to their heartbeats using a multiple stethoscope, 1920s


Auscultation grew out of clinical practice. The term was introduced by René Laennec (1781–1826), the French physician who refined the method while working in hospitals. In 1816 he invented the stethoscope, after being consulted by a young female patient with heart problems.

Subject to the social conventions of the day, Laennec considered it improper to listen to the sounds of her heart by putting his head directly on her chest. Instead, he rolled up a piece of paper into a tube and placed one end on the patient's chest and the other at his own ear. The tube communicated the sounds of her heart more clearly and helped Laennec diagnose the patient's condition. He subsequently developed a robust wooden tube to replace the makeshift paper one and called it a stethoscope—from the Greek words for 'chest' and 'explore'. 

Auscultation was a skill that required extensive experience. By the 1850s the stethoscope was in routine use, and learning to listen and diagnose the sounds from the chest and intestines became an important part of a doctor’s training. With amplified sounds, diseases of the lungs, heart and vascular systems could be diagnosed much more easily and reliably.

In the same decade, Laennec’s hollow wooden tube design was replaced by the binaural stethoscope. These stethoscopes featured two earpieces connected to a ‘bell’ that is placed on the body. 

Stethoscopes

See how the design of the stethoscope changed over time:

Today, a stethoscope worn around the neck is one of the most enduring images of the medical profession. Auscultation is still an important part of physical examination. Doctors and nurses listen for heart rates, unusual sounds such as heart murmurs and wheezing, and crackles in the lungs. Stethoscopes are also used during pregnancy to listen to the baby’s heartbeat.

Detecting illness through touch

Medical practitioners in many traditions feel the patient’s body with their hands to detect abnormalities, locate pain or apply an instrument in the right place. Palpation is the medical technique of detecting bodily change using touch. In Ancient Greece, doctors recommended palpation (applying pressure with the hands) of the patient’s abdomen to detect hardening or pain. 

The method went out of fashion in medieval and early modern Europe, when medicine became an academic discipline taught at universities. Doctors in this era were expected to be engaged in intellectual rather than manual labour; they considered hands-on activities such as palpation beneath them. 

Today, palpation is a routine part of medical diagnosis. New technologies supplement hands-on 'haptic' palpation with 'virtual palpation' through computer-aided imaging.

The pulse

The most familiar form of palpation is taking a patient’s pulse. The pulse is a measure of the number of times the heart beats per minute—the heart rate. The easiest way to measure the heartbeat is by using the fingertips to apply light pressure on a major artery close to the surface of the skin. The variation in blood pressure as the heart expands and contracts to pump blood around the body can be felt in the major blood vessels.

Ivory netsuke figure Science Museum Group Collection Image source for Ivory netsuke figure
Ivory figure depicting a doctor feeling a patient's pulse, signed Chikaaki, Japan, late 1800s

Taking the pulse is a diagnostic practice found in many medical traditions, from Galenic to Traditional Chinese Medicine and Unani Tibb. In the Western tradition, physicians have made attempts to quantify this observation since the early modern period.

The Italian physician Sanctorius (1561–1636) invented a ‘pulsilogium’ to count the pulse with the aid of a pendulum, and the British physician Sir John Floyer (1649–1734) introduced the second hand on watches to time the pulse more accurately.

By the 1800s, technological developments meant that the pulse could be measured mechanically, and doctors no longer had to rely on direct touch.  

In 1831 Julius Hèrisson developed the 'sphygmomanometer' (or 'sphygmometer' for short), an instrument that displays the pulse beat visually. The German physiologist Karl Vierordt (1818–84) combined Hèrisson's instrument with a device to record the movement of the pulse on paper. His ‘sphygmograph’ allowed doctors to see how the pulse changed over longer periods of time. 

Sphygmomanometer apparatus made by Charles Thackray, a surgical and medical instrument maker in Leeds, 1920-1955.

The French physiologist Étienne-Jules Marey improved the device and made it portable. Further developments in the late 1800s and early 1900s led to the development of the cuffs that we use to measure blood pressure today.

Electrical activity in the body

The body uses electricity to help it move, think and feel. The nervous system sends electrical signals throughout the body to individual cells in the muscles, brain and organs. 

In 1887, British physiologist Augustus Waller discovered it was possible to record the electrical activity of the heart from the skin’s surface. He used a photographic plate to record the electrical signals as the heart muscles expanded and contracted.

The electrocardiograph

Electrocardiograph machines (ECG) measure the heart’s electrical activity using electrodes placed on the skin. Information about heart activity is produced as wavy lines called ‘traces’. During the 1900s ECG was primarily used to diagnose heart conditions. Computerised ECG machines now enable continuous heart monitoring.

The electrocardiograph

Dutch physiologist Willem Einthoven (1860–1927) was inspired by Waller's experiments. In 1902 he developed an instrument to record traces of the heart’s activity. His string galvanometer was critical to the manufacture of early electrocardiograph machines in 1908.

Early ECG machines were cumbersome and hard to use. Einthoven's first machine required five people to operate. The person being monitored had to place each limb in a bucket of saltwater, so it was impractical for patient use. Improvements such as electrodes attached to the skin’s surface meant machines became smaller, portable and more reliable.

During the 1920s, German psychiatrist Hans Berger (1873–1941) developed the electroencephalograph (EEG) to detect electrical activity in the brain from the surface of the skull. EEG machines can identify brain conditions such as epilepsy and monitor changes in brain activity during sleep or coma.

In the 1940s, scientists tried using EEGs to diagnose criminal tendencies and some mental health conditions. Some thought EEGs could be used in eugenics to screen those with such disorders and prevent them reproducing. 

EEGs were also used in 'lie detector' machines, but by the 1960s most psychiatrists and psychologists accepted EEG data could not reliably diagnose dishonesty or criminality. 

EEGs have also been used in sleep research. They help us understand the relationship between deep sleep, light sleep, dreaming and wakefulness. They also help legally define 'brain death'.

Body temperature

Body temperature is an essential measure of human health. Before the development of reliable thermometers, medical practitioners relied on touch to tell them if their patient had an elevated body temperature or fever. 

Sanctorius was the first person to put a scale on a thermometer, allowing him to measure patients' absolute temperatures. Hermann Boerhaave (1668–1738) was perhaps the first physician to use a thermometer at his patients' bedside.

But early thermometers were very inaccurate. This was because it was not known how liquids expand and because glass-makers could not produce very regular thin glass tubes. Gabriel Fahrenheit (1686–1736) was the first person to make a thermometer filled with mercury. The more predictable expansion of mercury, combined with better glass-working techniques, led to a much more accurate thermometer.

In 1742 the Swedish scientist Anders Celsius (1701–44) developed the scale we still use today, with 0 degrees as the freezing point of water and 100 degrees as its boiling point. Yet the instrument did not become part of everyday medical practice until the 1800s. 

Carl Wunderlich Wellcome Collection (CC BY) Image source for Carl Wunderlich
Portrait of Carl Wunderlich

In 1868 the German physician Carl Wunderlich (1815–1877) published the results of thermometric measurements on more than 25,000 patients. He had recorded the temperatures of patients at Leipzig University Hospital as numbers and curves and established the range of 36.3 to 37.5 °C as normal human body temperature. 

Wunderlich also observed that specific diseases had their own characteristic fever curves. His work gave medical practitioners a new way to diagnose disease, and hospital patients' temperatures began to be recorded at regular intervals and displayed as temperature curves on a chart by their bed. 

But thermometers, however accurate, only give a localised spot temperature. Infrared thermography is a technique that uses infrared cameras to create heat images showing the distribution of temperature over the whole body. Thermal imaging can be used to screen for certain conditions such as inflammatory diseases and cancer.

 

Urine analysis

A physician studying the urine of patient, 1763

Since ancient times, medical practitioners have observed the colour, smell and even taste of urine as a means of diagnosis.

The Hindu physician Susruta (c.500 BCE) observed that black ants were attracted to the sugar in some people’s urine, a characteristic of the disease now known as diabetes mellitus. Greek physician Hippocrates (460–355 BCE) noted that sediment in the urine increased as a fever worsened.

His Roman counterpart Galen (130–210) ascribed frequent urination, which he called ‘diarrhoea of the urine’, to kidney disease. Islamic physician Ibn Sina (980–1037), provided a comprehensive list of symptoms for diabetes. Among them he noted that, when evaporated, the urine left a sweet residue like honey.

 

Gilles de Corbeil (1165–1213), royal physician to King Philippe-Auguste of France, introduced the matula or jorden—a glass vessel in which a physician could assess the colour, consistency, and clarity of urine. The matula became an iconic symbol for physicians in the same way that the white coat and stethoscope has in more recent times.

Page from 13th-century medical text Biblioteca Europea di Informazione e Cultura (Public domain) Image source for Page from 13th-century medical text
Colour wheel for uroscopy from 'Fasciculus Medicinae'

Published in 1491, Fasciculus Medicinae, attributed to Johannes de Ketham, was a bestselling medieval compilation of medical treatises. The book included a colour wheel of 20 shades of urine, with descriptions of what each indicated about the patient’s health. 

By the 1600s, the popularity of uroscopy (urine analysis) meant that it was taken up by quacks, charlatans and lay healers as well as physicians.

Practitioners claimed they could tell the age and sex of a patient from only their urine, as well as predicting the course of a disease. Witch hunters mixed urine with nail clippings to distinguish witches from non-witches. Some even claimed to predict the future in a related practice called uromancy. 

In 1637, the English physician Thomas Brian published a small book railing against the exaggerated claims for uroscopy and ridiculing those who practiced it as 'pisse prophets'. 

Uroscopy declined after the 1600s, although urine analysis remained a valuable diagnostic tool for doctors.

Pregnancy tests

Urine analysis has formed the basis of pregnancy testing for centuries. Ancient Egyptian texts describe an early urine-based test in which the woman urinated over a mixture of wheat and barley seeds. Should barley grow, a boy was expected. If wheat grew it was a girl. No growth meant no pregnancy.

From the Middle Ages onwards, people tried to detect ‘something different’ in the urine that could confirm pregnancy—or the lack of it. They looked for changes in its appearance, smell and taste, the way it reacted with other substances, and the colour with which urine-soaked cloth burned. 

In the 1930s, South African researchers Hillel Abbe Shapiro and Harry Zwarenstein discovered that they could successfully test for pregnancy using the Xenopus frog. They injected the frog with a woman’s urine and put it in a jar with a little water. If the frog laid eggs within a day, it indicated that the woman was pregnant. After ten years of trials, Shapiro and Zwarenstein reported the diagnosis was correct in over 98% of cases. The Hogben test (named after Shapiro’s advisor) became the standard method of testing pregnancy worldwide until the 1960s. 

Early hormone-based pregnancy tests were only carried out in hospitals and doctor’s surgeries, where there was deemed a medical need. The first home testing kit was marketed in 1978. It looked for the hormone human chorionic gonadotrophin (hCG) and more closely resembled a chemistry set than today’s pregnancy test strips. The results took around two hours.

Pregnancy tests are now relatively inexpensive and widely available over the counter. They measure the presence of hCG in the urine to indicate within minutes whether a woman is pregnant.

The Prepurex pregnancy test kit (1980) was used for early confirmation of pregnancy. It contains pipettes, sampling tubes, a mixing plate and instructions.

 

Find out more