From a powerful source of health and well-being to potential cancer risk, our understanding of the effects of sunlight—and specifically UV radiation—on the body has evolved markedly over the last century.
When was UV radiation discovered?
In 1800, British astronomer William Herschel attached coloured filters to his telescope to safely observe the sun.
He noticed that the telescope’s temperature altered when different colour filters were applied.
Intrigued, Herschel used a prism to break up white light and placed a thermometer in the spectrum it produced. The temperature rose as the thermometer moved. Herschel saw that the temperature was highest beyond the red end of the spectrum, where there was no visible light.
This was infrared radiation.
One year later, inspired by Herschel’s findings, German scientist Johann Ritter looked to see if there were other invisible rays in the spectrum.
To do this, he exposed silver chloride to the spectrum. Silver chloride turns black under light, and Ritter found this occurred fastest beyond the violet end of the spectrum.
The ‘chemical rays’ to which Ritter attributed this phenomenon would later become known as ultraviolet.
The health benefits of ultraviolet light
Later in the 19th century, scientists made new discoveries about the nature of sunlight and its effects on the human body. Their findings positioned the sun as a powerful source of health and well-being.
In 1882, German physician and scientist Robert Koch discovered that tuberculosis was caused by a bacteria (myobacterium tuberculosis), which died when left in sunlight.
In his presentation to the Berlin TB Congress in 1890, Koch said:
The tubercle bacillus is killed quite rapidly by light. A few minutes' to several hours' direct sunlight kills. Diffuse sunlight, though slower, gives the same result. Tubercle cultures set by the window die in five to seven days.
But Koch was not the first to suggest that sunlight might be a bactericidal. In 1877, chemists Arthur Downes and Thomas Blunt had shown that ultraviolet light had a destructive effect on bacteria.
Then in 1893, Danish physician Niels Ryberg Finsen found that lupus vulgaris, a tuberculosis infection of the skin, responded well to sunlight. Later, he advocated sun baths for all forms of tuberculosis.
By 1903, Finsen had treated several hundred tuberculosis patients with ultraviolet radiation from the arc lamps he had invented, an endeavour for which he was awarded the Nobel Prize.
The arc from the lamp emitted light with a similar spectrum to sunlight. The rock crystal lenses they used were specially designed to emit UV, filtering out other wavelengths. They were shown to be very effective.
Sanatoriums offering heliotherapy (the treatment of illness with sunlight) sprang up across Europe and America to become a mainstay of surgical and cutaneous tuberculosis treatment and remained so until the mid-20th century.
Discovering the link between sunlight and Vitamin D
By the middle of the 19th century rickets, a serious condition affecting bone development in children, had become a scourge of poor urban populations, but its causes remained a mystery.
In 1890, British doctor and missionary Theobald A. Palm conducted a survey of fellow missionaries in Asia and Africa.
Palm's study revealed that the rate of rickets was higher in urban areas than in regions which received more sunlight. He began advocating therapeutic sunbathing for children with the disease.
It was years before Palm’s theory was validated. In 1920, physician Edward Mellanby linked rickets to dietary deficiency and after the First World War, British microbiologist Harriette Chick found that children with rickets responded well to a combination of good nutrition and sunbathing.
Identifying Vitamin D
Finally, various research groups came to the conclusion that there was an unknown vitamin at work. This later became known as Vitamin D.
Fish oil is one good source of Vitamin D, but it can also be generated in the skin, provided the skin receives enough energy from sunlight to do so.
There were two apparent solutions to the rickets problem then: adequate sunlight, and a diet providing sufficient Vitamin D.
But urban areas were mostly dark and densely populated, with smog and smoke blocking the shortest waves of sunlight (those most effective in battling disease). To really benefit from sunlight, people with rickets would need to leave the city or receive artificial sunlight therapy.
If a parent wanted their child to remain rickets-free but did not live in a sunny climate, the more practical option was to ingest fish oil instead.
Could sun lamps have benefits for general health?
‘What is this lamp?’ Uncle Matthew asked Davey, who was, still clad in the exiguous dressing-gown which he had put on for his sun-bath.
‘Well, you know how one can never digest anything in the winter months.’
‘I can, damn you,’ said Uncle Matthew. […]
‘You think you can, but you can’t really. Now this lamp pours its rays into the system, your glands begin to work, and your food does you good again.’
‘The Pursuit of Love’ (1945, Nancy Mitford)
Across the Atlantic, one of the most controversial proponents of light therapy was John Harvey Kellogg.
Famous for his breakfast cereal, Kellogg was also a doctor with a keen interest in nutrition, exercise and environmental treatments. He created a cabinet light bath to treat patients with a variety of ailments, including rheumatism, diabetes and anaemia.
Kellogg claimed that the artificial sunlight caused the skin to fill with blood and thus draw blood away from swollen areas and sick organs. The blood was then ‘fixed in the skin’ by cold water.
Kellogg’s theories were eventually disproved, but light baths like his were widely used in America and Europe both as medical treatment and an aid to relaxation.
By the 1920s, a tanned complexion had come to be a desirable sign of health and well-being for many people in Europe and America. This was thanks, in part, to the dissemination of medical ideas about sunlight.
If the weather was gloomy, or funds did not allow for a trip to warmer climes, sun worshippers could acquire a tan in the comfort of their home by means of a domestic sun lamp. Many models were available and most promised users both health and beauty.
Artificial sun lamps continue to be used today, but they reached peak popularity in Europe and America in the the 1920s and 1930s. Today, medical professionals warn that the UV rays emitted by sun beds can be dangerous.
When were the risks of UV exposure first known?
Heliotherapy had firmly established the idea of the sun as a source of good health by the early 1900s, but was not without naysayers.
In 1905, the New York Times ran an editorial lamenting the 'modern superstition that sunshine is always desirable':
They do it under the queer delusion, for which there is no foundation of fact whatever, that thus they are getting back to nature and laying in a stock of health for future use. The truth is that they are taking rather desperate chances of wrecking what health they have and are storing up large quantities of future trouble [...] it is full time that the utter irrelevancy of ‘tan’ to health should be generally understood.
New York Times (1905)
Protection from the Sun
Though the sun was seen by many as a source of wellbeing, it could still harm the body in a range of ways, from burning the skin to causing sunstroke or hurting the eyes.
This, of course, was nothing new. For centuries human beings have sought to shield themselves from the harmful effects of sunlight, devising protective attire and medicines to do so.
When did we first connect UV rays with skin cancer?
Today, there is a recognised causal link between exposure to sunlight and the disease, but ultraviolet rays have long been suspected of having damaging effects.
In 1894, dermatologist Paul G. Unna linked skin cancer in sailors with exposure to ultraviolet light in his paper 'Carcinom der Seemanshaut'.
Just a few years later, William Dubreuilh undertook an epidemiological study of skin cancer in rural labourers, finding that skin cancer was more common amongst rural workers who were routinely exposed to the sun.
Further evidence emerged in 1928, when George Findlay induced skin cancer in mice by exposing them to mercury arc radiation.
In the mid-century, a trickle of health warnings began to creep into popular consciousness and, slowly but surely, attitudes began to change. In 1963, an article in Time magazine reported that:
There is undeniable evidence that the effects of the sun are cumulative and at some point irreversible. The evidence is clear that chronic exposure to sunlight can be one of the major factors in the production of precancerous and cancerous conditions of the skin.
By the 1980s, attitudes to the sun had become almost panicked.
Reporter Matt Clark wrote in 1982 that almost all of the 400,000 new skin cancer cases 'can be blamed on overexposure to the sun.' At the World Congress on Cancers of the Skin in 1983, dermatologist Fred Urbach warned, 'even one day’s exposure can cause damage.'
How do we understand the Sun’s risk and benefits today?
Though fears have slightly subsided since the 1980s, skin cancer is still a huge global health concern.
The World Health Organisation (WHO) estimates that between 2 and 3 million non-melanoma and 132,000 melanoma skin cancers occur globally every year. This, WHO suggests, is exacerbated by the depletion of the ozone layer.
On the other hand, the NHS recommends natural sunlight and nutrition to ensure the body gets enough Vitamin D, and around the world, ultraviolet light is used to disinfect hospitals and clinics.
Perhaps one of the most well-known medical applications of sunlight is the treatment of Seasonal Affective Disorder (SAD), a complex depressive illness which appears to be triggered by lack of sunlight in winter and is typically treated with a lamp simulating natural light.
Sunlight, it seems, still has the power to heal and to nourish the body.