It's hard to imagine a world without computers. They are found in every part of our lives, not just on our desks and in our pockets, but embedded in every conceivable industry.
It seems as if they can do anything—that they can be programmed with any problem we can imagine. But how did we get to this?
Computing before computers
When we talk about computers today, we usually mean electronic devices that can be programmed to carry out many different tasks—better described as universal stored-program computers. The first were built in the 1940s. But there were ‘computers’ before this.
In some cases, the word ‘computer’ simply meant a person who carries out repetitive calculations. Many of these ‘computers’ were women. One specialist company, the Scientific Computing Service, founded in 1937, employed mathematically trained women using calculating machines to solve a wide range of military and scientific problems.
In other cases, one-off devices computed specific problems. But unlike today’s computers, these single-purpose machines could not be reprogrammed to handle different problems.
Then came Alan Turing.
Alan Turing's universal computer
In 1936 the mathematician Alan Turing wrote a seminal mathematics paper, 'On Computable Numbers, with an Application to the Entscheidungsproblem', which came to be seen as a theoretical basis for today’s computers.
In it he imagined a single machine that could compute any problem, effectively uniting all human or problem-specific ‘computers’ into one universal device.
A few years later, these modern computers started to be built in real life and, by the mid-1950s, were becoming common in large institutions, companies and university departments. Such early machines, like the Automatic Computing Engine (ACE) filled entire rooms.
Models of weather systems, the night sky and the terrain of far-off places, as well as the flight of missiles, the enrichment of uranium and countless other scientific and technological problems, could be described in code and manipulated by computers.
Ten years later, minicomputers were introduced, bringing the power of computers into laboratories and offices. By the 1980s, desktop computers as we know them today had started to become commonplace.
But to really understand the power of computers, we need to look much further back than Alan Turing.
Ada Lovelace and Charles Babbage
As an aristocratic woman working in the male-dominated fields of science and mathematics in the 19th century, Ada Lovelace was highly unconventional.
As a child, she had been given a scientific education and was introduced by her mother, the educational reform campaigner Annabella Milbanke, to some of the most significant scientists and mathematicians of the day.
As an adult in 1839, Lovelace took the eminent mathematician Augustus de Morgan as her tutor. She also became close friends with the mathematician and science writer Mary Somerville, who acted as her mentor.
In 1843, Lovelace published an account of Charles Babbage’s analytical engine in which she set out its possibilities as a mechanical general-purpose device.
Babbage’s analytical engine was more than a fast calculator—thanks to Ada Lovelace’s crucial insight, it offered the possibility of enormous computational power.
Lovelace's conceptual breakthrough
In her description, Lovelace explained that Babbage's device could do more than manipulate numbers as quantities. It could become a general-purpose machine that represented numbers as abstract items, such as symbols or musical notes. She concluded by commenting about the deep theoretical power of computational mathematics, as we might now term it.
The engine can arrange and combine its numerical quantities exactly as if they were letters or any other general symbols.
Ada Lovelace, on Babbage’s analytical engine (1843)
Why was this so important?
All computers, at their heart, are very fast calculators. They perform simple operations on the numerical digits ‘one’ and ‘zero’ very quickly and in huge quantities. But fast calculators, while important, would not have changed the world.
The power of modern computers rests on the concept that numbers can represent other things—in fact, anything we choose.
Take music. We can represent musical notes, as well as factors such as speed, loudness and tone, using numbers. We can program these numbers into computers, and the computer can make music, or distort it or transpose it or do pretty much anything we want to do with it.
The computer is just performing simple operations on numbers. But we have chosen for those numbers—in this example—to represent music.
This conceptual leap made by Lovelace, that calculators gain enormous power if the numbers they manipulate are symbolic of other things, revolutionised computing and underpinned the technological development of modern computers.
A new, a vast, and a powerful language is developed for the future use of analysis, in which to wield its truths so that these may become of more speedy and accurate practical application for the purposes of mankind than the means hitherto in our possession have rendered possible.
Ada Lovelace (1843)
Computers and mathematics today
It's remarkable how far we have come since Ada Lovelace’s pioneering insights.
Today, we have access not only to incredibly powerful computing technologies, but to software which enables the most abstract ideas as well as real-world problems to be modelled and manipulated.
The first digital computers required great skill to programme.
Nowadays, packages such as MATLAB, Mathematica and Maple, each first released in the 1980s, let mathematical practitioners in all walks of life grapple with the toughest mathematical ideas.