Personal computers are an indispensable part of modern life—but the mouse, desktop and graphics we're so familiar with are relatively recent inventions. How did they come to define our experience of home computing?
When were the first home computers used?
Personal computing began in the mid-1970s, with the launch of the affordable Altair kits designed by American electronics hobbyist Ed Roberts in his garage.
Integrated circuits, otherwise known as microprocessors or silicon chips, had made it possible to build a computer, with keyboard and monitor, that would sit on an ordinary office desk. Thanks to an enthusiastic write-up in Popular Electronics, Roberts sold thousands of kits in the first month, having expected the market to be no more than a few hundred.
The Altair was wonderful for hobby engineers like Roberts himself, who loved putting the device together and programming it to carry out simple tasks. They formed a dedicated community, swapping applications and know-how.
But interacting with the Altair involved dealing with multiple switches and lights, and typing out complex codes, limiting their appeal to those outside the hobby community.
The key features of an accessible computer interface
In the more serious world of high-level computing, researchers had already begun to address the question of how to improve the interface between human and machine.
In 1968 at a conference in San Francisco, Douglas Engelbart from the Stanford Research Institute had demonstrated almost all the features of modern personal computing, including multiple windows, hypertext, graphics and the computer mouse.
Known to computer history as 'The Mother of All Demos', Engelbart's presentation was the first to combine all these elements in a single system of hardware and software.
The computer mouse
One of the key design innovations that improved the accessibility of computers was the humble mouse, designed by Douglas Englebart and built by his colleague Bill English.
English later improved on his mouse design by using a rolling ball rather than a set of wheels.
The prototype mouse proved to be superior to all other available forms of screen navigation.
On July 1, 1970, the Xerox Palo Alto Research Centre opened at the heart of what would become Silicon Valley.
Several of Engelbart’s colleagues moved there to work on developments in personal computing, including the Xerox Alto. Completed in 1973, this machine had a graphical user interface and many other advanced features.
The graphical user interface
One of the Alto's key developments was its graphical user interface (GUI).
It could run mouse-based programmes which allowed the user to navigate around the screen using a mouse, clicking on files. This was an early version of the kind of graphical navigation we're used to today.
This was a big change from the command line interfaces of other computers, which required the user to enter text instructions (commands) to run programmes
The Alto was never developed as a commercial product—but many in the computer industry visited to admire its design as it went through its paces.
One of these was Steve Jobs. With Steve Wozniak, Jobs had started Apple Computers in 1976, both of them college drop-outs and initially, like Ed Roberts, working from a garage.
What was the first commercial computer with a graphical user interface?
By 1979, Apple was competing with the Commodore PET for sales of the Apple II, designed by Wozniak. An early spreadsheet application, VisiCalc, made the Apple II popular with business users, and the machine was also the first to store data on 5 1/4 inch floppy discs.
But it had nothing like the graphical user interface Jobs had seen on the Xerox Alto.
Teams of researchers at Apple set to work to launch a commercial computer with a GUI: in 1983 they hit the market with the Lisa.
But the Lisa, although it received favourable reviews—it was said to be 'goof-proof'—was never a commercial success. Even for the business users it was designed for, it was too expensive ($9,995) and limited in its suite of applications.
What was the impact of the Macintosh computer?
It was the Macintosh, launched the following year, that caught the imagination of home and business users, and with its 'MacPaint' software won the hearts of artists and designers.
It was also the first commercially launched personal computer to use the more compact and robust 3 1/2 inch floppy disks for storage.
The Apple Mac launch
On 22 January 1984, millions of Americans were glued to their TV sets as the Washington Redskins took on the Los Angeles Raiders in the annual Super Bowl, the premier American football event of the year.
In one of the ad breaks, without any introduction, they saw a female athlete in bright red shorts run through a crowd of hopeless, grey-clad workers and hurl a hammer through the face of 'Big Brother' on a giant screen.
"On January 24," intoned the concluding voiceover, "Apple Computers will introduce Macintosh. And you'll see why 1984 won't be like '1984'."
The ad, costing over $1m and directed by Ridley Scott, grabbed the attention of viewers. The product itself had novel features that made it very attractive in comparison with other machines on the market.
Apple's extravagant launch commercial for the Macintosh cocked a snook at the wealth and power of IBM.
The computer giant had come late to the personal computer party but in 1981 had marketed the IBM PC with great success, setting a standard by which other computer companies were measured.
The Macintosh's user-friendly advantages held some sway—for those who could afford one—until Microsoft developed the Windows operating system for the PC.
What is the legacy of these machines today?
Over 30 years since the launch of the Macintosh, GUIs have been standard on every personal computer, continuing to make computing accessible to almost everyone.
Touchscreens and tablets still echo the philosophy of the early pioneers who saw that people didn't want to learn a new language in order to talk to their computers.