Charting New Territory: The Microcomputer and the Development of Personal Computing

Media

Text

The computer is engrained into the very fabric of modern society: you could be viewing this on a cell phone, tablet, or laptop computer, via projection or electronic display. However, computers small enough to fit on a desk only appeared about thirty years ago. 

The first working computers were enormous, appearing during the Second World War when Allied and Axis scientists worked on creating powerful enigma machines, code breaking machines, and calculators. Working at Bletchley Park to defeat the Enigma machine, Alan Turing created a theoretical method for general purpose computing, or computing that was not topic-specific. The first computer based on the “Turing machine” model was ENIAC (Electronic Numerical Integrator and Computer), which was used during the war to calculate firing tables. ENIAC weighed 27 tons and took up about 1800 square feet, about the size of the average American household.

After the Second World War, the first commercially produced computers made their way into American businesses and universities. These computers were massive main-frame computers that took up entire rooms and were operated by paper punch cards. Smaller machines developed in the late 1960s, known as “microcomputers,” were used only by computer hobbyists, who built and programmed the computers by hand.

This changed with the debut of the Altair 8800 in 1975. An affordable, relatively simple machine, the Altair 8800 expanded the market of the microcomputer. Inspired by the simplicity, affordability, and power of the Altair microcomputer, hundreds of new companies formed a budding microcomputing industry. Bill Gates and Paul Allen started Micro-Soft to produce code for the new Altair units, while Steve Jobs and Stephen Wozniak formed Apple Computer to compete with dozens of other computer manufacturers during the late 1970s and early 1980s.

Up until 1981, the microcomputer appeared to most Americans to be a hobbyist’s toy. This changed on 12 August 1981, with the unveiling of the IBM Personal Computer (the PC). IBM, or International Business Machines, was already an industry leader in the production of giant computer mainframes. Their embrace of the microcomputer gave the machine legitimacy in the eyes of the public. PCs were soon integrated into businesses, classrooms, and even the American household, as expanded capabilities and a low price point made the machines attractive to a range of new audiences.

Through the 1980s and 1990s, the computer industry began to consolidate. A miniscule number of the hundreds of computer corporations started during the initial microcomputer craze survived market pressures. Among them were Commodore International, Apple Computers, and Atari, which emerged as the most popular home computer manufacturers of the 1980s. The Apple II was especially popular among American consumers, ensuring the success of Apple Computers through the market crashes of the late 1980s and early 1990s.

The 1980s proved to be a testing ground for the microcomputer. During this time, the true purpose and future of the computer was uncertain: would it be used solely as an extension of the business machine? Would it serve as a new platform for video gaming? Or would it serve a new role, one that connected common people to the emerging computer industry?

For Further Reading:

Campbell-Kelly, Martin, William Aspray, Nathan Ensmenger, and Jeffrey R. Yost. Computer: A History of the Information Machine. Boulder, CO: Westview Press, 2014; Third edition.