The history of computer development is often referred to in terms of five distinct eras, or "generations" of computing devices. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. The First Generation: 1946 to 1955 The first computers used vacuum tubes for circuitry, magnetic drums and magnetic cores for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on teletype printers. The UNIVAC is the most famous first generation computer. Manufactured by Sperry-Rand Corporation, the first UNIVAC was delivered to the U.S. Census Bureau in 1951 - becoming the first computer that was not used for military or scientific purposes. |
Vacuum tube processing unit in a first-generation computer. | The Second Generation: 1956 to 1963 Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.