The History of the Integrated Circuit
the Integrated Circuit Generations
What is a Microchip?
How do microchips work? How are microchips made?
By definition the integrated circuit aka microchip is a set of interconnected electronic components such as transistors and resistors, that are etched or imprinted on a onto a tiny chip of a semiconducting material, such as silicon or germanium.
The History of the Integrated Circuit
Jack Kilby and Robert Noyce
It seems that the integrated circuit was destined to be invented. Two separate inventors, unaware of each other's activities, invented almost identical integrated circuits or ICs at nearly the same time.
Jack Kilby, an engineer with a background in ceramic-based silk screen circuit boards and transistor-based hearing aids, started working for Texas Instruments in 1958. A year earlier, research engineer Robert Noyce had co-founded the Fairchild Semiconductor Corporation. From 1958 to 1959, both electrical engineers were working on an answer to the same dilemma: how to make more of less.
Why the Integrated Circuit Was Needed
In designing a complex electronic machine like a computer it was always necessary to increase the number of components involved in order to make technical advances. The monolithic (formed from a single crystal) integrated circuit placed the previously separated transistors, resistors, capacitors and all the connecting wiring onto a single crystal (or 'chip') made of semiconductor material. Kilby used germanium and Noyce used silicon for the semiconductor material.
Commercial Release
In 1961 the first commercially available integrated circuits came from the Fairchild Semiconductor Corporation. All computers then started to be made using chips instead of the individual transistors and their accompanying parts. Texas Instruments first used the chips in Air Force computers and the Minuteman Missile in 1962. They later used the chips to