An integrated circuit is defined as:[2]
A circuit in which all or some of the circuit elements are inseparably associated and electrically interconnected so that it is considered to be indivisible for the purposes of construction and commerce.
Circuits meeting this definition can be constructed using many different technologies – see for example thin-film transistor, thick film technology, or hybrid integrated circuit. However, in general usage integrated circuit has since come to refer to the single-piece circuit construction originally known as a monolithic integrated circuit.[3][4]
Invention[edit]
Main article: Invention of the integrated circuit
Early developments of the integrated circuit go back to 1949, when the German engineer Werner Jacobi (Siemens AG)[5] filed a patent for an integrated-circuit-like semiconductor amplifying device[6] showing five transistors on a common substrate in a 3-stage amplifier arrangement. Jacobi disclosed small and cheap hearing aids as typical industrial applications of his patent. An immediate commercial use of his patent has not been reported.
The idea of the integrated circuit was conceived by a radar scientist working for the Royal Radar Establishment of the British Ministry of Defence, Geoffrey W.A. Dummer (1909–2002). Dummer presented the idea to the public at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952.[7] He gave many symposia publicly to propagate his ideas, and unsuccessfully attempted to build such a circuit in 1956.
A precursor idea to the IC was to create small ceramic squares (wafers), each one containing a single miniaturized component. Components could then be integrated and wired into a bidimensional or tridimensional compact grid. This idea, which looked very promising in 1957, was proposed to the US Army by Jack Kilby, and led to the short-lived Micromodule Program (similar to 1951's Project Tinkertoy).[8] However, as the project was gaining