Intel Corporation
J. David Hunger
In 1968, Robert N. Noyce, the co-inventor of the integrated circuit, and Gordon E. Moore left Fairchild Semiconductor International to form a new company. They took with them a young chemical engineer, Andrew Grove, and called the new firm Intel, short for integrated electronics. The company successfully made money by manufacturing computer memory modules. The company produced the first microprocessor (also called a “chip”) in 1971. A key turning point for the new company was IBM’s decision in the early 1980s to select Intel’s processors to run IBM’s new line of personal computers. Today, more than 80% of the world’s PCs run on Intel microprocessors.
One of the company’s early innovations was centralizing its manufacturing in giant chip fabrication plants. This allowed Intel to make chips at a lower cost than its competitors who made custom chips in small factories. The founders encouraged a corporate culture of “disagree and commit” in which engineers were encouraged to constantly think of new ways of doing things faster, cheaper, and more reliably.
Massive investment by Japanese competitors in the late 1970s led to falling prices in computer memory modules. Faced with possible bankruptcy, CEO Moore, with Grove as his second in command (Noyce had retired from active management), made the strategic decision in 1985 to abandon the computer memory business to focus on microprocessors. Projected growth in microprocessors was based on Moore’s prediction that the number of transistors on a chip would double every 24 months. In what was soon called “Moore’s Law,” Gordon Moore argued that microprocessor technology would improve exponentially, regardless of the state of the economy, the industry, or any one company. Thus, a company had to be at the cusp of innovation or risk falling behind. According to Moore, “If you lag behind your competition by a generation, you don’t just fall behind in chip