Microprocessors, also called central processing units (CPUs), are frequently described as the "brains" of a computer, because they act as the central control for the processing of data in personal computers (PCs) and other computers. Chipsets perform logic functions in computers based on Intel processors. Motherboards combine Intel microprocessors and chipsets to form the basic subsystem of a PC. Because it's part of every one of your computer's functions, it takes a fast processor to make a fast PC. These processors are all made of transistors. The first transistor was created in 1947 by a team of scientists at Bell Laboratories in New Jersey. Ever since 1947 transistors have shrunk dramitically in size enabling more and more to be placed on each single chip. The transistor was not the only thing that had to be developed before a true CPU could be produced. There also had to be some type of surface to assemble the transistors together on. The first chip made of semiconducitve material or silicon was invented in 1958 by Jack Kilby of Texas Instruments.
Now we have the major elements needed to produce a CPU. In 1965 a company by the name of Intel was formed and they began to produce CPU's shortly thereafter.
Gordon Moore, one of the founders of Intel, predicted that the number of transistor placed on each CPU would double every 18 months or so. This sounds almost impossible, however this has been a very accutate estimation of the evolution of CPUs. Intel introduced their first processor, a 4004, in November of 1971. This first processor had a clock speed of 108 kilohertz and 2,300 transistors. It was used mainly for simple arithmetic manipulation such as in a calculator. Ever since this first processor was introduced the market has done nothing but soared to unbelievable highs. The first processor common in personal computers was the 8088. This processor was introduced in June of 1978.
It could be purchased in three