The evolution of electronic computers over a period of time can be traced effectively dividing this period into various generations. Each generation is characterized by a major technological development that fundamentally changed the way computers operated. These helped to develop smaller, cheaper, powerful, efficient and reliable devices. Today, life has become indispensable without a computer. You find computerization in almost every sphere and industry. Computer evolution has been a fascinating process as we find out here.
The generation of computers may be broadly classified into 5 stages :
1. First Generation ( 1940 – 1956 )
2. Second Generation ( 1956 – 1963 )
3. Third Generation ( 1964 – 1971 )
4. Fourth generation ( 1971 – Present )
5. Fifth Generation ( present and Beyond )
FIRST GENERATION ( 1940 – 1956 )
World War gave rise to numerous developments and started off the computer age. Electronic Numerical Integrator and Computer (ENIAC) was produced by a partnership between University of Pennsylvannia and the US government. It consisted of 18,000 vacuum tubes and 7000 resistors. It was developed by John Presper Eckert and John W. Mauchly and was a general purpose computer. "Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as data." Von Neumann's computer allowed for all the computer functions to be controlled by a single source.
Then in 1951 came the Universal Automatic Computer(UNIVAC I), designed by Remington rand and collectively owned by US census bureau and General Electric. UNIVAC amazingly predicted the winner of 1952, presidential elections, Dwight D. Eisenhower.
In first generation computers, the operating instructions or programs were specifically built for the task for which computer was manufactured. The Machine language was the only way to tell these machines to perform the operations. There was great difficulty to program these computers ,and more when there were some malfunctions.
The first generation of computers used vacuum tubes for circuitry and magnetic drums for memory. They were large in size, occupied a lot of space and produced enormous heat.
They were very expensive to operate and consumed large amount of electricity. Input was based on punched cards and paper tape and output was displayed on print outs. First generati0n computers could solve only one problem at a time.
SECOND GENERATION(1956– 1963)
The second generation of computers witnessed the vacuum tubes being replaced by the “ TRANSISTORS “. The transistor was far superb, faster, cheaper, energy-efficient and more reliable than their First- generation counter parts. The transistors also generated considerable heat that sometimes caused the computer to malfunction. But it was a vast improvement over the vacuum tube. Second-generation computers used punched cards for input and print outs for output.
Second generation computers moved from the use of machine language to assembly language, which allowed programmers to specify instructions in words. Even though complex in itself Assembly language was much easier than the binary code. High-level programming languages were also developed at this time, such as early version of COBOL (Common Business-Oriented Language) and FORTRAN (Formula Translator). The computers stored their instruction in their memory, which moved from a magnetic drum to magnetic core technology. Throughout the early 1960's, there were a number of commercially successful second generation computers used in businesses, universities, and government from companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand, and others. These second generation computers were also of solid state design, and contained transistors in place of vacuum tubes. They also contained all the components we associate with the modern day computer: printers, tape storage, disk storage, memory, and stored programs. One important example was the IBM 1401, which was universally accepted throughout industry, and is considered by many to be the Model T of the computer industry. By 1965, most large business routinely processed financial information using second generation computers (Gersting 218).
THIRD GENERATION ( 1964 – 1971 )
The development of the Integrated Circuit in 1958 by Jack Kilbey left its mark in the third generation of computers. Transistors were made smaller in size and placed on silicon chips, which dramatically increased the speed and efficiency of computers. As a result, computers became ever smaller as more components were squeezed onto the chip. Another third-generation development included the use of an operating system that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory . Fairchild Camera and Instrument Corp. built the first standard metal oxide semiconductor product for data processing applications, an eight-bit arithmetic unit and accumulator. The fundamental components of this semiconductor laid the groundwork for the future discovery of the microprocessor in 1971. Another company that took advantage of the third generation advancements was IBM with the unveiling of the IBM System/360. The company was making a transition from discrete transistors to integrated circuits, and its major source of revenue moved from punched-card equipment to electronic computer systems.
UNIX : In 1969 AT&T Bell Laboratories programmers Kenneth Thompson and Dennis Ritchie developed the UNIX operating system on a spare DEC minicomputer. UNIX was the first modern operating system that provided a sound intermediary between software and hardware. The UNIX operating system quickly secured a wide following, particularly among engineers and scientists at universities and other computer science organizations.
FOURTH GENERATION
( 1971 – Present )
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits we rebuilt onto a single silicon chip. After the invention of the integrated circuit, the next step in the computer design process was to reduce the overall size. Large scale integration (LSI) could fit hundreds of components onto one chip. By the 1980's, very large scale integration (VLSI) squeezed hundreds of thousands of components onto a chip. Ultra-large scale integration (ULSI) increased that number into the millions. The ability to fit so much onto an area about half the size of a U.S. dime helped diminish the size and price of computers. It also increased their power, efficiency and reliability. The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minute chip. Whereas previously the integrated circuit had had to be manufactured to fit a special purpose, now one microprocessor could be manufactured and then programmed to meet any number of demands. Soon everyday household items such as microwave ovens, television sets, and automobiles with electronic fuel injection incorporated microprocessors (Gersting 35 - 39). Such condensed power allowed everyday people to harness a computer's power. They were no longer developed exclusively for large business or government contracts. By the mid-1970's, computer manufacturers sought to bring computers to general consumers. These minicomputers came complete with user-friendly software packages that offered even non-technical users an array of applications, most popularly word processing and spreadsheet programs. Pioneers in this field were Commodore, Radio Shack and Apple Computers. In the early 1980's, arcade video games such as Pac Man and home video game systems such as the Atari 2600 ignited consumer interest for more sophisticated, programmable home computers. In 1981, IBM introduced its personal computer (PC) for use in the home, office and schools. The 1980's saw an expansion in computer use in all three arenas as clones of the IBM PC made the personal computer even more affordable. The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used. Computers continued their trend toward a smaller size, working their way down from desktop to laptop computers to palmtop. In direct competition with IBM's PC was Apple's Macintosh line, introduced in 1984. Notable for its user-friendly design, the Macintosh offered an operating system that allowed users to move screen icons instead of typing instructions. Users controlled the screen cursor using a mouse, a device that mimicked the movement of one's hand on the computer screen. As computers became more widespread in the workplace, new ways to harness their potential developed. As smaller computers became more powerful, they could be linked together, or networked, to share memory space, software, information and communicate with each other. As opposed to a mainframe computer, which was one powerful computer that shared time with many terminals for many applications, networked computers allowed individual computers to form electronic gateways. Using either direct wiring, called a Local Area Network (LAN), or telephone lines, these networks could reach enormous proportions. A global web of computer circuitry, the Internet, for example, links computers worldwide into a single network of information. During the 1992 U.S. presidential election, vice-presidential candidate Al Gore promised to make the development of this so-called "information superhighway" an administrative priority. The ideals expressed by Gore and others are in usage everyday through email, web browsing, and e-commerce.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUI's, the mouse and handheld devices. A new generation of computers will emerge with the use wireless communications and wide area networking.
FIFTH GENERATION
( PRESENT & BEYOND )
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. Artificial Intelligence is the branch of computer science concerned with making computers behave like humans. The term was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology.
They will be able to take commands in an audio visual way and carry out instructions. Many of the operations which requires low human intelligence will be performed by these computers Artificial intelligence includes
Expert Systems: programming computers to make decisions in real-life situations (for example, some expert systems help doctors diagnose diseases based on symptoms)
Neural Networks: Systems that simulate intelligence by attempting to reproduce the types of physical connections that occur in animal brains
Robotics: programming computers to see and hear and react to other sensory stimuli
Natural Language: programming computers to understand natural human languages
Games Playing: programming computers to play games such as chess and checkers
Currently, no computers exhibit full artificial intelligence. The greatest advances have occurred in the field of games playing. The best computer chess programs are now capable of beating humans. In May,1997, an IBM super-computer called Deep Blue defeated world chess champion Gary Kasparov in a chess match.
In the area of robotics, computers are now widely used in assembly plants, but they are capable only of very limited tasks. Robots have great difficulty identifying objects based on appearance or feel, and they still move and handle objects clumsily.
Today, the hottest area of artificial intelligence is neural networks, which are proving successful in an umber of disciplines such as voice recognition and natural-language processing. There are several programming languages that are known as AI languages because they are used almost exclusively for AI applications.
You May Also Find These Documents Helpful
-
Computers have become one of our necessities in our daily life which makes it so hard to imagine a time when they did not really exist. And the birth of first computer occurred in 1950. It was the Electrical Numerical Integrator and Calculator or ENIAC. It was made out of 18,000 vacuum tubes which made it to consume about 180,000 watts of electrical power. However, it was only capable to give function such as multiplying numbers rapidly. Due to rapid growth in population, Census Bureau of United States decided to have a machine to tabulate the data, hence,…
- 755 Words
- 4 Pages
Good Essays -
The ENIAC Story: Historical Account of the Creation of the World's First Electronic Digital Computer
The ‘ENIAC Story’ is a historical account of the creation of the world’s first electronic digital…
- 877 Words
- 4 Pages
Powerful Essays -
At the beginning of 1940, although the European war had been on for several months, extensive…
- 4817 Words
- 20 Pages
Good Essays -
In 1944 the very first electronic- mechanical computer called MARK 1 was created at Harvard. This machine was a massive calculator that was fifty one feet wide and eight feet tall (Chee, 1997). The beginning of the smaller computers that we know today started in 1959 when Honeywell developed the first computers that used transistors. These were followed by IBM who used integrated circuits. The very first personal computers were built in the 1970’s with the computers that are recognizable today starting in 1974 with…
- 748 Words
- 3 Pages
Good Essays -
One of his first machines was the Automatic Computing Machine (ACE). He was enlisted to the National Physical Laboratory in London to design and progress the electric computer . “It was the first relatively complete specification of an electric stored-program general-purpose digital computer” The machine had less memory than Alan had originally intended . Although they were close, the NPL lost the race to build the world’s first working digital computer that also had a stored program .…
- 2244 Words
- 9 Pages
Good Essays -
People have been in awe of computers since they were first invented. At first scientist said that computers would only be for government usage only. "Then when the scientists saw the potential computers had, scientist then predicted that by 1990 computers may one day invade the home of just about ever citizen in the world" ("History" Internet), the scientists were slightly wrong, because by 1990 computers were just beginning to catch on. Then a few years later when scientists when to major corporations to get help with a special project, the corporations said no, because computers would just be a fad and they wouldn 't make much money off of it. "By definition Abacus is the first computer (the proper definition of a computer is one who or that which computes) ever invented" (Internet).…
- 1469 Words
- 6 Pages
Better Essays -
The ENIAC (Electrical Numerical Integrator And Calculator) was created by John Mauchly and John Presper Eckert at the University of Pennsylvania’s Moore School of Electrical Engineering. The ENIAC was the first computer developed in the United States. The ENIAC was programmed by Jean Bartik. She was the first woman to program the ENIAC. John Mauchly was the chief consultant and John Presper Eckert was the chief engineer. John Presper Eckert obtained his Bachelor 's degree in electrical engineering in 1941 and his Master 's degree in 1943 which qualified him to be chief engineer. John Mauchly obtained his Bachelor 's, Master 's and Doctorate degree at Johns Hopkins University in Baltimore, Maryland in physics. It was when John Eckert was a graduate student, he met John Mauchly. It was sponsored by the United States military during the Second World War. It was to be used to calculate artillery-firing tables which would be used for different weapons for target accuracy. The ENIAC contained approximately 17,500 vacuum tubes, and was linked together by 500,000 soldered connections. The ENIAC took up about a fifty foot long basement and weighted thirty tons. But the ENIAC was completed in 1946; one year after the war was over and it took about 500,000 dollars to build the ENIAC. Even though the war was over, The US Military was still used the ENIAC to perform the calculations for the design of a hydrogen bomb, weather prediction, and wind-tunnel design. In 1946 after the creation of the ENIAC, John Presper Eckert and John Mauchly started the Eckert-Mauchly Computer Corporation. In which they created "the BINAC (Binary Automatic) computer which used magnetic tape to store and access data." Jean Bartik also helped to develop the BINAC as well as the UNIVAC computer, both created by Eckert and Mauchly. "John Presper Eckert and John Mauchly both received the IEEE Computer Society Pioneer Award in 1980."…
- 433 Words
- 2 Pages
Powerful Essays -
Many encyclopaedias and other reference works state that the first large-scale automatic digital computer was the Harvard Mark 1, which was developed by Howard H. Aiken (and team) in America between 1939 and 1944. However, in the aftermath of World War II it was discovered that a program controlled computer called the Z3 had been completed in Germany in 1941, which means that the Z3 pre-dated the Harvard Mark I. Prof. Hurst Zuse (http://www.epemag.com/zuse/)…
- 1492 Words
- 6 Pages
Powerful Essays -
UNIVAC (an Acronym for UNIVersal Automatic Computer) is the name of a business unit and division of the Remington Rand company formed by the 1950 purchase of the Eckert-Mauchly Computer Corporation, founded four years earlier by ENIAC inventors John Adam Presper Eckert Jr., an American electrical engineer and computer pioneer and John William Mauchly, an American physicist. Eckert and Mauchly together invented the first general-purpose electronic digital computer, the ENIAC, and pioneered fundamental computer concepts including stored programming, subroutines and programming languages. The UNIVAC was the first commercial computer in the United States which incorporated Eckert’s invention of the mercury delay line memory. The UNIVAC name also refers to the associated line of computers which continues to this day in one of the two such lines offered by Unisys. Unisys was formed when Burroughs bought Sperry which held the evolved UNIVAC division.…
- 1164 Words
- 5 Pages
Powerful Essays -
Okay so the ENIAC (Electronic Numerical integrator And Computer) was the first general-use electronic computer. It was able to solve a whole mess of computing problems. Computing problems such as military related stuff, like stuff for the United States Army Ballistic Research Laboratory. The American Military funded the research, since it would really help them; it was also to their gain as well. As with the UNIVAC (Universal Automatic Computer) the ENIAC was invented by the same two, Joseph Mauchly and John P. Eckert. John Mauchly was the chief consultant and John P. Eckert was the chief engineer of the project. With their team, Eckert and Mauchly took around a year or so to design the ENIAC. But it took about a year and a half to actually…
- 547 Words
- 3 Pages
Good Essays -
John von Neumann, born December 28, 1903 was a Hungarian mathematician who made important contributions to computer science, von Neumann is best known for his EDVAC (Electronic Discrete Variable Automatic Computer) which opposed the not yet released ENIAC (Electronic Numerical Integrator and Computer). Neumann's EDVAC design was intended to resolve many of the problems created by the ENIAC's design. The ENIAC was designed to operate in decimal, whereas the EDVAC was designed to work in binary. These specific types of computers had to be physically rewired in order to perform different tasks. These machines are often referred to as "fixed-program computers," since they had to be physically reconfigured in order to run a different program. Since the term "CPU" is generally defined as a software (computer program) execution device, the earliest devices that could rightly be called CPU's came with the advent of the stored-program computer.…
- 2981 Words
- 12 Pages
Better Essays -
The first digital computer built in the basement of Iowa state university in 1937-1942 by John atanasoff First electronic digital computer…
- 596 Words
- 3 Pages
Satisfactory Essays -
In 1944 in the United States, International Business Machines (IBM) built a machine in cooperation with scientists working at Harvard University under the direction of Prof. Aiken. The machine, called Mark I Automatic Sequence-Controlled Calculator, was built to perform calculations for the Manhattan Project, which led to the development of atomic bomb. It was the largest electromechanical calculator ever built. It used over 3000 electrically actuated switches to control its operations. Although its operations were not controlled electronically, Aiken's machine is often classified as a computer because its instructions, which were entered by means of a punched paper tape, could be altered. The computer could create ballistic tables used by naval artillery.…
- 322 Words
- 2 Pages
Satisfactory Essays -
Blaise Pascal is usually credited for building the first digital computer in 1642. It added numbers entered with dials and was made to help his father, a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and, after changing some things around, multiply. Leibniz invented a special stepped gear mechanism for introducing the addend digits, and this is still being used.…
- 3195 Words
- 13 Pages
Powerful Essays -
The second generation of computers used transistors. With transistors replacing vacuum tubes computers became smaller, faster, cheaper, more energy efficient and more reliable. These computers no longer used machine language they progressed to symbolic language. Symbolic language allowed programmers to specify instructions in words. The first computers of this generation were developed for the atomic energy industry.…
- 527 Words
- 3 Pages
Good Essays