1. Business Data processing (BDP) .
Business data processing is characterized by the need to establish, retain, and process files of data for producing useful information. Generally, it involves a large volume of input data, limited arithmetical operations, and a relatively large volume of output. For example, a large retail store must maintain a record for each customer who purchases on account, update the balance owned on each account, and a periodically present a bill to the customer for merchandise purchased. This type of record keeping requires reading a customer’s account number, name, address, and previous balance. The bill involves a few basic calculations and the result are printed and mailed to the customer for collection. Tens of thousands of similar bills are commonly handled in the same way.
2. Scientific Data Processing (SDP) .
In science, data processing involves a limited volume of input and many logical or arithmetic calculations. Unlike business problems, most of the scientific problems are non-repetitive, requiring a “one-time” solution. For example, in cancer research, data on cancer patients (collected over a period of time) are analyzed by a computer to produce a possible cure. Although a final cure is unavailable, computer analysis of the hundreds of man-years of computations. It has also brought us a step closer to the final answer to the cancer horror. Although scientific data may differ from business data, the processing pattern is quite similar.
4 Methods of data processing
1. Batch processing
- all input has to be ready beforehand
- time to get your result may be long if interleaved with other batch jobs
- any errors may ruin the whole run but you won 't know it until results are returned
2. Online processing
- break in communication can leave your session in an unknown state
- may only be able to run a single session (may be no multiple logins)
- slow communication line can make the processing not work -- can 't get to the next step until the first one processed
3. Real time processing
(when done properly I can 't think of end-user restrictions, so here are some for programmers)
- operating systems that support it are often proprietary
- languages that support it are not the mainstream ones
- concepts in programming for it are unknown to the average programmer
4. Distributed processing
- processors may not be available when needed e.g. due to networking problems
- requires networking
- requires parallel programming or at least attention, in design, to distributing the tasks
5 Generation of computer
First Generation (1940-1956) Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.
Second Generation (1956-1963) Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.
Third Generation (1964-1971) Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Fourth Generation (1971-Present) Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.
Fifth Generation (Present and Beyond) Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
References http://www.mbaknol.com/management-information-systems/areas-of-data-processing/ http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp
References: http://www.mbaknol.com/management-information-systems/areas-of-data-processing/ http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp
You May Also Find These Documents Helpful
-
1) When you promoted your server to domain controller and installed DHCP, what would happen if there was another domain controller already on this network?…
- 501 Words
- 3 Pages
Satisfactory Essays -
The Maze Runner By: James Dashner Thomas is a boy that woke up and found himself alone in a dark box. He shouts and yells for help until the box opens. He looks up and sees a bunch of other boys looking back at him. He doesn't remember anything about his life and he's very confused about where he is. The leader named Alby comes over and tells him that they call this place the glade but in his sentences he says thing "Shank" "Shuck" "Keeper" "Slopper.…
- 1317 Words
- 6 Pages
Good Essays -
It appears that they did not count the individual batch time that’s why they didn’t utilize the total capacity of production.…
- 510 Words
- 3 Pages
Good Essays -
3. Give three business examples (not mentioned in the text) of data that must be processed…
- 508 Words
- 3 Pages
Good Essays -
3. If you figure the run time for minutes per part divided by the number of employees the processes that take the most time are the clean, coat, and the final test. You would have to increase the # of machines to cut down on the time to clean and coat or increase the # of machines and employees to cut down on final test time.…
- 657 Words
- 3 Pages
Good Essays -
Data will need to be processed in business for marketing purposes such as John Lewis. For example all information need to be processed such as product information or jobs details all that need to be processed in data information. Once it’s finished possessing the outputted data it will show the information of the product which will be outputted data. Every department in john Lewis will make data and other functions of the business will gained from external areas or sources. If the data is incorrect or the outputted information isn’t accurate enough then it will be…
- 505 Words
- 3 Pages
Good Essays -
Operational data are kept in a relational database that structures tables that tend to be extremely normalized. Operational data luggage compartment is optimized to support transactions that symbolize daily operations. For example, Customer data, and inventory data are in a frequent update mode. To provide effective modernize performance, operational systems keep data in many tables with the smallest number of fields. Operational data focus on individual transactions rather the effects of the transactions over time. In difference, data analysts tend to comprise of many data dimensions and are concerned in how the data recount over those…
- 628 Words
- 3 Pages
Good Essays -
Database systems are a way to collect and store large amounts of data. Essentially, database are electronic filing systems that store raw data to be later retrieved as useable information (Skillport, ). Using such a tool simplifies the filing and storage of all sorts of information used by businesses today. A common type of database is a customer/inventory database. Different tables store customer information, past customer orders, inventory counts and distributor information and then this information can be cross-referenced for following inventory pathways.…
- 666 Words
- 3 Pages
Good Essays -
Processing: The data are transmitted to a central computer and stored for retrieval. Data are also reorganized so that they can be tracked by customer account, date, driver, and other criteria such as the consolidation of orders for efficient final delivery of packages.…
- 307 Words
- 2 Pages
Satisfactory Essays -
Summarise the main points of legal requirements and codes of practice for handling information in health and social care.…
- 1180 Words
- 5 Pages
Good Essays -
4. Give three examples of subsystems not operating in the context of IT. Why are these considered subsystems and not systems?…
- 631 Words
- 3 Pages
Satisfactory Essays -
A voluminous and incredible amount of data is produced daily from individuals, businesses, governments across the world of information gathering and sharing. A text message, an email message loaded with images and videos, monthly sales report at a department store, information obtained from a recipient of a government sponsored program, tons and tons of sources are ways in which data is acquired. But when data collected inundates any given system, the frustration upon users becomes a nightmare. The…
- 1242 Words
- 5 Pages
Better Essays -
There are many causes of the American Revolution, but I believe the primary cause was the French and Indian War, which lasted from 1754 to 1763. This war also known as the Seven Years War, changed the relationship between Britain and the American colonies. Not only did this war help start the revolution, but its aftermath did too. A decade of conflicts between British Parliament and the colonists, starting with taxation acts, eventually led to the eruption of the American Revolution in 1775.…
- 1088 Words
- 5 Pages
Good Essays -
Process. The first process is to allow a machine learning algorithm to process the data of the…
- 2242 Words
- 9 Pages
Better Essays -
Processing: The data are transmitted to a central computer and stored for retrieval. Data are also reorganized so that they can be tracked by customer account, date, driver, and other criteria such as the consolidation of orders for efficient final delivery of packages.…
- 2580 Words
- 11 Pages
Powerful Essays