Trade in Information Technology and U.S. Economic Growth Entrepreneurial enterprises in the United States invented most of the information technology that we use today, including computer and communications hardware, software, and service. In the 1960s and 1970s, companies like IBM and DEC, which developed first mainframe and then midrange computers, led the information technology sector. In the 1980s, the locus of growth in the sector shifted to personal computers and the innovations of companies like Intel, Apple, IBM, Dell, and Compaq, which helped develop the mass market for the product. Along the way, however, something happened to this uniquely American industry—it started to move the production of hardware offshore. In the early 1980s production of “commodity components” for computers such as dynamic random access memory chips (DRAMs) migrated to low-cost producers in Japan, and then later to Taiwan and Korea. Soon hard disk drives, display screens, keyboards, computer mice, and host of other components were outsourced to foreign manufactures. By the early 2000s, American factories were specializing in making only the highest value components, such as the microprocessors made by Intel, and in final assembly (Dell, for example, assemblers PCs at two North American facilities). Just about every other component was made overseas—because it cost less to do so. There was a lot of hand-winging among politicians and journalists about the possible negative implication for the U.S. economy of the trend. According to the critics, high-paying manufacturing jobs in the information technology sector were being exported to foreign producers. Was this trend bad for the U.S. economy, as the critics claimed? According to research, the globalization of production made information technology hardware about 20 percent less expensive than it would otherwise have been. The price declines supported additional investments in information
Trade in Information Technology and U.S. Economic Growth Entrepreneurial enterprises in the United States invented most of the information technology that we use today, including computer and communications hardware, software, and service. In the 1960s and 1970s, companies like IBM and DEC, which developed first mainframe and then midrange computers, led the information technology sector. In the 1980s, the locus of growth in the sector shifted to personal computers and the innovations of companies like Intel, Apple, IBM, Dell, and Compaq, which helped develop the mass market for the product. Along the way, however, something happened to this uniquely American industry—it started to move the production of hardware offshore. In the early 1980s production of “commodity components” for computers such as dynamic random access memory chips (DRAMs) migrated to low-cost producers in Japan, and then later to Taiwan and Korea. Soon hard disk drives, display screens, keyboards, computer mice, and host of other components were outsourced to foreign manufactures. By the early 2000s, American factories were specializing in making only the highest value components, such as the microprocessors made by Intel, and in final assembly (Dell, for example, assemblers PCs at two North American facilities). Just about every other component was made overseas—because it cost less to do so. There was a lot of hand-winging among politicians and journalists about the possible negative implication for the U.S. economy of the trend. According to the critics, high-paying manufacturing jobs in the information technology sector were being exported to foreign producers. Was this trend bad for the U.S. economy, as the critics claimed? According to research, the globalization of production made information technology hardware about 20 percent less expensive than it would otherwise have been. The price declines supported additional investments in information