Green computing or green IT, refers to environmentally sustainable computing or IT. In the article Harnessing Green IT: Principles and Practices, San Murugesan defines the field of green computing as "the study and practice of designing, manufacturing, using, and disposing of computers, servers, and associated subsystems—such as monitors, printers, storage devices, and networking and communications systems—efficiently and effectively with minimal or no impact on the environment."[1] The goals of green computing are similar to green chemistry; reduce the use of hazardous materials, maximize energy efficiency during the product's lifetime, and promote the recyclability or biodegradability of defunct products and factory waste. Research continues into key areas such as making the use of computers as energy-efficient as possible, and designing algorithms and systems for efficiency-related computer technologies.
Contents [hide]
1 Origins
2 Regulations and industry initiatives
2.1 Government
2.2 Industry
3 Approaches
3.1 Product longevity
3.2 Software and deployment optimization
3.2.1 Algorithmic efficiency
3.2.2 Resource allocation
3.2.3 Virtualization
3.2.4 Terminal servers
3.3 Power management
3.3.1 Data center power
3.3.2 Operating system support
3.3.3 Power supply
3.3.4 Storage
3.3.5 Video card
3.3.6 Display
3.4 Materials recycling
3.5 Telecommuting
4 Education and Certification
4.1 Green Computing Degree Programs
4.2 Green computing certifications
5 See also
6 References
OriginsIn 1992, the U.S. Environmental Protection Agency launched Energy Star, a voluntary labeling program which is designed to promote and recognize energy-efficiency in monitors, climate control equipment, and other technologies. This resulted in the widespread adoption of sleep mode among consumer electronics. The term "green computing" was probably coined shortly after the