2.0. However, there are dozens of different definitions for Cloud
Computing and there seems to be no consensus on what a Cloud is.
On the other hand, Cloud Computing is not a completely new concept; it has intricate connection to the relatively new but thirteen-year established Grid Computing paradigm, and other relevant technologies such as utility computing, cluster computing, and distributed systems in general. This paper strives to compare and contrast Cloud Computing with Grid Computing from various angles and give insights into the essential characteristics of both.
1 100-Mile Overview
Cloud Computing is hinting at a future in which we won’t compute on local computers, but on centralized facilities operated by third-party compute and storage utilities. We sure won’t miss the shrink-wrapped software to unwrap and install.
Needless to say, this is not a new idea. In fact, back in 1961, computing pioneer John McCarthy predicted that
“computation may someday be organized as a public utility”— and went on to speculate how this might occur.
In the mid 1990s, the term Grid was coined to describe technologies that would allow consumers to obtain computing power on demand. Ian Foster and others posited that by standardizing the protocols used to request computing power, we could spur the creation of a Computing Grid, analogous in form and utility to the electric power grid. Researchers subsequently developed these ideas in many exciting ways, producing for example large-scale federated systems (TeraGrid,
Open Science Grid, caBIG, EGEE, Earth System Grid) that provide not just computing power, but also data and software, on demand. Standards organizations (e.g., OGF, OASIS) defined relevant standards. More prosaically, the term was also co-opted by industry as a marketing term for clusters. But no viable commercial Grid Computing providers