August 13, 2010 – Cloud Computing refers to a new technology whereby everything is a shared resource. Software and information are provided to computers and other devices on demand, much like the existing electricity grid.
The Cloud is a metaphor for a collection of items and in this case, computers which came from the diagrams used to show computers in a flowchart (in schools and training sessions).
This ingenious idea dates back to the 1950’s when an ideologist John McCarthy (pioneer in mathematical theory of computation, artificial intelligence, programming languages and inventor of the Lisp programming language) said: “All computation may someday be organised as a public utility.”
He was right.
Users do not have the resources on their actual endpoints but can access them on a whenever-wherever basis. The main driver of this is a paradigm shift from the precarious mainframe technology to the more the Server-Client based technology whereby information or data is stored at a central location, and there is more use of virtualisation techniques and scalability of the aforementioned resources.
It relies on the architecture of applications being held at a primary point and what happens on the client-side is just a user defined experience. Major factors that aid in the growth of Cloud Computing are reduction of Capital expenditure as users need not invest in the entire infrastructure and one pays for only what they use and the availability of the resource from anywhere.
Among the items that some cloud hosts charge for are data transfer in and out, storage (measured by the Gigabytes), Input and Output requests, IP addresses and load balancing. In some cases, users can bid and get cheaper services with prices largely dependent on demand for available instances and the actual availability of the service.
Cloud Computing is definitely the FUTURE.