Cloud vs Conventional Web Hosting
The cloud is considered to be the pinnacle of the hosting industry. It is assumed that the cloud virtualizes everything i.e. processing power, RAM, disk space. It is assumed that the cloud can expand to any limit and handle any amount of processing or transfer.
When a business requiring high availability runs their application on cloud infrastructure, it is assumed that the application will not fail for want of any kind of resources. It is assumed that the business can plan their growth without having to worry about scarcity of RAM, bandwidth, processing speed or database backend. The business assumes that their application will continue to run if their customer base suddenly increases tenfold. The business assumes that their application will not stop running even if there is any hardware failure.
Very large cloud providers like Amazon and Azure provide such scalability and reliability to enterprise applications.
A business running applications on the cloud pays per usage. If the application is hardly used, a business pays very little. However, when the application is used heavily the cost goes up exponentially.
Amazon manages the longest running cloud. The recent failure of this cloud broke the cloud myth. Many businesses were down for months and many lost data permanently.
So what is best for a business: the cloud or conventional web hosting or something in between? A cloud is as good as the number of applications that run on it, the hardware it is installed on and the technical people it is supported by. Can the benefits of the cloud be achieved by customization and application-specific hardware architecture? I believe the answer is "Yes".
No comments