One of the cloud’s major selling points is its cheapness when compared to a dedicated server. Cloud vendors shout from the rooftops about the cost benefits of moving to the cloud. To the casual observer, they have a good case. It’s possible to lease a cloud server for a few dollars a month. A dedicated server will cost significantly more. But the true situation is more complex, especially where web hosting is concerned. To make a true comparison, we’d have to compare like for like, considering the cost of each — but cloud servers are a completely different beast.
When you think of a server, what comes to mind is probably a unit consisting of the components that familiarly make up a modern computer: CPU, RAM, hard drive, and the various buses that connect them together. That’s what a dedicated server looks like. It’s a machine that sits in a rack in a data center and everything in that machine is at the disposal of the dedicated server’s owner or lessee.
Cloud servers are different. The most obvious difference being that cloud servers are virtualized instances running on top of physical machines, a circumstance that already gobbles up some of the underlying resources — the virtualization layer takes a chunk of processing power and memory.
But I’d like to concentrate on storage and, specifically, the I/O performance of cloud servers vs. dedicated servers. As I already mentioned, in a dedicated server the hard drive or SSD usually sits inside the server, connected by very fast short-range buses that are capable of moving large amounts of data extremely quickly. Those connections are not used by any other servers.
Cloud platforms usually separate storage from the server. The disks you see on a cloud server are actually abstractions from large pools of storage known as SANs (Storage Area Networks). The processing and memory of a cloud server are connected to its storage over a network connection that is used by lots of different cloud servers.
As you can imagine, the speed of such connections is not all that it might be. The distances are longer, latency is increased, and I/O performance is dependent on the conditions of the network and how many other servers are competing for bandwidth.
Additionally, the physical disks underlying the SAN are likely to be used by more than one server, so reading and writing to the disk are also not as responsive as in a dedicated server.
If you’re hosting a website or an eCommerce store in the cloud, these latencies can quickly add up, resulting in a less than ideal performance for users, which, as we know, is likely to negatively impact revenue.
At the bottom pricing tier of cloud platforms, connections to the SAN (or SANs) can be saturated, and, even on more expensive tiers, servers will be competing for I/O. Many cloud platforms offer the ability to pay for decreased latency, but it’s expensive — one of the reasons that cloud servers aren’t really less expensive than dedicated servers — to get equivalent performance you pay end up paying more, as we’ve discussed in previous articles on this blog.