Category: CommunityRed Hat’s CEO Is Right About Cloud’s “Obscene Cost”

Share this post...Tweet about this on TwitterShare on Google+0Share on Facebook0

The Clouds Obscene CostRed Hat CEO Jim Whitehurst put the cat among the pigeons recently with comments about the cost of cloud platforms. He said:

“The public cloud gets to become obscenely expensive at scale … for workloads that don’t vary a lot in usage.”

Let’s be clear about what Whitehead is saying here. He is not saying that the average cloud server is more expensive than the average dedicated server. He’s making the specific claim that for most large-scale deployments, in-house infrastructure is more cost-effective than cloud deployment.

Many enterprise IT folks make the same discovery. The public cloud is useful in situations where workloads are highly variable. In those cases, public cloud elasticity offers enough value to overcome performance issues and the cost of maintaining sufficient redundancy to create a reliable platform.

But, in real-world enterprise scenarios, workloads aren’t usually that variable. They are relatively fixed and predictable. Of course, the ability to scale is essential, but there’s no difficulty in scaling physical infrastructure when workloads are predictable — they grow at a predictable rate and they don’t expand or contract rapidly enough to require an elastic platform.

For massively variable workloads, the cloud is cost effective, because in-house deployments require capital investment in infrastructure that exceeds average requirements by a considerable margin. Workloads need space to grow, requiring redundant infrastructure. And they need the ability to shrink, which will frequently leave much of the infrastructure idle.

This is the point made by AWS Data Scientist Matt Wood in response to Whitehead:

“You need an environment that is flexible and allows you to quickly respond to changing big data requirements”

But that response both misses the point and makes an assumption for which there is no clear evidence. The assertion that businesses need a flexible environment which can respond to rapidly changing big data requirements is all well and good, except that most enterprise IT deployments do not need to respond to rapidly changing big data requirements. They need to respond to predictably changing demand and provide reliability and performance over the long term.

As Whitehead responds:

“A number of enterprise customers tell us that apps that don’t vary a lot in usage are significantly cheaper to run in their own data center than on the public cloud.”

The issue here is that the public cloud has been touted as the IT platform of the future, and its adherents will brook no implication that in many cases, other solutions can generate a better ROI for businesses.

No one could reasonably argue that public clouds are not useful if you want to quickly deploy a server as a development or test environment. No one would reasonably argue that the public cloud doesn’t benefit a company like Netflix that has scaling demands that see thousands of servers spun up and down every day. But it is entirely sensible to maintain that for the large-scale semi-invariant and predictable workloads that form the bulk of enterprise IT demand, in-house bare metal infrastructure is often the best option.

Image: Flickr/perpop

CloudRed Hat
Jun 16, 2015, 1:01 pmBy: Corey Northcutt (0) Comments

Leave a Reply
Surround code blocks with <pre>code</pre>

Your email address will not be published.

Newsletter

Sign up to receive periodic InterWorx news, updates and promos!

New Comments

Current Poll

  • This field is for validation purposes and should be left unchanged.

Forum Posts