Over a million developers have joined DZone.

Understanding the New Economics of Cloud Computing

DZone's Guide to

Understanding the New Economics of Cloud Computing

· Cloud Zone
Free Resource

Production-proven Mesosphere DC/OS is now even better with GPU scheduling, pods, troubleshooting, enhanced security, and over 100+ integrated services deployed in one-click.

Cloud computing is changing the way we look at IT costs, according to industry experts on a recent Cloud Luminary Fireside Chat panel discussion.

Enterprise IT, traditionally viewed as a cost center, now plays a central role in the delivery of software-driven goods and services. Therefore, companies need to understand their cloud utilization and resulting costs in order to ensure profitability on their business offerings.

Led by Bernard Golden, this fireside chat offers valuable insights on how organizations can get a better handle on their use of cloud computing.

Enjoy the full recording below, and highlights further down.

Participating in this panel discussion were:

  • John Cowan (@cownet), Co-Founder and CEO, 6fusion
  • Sharon Wagner (@Sharon_Wagner), Founder and CEO, Cloudyn
  • Owen Rogers (@owenrog), Senior Analyst, Digital Economics, 451 Research

What "Cloud Utilization and Cost Analytics" Means and Why It's Important

Sharon Wagner: Utilization and cost analytics is the ability to understand how your cloud deployment behaves from a usage and cost perspective. In public cloud specifically, whenever you spin up new servers or databases or you use more and more storage, you pay as you go, and therefore usage and cost are tied together. When you over-provision on your resources you would pay more. Therefore, it's very important to define a set of cost utilization as well as performance metrics that will help us understand who used what, when, and how we can reduce the cost and improve the performance to avoid the budget violations or even performance issues in our cloud deployments.Sharon Wagner

Bernard Golden: One of the key issues around a business offering is, all of a sudden the cost of provisioning a service is really critical. So, for example, if you're driving a Marketing campaign to generate leads, it makes a big difference if you know that the value of every lead is say $10, whether the cost of getting that lead is $5 or $15. One is probably pretty opportunistic and very good, the other is you're going to lose money for every one you do. And that gets affected by, in this cloud computing world, how much computing resources you use. This moves from an IT-centric cost management, "how do I reduce my total cost of ownership?", to "how do I understand what the cost of goods sold is?".

John Cowan: I would go one step further on that and just say that if the philosophy of cloud computing is to treat compute, network and storage as a legitimate and true utility, then utility economics needs to be able to be applied to this industry in order for buyers and sellers to make sense of it. And in that vein, I would go so far as to say that the concept of TCO (total cost of ownership) really doesn't make any sense to consumers or users of cloud computing. What we found is that a more appropriate term is total cost of consumption, or TCC. What your business unit owners really care about is their bill of IT for running or hosting an application. They could care less that you got a good deal on your hard drives or your servers. What they care about is what the running cost is of an application that they are either bringing to market or using for an internal productivity suite for their business unit.

Owen Rogers: And for me, it comes back really to the concept of value rather than cost. I think businesses and organizations...don't mind consuming resources, they don't mind paying for things, but they naturally want to make sure that they are getting value from what they're purchasing. So for me, these cost and utilization analytical tools are a way of making sure that if an application is scaling, or if resources are being consumed, that they're actually delivering some kind of business benefits.

Are Other Companies Tracking Their Cloud Costs?

Owen: We have a commentator network called TheInfoPro, and these are thousands of end users who are consuming IT and cloud services as part of their job. And we survey them all the time to understand what their spending habits are...And it turns out that in one of these surveys, we discovered that 25 percent of the enterprise end users weren't doing any cost analysis at all on their use of cloud services, and for me that's a really terrifying statistic because the whole point of the cloud is that it's variable and scaleable, and the on-demand purchase and procurement of cloud means that you should be able to purchase it up and down whenever needs change. And I think because only 25 percent of users are actually keeping track of what they are doing, this explains why we found that 33 percent of users weren't confident that they had a good control over their costs.

The Need for a Standard Unit of Measurement

Sharon: One analogy that we typically use when we talk to clients that are asking us about the value of monitoring utilization costs...is the analogy of miles per gallon. When you ask them how do they measure the value they get from cloud computing, they will give you 10 different parameters or operational metrics, like, CPU, utilization, throughput...and when you ask them, "well, can you guys compare it to the way you buy a car?" it's very simple. You buy an effective car by comparing the miles per gallon you're going to pay. So we expect our clients to do the same.

John CowanJohn: It's interesting to talk about miles per gallon, but what happens when internal IT and nine different vendors that are out there in the market offering a service, define the gallon differently? How do you actually create economic transparency and interoperability comparisons in a meaningful way between legacy IT and on-demand service?...If I'm the business unit owner or I'm the CIO, how do I make decisions about where apps should go if everybody is measuring this thing differently?...We created a unit value Workload Allocation Cube that was representative of consumption across six vectors: CPU, memory, storage, disk, LAN, and WAN I/O..."My internal cost of operation is $X per WAC unit and my supplier's price is $X dollars per WAC unit." Now we're informing a much more real-time conversation about buying and selling, which is exactly what's going on the CIO's office.

Should We Use Excel?

Owen: We found that 50 percent of end users were still using Excel to really understand what their cost was. And again, Excel is great when we know what's going to happen, when we have everything up front, when costs are fixed and we understand the capital and the fixed operating costs. But when cloud costs vary month in, month out and we have different business objectives we want to meet, the whole "spreadsheet in advance" approach falls apart, and that's why we need tools to really understand what's going on.

John: To Owen's point, doing (a standard unit of measurement) on spreadsheets is interesting once. Try to do that in real time, which is the pace of the utility.

Sharon: I think the real time is a good point. If you take a look at the nature of applications in the cloud, they're all very, very dynamic. I'll give you some statistics around it. We monitor around 100,000 virtual instances daily, which represents around 10 percent of the Amazon capacity worldwide. 86% of them are started and stopped two times a month, which means that it's very dynamic. Now try to build capacity and do some capacity management exercise in an Excel spreadsheet for servers that start and stop during the month two times or more. It's very, very difficult. And that's one of the reasons why you see so many instances and resources that are floating out there in the air, actually cloud, significantly underutilized.

Accountability is Key

Owen: It's one of those things, you don't appreciate how important it is until you have that bill at the end of the month when you realize you have this legal liability...Now, if I'm an enterprise and anyone can start up a virtual machine or can consume a resource and can make mistakes, which means they're left on and bringing no business value, then this is going to mount up to be a lot of costs in the long run. And it only takes one experience of looking on your AWS or your Google or your Microsoft bill at the end of the month, seeing you've got all these charges on the bill and suddenly realizing that this is a financial liability. You've already consumed those resources. It is now your job, it is your role and your liability to pay them back...Most CIOs, most CFOs realize the importance of understanding and getting involved in their cloud costs.

Sharon: I want to tell you an interesting story I had from one of our customers....Amazon has a specific pricing module called Reserved Capacity so you can buy capacity up-front. So the organization reserved capacity for...each one of the business units, and it happened that reserved capacity had been allocated separately, and some of the business units actually went ahead and spun up additional on-demand resources. So what happened is that...by the end of the year, the customer paid twice -- for the capacity that they bought and they didn't allocate properly and for all the on-demand resources that people in the different business units spun up without knowing that there was reserved capacity out there...So organizational accountability is an important behavior, or practice, that customers have to implement.

On Public Cloud Costs

John: It's unacceptable for a cloud provider to not have customer APIs into things like consumption and billing. There will come a point in the very near future...when it becomes a minimum standard for cloud operators to provide that kind of data. Until recently in our world, it was really about AWS, who had a very mature API...vs. the well-entrenched install base of the virtualization platforms like VMware or Xen. Just providing the capability for a customer to compare their utilization across a virtualization environment...and give a view as to what that might look like if it were actually hosted on an AWS-type platform...that's an extremely important data point from the CIO's decision aspect.

Sharon: Specifically when speaking about scaling in and scaling out in public cloud computing, sometimes I would say that performance is mitigated with additional capacity... So, just make sure that when you provision a resource in public cloud, it has the safe provisioning approach and not the over provisioning approach, so that if you scale more and more resources, you will still keep the environment perfectly provisioned to your performance needs.

Bernard: So, If you're somebody who drinks one glass of milk a day, you should buy by the quart, whereas if you're you have a family with six kids, you should be buying it by the gallon.

On Private Cloud Costs

Bernard: At least in my experience most internal IT organizations…don't really know what all their costs are. And it's particularly exacerbated by the fact that facilities may pay for the buildings and the electricity, somebody else buys the servers, a third person is doing the network connectivities coming through a telecoms group.

Sharon: I don't think that many people are actually putting effort into it, but they are more into new workloads that they want to introduce and make a decision whether these workloads are going to run internally, OpenStack, VMware...or externally to public cloud, or sometimes using hybrid module of resource bursting. At that point in time, they really need to put their hands around consumption and clearly understand what is the true cost of their application to run in the private cloud. And they are using three different variables...based on the ABC model, activity based costing...So they are doing this exercise, but they are specifically focusing on new workloads that they have to introduce.

Owen RogersJohn: I can give you a granular cost per unit of consumption for an application every five minutes of every hour of every day. However, if you lie to yourself in terms of the cost to actually calculate your internal rate of production, it's really only as good as you can throw it...This is why we monitor and watch the organizations that are emerging now to standardize cost methodology. There are at least three organizations that are very active now to build best practices around what large organizations should be thinking about when they're doing their cost allocation methodologies...So I think I think you'll see best practices emerge that produce standards around the things that you really want to incorporate when you're calculating your internal cost of production.

Owen: It used to be so easy before when it was just a Marketing team or a Sales team and they would have their budget allocated. At the end of the month it would be so easy to say, "Oh well, the Marketing team has spent X and the net revenue they brought in was Y." Simple equation. But now organizations are correctly moving away from this functional level and now looking at applications and where they are deriving value...In fact, when we were surveying all these end users, one of the revealing statistics for me is that 71% of respondents rated the non-IT roadblocks as their biggest barriers to cloud adoption..resistance to change, relative to the cost models, people, time, the organization budget, and regulation and compliance.

On Legacy Systems

Bernard: I was going through a data center with a very large telco, and they were kind of going "this is our new cloud stuff"...so we got onto one floor, and there was a bunch of machines, and I said, "what is all that stuff?" And they said, "Well, nobody really knows, but we don't want to do anything to it because it might break something."

John: It's the expiring nuclear plant approach to IT reduction. Just leave it until it's absolutely necessary to pour cement on it.

On Hybrid Cloud Costs

Question from the audience: I'm running a hybrid deployment. How can I measure my current costs and identify my projected as a total?

Sharon: I would go with the consumption first. So let's assume that we know how to meter and measure costs in public cloud. Let's go into the private cloud for a second. Assuming you're on virtualized environment, like OpenStack or VMware...Our install base indicates 75% of cloud cost is coming from compute, then from storage or networks, so I would focus on compute first. I would try to define a cost per flavour in OpenStack or virtual instance in VMware, and multiply it by the number of hours it has been running. It will give us a very good indication for what's the cost in hybrid cloud. That's where I would start.

John: We view that the starting point to this is to have an apples-to-apples equation. Meter your internal environment using our technology, you will have a cost-per-unit established for your internal cloud...and then do the same thing on the public side. So you've got cost uniformity across the hybrid environment, and then you can start to see, as you collect data, where the patterns emerge and a forward-looking analysis for projections.

Owen: I think the real challenge in all of this, is about the variability and the scalability. If you know how many virtual machines, how much storage you're going to use and you have that at a fixed level for three years, then actually working out the cloud costs isn't so much a challenge...I think what's more crucial is working out what your likely scalability demands over the next few years ago is going to be. And I think as well as having a likely one, you also need a worst-case and a best-case.

Predictive Analytics/Predictive Modelling and Risk

Bernard: You've almost sort of implied that you need to use Monte Carlo simulations for this. How much (are companies doing) in terms of forward projection?

Owen: I think this is all really a question of risk. So, if an organization is fully cut out to the on-demand way of doing things, then it can scale up and down and it can take risks knowing they've really bet very little on taking that risk. But if an organization is more traditional and they need budgets approved in advance and they have to make a huge expenditure before actually realizing if they paid off or not, then that risk is a lot greater...I think that kind of granular risk management is important, but only if you're the type of organization that isn't fluid enough to be able to to cope with taking small risks because you're fully utilizing the cloud and the organization is resolved around that consumption.

Sharon: The way we answer questions like, "What will my consumption look like?", we use a baseline of your existing cloud deployment and extrapolate or estimate the projected cost...We get into pretty accurate numbers...I would like to add one more thing what Owen mentioned on risk. Many companies that are focused on growth will reduce their risk by going to the public cloud...Once they have reached a certain size, they will reconsider their decision and may go back into private cloud and invest in hardware and software. In order to do that, they need to keep their applications portable in a way that they would be able to transit them back and forth between public and private clouds.

Bernard: Sounds to me like you're making a sales pitch for using a Platform-as-a-Service (PaaS) product...So John,...you might say this is moving forward to "you might want to hedge that risk."

John: At the end at the end of the day this is absolutely about risk mitigation and opportunities to de-risk this for both the buyer and seller, which is why we did the deal with the Chicago Mercantile Exchange...to create, effectively, a legitimate spot exchange between heterogeneous providers and a plethora buyers on the outside. The important point about this is really about price discovery...the supply side can sell on volume on future capacity and...large-scale buyers or intermediaries can buy on volume and distribute those contracts and ultimately resell them if they have to, to manage risk, to hedge risk, but most importantly to give that...data point to the buyer of price discovery. What is the market price for the type of application pattern that I'm hosting internally or externally to deliver service.

Closing Thoughts

Bernard: If you're interested in that world that Sharon talked about, being able to migrate applications...I would encourage you to download the Stackato micro cloud or 20GB cluster.

Sharon: Invest in cloud but make sure that you always measure the ROI.

Owen: Watch out for 451 Research's Cloud Price Index for a handle on how much cloud costing is changing.

John: You can't have a utility if you don't have a single unit of measurement.

Simply build, test, and deploy. Mesosphere DC/OS is the best way to run containers and big data anywhere offering production-proven flexibility and reliability.


Published at DZone with permission of Mike Kanasoot, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.


Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.


{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}