Cisco Blogs

How do you measure your Cloud Computing strategy?

January 27, 2011 - 2 Comments

In sports, when looking at the record books we often find that most of the records are held by current players. In most cases this get attributed to the fact that the games change over time and certain elements are emphasized over others. Maybe it’s passing over running in football, home runs instead of stolen bases in baseball, or dunks over jump shots in basketball. Whatever it is, we eventually accept that the new approach to the game is going to lead us to viewing measurements differently.

For some reason, that same mentality doesn’t seem to apply to changes in how we leverage IT technologies to drive our businesses. While we now live in a world where change happens at 2x, 3x and sometimes 5x what it did in the past, but we’re still using measurements that are centered in a world where IT is primarily focused on keeping the operations running and keeping the costs down. Measurements like Return on Investment (ROI) and Total Cost of Ownership (TCO) are primarily focused on upfront costs of equipment or a level of labor that is often ignored after the initial ROI calculations are complete. But what happens when a new system is able to take on orders of magnitude more work them previous systems, which often happens with server virtualization projects, so IT handles more capacity? Do ROI and TCO really account for that increased productivity properly?Are measurements centered on costs really the right way to motivate an IT organization to drive itself from a maintenance-centric approach to an innovation-centric approach? Unfortunately, this happens all too often when IT is viewed as an enforcer of the bottom-line instead of an innovation engine for the top-line

Let me give an example to highlight this concept of IT-the-enforcer vs. IT-the-innovator and how they don’t always have to be mutually exclusive. Most of you have heard of NetFlix and I assume that some of you are now using their streaming services more than having DVDs delivered to the house. NetFlix considers technology to be a core element of their business, but innovation is an even more important element (for good insight, follow @adrianco on twitter). Through the use of a cloud computing model to change their deliver method (DVD’s in the mail vs. streamed movies over the Internet), they created an interesting paradigm within their income statement and P&L. Not only did they blow away their recent financial numbers, but they spent 20x less on deliver charges (bandwidth) than in the past (postage). Innovative thinking and execution from their technology teams was key to driving this significant change in their business.

So just as we’ve seen in sports with numbers in the records books, it’s time to start thinking about how these radically new ways to leveraging IT, cloud computing, should be measured to encourage innovative thinking and account for the new dynamics of these environments.

The first measurement I’d suggest is Total Pace of Innovation (TPI). How do we measure the speed at which we’re able to get from great idea to market delivery? How are we leveraging IT to help us foster those great new ideas? What percentage of impact does IT technology have on this pace and could it be increased to improve it?

The second measurement builds on something that James Staten (Forrester Research) recently mentioned in his predictions for 2011. In the context of Private Cloud computing, he talks about the need to fail fast and fail quietly. I’ve recently written about a similar concept as we look at how we measure the success or failure of IT project portfolios. Knowing that learning comes from mistakes, and learning will enable us to advance along the learning curve more quickly, how do we encourage a culture that measures Faster Time to Failure (FTF)?

The third measurement comes out of the rapid pace in which “Consumerization” is making its way into the day-to-day business operations. This is where pace of change is critical and users make a conscious trade-off between speed and “hiccups” in service. Google became well known for popular serivces being “in beta” for long periods of time, but that didn’t stop them from attracting millions of users. A company mindset that allows Always Beta Enablement (ABE) to permeate certain areas of IT service delivery will be critical to combine rapid deployment with rapid iteration cycles through community-based feedback. Maybe this is 10-15% of projects with a goal to increase it to 25% within a year. Maybe this all new projects that touch your best partners, the ones that are willing to share the risk and reward with your business.

Cloud Computing is a different way of delivering IT services to business users, partners and customers. Not only is it rapidly changing the technologies that are being used, but it will ultimately require a new way of conceptualizing projects and measuring their successes or fast failures. Change is never easy and often times slowed by restrictions rooted in the past. But as Peter Drucker famously said, “if it can’t be measured it can’t be managed.” So how are you measuring your cloud computing strategy and how are you bringing in new metrics to encourage the right kinds of change?

Please visit Cisco’s Cloud Homepage.

In an effort to keep conversations fresh, Cisco Blogs closes comments after 60 days. Please visit the Cisco Blogs hub page for the latest content.


  1. @paolo – I understand that ROI and TCO aren’t going away anytime soon, especially since they tie into larger financial metrics. But I’m curious about how you measure some of those elements I mention above that are more focused on taking risk, accelerating the pace of change and getting people to start using new approaches to driving the business. Do you have a way to do that today within your organization?

  2. Brian,
    Smart post but we are still using ROI and TCO concept because at the end the purchase decision are made by financial people and not by CIO.