Is your organization moving to a cloud model through a well thought out RFP with at least 40 requirements? May I suggest that you rethink this model. The RFP approach with a committee generated wish list may work in some situations, or even be required, but in general the IT shops that really differentiate themselves go Agile for the cloud. What does that mean?
In our business unit we have turned the development of our Cloud Automation platform: Intelligent Automation for Cloud to an Agile development methodology and process. This means when I ask if we will have a certain feature in our 3.1 version, I get an unexpected answer: we won’t know until close to the ship date. Going agile means we work off a backlog of user stories versus a hard and fast set of features that MUST be in the release. We can ship at anytime with the right methodologies in place.
This approach also works for our customers in building their clouds with our software stack. Agile cloud builders have a set of cloud user stories that they are implementing and may release the updated version of the cloud functionality every quarter, or even every 2-3 weeks. When relaying this approach that one of our customers is taking to another customer considering our solution, I could see a twinkle in his eye as he said: I bet that could really help differentiate the value the IT organization provides. He got that right.
We sell to customers who have RFPs and those who look for capabilities, roadmaps, and more importantly an alignment of vision and approach to cloud automation. Many cloud builders look for vendors who will grow with their agile cloud and one that has an open and extensible model to build new use cases with. Why is that of paramount importance? If you think you know what your cloud needs six months from now, good luck. If you bet on the fact that your business and technology requirements will change before you get to your next release of your cloud you will need an agile cloud builder methodology.
Back to responding to, SIGH, another RFP.
Tags: Cloud Builder, intelligent automation, orchestration, private cloud
So, goings on with OpenFlow and the Open Networking Foundation (ONF) are always lively topics for discussion. Since our announcement of Cisco ONE at CiscoLive, a number of folks have asked me if the announcement of our strategy changes our view of the ONF or the role of OpenFlow—the short answer is, simply, no.
We continue to strongly support ONF and its efforts related to SDN and our support has and will continue to been demonstrated in tangible ways. One of the elements of the Cisco ONE announcement is onePK, which is an enabling technology and one of the things it has enabled is the development of our OpenFlow agents. Similarly, we have introducing controllers and working with our customers to develop the technology.
What seems to surprise a lot of folks is that our contributions to ONF go beyond our own internal development efforts:
Technology Advisory Group – Chartered to provide high-level guidance on any technical issues faced by the ONF Board in which feedback is requested.
Hybrid Working Group – Document the requirements for a hybrid programmable forwarding plane (HPFP).
- Chaired by Jan Medved
- Hybrid Use-cases document: Co-author: Bhushan Kanekar
- Hybrid Switch Architecture – Integrated: Co-author Bhushan Kanekar
- Hybrid Switch Architecture – Ships in the night: Co-author Dave Meyer
- Terminology document: Co-authors: Dave Meyer, Bhushan Kanekar
Beyond these two working groups, the Cisco folks, including Jan Medved, David Meyer, Josh Littlefield, Andrew Thurber, Alex Clemm, Mark Szczesniak and Bhushan Kanekar have been active in other workgroups including the Configuration & Management Working Group and the Extensibility Working Group.
Beyond these efforts, David Meyer has been a rock star across the board including contributions to the “OF futures” discussions and recently received an award from the ONF for his contributions.
To net things out, Cisco expects to be a pacesetter with regards to network programmability and SDN and our efforts with ONF will continue to be part of that strategy.
Tags: ONF, Open Networking Foundation, OpenFlow, SDN, standards, Standrds
It seems convergence is accelerating in our daily lives with new apps on our mobile devices, cars with Siri voice control, not to mention the way we consume video. Converged infrastructure is all the rage in IT as well. VCE, the joint venture between Cisco, VMware, and EMC, has a very successful converged infrastructure offering in Vblock. Now customers have another choice, VSPEX, a set of solutions offered through our mutual Cisco and EMC channel partners.
Check out this brief video from Cisco Live in San Diego. Cisco Josh Atwell (@Josh_Atwell) speaks with EMC Fred Nix (@Nixfred) and Nexus IS Colin McNamara (@colinmcnamara) about VSPEX.
Read More »
Tags: Cisco Live 2012 San Diego, private cloud, reference architecture, vspex
Look around in your IT shop. Do you have a single large printout page denoting the graphic of the IT Enterprise Architecture in your company? Does Zachman ring a bell? Do you have Data, Process and Deployment views documented? Do you have an Enterprise Architect?
If you answered YES to most if not all of these then you better take a seat and then throw this all out. Get the biggest shredder you can or just light a match to those artifacts. Big IT architecture is dead. Some would say we the practitioners never really got there. I agree with that. Management turnover and turnover again, ITIL deployments, imploding financial systems and reductions in funding, virtualization that sneaked in the back door, cloud that entered through the front door; this all worked against us in building the perfect system model to live out the decade, let along the most recent fiscal quarter.
If you answered MAYBE or NO to most of those questions, good for you, but be careful. I will explain about that later. Monolithic IT architectures are gone. Do we really have a single version of the truth in that relational database? Probably not. Why is this important? The pace of innovation in the deployment of IT systems to solve real life problems at speed and scale has increased. In some ways we are willing to compromise on those desires for five nines of reliability to get the business results quickly.
Do you still need a well thought out architecture for your deployed systems? Of course! Do you need to design those deployment views for new models of application resiliency, ecosystems of federated data models, and the conclusion that even the CIO’s office can’t really control what the end users do with technology? Absolutely.
Why is this important to you? No matter what part of IT or the business you are in, make a small subtle shift in your psyche. Stop trying to control what you cannot. Focus on the end outcomes, and strive to make your piece of IT process or technology listen to your customers. If you are an architect, go broad, real broad, but focus on the micro-architectures. If you are a technologist, don’t just dwell on the speeds and feeds. Live a few days in the life of your users. Manage the change that occurs through small impactful steps.
Back to building flexible automation for fast moving architectures.
Tags: intelligent automation, IT architecture
In about 2 weeks there will be a great webinar panel discussion on the business and technology architecture concerns in automating your cloud and how to measure the value. Unleashing automation solutions to do what they do best may make or break a company’s IT strategy over the next few quarters as those cloud journeys begin.
The webinar, IT Automation Unplugged, a panel discussion moderated by Glenn O’Donnell of Forrester will indeed be a cool event to listen in to. Not only has Glenn followed this space for many years but he also has some really insightful perspectives on the Journey to Cloud. This webinar has the potential to highlight some really pointed dialog between myself and Brad Adams of rPath, Nand Mulchandani of ScaleXtreme, and Luke Kanies of Puppetlabs. I bet the sparks might fly as we trade our perspectives on the huge demand for private and public clouds and need for enterprises to show value quickly.
This brings me to a great phrase I heard this week from one of our customers. It was used in the context of their employees using their company’s private cloud. It was “High Governance”. It was seriously lacking in their current solution which highly leveraged their virtualization vendor’s software. I probed them on what they meant by “High Governance”. It was mostly around ensuring that individuals that provision services would get access to only the services, cloud data center locations, and specific providers that they are entitled to. While this is not a new concept, the element that grabbed my attention was that IT shops have a strong need for different sourcing strategies based upon end user role, organization, location, and any number of policy settings in their Active Directory or LDAP.
“High Governance” means ensuring that your cloud users get ONLY what they are entitled to in your IT policy. No more generic UIs for generic users or uber UIs for unknown hypothetical users. The cloud is now a strongly governed personal experience, what a novel concept.
I wonder what the panel will think about this. Please attend if you get a chance.
Tags: automated provisioning, Cisco Intelligent Automation for Cloud, Governance, intelligent automation, orchestration