Think about it, when was the last time the business said “thank you” to IT? It’s probably been a while. Unfortunately, all too often we hear complaints that IT is too slow, or that IT is the department of “no”.
Deploying a private cloud is one way to help turn IT into the department of “yes”, with faster and more responsive IT service delivery. The customers of Cisco Intelligent Automation for Cloud have compressed the cycle time for IT provisioning from weeks to minutes. That means that project managers and application developers no longer have to wait for IT – they can speed up their projects and get business applications up and running more quickly.
And if there’s one golden rule to remember for your private cloud solution, it’s that the business wants apps. They’ll be thankful if you can provision and manage their applications in a cloud environment with consistency, reliability and speed.
So if you’re interested in on-demand application delivery for your private cloud, check out this presentation from Cisco Intelligent Automation and our ecosystem partner rPath:
Cloud Expo was indeed a very interesting juxtaposition of people espousing the value of cloud and how their stuff is really cloudy. You have a group of presenters and expo floor booths talking about their open API and how that is the future of cloud. Then you have the other camp that tells us how their special mix of functions is so much better than that. All of this is a very interesting dialog. APIs are indeed very important. If your technology is indeed a cloud operating model then you must have an API. Solutions like Cisco’s Intelligent Automation for Cloud rely on those APIs to orchestrate cloud services. But APIs are not the end all. The reality is that while the cloud discussions tend to center on the API and the model behind that API, the real change enabling the move towards cloud is the operating model of the users who are leveraging the cloud for a completely fresh game plan for their businesses.
James Urquhart’s recent blog: http://gigaom.com/cloud/what-cloud-boils-down-to-for-the-enterprise-2/ highlights that the real change for users of the cloud is modifying how they do development, test, capacity management, production operations and disaster recovery. My last blog talked about the world before cloud management and automation and the move from the old world model to the new models of dev/test or dev/ops that force the application architects, developers, and QA folks to radically alter their model. Those that adopt the cloud without changing their “software factory” model from one that Henry Ford would recognize to the new models may not get the value they are looking for out of the cloud.
At Cloud Expo I saw a lot of very interesting software packages. Some of them went really deep into a specific use case area, while others accomplished a lot of functional use cases that were only about a inch deep. As product teams build out software packages for commercial use, they have a very interesting and critical decision point that will drive the value proposition of the software product. It seems to me that within 2 years, just about all entrants in the cloud management and automation marathon will begin to converge on a simple focused yet broad set of use cases. Each competitor will be either directly driving their product to that point, or they will be forced to that spot by the practical aspects of customers voting with the wallets. Interestingly enough, this whole process it drives competition and will yield great value for the VP of Operations and VP of Applications of companies moving their applications to the cloud.
Early in my career I moved quite a bit, new job, growing family, whatever the reason it seemed like every two or three years we were packing up and going to a new place and meeting our new neighbors.
Each new place had its own protocol for getting to know the neighbors, sometimes they came to us other times we had to walk around the block with the kids in tow to make that connection. The benefits of knowing your neighbors are many, who’ll lend you tools, who will help move furniture, etc.
Knowing the device neighbors in you network is just as important and fortunately there is a protocol for that, Cisco Discovery Protocol Cisco Discovery Protocol. This article is a guide to getting to know your UCS Fabric Interconnects’ neighbors in a manual and automated way.
Earlier in my career, I ran a corporate IT and managed services tooling team. I wish it was garage type tools, but it was IT operational management tools. My team was responsible for developing and integration a set of ~20 applications that was the “IT for the IT guys”. It was a great training ground for 120 of us; we worked on the bleeding edge and we were loving it. We did everything from product management, development, test, quality engineering deployment, production and operational support. It was indeed an example of eating your own cooking. Applications where king in our group. We had .NET, J2EE, JAVA, C, C+, C++ and other languages. We have custom build and COTS (commercial off the shelf) software applications.
One day on a fateful Friday, my teenagers happily asleep on a Friday night way past midnight (I guess that made it Saturday), I was biting my nails at 2 AM with my management and technical team on a concall wondering what went wrong. We were 5 hours into a major yearly upgrade and Murphy was my co-pilot that night. I had DBAs, architects, Tomcat experts, QA, load testing gurus, infrastructure jockeys, and everyone else on the phone. We had deployed 10 new servers that night and were simultaneously doing an upgrade to the software stack. I think we had 7 time zones covered with our concall. At least for my compatriots in France it was not too bad; they were having morning coffee in their time zone. Our composite application was taking 12 seconds to process transactions; it should have taken no more 1.5 secs. The big question: can we fix this by Sun at 10 PM when our user base in EMEA showed up for work, or do we (don’t say this to the management) roll back the systems and application…. I ran out of nails at this point…. My wife came into my dark home office and wondered what the heck was going on…..
Recently, a customer asked me what was the value of using automation to operate a private cloud? It was a good question. Working in the middle of the reality distorition field of the cloud industry I take it for granted that everyone knows automation’s benefits.
Fundamentally, automation tools help to reduce labor costs, rationalize consumption and increase utilization.
Costs are lower because the labor required to configure and deploy is eliminate. This automation is possible by creating standard infrastructure offerings. Standard infrastructure offering make possible a new operational model: to move from the artesanal approach of delivering infrastructure ,where every system and configuration is uniqe, to the industrialized approach, that ensures repeatability, quality and agility. It’s the difference between custom tailoring and standardized sizes at The Gap. Both have their place, but one costs more.