Think about it, when was the last time the business said “thank you” to IT? It’s probably been a while. Unfortunately, all too often we hear complaints that IT is too slow, or that IT is the department of “no”.
Deploying a private cloud is one way to help turn IT into the department of “yes”, with faster and more responsive IT service delivery. The customers of Cisco Intelligent Automation for Cloud have compressed the cycle time for IT provisioning from weeks to minutes. That means that project managers and application developers no longer have to wait for IT – they can speed up their projects and get business applications up and running more quickly.
And if there’s one golden rule to remember for your private cloud solution, it’s that the business wants apps. They’ll be thankful if you can provision and manage their applications in a cloud environment with consistency, reliability and speed.
So if you’re interested in on-demand application delivery for your private cloud, check out this presentation from Cisco Intelligent Automation and our ecosystem partner rPath:
Подведены итоги первого Конкурса продуктов, организованного некоммерческим объединением специалистов Virtualization Security Group Russia и порталом VirtualizationSecurityGroup.Ru. В Конкурсе приняли участие продукты для защиты виртуальных инфраструктур компаний Cisco, HP, Trend Micro, «Код Безопасности», McAfee, IBM, Symantec. Также вне Конкурса были представлены продукты компаний ОКБ САПР, Stonesoft.
В номинации «Выбор читателей» победу одержал продукт Cisco Virtual Security Gateway, получивший наибольшее количество голосов в открытом голосовании портала VirtualizationSecurityGroup.Ru.
This is the summary of the first Competition of products organized by the non-profit association of experts, Virtualization Security Group Russia and its portal VirtualizationSecurityGroup.ru. The products that have taken part in the competition for protection of virtual infrastructures of companies include Cisco, HP, Trend Micro, « the Code of the Security », McAfee, IBM, and Symantec. Also products from companies OKB SAPR, Stonesoft have been presented.
In the nomination category « the Choice of readers », a victory was gained by the Cisco Virtual Security Gateway product, which has received the greatest quantity of voices in voting by show of hands on the portal VirtualizationSecurityGroup.ru.
Customers and experts in Russia are realizing the strong capabilities within VSG to not only protect virtual workloads and easily account for application mobility, but the ability to build sophisticated firewall policies based on the attributes of the virtual machine. This allows organizations to create trust zones for classes of applications, by tenant name/owner, by user group, for virtual desktops, by OS, etc., that align with real world policy requirements. Congratulations to VSG and to Cisco’s Server Access Virtualization Business Unit!
I finally took a leap of faith and had LASIK surgery done recently, and without a doubt it’s been a life changing decision. The daily hassle of glasses and contacts are gone, and my vision is now 20/15…it’s like going from regular TV to HiDef! Of course these benefits came with a cost, requiring investments both financial and mental. The financial cost was easy enough thanks to no interest payments, however the mental cost required a careful weighing of risk vs reward and a bit of blind faith (no pun intended). In the end, trust in the technology and the doctor, and the belief that I could find my happy place for 15 minutes to endure the procedure was enough to take the leap. Looking back it was one of my better life decisions.
Shortly after my procedure I was on site at a customer who was implementing a Vblock, and Cisco was engaged for UCS optimization services to follow up the install. For those new to integrated infrastructure solutions, a Vblock is a pre-integrated and tested infrastructure stack with various components across compute, network, and storage. My favorite component hands down is the Cisco Nexus1000 This product replaces the VMware vSwitch functionality with a feature rich Cisco switch powered by NXOS, which this particular customer had no knowledge of. Well, I’m a huge fan of the product, and I knew they would be too once they came to understand it’s use cases and capabilities. I gave their network and server admins a 4 hour overview covering everything from architecture to troubleshooting. The light bulbs went on and they were exchanging smiles about 10 minutes into the presentation when I started talking about the non disruptive operational model and VN-LINK concepts. One of the network admins interrupted me and said “ are you telling me I can get clear vision to the VM level without the hassle of dealing with these guys” as he pointed at the closest server admin. I immediately thought of my new eyes and chuckled at the thought that server admins apparently were as annoying as glasses or contacts to deal with on a daily basis.
Cloud Expo was indeed a very interesting juxtaposition of people espousing the value of cloud and how their stuff is really cloudy. You have a group of presenters and expo floor booths talking about their open API and how that is the future of cloud. Then you have the other camp that tells us how their special mix of functions is so much better than that. All of this is a very interesting dialog. APIs are indeed very important. If your technology is indeed a cloud operating model then you must have an API. Solutions like Cisco’s Intelligent Automation for Cloud rely on those APIs to orchestrate cloud services. But APIs are not the end all. The reality is that while the cloud discussions tend to center on the API and the model behind that API, the real change enabling the move towards cloud is the operating model of the users who are leveraging the cloud for a completely fresh game plan for their businesses.
James Urquhart’s recent blog: http://gigaom.com/cloud/what-cloud-boils-down-to-for-the-enterprise-2/ highlights that the real change for users of the cloud is modifying how they do development, test, capacity management, production operations and disaster recovery. My last blog talked about the world before cloud management and automation and the move from the old world model to the new models of dev/test or dev/ops that force the application architects, developers, and QA folks to radically alter their model. Those that adopt the cloud without changing their “software factory” model from one that Henry Ford would recognize to the new models may not get the value they are looking for out of the cloud.
At Cloud Expo I saw a lot of very interesting software packages. Some of them went really deep into a specific use case area, while others accomplished a lot of functional use cases that were only about a inch deep. As product teams build out software packages for commercial use, they have a very interesting and critical decision point that will drive the value proposition of the software product. It seems to me that within 2 years, just about all entrants in the cloud management and automation marathon will begin to converge on a simple focused yet broad set of use cases. Each competitor will be either directly driving their product to that point, or they will be forced to that spot by the practical aspects of customers voting with the wallets. Interestingly enough, this whole process it drives competition and will yield great value for the VP of Operations and VP of Applications of companies moving their applications to the cloud.
Big Data’s move into the enterprise has generated a lot of buzz on why big data, what are the components and how to integrate? The “why” was covered in a two part blog (Part 1 | Part 2) by Sean McKeown last week. To help answer the remaining questions, I presented Hadoop Network and Architecture Considerations last week at the sold out Hadoop World event in New York. The goal was to examine what considerations need to be taken to integrate Hadoop into Enterprise architectures by demystifying what happens on the network and identifying key network characteristics that affect Hadoop clusters.
The presentation includes results from an in depth testing effort to examine what Hadoop means to the network. We went through many rounds of testing that spanned several months (special thanks to Cloudera on their guidance). Read More »