At Cisco Live, I had the good fortune to sit down with Steve Kaplan of Presidio (@ROIdude), and Sreekanth Kannan of VMware, to discuss the current landscape of desktop virtualization as seen through the experiences of our customers, key enabling technologies we’re excited about, and some thoughts about what to expect looking forward. Watch the session
For the most part, my last post was concerned about what Cisco ONE was, so explore a little more into the why. I am going to assume you read my last post, so let’s dig in. One of the fundamental concepts behind ONE is illustrated below–the idea of exposing the network in a highly granular way and emphasizing the ability to not only exert programmatic control over switch behavior, but the ability of the network to present interesting and useful information back up to the applications.
Great news for SAP users! Cisco’s industry leading IT process automation software based upon the Cisco Process Orchestrator and our knowledge based automation solution will be sold by SAP.
This solution addresses market demand for IT process automation, helping IT staff standardize and unify operational processes across the enterprise to support maximum up-time and optimal resource usage. What a big deal for SAP customers who will now get the advantages of this automation to achieve their business goals. We have well over 300 out of the box automation workflows that drive operational excellence. Imagine being the person who used to do all of this manually. Sign me up for:
automated health checks
unified incident response
predefined corrective actions
the ability to create reusable workflows with your best practices through a drag and drop editor
In fact, the core software that drives this is the same software that orchestrates our Intelligent Automation for Cloud Solution. Very cool that we can solve such different problems (SAP IT Process Automation) and Cloud Orchestration (provisioning of resources for physical and virtual servers) with the same software product.
This automation product will be co-marketed and sold with SAP Landscape Visualization Management (LVM) solution as part of SAP’s virtualization and cloud solution offering.
I want to extend a shout out to Eric Robertson,
our Product Manager for our SAP solutions and stay tuned for even more exciting solutions based upon Cisco Intelligent Automation.
Last month, James Sharp wrote about Cisco Advanced Services, and how together with partners, Cisco can offer Canadian businesses the services they need, when they need them.
James goes on to say how there are specific ways advanced services can help depending on which stage of the lifecycle you are currently in. This makes me think of food, and how meals during the day help keep me running efficiently and effectively, just like the services Cisco offers for your data center during different stage of the lifecycle process.
Didn’t your parents always tell you that breakfast is the most important part of the day? Well, without proper planning, you will certainly be in a hard place to begin building. Planning services assess your current data center and provide in-depth recommendations for building and managing.
Lunch involves design and implementation. No, there’s no nap after.
If you don’t have hearty dinner, you may not be on the top of your game for the next day. Managing is all about being proactive by anticipating issues before they arise and optimizing performance.
And as for dessert? Read what James has to say about his personal experiences with advanced services and Cisco customers:
Last week we participated in the annual Hadoop Summit held in San Jose, CA. When we first met with Hortonworks about the Summit many months back they mentioned this year’s Hadoop Summit would be promoting Reference Architectures from many companies in the Hadoop Ecosystem. This was great to hear as we had previously presented results from a large round of testing on Network and Compute Considerations for Hadoop at Hadoop World 2011 last November and we were looking to do a second round of testing to take our original findings and test/develop a set of best practices around them including failure and connectivity options. Further the set of validation demystifies the one key Enterprise ask “Can we use the same architecture/component for Hadoop deployments?”. Since a lot of the value of Hadoop is seen once it is integrated into current enterprise data models the goal of the testing was to not only define a reference architecture, but to define a set of best practices so Hadoop can be integrated into current enterprise architectures.
Below are the results of this new testing effort presented at Hadoop Summit, 2012. Thanks to Hortonworks for their collaboration throughout the testing.