Let’s talk about Cisco Process Orchestrator. We recently released version 3.0 — and there are a lot of exciting features that have been introduced with this new release of our IT Process Automation (ITPA) / Run Book Automation (RBA) software.
Cisco Process Orchestrator is the foundational engine on which Cisco has built a number of data center-, application-, and network-focused automation solutions. These include our cloud management solution (Cisco Intelligent Automation for Cloud), which embeds this orchestration engine for cloud service automation and helps organizations deploy private, public, or hybrid clouds; an SAP-focused solution resold by SAP (SAP IT Process Automation by Cisco), which lowers TCO of your SAP applications and databases by integrating events and alert management data with incident response information; and a network troubleshooting and triage solution, which helps customers manage repetitive tasks and aids remediation of common issues with network operations.
In this post, I’m going to feature a few of the major highlights of the new version 3.0 release:
- A new service-oriented methodology for service encapsulation: design workflows that match to the service ordered by the business units
- Flexible automation packs and solution accelerators: build, version, re-purpose content to drive solutions
- Service automation integrated with Cisco Prime Service Catalog: optimizing the end-to-end service delivery process
Building ITPA and RBA workflows has never been easier. With Cisco Process Orchestrator‘s service-oriented orchestration you can move away from traditional static, script-based run-book automation and IT process-level automation. We have built a modeling platform where automation aligns with the highest-level services and allows you to model IT service the way that a high-level service is delivered.
A shift from static workflow design to dynamic, service-oriented design
In this “top down” approach, designing the services and their desired state is the initial step in automation design. The next step is defining the process actions for these services and then implementing the specific process workflows that traverse traditional IT boundaries to act on and automate the necessary elements to deliver the service.
Read More »
Tags: Cisco Intelligent Automation for Cloud, Cisco Network Operations Automation Service, Cisco Unified Management, cloud, Cloud Management, data center, intelligent automation, network automation, process automation, run book automation, SAP IT Process Automation, Service Orchestration, unified management
At Cisco Live! Milan, I talked with people from all types of organizations from around the world and hearing their excitement at how they envision the cloud is going to change the way they do business. Their stories are incredible and inspiring. They range from a small startup hoping to expand their business without having to create their own IT department all the way to global companies looking for ways to deliver new services faster in a more secure and cost-effective manner.
No matter the size of their organization, they are all looking for similar things. Assured performance is top-of-list for most. Faster to time market is another important factor. Many are turning to the cloud for a competitive edge that lets them take advantage of continuous innovation without having to reinvest in a completely new data center. And most are looking forward to the cost savings of not having to manage their own IT infrastructure. However, what’s often not talked about is that behind every cloud service is a network that is responsible for the performance of your data and applications. And the truth is, not all cloud services are created equal.
Read More »
Tags: Cisco Powered, cloud, cloud providers, partners
Recently I wrote about a few real life examples of IDC Manufacturing Insights 2014 Predictions: Worldwide Manufacturing and their Top 10 predictions. I wanted to continue with some more examples to illustrate what is happening now and hopefully help you see some real life examples that are already taking place.
To follow up with IDC’s Prediction #4
Supply chain technology investment will involve modernizing existing systems while also trying new approaches, many systems already exist, but the issue is that they are in a ‘silo’ versus the other systems and it is difficult to talk across the systems. If you have multiple vendors on multiple platforms, it is difficult to get to the information but then also make sense of this information. In fact, most of our customers use different systems within their environment so even within a first level silo (company A) it is difficult to start to see what information exists let alone start to analyze this information. When you start to get to a second level silo (company A to company B information flows) you are now looking at silos within silos trying to talk with other silos.
I do not see these systems being ‘ripped and replaced’ but augmented with a layer above them to then start to build visibility and correlation across the systems which can then be tracked across companies to add visibility. We have been working with one of our business partners, HCL, to build a cloud offering where we are able to quickly install a platform, extract data from your existing systems and start to add value to you multiple locations operations and diverse portfolio.
The fifth IDC prediction was around the modernization of the B2B Commerce Backbone. I have seen this happening with many of our customers and business partners where they are using the information they have already on hand and start to use it in new ways. Look at this article on Amazon and anticipatory shopping. By taking the data that they have and mining this information is going to change how and what we order.
We are seeing this same use of analytics from the manufacturing market, not to predict what you are going to order, but when something may fail. Taking the data and tracking the sensor information, now much easier to access and track with new products and offerings, and driving this into an analytics engine. Using analytics, the ‘normal ranges’ are known and can be applied when the sensors are seeing any abnormalities occur. This helps to then understand what is happening and where it is happening and then start to understand where items may fail.
A few of our customers are taking this information from their customers and aggregating all of this information from different locations, adding sensors from the environment and then taking this information to drive the predictions back to their customers. Interestingly, some of our customers and partners see this as a service offering to allow better information and comparisons to stop failures and drive towards the 99.999% uptime that every company would like to have.
How are you wrestling with modernizing or revamping your supply chain? Are you adopting analytics in your sales or manufacturing processes? Let us know. Thanks for reading!
Tags: B2B, cloud, Manufacturing, supply chain
Technology is changing the world at an ever increasing rate. The exponential jumps in processing power, mobile technologies, storage, and connection speeds are facilitating a whole new suite of possibilities. We are beginning to see the creation and realities of new connected experiences that allow us to capture and document our lives in countless ways. Our memories and experiences can be digitally chronicled and preserved perhaps forever. Where and how do we store our ever-expanding archive of personal history? How will we be able to find, share, or extract what we need, when we need it?
(Fun Fact: The volume of such data already being saved to the cloud is staggering: in Dropbox alone, one billion files are uploaded every 24 hours!)
Enter the era of the personal cloud. According to Gartner’s IT predictions for 2014, personal cloud will solidify the shift from devices to services. So it won’t matter which device captures information or images; you’re personal cloud will be the hub that centralizes everything. You’ll have almost unlimited potential in recording and archiving your life, with easier and faster access. While this notion of “lifestreaming” – i.e., creating a digital diary and archive of one’s life and activities – has been around for years, social media aggregators like Facebook have only begun to address the scope of what it can be.
Read More »
Tags: Big Data, Cisco cloud, cloud
As information consumers that depend so much on the Network or Cloud, we sometimes indulge in thinking what will happen when we really begin to feel the effects of Moore’s Law and Nielsen’s Law combined, at the edges: the amount of data and our ability to consume it (let alone stream it to the edge), is simply too much for our mind to process. We have already begun to experience this today: how much information can you consume on a daily basis from the collective of your so-called “smart” devices, your social networks or other networked services, and how much more data is left behind. Same for machines to machine: a jet engine produces terabytes of data about its performance in just a few minutes, it would be impossible to send this data to some remote computer or network and act on the engine locally. We already know Big Data is not just growing, it is exploding!
The conclusion is simple: one day we will no longer be able to cope, unless the information is consumed differently, locally. Our brain may no longer be enough, we hope to get help, Artificial Intelligence comes to the rescue, M2M takes off, but the new system must be highly decentralized in order to stay robust, or else it will crash like some kind of dystopian event from H2G2. Is it any wonder that even today, a large portion if not the majority of the world Internet traffic is in fact already P2P and the majority of the world software downloaded is Open Source P2P? Just think of BitCoin and how it captures the imagination of the best or bravest developers and investors (and how ridiculous one of those categories could be, not realizing its potential current flaw, to the supreme delight of its developers, who will undoubtedly develop the fix — but that’s the subject of another blog).
Consequently, centralized high bandwidth style compute will break down at the bleeding edge, the cloud as we know it won’t scale and a new form of computing emerges: fog computing as a direct consequence of Moore’s and Nielsen’s Laws combined. Fighting this trend equates to fighting the laws of physics, I don’t think I can say it simpler than that.
Thus the compute model has already begun to shift: we will want our Big Data, analyzed, visualized, private, secure, ready when we are, and finally we begin to realize how vital it has become: can you live without your network, data, connection, friends or social network for more than a few minutes? Hours? Days? And when you rejoin it, how does it feel? And if you can’t, are you convinced that one day you must be in control of your own persona, your personal data, or else? Granted, while we shouldn’t worry too much about a Blade Runner dystopia or the H2G2 Krikkit story in Life, the Universe of Everything, there are some interesting things one could be doing, and more than just asking, as Philip K Dick once did, do androids dream of electric sheep?
To enable this new beginning, we started in Open Source, looking to incubate a project or two, first one in Eclipse M2M, among a dozen-or-so dots we’d like to connect in the days and months to come, we call it krikkit. The possibilities afforded by this new compute model are endless. One of those could be the ability to put us back in control of our own local and personal data, not some central place, service or bot currently sold as a matter of convenience, fashion or scale. I hope with the release of these new projects, we will begin to solve that together. What better way to collaborate, than open? Perhaps this is what the Internet of Everything and data in motion should be about.
Tags: ai, Android, artificial intelligence, Big Data, BitCoin, Blade Runner, cloud, Do Androids Dream of Electric Sheep, Fog, Fog computing, H2G2, Internet of Everything, internet of things, IoE, IoT, krikkit, M2M, Moore Law, Nielsen Law, open source, p2p, privacy, security