I was talking to a few analysts this week on workload automation and some interesting themes came up. Workload Automation (aka Job Scheduling) starting with IBM mainframes and job control language. Over the years the automated execution of business processes (moving batch data in real-time schedules) became a major workload in the data center. This was extended from mainframe to distributed computing with Unix and then Windows compute. Along with this came all the Enterprise Resource Planning and Business Intelligence package applications that made SAP and Oracle famous. Moving the data around was absolutely mission critical. Huge demand on the data center resources drove the need to begin to control the resource states of the target compute engines to be ready for the high demands of processing millions of transactions or running critical reports for the Enterprise.
Now here is the rub…
The 40 to 60 year old set (of which I am member) know all about this “in the background” processing and its importance. The challenge now, with all the new Web applications being created and a new batch of IT professionals is that this critical part of the IT ecosystem is being forgot about, downplayed and generally not paid attention too until there is a big outage and enterprise controls and automation are put in place. There are so many places where workload automation can be applied to help automate key processes. Many of our customers of this product line have increased the size of their operations 2-3x over the past 3 years. All this cool new technology and new products being sold are putting increased demands on workload automation. Learn more about this cost saving automation technology, your CIO will thank you for it.
Tags: data center, intelligent automation, Oracle, SAP, Tidal Enterprise Scheduler, workload automation
The Intelligent Automation Solutions Business Unit hosts user groups for our Workload Automation software customers. Our Tidal Enterprise Scheduler is used by many enterprises to manage the execution of business process and moving data around the data center. We recently met many customers during our user groups in Chicago, Boston and New York City. We see some very interesting differences in our user base and how our customers use our product between these cities. For example in our Chicago
user group during the winter we had some key large customer implementation and many customer s who were deploying job scheduling for department level deployments and wanting to drive the usage throughout their enterprise. It is very common to start using Workload Automation in one key area and then expand into other areas as the success multiplies. It was good to see old friends who have used our scheduler for almost a decade as well as new users learning how to use our software product to accomplish cool new technical use cases.
Read More »
Tags: analytics, application, enterprise job scheduling;, intelligent automation, job scheduling, Tidal Enterprise Scheduler, workload automation
A few weeks ago I participated in on a webinar panel http://event.on24.com/r.htm?e=404085&s=1&k=639DAC16BAF88F2B7260152679635F00
around Big Data and the return on that data. I was joined by Ivan Chong, EVP from Informatica http://blogs.informatica.com/perspectives/author/ivan-chong/ and Van Baltz, VP and CIO of Station Casinos, http://www.stationcasinos.com/, where we discussed Van’s aggressive project to roll out a realtime big data system. This deployment , using Informatica Powercenter and the Cisco Tidal Enterprise Scheduler was an impressive project that drove both a new architecture and critical changes to the people, processes and technologies. Ivan used a great analogy around the exhaust from applications. Nowadays, IT shops deploy and run many applications that run the real-time business for their companies. All these applications produce exhaust, namely the data after all that work is done. This data is very valuable. It can tell a lot about your business, your customers and your execution to meet the corporate financials.
Cisco’s Tidal Enterprise Scheduler is a master at moving this data around and processing it so that the business IT users have the data they need to run a real-time business.
Read More »
Tags: intelligent automation, Tidal Enterprise Scheduler, workload automation
Workload Automation is the triggered or scheduled activities that drive batch and real-time business process automation moving data across an ecosystem of applications and compute environments. This is mission critical “back office” IT business data process capability. I introduced Workload Automation in a previous blog, http://blogs.cisco.com/datacenter/workload-automation-job-scheduling-applications-and-the-move-to-cloud/. We have customers who use the Cisco Tidal Enterprise Scheduler to perform the most critical activities in their enterprise. Many of the health care providers, insurance companies, manufacturing giants, and financial service outfits rely on these technologies to drive their business.
We recently made generally available our workhorse release of the Cisco Tidal Enterprise Scheduler for the next few years.
Read More »
Tags: enterprise job scheduling;, intelligent automation, workload automation
They say that data about your data is more important than the data itself. Having the right data in the data warehouse at the right time or loaded up for Hadoop Analysis is critical. I have heard of stories where the wrong product was sent to the wrong store for sale due to incorrect conclusions on what was selling best. This was due to reports and decisions being made on the wrong data. This can be a resume impacting decision in this modern world of data driven product placements around the globe. In previous blog about Enterprise Job Scheduling (aka Workload Automation) http://blogs.cisco.com/datacenter/workload-automation-job-scheduling-applications-and-the-move-to-cloud/ I discussed the basic uses of automating and scheduling batch workloads. Business intelligent, data warehousing and Big Data initiatives need to aggregate data from different sources and load them into very large data warehouses.
Let’s look into the life of the administrator and operations of a workload automation tool. The typical Enterprise may have thousands if not ten thousands of job definitions. Those are individual jobs that get run: look for this file in a drop box, FTP data from that location, extract this specific set of data from an Oracle database, connect to that windows server and launch this process, load this data into a datawarehouse using Informatica PowerCenter, run this process chain in SAP BW and take that information to this location. All this occurs to get the right data in the right place at the right time. These jobs are then strung together in a sequences we in the Intelligent Automation Solutions Business Unit at Cisco call Job Groups. These groups can represent business processes that are automated. They many have 10’s to hundreds of steps. Each job may have dependency on other jobs for completion. The jobs may be waiting for resources to become available. This all leads to a very complex execution sequence. These jobs groups run every day; some run multiple times a day, some only run at the end of the quarter.
The typical IT operations team has a group of people that design, test and implement these job groups by working with people in business IT that design and implement business processes. Often times these job groups need to finish by a certain time to meet the needs of the business. If you are a stock exchange some job groups have to finish say in so many hours after the market closes. If you have to get your data to a downstream business partner (or customer) by a certain time you become very attached to watching those jobs execute. No pun intended, your job may be on the line.
A new technology has hit the scene for our customers of the Cisco Tidal Enterprise Scheduler. It is called JAWS Historical and Predictive Analytics. http://www.termalabs.com/products/cisco-tidal-enterprise-scheduler.html . These modules takes all historical and real time performance data information from the Scheduler and through a set of algorithms produce historical, real-time, predictive, and business analytics historical and predictive analytics. This is the data about the data I mentioned previously. Our customers can do what if analyses as well as get early indication that a particular job group is not able to finish in time. The administrators can take action before it is too late. This is critical in getting the data in the right place so that analytics can be performed correctly and therefore not sending 1000 of the wrong product to the wrong store location. Thanks to our partners from Terma Software Labs http://info.termalabs.com/cisco-systems-and-terma-software-labs-to-join-forces-for-more-sla-aware-workload-processing/ .
Tags: data center, intelligent automation, job scheduling, workload automation