They say that data about your data is more important than the data itself. Having the right data in the data warehouse at the right time or loaded up for Hadoop Analysis is critical. I have heard of stories where the wrong product was sent to the wrong store for sale due to incorrect conclusions on what was selling best. This was due to reports and decisions being made on the wrong data. This can be a resume impacting decision in this modern world of data driven product placements around the globe. In previous blog about Enterprise Job Scheduling (aka Workload Automation) http://blogs.cisco.com/datacenter/workload-automation-job-scheduling-applications-and-the-move-to-cloud/ I discussed the basic uses of automating and scheduling batch workloads. Business intelligent, data warehousing and Big Data initiatives need to aggregate data from different sources and load them into very large data warehouses.
Let’s look into the life of the administrator and operations of a workload automation tool. The typical Enterprise may have thousands if not ten thousands of job definitions. Those are individual jobs that get run: look for this file in a drop box, FTP data from that location, extract this specific set of data from an Oracle database, connect to that windows server and launch this process, load this data into a datawarehouse using Informatica PowerCenter, run this process chain in SAP BW and take that information to this location. All this occurs to get the right data in the right place at the right time. These jobs are then strung together in a sequences we in the Intelligent Automation Solutions Business Unit at Cisco call Job Groups. These groups can represent business processes that are automated. They many have 10’s to hundreds of steps. Each job may have dependency on other jobs for completion. The jobs may be waiting for resources to become available. This all leads to a very complex execution sequence. These jobs groups run every day; some run multiple times a day, some only run at the end of the quarter.
The typical IT operations team has a group of people that design, test and implement these job groups by working with people in business IT that design and implement business processes. Often times these job groups need to finish by a certain time to meet the needs of the business. If you are a stock exchange some job groups have to finish say in so many hours after the market closes. If you have to get your data to a downstream business partner (or customer) by a certain time you become very attached to watching those jobs execute. No pun intended, your job may be on the line.
A new technology has hit the scene for our customers of the Cisco Tidal Enterprise Scheduler. It is called JAWS Historical and Predictive Analytics. http://www.termalabs.com/products/cisco-tidal-enterprise-scheduler.html . These modules takes all historical and real time performance data information from the Scheduler and through a set of algorithms produce historical, real-time, predictive, and business analytics historical and predictive analytics. This is the data about the data I mentioned previously. Our customers can do what if analyses as well as get early indication that a particular job group is not able to finish in time. The administrators can take action before it is too late. This is critical in getting the data in the right place so that analytics can be performed correctly and therefore not sending 1000 of the wrong product to the wrong store location. Thanks to our partners from Terma Software Labs http://info.termalabs.com/cisco-systems-and-terma-software-labs-to-join-forces-for-more-sla-aware-workload-processing/ .
Tags: data center, intelligent automation, job scheduling, workload automation
In an earlier part of my career I learned the extreme importance of Workload Automation, aka Job Scheduling. Workload automation is the oldest IT technology on the planet coming from the need to schedule jobs on an IBM Mainframe. Job Scheduling has evolved from driving JCL (Job Control Language) to Workload Automation where the Scheduler stitches together batch and real time activities across mainframes, proprietary OS systems, x86 systems, applications (both packages and commercial off the shelf such as SAP or Oracle or Informatica) and now web service enabled applications whether they be onsite or in the cloud. Walk into the operations center of any data driven company and you will see multiple screens where operations are monitoring the state of these jobs. Why are they so critical? Over 50% of all transactions that occur on this planet are batch in nature. They are scheduled based upon specific times or based upon dependencies being met. These workloads can be a complex and interrelated set of activities. Effectively these job streams are the business processes that drive modern enterprises.
Without these jobs companies don’t get information (and large amounts of it) in the right place at the right time. Most companies today could not close out their financial quarters without enterprise schedulers to move data from their disparate systems into a consolidate place for either the general ledger to close out or for a critical Business Intelligence report to run to drive placement of the correct product into the specific physical location to serve the global economy. Workload automation tools open and close stock exchanges and process all the transaction data from trades. They also drive compliance checks. This is important stuff for the global economy! This was my realization in touring key operations centers and realizing that half of the big monitors were covering the movement of batch data in the enterprise.
Read More »
Tags: cloud, cloud_computing, datacenter, intelligent automation, scheduleing, virtualization, workload automation
Next week is Cloud Connect in Santa Clara and Cisco’s Cloud Software group will have a big presence.
While we have plenty to talk about on how Cisco is helping customers build their cloud, we also want to listen to our customers plans and needs. We are bringing some of our engineers and architects so you can engage directly with them. There are three things you can see next week.
CITEIS – Cisco’s, in production, private cloud.
See how it was built, the results in agility and cost, and best of all see a demo. Not a fake demo but the real thing.
Of course, we will also be showcasing our award winning cloud automation software, Cisco Intelligent Automation for Cloud (CIAC) (formerly newScale and Tidal), which provides the self-service catalog and orchestration to our private cloud
Read More »
Tags: Big Data, CIAC, cloud portal, Hadoop, intelligent automation, newScale, OpenStack, Tidal, Tidal Enterprise Scheduler, unified management, workload automation
No doubt data is one of an organization’s most important assets. The trick is to turn it into timely and trusted information—information that can be used to rapidly uncover new markets, attract and retain customers, reduce operating costs, shrink time to market, and make smarter strategic decisions. In short, leveraging data can sharpen a company’s ability to navigate markets.
So when we combine Informatica’s world-class data integration platform with Cisco® Tidal Enterprise Scheduler, we are enabling organizations to gain a competitive advantage in today’s global information economy by empowering them with relevant and trustworthy information to support all their business decisions.
Read More »
Tags: business intelligence, enterprise scheduler, informatica, process automation, Tidal, workload automation
Looking back, the market was probably caught off guard a bit by an acquisition of an automation software company by the worldwide leader in networking. Looking around, the market may still be wondering: why was the acquisition made? Did all that cash the financial analysts keep talking about finally burn a hole in Cisco’s proverbial head as well as pockets?
Neither conjecture is true, of course. As usual, Cisco mined the market for the next catalyst (pun fully intended) to transform its infrastructure, starting with the data center. The result was a formula for data center transformation that solves some of the most pressing problems in data center management both today and well into the future. Here’s the formula: take one compute platform highly tuned for on-demand cloud environments, add third-party application deliver, then perform a little fusion with support solutions that support the Day 2 operations requirements for automating manual tasks. The result is an automation of the many repetitive tasks that are now being done manually, allowing data center administrators to invest the majority of their resources in aligning IT operations with business goals and creating new ways to generate revenue rather than in just maintaining the infrastructure. Read More »
Tags: enterprise orchestrator, enterprise scheduler, intelligent automation, Tidal, workload automation