Have you ever wondered how Cisco IT overcomes the challenges of deploying products and technologies in a large-scale, global enterprise – the same challenges that your customers face everyday? Or how Cisco IT is transforming into a sharply competitive, services-centric organization? Cisco on Cisco: Inside Cisco IT shares our journey and lessons learned on these and many other fronts.
We’ve just redesigned our website from the ground up to make it easier and faster to find Cisco IT content relevant to you and your customers. Head over to the Cisco on Cisco website to benefit from our IT Success Stories that include case studies, best practices, videos, and interactive content.
The new site focuses on content that YOU are looking for:
- How does Cisco build a highly secure network that connects anyone, anywhere, on any device, at any time? Check out our Borderless Networks page.
- How does Cisco enable collaboration, boosting productivity and enhancing myriad business processes? Visit our Collaboration page.
- Want to know more about the cloud, virtualization, service-oriented infrastructure, and unified computing? Our Data Center page can help.
- How does Cisco IT achieve greater workplace efficiencies and help solve business process problems? Learn more on our Business of IT page.
- Click on our Events page to see when and where you can engage with the Cisco on Cisco team at industry events.
Take a look Inside Cisco IT today at http://www.cisco.com/go/ciscoit.
Read More »
Tags: aaron, aaron chiles, borderless, Borderless Networks, business of it, center, chiles, Cisco, Cisco IT, cisco on cisco, collaboration, data, data center, information, IT, Networks, technology
They say that data about your data is more important than the data itself. Having the right data in the data warehouse at the right time or loaded up for Hadoop Analysis is critical. I have heard of stories where the wrong product was sent to the wrong store for sale due to incorrect conclusions on what was selling best. This was due to reports and decisions being made on the wrong data. This can be a resume impacting decision in this modern world of data driven product placements around the globe. In previous blog about Enterprise Job Scheduling (aka Workload Automation) http://blogs.cisco.com/datacenter/workload-automation-job-scheduling-applications-and-the-move-to-cloud/ I discussed the basic uses of automating and scheduling batch workloads. Business intelligent, data warehousing and Big Data initiatives need to aggregate data from different sources and load them into very large data warehouses.
Let’s look into the life of the administrator and operations of a workload automation tool. The typical Enterprise may have thousands if not ten thousands of job definitions. Those are individual jobs that get run: look for this file in a drop box, FTP data from that location, extract this specific set of data from an Oracle database, connect to that windows server and launch this process, load this data into a datawarehouse using Informatica PowerCenter, run this process chain in SAP BW and take that information to this location. All this occurs to get the right data in the right place at the right time. These jobs are then strung together in a sequences we in the Intelligent Automation Solutions Business Unit at Cisco call Job Groups. These groups can represent business processes that are automated. They many have 10’s to hundreds of steps. Each job may have dependency on other jobs for completion. The jobs may be waiting for resources to become available. This all leads to a very complex execution sequence. These jobs groups run every day; some run multiple times a day, some only run at the end of the quarter.
The typical IT operations team has a group of people that design, test and implement these job groups by working with people in business IT that design and implement business processes. Often times these job groups need to finish by a certain time to meet the needs of the business. If you are a stock exchange some job groups have to finish say in so many hours after the market closes. If you have to get your data to a downstream business partner (or customer) by a certain time you become very attached to watching those jobs execute. No pun intended, your job may be on the line.
A new technology has hit the scene for our customers of the Cisco Tidal Enterprise Scheduler. It is called JAWS Historical and Predictive Analytics. http://www.termalabs.com/products/cisco-tidal-enterprise-scheduler.html . These modules takes all historical and real time performance data information from the Scheduler and through a set of algorithms produce historical, real-time, predictive, and business analytics historical and predictive analytics. This is the data about the data I mentioned previously. Our customers can do what if analyses as well as get early indication that a particular job group is not able to finish in time. The administrators can take action before it is too late. This is critical in getting the data in the right place so that analytics can be performed correctly and therefore not sending 1000 of the wrong product to the wrong store location. Thanks to our partners from Terma Software Labs http://info.termalabs.com/cisco-systems-and-terma-software-labs-to-join-forces-for-more-sla-aware-workload-processing/ .
Tags: data center, intelligent automation, job scheduling, workload automation
Based in the Glasgow Cisco Scotland office, Stephen is a distinguished blogger from the Data Center and Cloud team in Cisco Services. Stephen joined Cisco in the year 2000 via the Atlantech Technologies acquisition and was Senior Manager within Product Management in Cisco’s Network Management R&D team, and he focused on IP/MPLS service provider network management.
During this time, he brought to market the unique Cisco MPLS Diagnostics Expert product, taking it from (literally) a corridor conversation through definition to launch, and on to win multiple industry awards. He has over 20 years of industry experience in IT, Data Center, and Service Provider Network Management which he shares with the world through his writing. By keeping customers’ new technology adoption challenges at the forefront of his mind and weaving novelty into his blogging best practices, Stephen has gained the popularity of many of his readers and established himself as a role model for many other Cisco bloggers.
Stephen’s Customer-Centric Vision
Blogging is no one-way conversation for Stephen. He has the customer in mind at all times and is always conscious of their careabouts. Prior to writing, he interviews customers and partners to better understand their viewpoints and present a more well-rounded perspective.
Read More »
Tags: best practices, blogger, blogging, Cisco Services, cloud, customer focus, data center, innovative practices, lessons learned, meet our SMEs, SME, social media, subject matter expert, tips
IT shops deploying clouds over the past year have been focused on Infrastructure as a Service ( http://en.wikipedia.org/wiki/Infrastructure_as_a_service#Infrastructure ) as a way to drive speed in virtual and physical server provisioning, cost savings in operations, proactive service level agreements, and increased control and governance. In one of my blogs I introduced our Cisco Intelligent Automation for Cloud http://blogs.cisco.com/datacenter/the-secret-is-now-out-you-can-simplify-cloud-deployments-with-cisco-unified-management/ and how that addresses both private, hybrid and public clouds IaaS. Key to this is the service catalog and self service portal. Moving to cloud is NOT about taking hundreds of server configuration templates and moving to them immediate self service. All you are doing in that model is automating VM sprawl. They key is defining a limited set of services and options that your end users such as application owners and technical folks can order through a self service portal and manage their life-cycle.
Read More »
Tags: automated provisioning, Cisco CloudVerse, Cisco Intelligent Automation for Cloud, cloud, data center, data center provisioning, IaaS, intelligent automation, orchestration, private cloud, Public Cloud, self-service
For this week’s Data Center Deconstructed we’re setting the Wayback machine to 1998, when Cisco opened a new engineering Data Center at its headquarters in San Jose, California.
Read More »
Tags: Cisco, coc-data-center, data center, datacenterdeconstructed, design, legacy, WABAC, wayback machine