In the past, when talking with customers there would be a level of consistency between the CIO and the IT staff about priorities and needs. They needed to improve internal operations by deploying a new ERP system. They needed to improve workforce productivity by rolling out wireless access within the office and smartphones to their sales force. Business needs led to technology implementation.
But over the past year, those conversations have been changing. More and more, the CIO is looking to the IT department to drive new innovation for the business. In parallel, they realize that existing IT organizations have been built in silos to address previous business demands and this will need to change if they are expected to have cycles to drive innovation. Quite often, the CIO is asking how their company can begin to offer IT as a Service to the business, exploring the simplicity that is apparent with public or consumer Cloud Computing services. Read More »
When some of the largest companies in the world get together and demand choice, it means that any vendor interested in delivering solutions had better be creating architectural flexibility. And not just architectural flexibility, but also creating and enabling an open ecosystem with market leaders across a broad range of capabilities.
At a time when some vendors are moving towards vertically integrated solutions, limiting customer choice, Cisco continues to be committed to open standards and open architectures within the Data Center. This model has enabled tremendous growth in the Internet, and with more systems being interconnected (Internet, Public/Private Cloud, Mobile) the need for interoperability has never been greater. Read More »
Looking back, the market was probably caught off guard a bit by an acquisition of an automation software company by the worldwide leader in networking. Looking around, the market may still be wondering: why was the acquisition made? Did all that cash the financial analysts keep talking about finally burn a hole in Cisco’s proverbial head as well as pockets?
Neither conjecture is true, of course. As usual, Cisco mined the market for the next catalyst (pun fully intended) to transform its infrastructure, starting with the data center. The result was a formula for data center transformation that solves some of the most pressing problems in data center management both today and well into the future. Here’s the formula: take one compute platform highly tuned for on-demand cloud environments, add third-party application deliver, then perform a little fusion with support solutions that support the Day 2 operations requirements for automating manual tasks. The result is an automation of the many repetitive tasks that are now being done manually, allowing data center administrators to invest the majority of their resources in aligning IT operations with business goals and creating new ways to generate revenue rather than in just maintaining the infrastructure. Read More »
Making it Easier to Deploy and Enhance Cloud Infrastructure
Open standards. Open ecosystem. Needs for higher bandwidth. Needs for greater levels of security. Faster application response times. Demand for new levels of flexibility in moving from legacy Data Center architectures to Cloud Computing models. At Cisco, we hear these demands from our customers and partners every day as we deliver solutions to help them drive greater productivity into their business.
But we realize that we can’t deliver innovative solutions to the market alone. Not only do we need to work with partners that create world-class technology, but we need partners that are committed to creating intelligent environments that drive participation from every part of the market.
Today Intel launched the Intel Cloud Builder program, part of it’s broader Cloud 2015 initiative. This multi-vendor initiative is committed to helping customers as they migrate to various forms of Cloud Computing. Intel Cloud Builder brings together industry leaders to drive new innovations, to educate customers about technology and trends, and to deliver solutions (via Reference Architectures) that can be deployed today.
There is a myth that all data processing occurs in real time. But the reality is that batch and event based processing are still very much alive and the majority of data processing is still done through batch processing. Our average customer uses Cisco Tidal Enterprise Scheduler to execute ~50K jobs on a daily basis, but we also have large financial services companies automating the execution of over 100K jobs daily, not bad right?
So, with over 50 percent of all business processes leveraging batch operations, it is essential to keep your batch production running smoothly in order to keep your business running smoothly. You cannot afford to have failed jobs. Failure is not acceptable as it directly impacts the business and can impact revenue – such as the inability to process orders or to generate invoices. What are we getting at here?