SAP®HANATM is a next-generation database platform for real-time analytics and applications. Although the in-memory, columnar relational database debuted only recently (in June 2011), it has quickly become the fastest growing product in the history of
SAP AG. Already proven enormously successful for analytics, SAP HANA now supports SAP Business Suite, SAP’s flagship enterprise resource planning (ERP) application. It has also been identified as the focus for innovation for SAP. To prepare for
this eventuality, enterprises are considering ways to make SAP HANA “data center-ready.”
Since SAP HANA is the fastest growing product in SAP history, the data center needs to be trusted as well. Here is a chart on the requirements of SAP as it relates to SAP HANA
The Five Essential Characteristics of a SAP HANA Hardware Platform
Find the full story about how SAP HANA users are choosing the Unified Computing System server platform to build their trusted data center in this comprehensive white paper.
When Cisco designed the concept of an Application Centric Infrastructure, we knew it wouldn’t reach its full potential without drawing in a very comprehensive ecosystem of partners in a number of areas. Perhaps the most impressive aspect of our announcement was the breadth, quality and scope of the data center infrastructure partners that we aligned so quickly with our ACI vision and that contributed their perspectives to the launch, and will be contributing key solutions to Cisco’s infrastructure-wide vision.
Yesterday, I blogged about the role of application controllers, network monitoring solutions, WAN optimization, firewalls, etc. have in setting up application networks, provisioning applications, and how the ACI policy model incorporates these security and services solutions. I wanted to follow up that post with some highlights from the support we received from some of our ACI partners in this area, that incorporate ACI policy support into their security, application delivery controller, load balancing and other solutions.
Every server manufacturer has a Server TCO tool, of sorts. Why do I say “of sorts”? Because rather than a straightforward approach to TCO, some tools color the input parameters with fixed pre-sets, conditions and assumptions. Certainly every tool has some assumptions and pre-sets; they just need to be applied equitably across all scenarios and all vendors. If not, you get results that “…would strain credulity…” in the immortal words of Captain Barbossa [Pirates of the Caribbean– At World’s End; my daughters love these movies. OK, me too.]
Yesterday, Nov 6, Cisco unveiled details of the Application Centric Infrastructure with an ecosystem of partners that share our common view -- IT is in need of a transformation to create the Application Economy. Some key technology leaders spoke about the application lifecycle impact of an open and centralized policy model for complete infrastructure automation, including configuration, operation, monitoring, and optimization. I’d like to recap a few of those comments here today.
During the ACI announcement, Brad Anderson, Corporate Vice President in Microsoft’s Windows Server and System Center Group (WSSC), said that
virtualization has unshackled applications from the hardware in the past. But now with ACI we can do much more.So first of all, we can have the applications be able to describe their needs for more rapid provisioning. So with the view we can get across physical and virtual, we can see what is happening with the application, we can optimize the infrastructure for the application, and do more rapid troubleshooting.
…the integration with Microsoft cloud OS and UCS is really remarkable. Literally you have a common way to automate everything from the application, down to the operating system, down to all of the hardware level components. But ACI gives us the ability to do some really remarkable things..
Imagine how Exchange, Sharepoint and Linc -- being able to be shipped with ACI policies that now describe out how exactly the network should be configured, how it should be optimized, and automatically be provisioned across physical and virtual in a holistic way. That’s the kind of value we are going to be able to deliver together.
“…These new solutions are designed to improve business agility and reduce cost by driving infrastructure automation in support of core business processes and applications. This next-generation infrastructure will deliver increased application performance, resource pooling, visibility, automation and mobility through:
· Converged ACI stacks that include fully integrated versions of Windows Server 2012 R2 Hyper-V, System Center 2012 R2, SQL Server, Exchange and SharePoint”
I introduced the IT challenge posed by apps that behave differently in my earlier ACI post so now I want to point out that the new converged ACI stacks will fully integrate the operating system, orchestration, applications, server and network infrastructure to provide an enterprise customer with the application agility to rapidly deploy Exchange, SQL Server, and SharePoint, scale and upgrade them, and also to decommission them.
Many next generation distributed cloud applications are being written on open source platforms. For a view on what ACI means to a leading open source cloud platform, OpenStack, let me quote what Jim Whitehurst, President and CEO of Red Hat, said at the launch:
…there’s a whole set of functionality that is required to run a portfolio of true production applications and be able to run a diverse set of applications and to make sure that you can actually guarantee the performance levels that you need. The great thing about ACI is it provides that really differentiated functionality that enterprises need, even on open platforms, but at the same time, it does it with open standards, open APIs, and an open ecosystem so that customers get the benefit without being locked in and maintain the flexibility they are looking for going forward.
For more on Openstack and ACI, see this video – Application Policy and OpenStack – which explains how the DevOps community can extend agile processes to network infrastructure.
This year Cisco held Data Virtualization Day 2013 at the New York Palace in New York City. With 350 attendees from more than 130 organizations, it marked the largest event to date and showcased data virtualization is top of mind for organizations as they try to extract more value out of their data.
Data Virtualization -- Different points of view
During the event, customers, analyst and Cisco executives gathered to share best practices, discuss trends driving data virtualization and provide insight into Cisco’s go-forward strategy to expand and accelerate data virtualization offers. Some highlights included:
Customers such as Goldman Sachs, BMO and British Sky Broadcasting shared insider’s views of their implementations, also explaining the significant profitability, agility, and risk management benefits their enterprises have achieved.
Top data virtualization analysts at Forrester and R20 Consultancy discussed data virtualization adoption acceleration as well as the business and technology trends behind it.
Looking ahead, Noel Yuahanna of Forrester described global information fabrics powered by data virtualization that integrate enterprise, partner, marketplace, social and line of business information fabrics to provide connected data anytime, anywhere. Rick van der Lans, R20 Consultancy, discussed how data virtualization along with powerful networks – which will allow data to stay where it is collected – will become the dominant data integration method.
Mike Flannagan, General Manager of Cisco’s Integration Brokerage Technology Group,discussed why Cisco chose to enter the data virtualization business and noted that the big data, cloud computing and “Internet of Everything” eras were making data virtualization a must have for Cisco’s customers.
Jim Green, General Manager of Cisco’s Data Virtualization Business Unit, presented his vision for data virtualization’s next generation and that achieving massive scale was the next frontier for data virtualization technology. He also discussed Cisco’s strategy to innovate using a unique mix of data virtualization, networking, and compute assets to meet this scale challenge.
Highlights from these presentations will soon be posted to our Data Virtualization Day resources page and the Cisco data virtualization offering page. So stay tuned.