Guest Post by Alex Jauch (Technical Architect, NetApp)
This week at Microsoft TechEd, NetApp and Cisco announced our joint Microsoft Hyper-V Cloud Fast Track submission. The submission was the result of a great working partnership between NetApp and Cisco. We’re very pleased with the solutions and technical expertise that the Cisco team brought to the project. Because of our strong partnership, Cisco and NetApp have been able to produce a great Hyper-V Cloud Fast Track solution.
Here’s an example, because both NetApp and Cisco have both been investing in our Microsoft management integration platforms and we have been able to integrate these features into our submission to Fast Track. Both Cisco and NetApp have been working closely with the Microsoft Opalis Integration Server team and we were able to combine our solutions into a single solution that enables rapid provisioning from a bare metal blade to a full running Windows Cluster and then VM provisioning against that cluster. You can see a demonstration of this solution here:
In a competitive market, differentiation makes all the difference. Whether you’re selling sneakers or servers, being able to offer the widest selection of products to meet a broad set of needs is critical in helping to drive growth and grow profits.
So today’s announcements of three new reference configurations focused on Microsoft applications and technologies will likely please you—now our partners will have even more opportunities to sell a broader set of solutions, giving customers more choice.
Today, along with our storage partners, Cisco is greatly expanding our channel partners’ ability to offer customers Microsoft-based private cloud, data warehouse, or OLTP configurations based on shipping Cisco UCS server and Nexus networking products. Three standalone, discrete reference architectures are now available: A Cisco-developed SQL Server 2008 R2 Data Warehouse solution; a Cisco-developed SQL Server 2008 R2 Online Transaction Processing (OLTP) offer; and Cisco as the server partner with NetApp as they bring to market their NetApp for Private Cloud offer as part of Microsoft’s Hyper-V Private Cloud program.
This guest post is presented by Rex Backman, Global Marketing Manager for Microsoft Solutions, Cisco, firstname.lastname@example.org
Cisco is very pleased to announce that Microsoft has invited us to be part of their Hyper-V Cloud Fast Track program. This is great news for our customers and partners as it will allow for Cisco UCS based server infrastructures for Windows Server 2008 R2 based private cloud deployments.
I, just like my colleague Tony Paikeday, am somewhat preoccupied these days with the fast changing world of the desktop and its impact on data center infrastructures. I wanted to pick up on Tony’s desktop virtualization “just another workload” blog back in November because it is a subject of growing discussion, especially with “cloud” being all the buzz. While desktops are an increasingly popular workload to get started with private cloud initiatives, does that mean that data center architects are mixing desktops with more traditional data center workloads?
Talking to our system engineers who are helping plan and design desktop virtualization deployments day in day out…..the more I learn there are very good reasons for treating this workload as special and separate.
The first thing I hear about is sizing of the desktop workload. A “desktop” is not a “desktop”. You can’t just characterize a generic Win 7 desktop for compute, memory, I/O and storage IOPS. You need to be able to customize the infrastructure profile to the specific user type being deployed. Therein lays the danger of mixing these virtual desktops with production workloads, where desktops could end up capturing valuable resources of mission critical services. For example consolidating a company procurement application on the same compute pool as your desktop workloads could result in a lot of unproductive – or even worse –unhappy employees.
What is “in-memory”? “In memory” is a technology that takes Data Warehousing and Business Intelligence to a different level. CIO’s want information at their fingertips. In order to obtain that information, they engage in data modeling and “what if” scenarios, the answers of which give them a competitive edge in business. The biggest concern to-date, is that the data modeling and “what if” scenarios usually take days to process. SAP HANA in-memory technology allows CIO’s to obtain answers to these complex issues in microseconds instead of the typical wait of days.
Who are the only server platform vendors certified to sell SAP HANA?
What are the benefits to users of SAP HANA?
Processes all transactions in memory instead of I/O to disk
Processes millions of lines of data in microseconds
All processing done outside of normal data processing
Reduction of hardware and maintenance costs since SAP HANA is self contained in one appliance
So is SAP HANA “in Memory” technology disruptive? Absolutely. Watch Rajiv Thomas’s Video Cisco and SAP HANA about HANA