Please be aware that this product is no longer sold.
Please be aware that this product is no longer sold.
The recent release of the new Cisco Intelligent Automation for Cloud Starter Edition is good news for Cisco’s Partners.
Customers will have another way to purchase and implement a Cisco cloud solution. Most customers already know that they can buy this solution from Cisco and have Cisco Advanced Services perform the installation, configuration and customization — now qualified Partners will be able to both sell and stand up cloud solutions as well. Cisco Intelligent Automation for Cloud is a sophisticated yet easy-to-use cloud solution. Customers buy a software license, but typically need a Professional Services engagement to stand up the cloud.
The Cisco IAC Partner Enablement program is what makes this possible for a Partner to perform. Qualified Partners will be able to get pre-sales and post-sales training. By pre-sales training, I mean gaining competencies around how to identify and qualify a deal, how to present the value proposition around Cisco Intelligent Automation for Cloud, how to strategically sell it and then an understanding about how it’s deployed.
Post-sales training is a combination of learning foundational issues around cloud dynamics, and seven days of hands-on labs with the technology — becoming competent in the installation, configuration, enhancement, and customization of a Cisco IAC environment.
In order to insure quality and high customer satisfaction as Cisco IAC Starter Edition is rolled out, two dozen Authorized Technology Partners (ATP) Partners have been selected worldwide who have already built a cloud practice in their Professional Services organization. They’ve made investments and commitments to joint sales planning sessions, training classes and mentoring engagements. They have cloud business design and implementation service competencies matched by technical implementation qualifications that enable them to do multi-system integration with advanced enterprise software systems using standard web services and custom APIs. They are familiar with Cisco UCS and VMware certified and have done advanced data storage integrations. These consultants, architects and implementation engineers will receive the conceptual as well as hands-on experience with standing up a Cisco IAC solution.
A Phased Approach:
Starting later this month and next month, the first phase of training will begin for these ATP Partners with pre-sales and post-sales service delivery training classes. As these ATP Partners complete their training, a second phase of Partners, who are motivated to obtain the training, will be able to sign up for this enablement.
Where to learn more:
- Visit the Cisco IAC Partner Community. Cisco Partners are participating in the online community around Cisco IAC. With your Cisco Partner credentials, drop by cisco.com/go/iacloudpartner and join in the discussion, read the Q&A, and find other information designed specifically for Partners. The website will grow and develop based on your input.
- See it live. Cisco is doing live demos at InterOp in Las Vegas the week of May 6, at EMC World in Las Vegas the week of May 21, and Cisco Live in San Diego the week of June 11. Stop by the Cisco booth and say hello.
- See a demo of Cisco Intelligent Automation for Cloud Starter Edition online. Visit the website cisco.com/go/starteredition and click on the Video Demonstration. You can also find Data Sheets and Presentations there and learn more about the Cisco Cloud Portal and Cisco Process Orchestrator technologies that make up Cisco Intelligent Automation for Cloud.
- Join the live Cisco webcast here on May 15, 2012 at 8 am Pacific Time to ask questions about Cisco Intelligent Automation for Cloud Starter Edition.
You’re connected to the CloudTone.
Tags: CIAC, Cisco Intelligent Automation for Cloud, Cisco UCS, cloud, Cloud Computing, Cloud Management, IaaS, intelligent automation, private cloud, starter edition, Unified Data Center
Industry standard benchmarks have played, and continue to play, a crucial role in the advancement of the computing industry. Demands for them have existed since buyers were first confronted with the choice between purchasing one system over another. Over the years, industry standard benchmarks have proven critical to both buyers and vendors: buyers use benchmark results when evaluating new systems in terms of performance, price/performance and energy efficiency, while vendors use benchmarks to demonstrate competitiveness of their products and to monitor release-to-release progress of their products under development . Historically we have seen that industry standard benchmarks enable healthy competition that results in product improvements and the evolution of brand new technologies.
Over the past quarter-century, industry standard bodies like the Transaction Processing Performance Council (TPC) and the Standard Performance Evaluation Corporation (SPEC) have developed several industry standards for performance benchmarking, which have been a significant driving force behind the development of faster, less expensive, and/or more energy efficient system configurations.
The world has been in the midst of an extraordinary information explosion over the past decade, punctuated by rapid growth in the use of the Internet and the number of connected devices worldwide. Today, we’re seeing a rate of change faster than at any point throughout history, and both enterprise application data and machine generated data, known as Big Data, continue to grow exponentially, challenging industry experts and researchers to develop new innovative techniques to evaluate and benchmark hardware and software technologies and products.
I am co-chairing a workshop with my distinguished colleagues Chaitanya Baru, Meikel Poess, Milind Bhandarkar, Tilmann Rabl and others entitled Workshop on Big Data Benchmarking (WBDB 2012) , supported by the National Science Foundation (NSF.gov). This is a crucial initial step towards the development of an industry standard benchmark for providing objective measures of the effectiveness of hardware and software systems dealing with Big Data. Several industry experts and researchers have been invited to present and debate their vision on benchmarking big data platforms.
A report from this workshop will be presented at the just-announced 4th International Conference on Performance Evaluation Benchmarking (TPCTC 2012) , organized by the TPC, which will be collocated with the 38th International Conference of Very Large Data Bases (VLDB 2012), a premier forum for data management and database researchers, vendors and users. With this conference, we encourage industry experts and researchers to submit ideas and methodologies in performance evaluation, measurement and characterization in areas including, but not limited to: big data, cloud computing, business intelligence, energy and space efficiency, hardware and software innovations and lessons learned in practice using TPC and other benchmark workloads .
Cisco has been an active member of the TPC since 2010 and the SPEC since 2009.
 R. Nambiar, N. Wakou, P. Thawley, A. Masland, M. Lanken, M. Majdalany, F. Carman: Shaping the Landscape of Industry Standard Benchmarks: Contributions of the Transaction Processing Performance Council: Springer 2011
 Workshop on Big Data Benchmarking: http://clds.ucsd.edu/wbdb2012/
 TPC Press Release: http://finance.yahoo.com/news/transaction-processing-performance-council-announces-150000511.html
 TPCTC 2012 Call for Papers: http://www.tpc.org/tpctc2012/
Tags: Big Data, Big Data Benchmarks, Industry Standard, TPCTC 2012, WBDB 2012
So, some closing thoughts on ONS. I know its a bit late, but hey, when you’re out of the office for a few days, things pile up a bit–overall, I think the ONF folks did a fine job with the event.
As I look back at ONS, I am reminded of one of my favorite IT quotes, courtesy of Bill Gates:
We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten
Long-term, I think SDN or the concepts it represents will certainly have a hand in shaping how we do networking a decade for now–how we get there and what that destination really looks like is a bit less certain.
First, I think we are early enough in the game that the technology is far from unsettled:
- Most folks are shipping 1.0 code, either literally or figuratively, and I am betting there are unseen technologies in the wings that will help shape things and I am sure folks will find interesting ways to also repurpose existing technology
- We can pretty much expect some wave of M&A to help shape the vendor and technology landscape
- As I have noted before, there is a lot of dogma about what SDN is right now that is not helpful, but I also believe it will eventually fall by the wayside
Eventually the market will sort this stuff out, and a handful of organizations are in a position to drive their own solutions, but for regular folks, I think there is enough near-term uncertainty here that it will give people pause–both in terms of customer adoption as well as ecosystem investment.
Read More »
Tags: Open Networking Summit, OpenFlow, SDN, software defined networking
A few weeks ago I participated in on a webinar panel http://event.on24.com/r.htm?e=404085&s=1&k=639DAC16BAF88F2B7260152679635F00
around Big Data and the return on that data. I was joined by Ivan Chong, EVP from Informatica http://blogs.informatica.com/perspectives/author/ivan-chong/ and Van Baltz, VP and CIO of Station Casinos, http://www.stationcasinos.com/, where we discussed Van’s aggressive project to roll out a realtime big data system. This deployment , using Informatica Powercenter and the Cisco Tidal Enterprise Scheduler was an impressive project that drove both a new architecture and critical changes to the people, processes and technologies. Ivan used a great analogy around the exhaust from applications. Nowadays, IT shops deploy and run many applications that run the real-time business for their companies. All these applications produce exhaust, namely the data after all that work is done. This data is very valuable. It can tell a lot about your business, your customers and your execution to meet the corporate financials.
Cisco’s Tidal Enterprise Scheduler is a master at moving this data around and processing it so that the business IT users have the data they need to run a real-time business.
Read More »
Tags: intelligent automation, Tidal Enterprise Scheduler, workload automation
Recently I blogged on the rise of UCS and my own perspectives joining Cisco Data Center Services around the launch of Cisco UCS back in March 2009. I then posed a quick poll on the Cisco Data Center Facebook page, with a number of options, asking which of these options did we in Cisco Data Center Services *not* offer to our customers today. Thanks to all who took the time to answer the poll. So let’s look at the summary of our services I presented in my previous blog (diagram below), and let’s discuss what you said via the poll.
Cisco Data Center Services Portfolio Evolution 2008-2012
Read More »
Tags: architecture, cisco_services, cloud, cloud_computing, data center facilities, desktop virtualization, operations management, UCS