Our Common Platform Architecture (CPA) for Big Data has been gaining momentum as a viable platform for enterprise big data deployments. The newest addition to the portfolio is EMC’s new Pivotal HD™ that natively integrates Greenplum MPP database technology with Apache Hadoop enabling SQL applications and traditional business intelligence tools directly on Hadoop framework. Extending support for Pivotal HD on Cisco UCS, Satinder Sethi, Vice President at Cisco’s Datacenter Group said “Hadoop is becoming a critical part of enterprise data management portfolio that must co-exist and complement enterprise applications, EMC’s Pivotal HD is an important step towards that by enabling native SQL processing for Hadoop”.
Built up on our 3+ years of partnership with Greenplum database distribution and Hadoop distributions, the joint solution offers all the architectural benefits of the CPA including: Unified Fabric -- fully redundant active-active fabric for server clustering, Fabric Extender technology -- highly scalable and cost-effective connectivity, Unified Management -- holistic management of infrastructure through a single pane of glass using UCS manager, and High performance -- high speed fabric along with Cisco UCS C240 M3 Rack Servers powered by Intel® Xeon® E5-2600 series processors. Unique to this solution is the management integration and data integration capabilities between Pivotal HD based Big Data applications running on CPA and enterprise application running on Cisco UCS B-Series Blade Servers connected to enterprise SAN storage from EMC or enterprise application running on integrated solutions like Vblock.
The Cisco solution for Pivatol HD is offered as reference architecture and as Cisco UCS SmartPlay solution bundles that can be purchased by ordering a single part number: UCS-EZ-BD-HC -- rack level solution optimized for for low cost per terabyte and UCS-EZ-BD-HP -- rack level solution offers balance of compute power with IO bandwidth optimized for price/performance.
For more information see: Cisco Big Data SmartPlay Solution Bundles and Common Platform Architecture (CPA) for Big Data.
Cisco UCS Common Platform Architecture Version 2 (CPA v2) for Big Data with Pivotal HD and HAWQ
Tags: Big Data, Cisco UCS CPA, CPA, Greenplum, Hadoop, Pivotal, Pivotal HD
Mobile carriers face no shortage of pain points as new data streams create unprecedented and staggering amounts of information. But it is important to remember that pain points often arrive in tandem with new opportunities.
From my perspective, observing the driving forces shaping the mobile industry, five key trends stand out. All are laced with challenges and opportunities. And each represents a core element in an interconnected system that is pushing the entire marketplace forward, while demanding innovative breakthroughs in monetizing and optimizing data.
On February 25-28, I will be attending Mobile World Congress 2013 in Barcelona. This year’s event is expected to be the largest ever, with 1,500 exhibitors. I expect these five trends will be major sources of discussion:
- Video. We are already seeing the true inflection point in video where it becomes mainstream on multiple devices. The mobile and nomadic consumption of video—whether served by mobile carriers or localized Wi-Fi—is popular, commonplace, and growing rapidly. But video will completely reshape the demand side of the industry, creating enormous amounts of data. It threatens to load and clog networks, and it will demand new models for monetization.
- Accelerating connections. As the Read More »
Tags: Big Data, Cisco, Connected Life, data, data in motion, IBSG, Internet of Everything, IoE, mobile carriers, mobile devices, mobile world congress, multiscreen, offloading, Personalization, service providers, targeted ads, video, wi-fi
When customers look to deploy their Hadoop solutions, one of the first questions they ask is, which distro should we run it on? For many enterprise customers, the answer has been MapR. For those of you not familiar with MapR, they offer an enterprise-grade Hadoop software solution that provides customers with a robust set of tools for running Big Data workloads. A few months ago, Cisco announced the release of Tidal Enterprise Scheduler (TES) 6.1 and with it integrations for Hadoop software distributions, such as Cloudera and MapR, as well as adapters to support Sqoop, Data Mover (HDFS), Hive, and MapReduce jobs. All performed through the same TES interface as their other enterprise workloads.
Today, I’m pleased to announce that with the upcoming 6.1.1 release of Cisco’s Tidal Enterprise Scheduler, Cisco’s MapR integration will deepen further. Leveraging Big Data for competitive advantage and rises in innovative product offerings are changing the storage, management, and analysis of an enterprise’s most critical asset -- data. The difficulty of managing Hadoop clusters will continue to grow and enterprises need solutions like Hadoop to enable the processing of large amounts of data. Cisco Tidal Enterprise Scheduler enables more efficient management of those environment because it is an intelligent solution for integrating Big Data jobs into an existing data center infrastructure. TES has adapters for a range of enterprise applications including: SAP, Informatica, Oracle, PeopleSoft, MSSQL, JDEdwards, and many others.
Stay tuned for additional blog posts on Cisco’s Tidal Enterprise Scheduler version 6.
Tags: Big Data, Cloudera, enterprise scheduler, Hadoop, MapR, mapreduce, sqoop, tes, Tidal
Big Data is quickly becoming a critical priority for enterprises across all verticals. Yet, there is currently a lack of any means of fair comparability of performance and price-performance of underlying platforms. The Big Data Benchmarking Community established last year to fill this void is announcing the efforts that have been undertaken thus far toward the definition “BigData Top100 List”.
Tags: Big Data, Big Data Benchmarking Community, Big Data Top100 List, Cisco, data center
By Shaun Kirby, Director, Innovations Architecture
Internet Business Solutions Group
If anyone still doubts the overwhelming complexity of today’s data deluge, Eric Schmidt, the chairman of Google, offers some poignant perspective. In a recent book, “The Human Face of Big Data”, he observes that from the dawn of civilization until 2003 humankind generated five exabytes of data. Now, we produce more than five exabytes of data every two days.
Those torrents of information may be intimidating, but they also promise great opportunities. Indeed, Big Data has been touted as an answer to many problems. Looking for customer buying patterns? Retailers have petabytes of purchasing history. Need to test a new drug? There are terabytes of patient data to be analyzed. Launching a new product? A mountain of social media data awaits you. Read More »
Tags: Big Data, collaboration, Complexity, data analytics, data in motion, Enterprise, H2M, Human to Machine, Internet of Everything, internet of things, IoE, IoT, M2M, Machine to Machine, P2M, people to machine