Analytics will continue to take center stage as the volume of data generated by embedded systems increases and vast pools of structured and unstructured data within and outside the enterprise are analyzed. — Gartner
Big Data will continue to be important, but it’s critical to first understand how businesses can quickly gather relevant insights from their big data. The value is in unlocking key takeaways from the data because these insights can be used for agile decision-making and faster time-to-market approaches. Discovering better business insights quickly requires the combination of software and hardware that is optimized for speed, scale, and flexibility. And that is exactly what you get when you combine Platfora and Cisco UCS.
When Platfora runs on Cisco UCS, business analysts can find these patterns in minutes or hours rather than months. For example, our joint customer was able to identify exactly what factors impacted their customer experience using the Platfora solution which was deployed in 1/10th the time and cost of traditional approaches.
Platfora enables users to analyze petabytes of data at scale and leverages the latest cutting-edge technologies such as Spark and YARN (MapReduce 2.0). The Platfora end-to-end platform replaces the need for ETL, Data Warehousing, and BI tools. And the combination of Platfora and UCS ensures that there are no performance, scalability, or TCO tradeoffs as we add new data discovery joint use cases. This joint solution is truly designed for enterprise-scale analytics.
Read More »
Tags: BigData, CiscoUCS, data center, ETL, Platfora, Spark, Strata Hadoop, tco, YARN
This June in San Diego, I had the pleasure of meeting Dan Stanton, Trainer and Subject Matter Expert at NterOne, a global IT training and consulting company. Dan shared his challenges to create great digital experiences for NterOne’s students. Dan and his team have to support virtual IT training in many different time zones and must undertake twenty or so dynamic reconfigurations every week. NterOne is like many enterprise customers except they are sped up to a high rate of change.
Dan runs a multi-hypervisor environment which made ACI a perfect match. Please listen to Dan share his use cases and how they positively impact NterOne’s business in the interview below:
For more information and insights into ACI See:
Cisco Application Centric Infrastructure Case Study: NterOne
Getting Started with Cisco Application Centric Infrastructure (ACI) in the Small-to-Midsize Commercial Data Center
Tags: #CiscoACI, ACI, Harry Petty, NterOne
According to scientists, the age of smartphones has left humans with such a short attention span that even a goldfish can hold a thought for longer. On an average, human attention span has fallen from 12 seconds in year 2000 to 8 seconds in today’s smart-world.
What does this mean for Splunk Enterprise ? Read More »
Tags: analytics, Big Data, Cisco UCS, data center, Integrated infrastructure, Splunk, Splunk Enterprise, Splunkconf, UCS, ucsbigdata
Delivering on the promise of Big Data and Analytics takes an ecosystem of partners who collaborate to integrate the underlying technologies so your organization can turn data into business value – faster. That’s why Cisco and MapR are teaming to deliver integrated solutions that are transforming the way organizations deploy and capitalize on the value of Hadoop technology.
The Cisco UCS Integrated Infrastructure for Big Data with MapR solution combines the MapR Distribution including Apache Hadoop with Cisco UCS Integrated Infrastructure for Big Data, which unifies computing, storage, connectivity, and management capabilities. This validated solution delivers an industry-leading architectural platform for Hadoop-based applications.
Cisco and MapR continue to innovate to enable new customer use cases. MapR Senior Solutions Architect, Dr. James Sun, provides an excellent example on his latest blog on Dockerizing Apache Webservers with Cisco UCS, Apache Mesos and MapR.
Read More »
Tags: Big Data, Hadoop, MapR, Strata Hadoop, UCS
Guest Blog by Ron Graham
Ron Graham had served as a Data Center Architect and Systems Engineer for some of the largest IT companies in the U.S. including Cisco Systems, NetApp, Sun Microsystems, and Oracle. He is currently working for Cisco Systems as a Big Data Analytics Engineer.
What is Data Virtualization? Our definition is: Agile data integration software that makes it easy to access all your data no matter where it’s managed, and query it across the network as if it were in a single place. I like to say it differently – the real value lies in its ability to provide business users with a single high-level view of data that is spread across their infrastructure.
Data Virtualization is essentially middleware software that leverages a high-performance query engine and can utilize advanced computer architectures such as Cisco UCS. It’s a virtual data integration layer that can deliver data from multiple sources that are loosely coupled or have little or no knowledge of the other components. Of course this is done in a logically organized manner as show by the diagram below.
This is all nice but where is the beef, or the sex appeal? The sexy part is in the front-end business intelligence platforms and data visualization tools that can access and analyze data such as Tableau. Tableau can simply access data through the Cisco Data Virtualization with an ODBC driver. From here, business users can query data on demand from a single point of access (i.e. a common data model) without having to understand the different schemas or SQL dialects of the original data sources.
Read More »
Tags: Big Data, Cisco, Cisco UCS, data center, Hadoop, Strata Hadoop, Tableau