Cisco Blogs


Cisco Blog > Data Center and Cloud

Building an Enterprise Data Hub, Evaluating Infrastructure

Cisco, MapR, Informatica

 

Want to get the most out of your big data? Build an enterprise data hub (EDH).

Big data is rapidly getting bigger. That in itself isn’t a problem. The issue is what Gartner analyst Doug Laney describes as the three Vs of Big Data: volume, velocity, and variety.

 

Gartner

Volume refers to the ever-growing amount of data being collected. Velocity is the speed at which the data is being produced and moved through the enterprise information systems. Variety refers to the fact that we’re gathering information from multiple data sources such as sensors, enterprise resource planning (ERP) systems, e-commerce transactions, log files, supply chain info, social media feeds, and the list goes on.

 

Data warehouses weren’t made to handle this fast-flowing stream of wildly dissimilar data. Using them for this purpose has led to resource-draining, sluggish response times as workers attempt to perform numerous extract, load, and transform (ELT) functions to make stored data accessible and usable for the task at hand.

Constructing Your Hub

An EDH addresses this problem. It serves as a central platform that enables organizations to collect structured, unstructured, and semi-structured data from slews of sources, process it quickly, and make it available throughout the enterprise.

Building an EDH begins with selecting the right technology in three key areas: infrastructure, a foundational system to drive EDH applications, and the data integration platform. Obviously, you want to choose solutions that fit your needs today and allow for future growth. You’ll also want to ensure they are tested and validated to work well together and with your existing technology ecosystem. In this post, we’ll focus on selecting the right hardware.

Cisco UCS Big Data Domain

 

The Infrastructure Component

Big data deployments must be able to handle continued growth, from both a data and user load perspective. Therefore, the underlying hardware must be architected to run efficiently as a scalable cluster. Important features such as the integration of compute and network, unified management, and fast provisioning all contribute to an elastic, cloud-like infrastructure that’s required for big data workloads. No longer is it satisfactory to stand up independent new applications that result in new silos. Instead, you should plan for a common and consistent architecture to meet all of your workload requirements.

 

Big data workloads represent a relatively new model for most data centers, but that doesn’t mean best practices must change. Handling a big data workload should be viewed from the same lens as deployments of traditional enterprise applications. As always, you want to standardize on reference architectures, optimize your spending, provision new servers quickly and consistently, and meet the performance requirements of your end users.

 

Cisco Unified Computing System to Run Your EDH

Cisco UCS for Big Data

The Cisco Unified Computing System™ (Cisco UCS®) Integrated Infrastructure for Big Data delivers a highly scalable platform that is proven for enterprise applications like Oracle, SAP, and Microsoft. It also provides the same required enterprise-class capabilities--performance, advanced monitoring, simplification of management, QoS guarantees--to big data workloads. With lower switch and cabling infrastructure costs, lower power consumption, and lower cooling requirements, you can realize a 30 percent reduction in total cost of ownership. In addition, with its service profiles, you get fast and consistent time-to-value by leveraging provisioning templates to instantly set up a new cluster or add many new nodes to an existing cluster.

 

And when deploying an EDH, the MapR Distribution including Apache Hadoop® is especially well-suited to take advantage of the compute and I/O bandwidth of Cisco UCS. Cisco and MapR have been working together for the past 2 years and have developed Cisco-validated design guides to provide customers the most value for their IT expenditures.

 

Cisco UCS for Big Data comes in optimized power/performance-based configurations, all of which are tested with the leading big data software distributions. You can customize these configurations further, or use the system as is. Utilizing one of Cisco UCS for Big Data’s pre-configured options goes a long way to ensuring a stress-free deployment. All Cisco UCS solutions also provide a single point of control for managing all computing, networking, and storage resources, for any fine tuning you may do before deployment or as your hub evolves in the future.

 

I encourage you to check out the latest Gartner video to hear Satinder Sethi, our VP of Data Center Solutions Engineering and UCS Product Management, share his perspective on how powering your infrastructure is an important component of building an enterprise data hub.

 

Gartner Video

 

 

 

 

 

 

 

 

 

In addition, you can read the MapR Blog, Building an Enterprise Data Hub, Choosing the Foundational Software.

Let me know if you have any comments or questions, or via twitter at @CicconeScott.

Tags: , , , , , , , , , , , , , ,

Transparently Offloading Data Warehouse Data to Hadoop using Data Virtualization

More data allows for better and more expansive analysis. And better analysis is a critical success factor for businesses today.

But most data warehouses use the once-in-never-out principle when storing data. So whenever new business activities occur, new data is added without removing old data to make room. New data sources, such as data from social media networks, open data sources, and public web services further expand the warehouse. Unfortunately, all this growth comes at a cost.

Is there a way you can have your cake and eat it too?

With Hadoop and Cisco Big Data Warehouse Expansion, you can.

Disadvantages of More Data

While everyone understands the business advantage that can be derived from analyzing more data, not everyone understands the disadvantages that can occur including:

  • Expensive data storage: Data warehouse costs include hardware costs, management costs, and database server license fees.  These grow in line with scale.
  • Poor query performance: The bigger the database tables, the slower the queries.
  • Poor loading performance: As tables grow, loading new data also slows down.
  • Slow backup/recovery: The larger the database, the longer the backup and restore process.
  • Expensive database administration: Larger databases require more database administration including tuning and optimizing the database server, the tables, the buffer, and so on.

Three Options to Control Costs

The easiest way to control data warehouse costs is to simply remove data, especially the less-frequently used or older data. But then this data can no longer be analyzed.

Another option is to move the lesser-used data to tape. This option provides cost savings, and in an emergency, the data can be reloaded from tape. But analysis has now become EXTREMELY difficult.

The third option is to offload lesser-used data to cheaper online data storage, with Hadoop the obvious choice. This provides a 10x cost savings over traditional databases, while retaining the online access required for analysis.

This is the “have your cake and eat it too” option.

The Fast Path to Transparent Offloading

Cisco provides a packaged solution called Cisco Big Data Warehouse Expansion, which includes the data virtualization software, hardware, and services required to accelerate all the activities involved in offloading data from a data warehouse to Hadoop.

And to help you understand how it works, Rick van der Lans, data virtualization’s leading independent analyst, recently wrote a step-by-step white paper, Transparently Offloading Data Warehouse Data to Hadoop using Data Virtualization, that explains everything you need to do.

Read The White Paper

Download Transparently Offloading Data Warehouse Data to Hadoop using Data Virtualization here.

 

Learn More

To learn more about Cisco Data Virtualization, check out our page.

Join the Conversation

Follow us @CiscoDataVirt.

Tags: , , , , , ,

Can the Elephant Dance to a Security Tune?

HadoopThere is a great debate in the security world right now: have SIEM and logging products run their course? Will Hadoop ride to the rescue? Can machines “learn” about security and reliably spot threats that no other approach can find?

Gartner calls this phenomenon Big Data Security Analytics, and they make a strong point to define BDSA solutions as a three-layer pyramid. At the bottom is the “data lake,” which is what most people equate with Hadoop. The next layer is context—the addition of relevant business, location, and other non-traditional security information to increase the precision of the next layer: applications and analytics (such as Machine Learning). It is this top layer where the real value of BDSA is realized in terms of finding new threats and remediating them before they do damage.

Read More »

Tags: , , , , ,

Unlock The Value of Big Data with Cisco Unified Computing System

Big Data is not just about gathering tons of data, the digital exhaust from the internet, social media, and customer records.  The real value is in being able to analyze the data to gain a desired business outcome.  

Screen Shot 2014-09-21 at 8.18.12 PMThose of us who follow the Big Data market closely never lack for something new to talk about. There is always a story about how a business is using Big Data in a different way or about some new breakthrough that has been achieved in the expansive big data ecosystem. The good news for all of us is, we have clearly only scratched the surface of the Big Data opportunity!

With the increasing momentum of the Internet of Everything (IoE) market transition, there will be 50 billion devices connected to the Internet by 2020—just five years from now. As billions of new people, processes, and things become connected, each connection will become a source of potentially powerful data to businesses and the public sector.  Organizations who can unlock the intelligence in this data can create new sources of competitive advantage, not just from more data but from better access to better data.

What we haven’t heard about – yet—are examples of enterprises that are applying the power of this data pervasively in their organizations:  giving them a competitive edge in marketing, supply chain, manufacturing, human resources, customer support, and many more departments. The enterprise that can apply the power of Big Data throughout their organization can create multiple and simultaneous sources of ongoing innovation—each one a constantly renewable or perpetual competitive edge. Looking forward, the companies that can accomplish this will be the ones setting the pace for the competition to follow.

Cisco has been working on making this vision of pervasive use of Big Data within enterprises a reality. We’d like to share this vision with you in an upcoming blog series and executive Webcast entitled, ‘Unlock Your Competitive Edge with Cisco Big Data Solutions’, that will air on October 21st at 9:00 AM PT.

Register Now

 

 

I have the honor of kicking off the multi-part blog series today. Each blog will focus on a specific Cisco solution our customers can utilize to unlock the power of their big data – enterprise-wide-- to deliver a competitive edge to our customers.  I’m going to start the discussion by highlighting the infrastructure implications for Big Data in the internet of Everything (IoE) era and focus on Cisco Unified Computing System initially.

Enterprises who want to make strategic use of data throughout their organizations will need to take advantage of the power of all types of data. As IoE increasingly takes root, organizations will be able to access data from virtually anywhere in their value chain. No longer restricted to small sets of structured, historical data, they’ll have more comprehensive and even real-time data including video surveillance information,  social media output, and sensor data that allow them to monitor behavior, performance, and preferences. These are just a few examples, but they underscore the fact that not all data is created equally. Real-time data coming in from a sensor may only be valuable for minutes, or even seconds – so it is critical to be able to act on that intelligence as quickly as possible. From an infrastructure standpoint, that means enterprises must be able to connect the computing resource as closely as possible to the many sources and users of data. At the same time, historical data will also continue to be critical to Big Data analytics.

Cisco UCS Common Platform Architecture for Big Data from Cisco Data Center

Cisco encourages our customers to take a long-term view—and select a Big Data infrastructure that is distributed, and designed for high scalability, management automation, outstanding performance, low TCO, and the comprehensive, security approach needed for the IoE era. And that infrastructure must be open—because there is tremendous innovation going on in this industry, and enterprises will want to be able to take full advantage of it.

Cisco UCS for Big DataOne of the foundational elements of our Big Data infrastructure is the Cisco Unified Computing System (UCS).  UCS integrated infrastructure uniquely combines server, network and storage access and has recently claimed the #1, x86 blade server market share position in the Americas. It’s this same innovation that propelled us to the leading blade market share position that we are directly applying to Big Data workloads.  With its highly efficient infrastructure, UCS lets enterprises manage up to 10,000 UCS servers as if they were a single pool of resources, so they can support the largest data clusters.

UCS Mini

Because enterprises will ultimately need to be able to capture intelligence from both data at rest in the data center and data at the edge of the network, Cisco’s broad portfolio of UCS systems gives our customers the flexibility to process data where it makes the most sense. For instance, our UCS 240 rack system has been extremely popular for Hadoop-based Big Data deployments at the data center core. And Cisco’s recently introduced UCS Mini is designed to process data at the edge of the network.

Because the entire UCS portfolio utilizes the same unified architecture, enterprises can choose the right compute configuration for the workload, with the advantage of being able to use the same powerful management and orchestration tools to speed deployment, maximize availability, and significantly lower your operating expenses.  Being able to leverage UCS Manager and Service Profiles, Unified Fabric and SingleConnect Technology, our Virtual interface card technology, and industry leading performance really set Cisco apart from our competition.

So, please consider this just an introduction to the first component of Cisco’s “bigger”, big data story. To hear more, please make plans to attend our upcoming webcast entitled,  ‘Unlock Your Competitive Edge With Cisco Big Data Solutions’ on October 21st.  

Register Now

Every Tuesday and Thursday from now until October 21st, we’ll post another blog in the series to provide you with additional details of Cisco’s full line of products, solutions and services.

View additional blogs in the series:

     9/25:    Unlock Big Data with Breakthroughs in Management Automation

     9/30:    Turbocharging New Hadoop Workloads with Application Centric Infrastructure

     10/2:    Enable Automated Big Data Workloads with Cisco Tidal Enterprise Scheduler

     10/7:    To Succeed with Big Data, Enterprises Must Drop an IT-Centric Mindset: Securing IoT Networks Requires New Thinking

     10/9:    Aligning Solutions to meet our Customers’ Data Challenges

    10/14:   Analytics for an IoE World

Please let me know if you have any comments or questions, or via Twitter at @CicconeScott.

 

 

 

 

 

 

Tags: , , , , , , , , , , , , , , , , , , , ,

How Cisco Helped Solutionary Boost Security and Improve their Hadoop Performance

Every day, security threats continue to evolve, as cyber attackers continue to exploit gaps in basic security controls. In fact, the federal government alone has experienced a 680% increase in cyber security breaches in the past six years, and cybersecurity attacks against the US average 117 per day. Globally, the estimated annual cost of cybercrime is over $100 billion. Often, even when security breaches are identified, it can be extremely difficult to figure how they happened or who is responsible.

One company working hard to prevent these threats is Solutionary, a managed security services provider (MSSP) that actively monitors their customers’ technology systems in order to identify and thwart security events before any negative impacts occur.

In order to provide real-time analytics of client traffic and user activity, Solutionary, a wholly owned subsidiary of NTT Group, developed a patented Solutionary ActiveGuard® Security and Compliance Platform which correlates data across global threats and trends in order to quickly identify security alerts and provide clients with actionable alerts.

The patented, cloud-based ActiveGuard® Security and Compliance Platform is the technology behind Solutionary Managed Security Services

The patented, cloud-based ActiveGuard® Security and Compliance Platform is the technology behind Solutionary Managed Security Services

 

 

 

 

 

 

 

In order to keep up with growing data volumes, the need for fast security analytics, and their expanding client base, Solutionary needed to find a way to quickly scale their infrastructure, as their traditional server infrastructure was not able to easily scale and support in-depth analysis. Their challenge was to figure out how to:

 

1)     Increase their data analytics capabilities and improve their clients’ security

 

2)   Cost-effectively scale as their clients/data volume grows

 

When a security threat occurred in the past, the legacy systems could only be used to analyze log data; they couldn’t see the big picture. Thus, when an event happened, it would sometimes take weeks of forensics work to figure out what had occurred. In order to meet these challenges, Solutionary turned to the MapR Distribution for Hadoop running on the Cisco Unified Computing System™.  By using Hadoop, Solutionary was able to smoothly analyze both structured and unstructured data on a single data infrastructure, instead of relying on a costly traditional database solution that couldn’t pull in both structured and unstructured data into a single platform for analysis.

Cisco UCS Common Platform Architecture for Big Data

Cisco UCS Common Platform Architecture for Big Data

 

 

 

 

 

 

 

 

 

Specifically, the Cisco/MapR environment consists of two MapR clusters of 16 Cisco UCS C240 M3 Rack Servers. Solutionary uses the Cisco UCS Manager to provision and control their servers and network resources, while the Cisco UCS 6200 Series Fabric Interconnects provide high-bandwidth connections to servers, and act as centralized management points for the Cisco infrastructure, eliminating the need to manage each element in the environment separately. Because of the environment’s high scalability, it’s easy for the fabric interconnects to support the large number of nodes needed for MapR clusters. Scalability is improved even further by using the Cisco UCS 2200 Series Fabric Extenders to extend the network into each rack.

Cisco UCS Components

Cisco UCS Components

 

 

 

 

 

 

 

 

 

 

 

 

 

With MapR and the Cisco UCS CPA for Big Data environment, Solutionary can now access a much greater amount of data analysis and contextual data, giving them a more informed picture of behavior patterns, anomalous activities, and attack indicators. By quickly identifying global patterns, Solutionary can identify new security threats and put them into context for their clients.

Let me know if you have any comments or questions, or via twitter at @CicconeScott.

 

 

Tags: , , , , , , , , , , , , ,