Cisco Blogs


Cisco Blog > Data Center and Cloud

Cisco Connects Data and Analytics

Billions of devices are changing how organizations compete and disrupting traditional data management and analytics.

This Internet of Everything world presents an exciting new opportunity to discover and take advantage of market, customer, and operational insights. And by making sense of captured data quickly, organizations can take action at that point, in that moment, in ways that differentiate versus competitors and drive significant new business value.   TimeWarnerCable’s intelligent home initiative is one example.

Screen Shot 2014-12-17 at 4.51.42 PM

But all this data is massive, messy, and everywhere, spanning many sources – cloud, data warehouses, devices – and formats – video, voice, text, and images.  To address this challenge, new solutions beyond traditional data warehousing and even big data are required.

Cisco Enters the Data and Analytics Market

When Cisco acquired data virtualization market leader Composite Software in mid 2013, Cisco signaled a clear intent to begin connecting this data via intelligent networking the same way it connected LANs; the Internet; voice and video over IP; and more in it’s 30 year history.

And with our December 11, 2014, Connected Analytics Portfolio announcement, Cisco adds a rich suite of analytics solutions that help organizations capture insights that create new opportunities, simplify business operations, enhance the customer experience, and resolve potential threats.

New Methods for the New Challenges

Today’s analytic solutions need to advance beyond traditional methods that move data to a warehouse or data lake before commencing analysis.  Cisco’s Connected Analytics Portfolio provides analytics with immediate access to data, as well as brings analytics to the data – no matter where the data resides on the network.

Further, Cisco is uniquely qualified to implement analytics at the point of data, because so much of the data worldwide resides on our networks, providing the ideal platform for embedded analytics.  Along with 30 years of networking experience, Cisco now has the data and analytics tools, software, and services to help our customers instantly capture, analyze, and interpret critical data out to the network edge.

 

Learn More

Visit our Data and Analytics website to learn more.

Join the Conversation

Follow us @CiscoDataVirt and @CiscoAnalytics.

Tags: , , , ,

Transparently Offloading Data Warehouse Data to Hadoop using Data Virtualization

More data allows for better and more expansive analysis. And better analysis is a critical success factor for businesses today.

But most data warehouses use the once-in-never-out principle when storing data. So whenever new business activities occur, new data is added without removing old data to make room. New data sources, such as data from social media networks, open data sources, and public web services further expand the warehouse. Unfortunately, all this growth comes at a cost.

Is there a way you can have your cake and eat it too?

With Hadoop and Cisco Big Data Warehouse Expansion, you can.

Disadvantages of More Data

While everyone understands the business advantage that can be derived from analyzing more data, not everyone understands the disadvantages that can occur including:

  • Expensive data storage: Data warehouse costs include hardware costs, management costs, and database server license fees.  These grow in line with scale.
  • Poor query performance: The bigger the database tables, the slower the queries.
  • Poor loading performance: As tables grow, loading new data also slows down.
  • Slow backup/recovery: The larger the database, the longer the backup and restore process.
  • Expensive database administration: Larger databases require more database administration including tuning and optimizing the database server, the tables, the buffer, and so on.

Three Options to Control Costs

The easiest way to control data warehouse costs is to simply remove data, especially the less-frequently used or older data. But then this data can no longer be analyzed.

Another option is to move the lesser-used data to tape. This option provides cost savings, and in an emergency, the data can be reloaded from tape. But analysis has now become EXTREMELY difficult.

The third option is to offload lesser-used data to cheaper online data storage, with Hadoop the obvious choice. This provides a 10x cost savings over traditional databases, while retaining the online access required for analysis.

This is the “have your cake and eat it too” option.

The Fast Path to Transparent Offloading

Cisco provides a packaged solution called Cisco Big Data Warehouse Expansion, which includes the data virtualization software, hardware, and services required to accelerate all the activities involved in offloading data from a data warehouse to Hadoop.

And to help you understand how it works, Rick van der Lans, data virtualization’s leading independent analyst, recently wrote a step-by-step white paper, Transparently Offloading Data Warehouse Data to Hadoop using Data Virtualization, that explains everything you need to do.

Read The White Paper

Download Transparently Offloading Data Warehouse Data to Hadoop using Data Virtualization here.

 

Learn More

To learn more about Cisco Data Virtualization, check out our page.

Join the Conversation

Follow us @CiscoDataVirt.

Tags: , , , , , ,

Enhancing the Customer Experience with Data & Analytics

Last month, I had the honor of presenting at the Internet of Things (IoT) World Forum in Chicago. The event gave me the opportunity to do one of my favorite things, collaborate and network with all of my peers who are doing creative work within the world of IoT. One of my colleagues, Kowsalya Arunprakash, Lead Architect of Virtual Data Integration Services for Time Warner Cable, co-presented and shared a use case in which Time Warner is utilizing Cisco Data Virtualization to enhance its customer experience with analytics. In today’s blog, I’d like to share more details about this use case, because I think it’s a great example of how organizations are leveraging IoT solutions as a way to better serve their customers and separate themselves from their competition.

Time Warner Cable IntelligentHome is a home security and energy management system which TWC IntelligentHomeusers can control from their smartphone, tablet or computer to do things like view live video, arm/disarm their system, turn their home lights on/off or adjust the temperature of their thermostat. As you can imagine, each one of these pieces of equipment creates a fair amount of data through radio-frequency identification (RFID) and sensors. On top of this, as consumers generally do, users are going to social media to share their experience.

Utilizing Cisco Data Virtualization, Time Warner is able to couple this data along with sales, marketing and historical customer data to get a full 360 degree view of operational analytics. By operational, I am TWC IntelligentHome 2referring to intelligence like customer trends, sales analytics and resource allocation management.

This combination is an extremely powerful tool to understand the customer in order to create and adapt products and services that cater to their wants and needs. It’s an opportunity to see how the product is being used, via RFIDs and sensors, coupled with customers’ feedback and experience shared on social media as well as their long-term history of usage and preferences.

During her presentation, Kowsalya shared that by leveraging these insights Time Warner is able to improve sales, reduce customer churn and even work with local law enforcement and emergency services to respond faster to current events. To hear more, I encourage you to watch the video interview of Kowsalya and learn about how Time Warner is using Data Virtualization to derive value for their customers.

 

Learn More

To learn more about Cisco Data and Analytics, check out our page.

Join the Conversation

Follow @MikeFlannagan and @CiscoAnalytics.

Tags: , , , , , , , ,

New Research Identifies How to Accelerate Data Virtualization Adoption

The challenges of data management are getting exponentially harder. These ever-increasing data quantities, diversity and distribution are revolutionizing data management, opening the door for new solutions such as data virtualization.

Data virtualization fulfills a range of business demands for data, supports high iteration and fast response times, all while saving significant IT costs.

Radiant Advisors is a leading strategic research and advisory firm that helps transform today’s organizations into tomorrow’s data-driven industry leaders.  They recently teamed with Cisco to better understand the barriers experienced by companies considering data virtualization and see how companies that have already adopted data virtualization overcame them.

The Data Virtualization Adoption Dilemma

Data virtualization is an advanced technology with nearly every major research and consulting firm provides architectural blueprints that include a logical business semantic layer.

Yet only some companies have it, leaving data virtualization an often-missed opportunity for business and IT to keep pace with today’s volatile data landscape.

To achieve the benefits of data virtualization, companies need to take the leap.

Unique Research Formula

To better understand data virtualization adoption barriers, Radiant Advisors’ Research Director, Lindy Ryan, interviewed a broad, pre-adopter community to understand their perceptions and concerns.  These top-of-mind issues were then posited to companies that have already adopted data virtualization, who then anonymously shared the insights, best practices, and lessons they learned in overcoming barriers to data virtualization adoption.

The resulting research report, Overcoming Barriers to Data Virtualization Adoption, consolidated these findings into clear counsel intended to guide potential adopters to overcome perceived barriers.

Read The Research

Download Overcoming Barriers to Data Virtualization Adoption here.

 

Learn More

To learn more about Cisco Data Virtualization, check out our page.

Join the Conversation

Follow us @CiscoDataVirt.

Tags: , , , ,

Keeping Up With Demand for Big Data Analytics

In my recent blogs about building the right data strategy and analytics solutions, I discussed how Cisco is helping our customers to meet one of the toughest challenges brought on by the Internet of Everything (IoE) – cost-effectively managing massive amounts of distributed data. With solutions such as Cisco Data Virtualization and Cisco Big Data Warehouse Expansion (BDWE), our customers can bring all of this data together in ways that are meaningful to them. Utilizing the network to securely connect data throughout the IoE and providing advanced analytics, we help our customers predict outcomes so they can drive better decisions in real-time.

To keep up with growing demands for big data analytics, now is the time to look at workload automation end to end. In this blog, I would like to share the other aspect of big data analytics; building the right workload automation strategy to integrate data, analytics and operations.

In today’s big data analytics environments, IT staff is regularly managing increasingly complex processes that are co-dependent on one another and span applications and departments. Rather than implementing silos of automation, we are helping customers take advantage of workload automation from a unified perspective. Our enterprise-wide workload automation solution, Cisco Tidal Enterprise Scheduler (TES), simplifies end-to-end data management and automates diverse business processes across a broad set of applications, systems and environments.

For example, many business intelligence and analytics applications operate 24 hours a day. To handle a high volume of these jobs – many with service-level agreements – customers need the tools to receive and respond to alerts from anywhere.  Using the Cisco Tidal Enterprise Scheduler client on their iPhones and iPads, our customers can receive alerts, check logs and remediate errors from anywhere. As organizations continue to require fast and timely business services, we put the power directly in the users’ hands. By using TES’ self-service portal, business users can monitor the progress of relevant workloads through a web browser and perform basic job control.  In addition to reducing the downtime of a business user having to wait for IT to address their issue, this also takes some of the burden off IT.

TES is a key piece of any company’s big data and analytics software strategy and a priority for us at Cisco. Cisco IT uses TES internally and, due to successful deployment, is now encouraging further widespread internal adoption. Since Cisco TES is the first workload automation solution in the industry to have a suite of Hadoop adapters (as well as other data source and application adapters), Cisco IT is now additionally leveraging TES in Cisco’s own internal Big Data initiatives to deliver an end-to-end big data workload solution -- as detailed here and here. We are continuously working to develop ways to integrate TES further with other Cisco Big Data solutions, such as the recently launched BDWE solution, to provide more value through holistic data management solutions.

Taking full advantage of big data delivers a tremendous competitive advantage to our customers.  Cisco TES facilitates the end-to-end management of the entire process from data acquisition, to value extraction, to action. Stay tuned as we continue to deliver new capabilities to further bridge today’s gap between enterprise IT capabilities and business requirements.

 

Learn More

To learn more about Cisco Tidal Enterprise Scheduler and the power of our integrated infrastructure for big data, check out our page.

Join the Conversation

Follow @MikeFlannagan.

Tags: , , , ,