Cisco Blogs


Cisco Blog > Data Center and Cloud

Announcing Connected Analytics for IoE

Over the past few weeks, I’ve shared how we are helping our customers address one of their toughest challenges brought on by the Internet of Everything (IoE), Big Data and hybrid IT environments: effective management of the massive amounts of data, types of data and in various locations. With solutions like Data Virtualization , Big Data Warehouse Expansion and Cisco Tidal Enterprise Scheduler, we give our customers the tools to address this challenge head on.

Once you have access to all of your data…what next? The second challenge is to extract real-time valuable information from data in order to make better business decisions. As I’ve said before, more data is only a good thing if you use that data to better respond to opportunities and potential threats. Our customers certainly understand this and, in a recent Cisco study, 40% of surveyed companies identified effectively capturing, storing and analyzing data generated by connected “things” (e.g., machines, devices, equipment) as the biggest challenge to realizing the value of IoT.

The majority of data analysis has historically been performed after moving all data into a centralized repository, but digital enterprises will have so many connections creating so much widely distributed data that moving it all to a central place for analysis will no longer be the optimal approach. For insights needed in real-time, or data sets that are too large to move, the ability to perform analytics at the edge will be a new capability that must be incorporated into any comprehensive analytics strategy.

Analytics 1.0 was all about structured data, in centralized data repositories.  Analytics 2.0 added unstructured data and gave rise to Big Data. Analytics 3.0 will require all of those existing capabilities but will also require data management and analytics capabilities closer to where the data is created…at the edge of the network.

With this new approach in mind, today we announced Connected Analytics for IoE, packaged, network-enriched analytics that leverage Cisco technologies and data to extract real-time valuable information called:

  • Optimize the fan experience -- Connected Analytics for Events monitors Wi-Fi, device and application usage along with social media to deliver insights on fan engagement and business operations.
  • Improve store operations and customer service -- Connected Analytics for Retail supports analysis of metrics, including customer and operational data in retail environments, to help stores take new steps to assure customer satisfaction and store performance.
  • Enhance service quality, customer experience and unveil opportunities for new business -- Connected Analytics for Service Providers provides near real-time operational and customer intelligence from patterns in networks, operations, and customer system data.
  • Understand how to get the most out of your IT assets -- Connected Analytics for IT provides advanced data management, data governance, business intelligence and insights to help align and get the most out of IT capabilities and services.
  • Reveal hidden patterns impacting network deployment and optimization -- Connected Analytics for Network Deployment analyzes devices, software, and features for inconsistencies that disrupt network operations and provides visualizations and actionable recommendations to prioritize network planning and optimization activities.
  • Understand customer patterns in order to meet quality expectations and uncover monetization strategies -- Connected Analytics for Mobility analyzes mobile networks to provide network, operations and business insights for pro-active governance to Wi-Fi solution customers.
  • Gain a holistic view of customers across data silos -- Cisco Connected Analytics for Contact Center delivers actionable customer intelligence to impact behaviors and outcomes during the critical window of customer decision making. Having the right offer at the right time will drive market leadership.
  • Measure the impact of collaboration in comparison with best practices -- Cisco Connected Analytics for Collaboration measures the adoption of collaboration technologies internally. It leverages data collection using the Unified Communications Audit Tool, from sources such as WebEx, IP Phones, Video, Email and Jabber.

The portfolio also includes Cisco Connected Streaming Analytics, a scalable, real-time platform that combines quick and easy network data collection from a variety of sources with one of the fastest streaming analytics engines in the industry.

In the world of IoE, data is massive, messy, and everywhere, spanning many sources – cloud, data warehouses, devices – and formats – video, voice, text, and images. The power of an intelligent infrastructure is what brings all of this data together, regardless of its location or type. That is the Cisco difference.

Tags: , , , ,

The New Analytics Imperative

Cisco today announced a data and analytics strategy and a suite of analytics software that will enable customers to translate their data into actionable business insight regardless of where the data resides.

With the number of connected devices projected to grow from 10 billion today to 50 billion by 2020, the flood tide of new data — widely distributed and often unstructured — is disrupting traditional data management and analytics. Traditionally most organizations created data inside their own four walls and saved it in a centralized repository. This made it easy to analyze the data and extract valuable information to make better business decisions.

But the arrival of the Internet of Everything (IoE) — the hyper-connection of people, process, data, and things – is quickly changing all that. The amount of data is huge. It’s coming from widely disparate sources (like mobile devices, sensors, or remote routers), and much of that data is being created at the edge. Organizations can now get data from everywhere — from every device and at any time — to answer questions about their markets and customers that they never could before. But IT managers and key decision makers are struggling to find the useful business nuggets from this mountain of data.

As an example, take the typical offshore oil rig, which generates up to 2 terabytes of data per day. The majority of this data is time sensitive to both production and safety. Yet it can take up to 12 days to move a single day’s worth of data from its source at the network edge back to the data center or cloud. This means that analytics at the edge are critical to knowing what’s going on when it’s happening now, not almost 2 weeks later.

Read More »

Tags: , , , , , , ,

How ‘Data’ and ‘Process’ Are Reshaping the Future Workforce

The sheer size, variety, and speed of data traversing today’s networks are increasing exponentially. This highly distributed data is generated by a wide range of cloud and enterprise applications, websites, social media, computers, smartphones, sensors, cameras, and much more — all coming in different formats and protocols.

Whether it is in the cloud or at the edge, data generated by the Internet of Everything (IoE) must be analyzed to identify actionable insights that can be used to create better outcomes (such as from process optimization or improved customer engagement). Without this critical step, data remains just “data.”

There is often an immense gap, however, between the amount of data with hidden value and the amount of value that is actually being extracted. According to IDC, less than 1 percent of the world’s data is currently being analyzed. What good is data if isn’t analyzed to gain insights?

It’s no surprise, then, that in a recent survey conducted by Cisco Consulting Services, IT and Operational Technology leaders indicated that they perceive the Internet of Things (IoT) — a critical enabler of IoE — as being about much more than just “things.” When we asked them which area (people, process, data, or things) they needed to improve most to make effective use of IoT solutions, the largest number (40 percent) indicated “Data,” while “Process” (27 percent) ranked second. “People” placed third (20 percent) and “Things” finished last (13 percent).

Focus on Capturing Insights, Not on Connecting Things, to Attain IoT Value from Cisco Business Insights

Read More »

Tags: , , , , , , ,

Modern Art Goes Wireless With Cisco Connected Mobile Experiences (CMX)

Like any service organization, museums are looking for innovative ways to gain insight in consumption patterns, to better optimize resources and to improve the guest experience.

The Cisco Connected Mobile Experiences (CMX) solution helps organizations use their network as a platform, to derive analytics that can be used to better understand how guests with mobile devices  use and interact with a venue. These types of insight can allow venues like museums to optimize their operations and deliver a better experience  to their guests.

Read More »

Tags: , , , ,

OpenSOC: An Open Commitment to Security

According to the Breach Level Index, between July and September of this year, an average of 23 data records were lost or stolen every second – close to two million records every day.1 This data loss will continue as attackers become increasingly sophisticated in their attacks. Given this stark reality, we can no longer rely on traditional means of threat detection. Technically advanced attackers often leave behind clue-based evidence of their activities, but uncovering them usually involves filtering through mountains of logs and telemetry. The application of big data analytics to this problem has become a necessity.

To help organizations leverage big data in their security strategy, we are announcing the availability of an open source security analytics framework: OpenSOC. The OpenSOC framework helps organizations make big data part of their technical security strategy by providing a platform for the application of anomaly detection and incident forensics to the data loss problem. By integrating numerous elements of the Hadoop ecosystem such as Storm, Kafka, and Elasticsearch, OpenSOC provides a scalable platform incorporating capabilities such as full-packet capture indexing, storage, data enrichment, stream processing, batch processing, real-time search, and telemetry aggregation. It also provides a centralized platform to effectively enable security analysts to rapidly detect and respond to advanced security threats.

The OpenSOC framework provides three key elements for security analytics:

  1. Context

    A mechanism to capture, store, and normalize any type of security telemetry at extremely high rates. OpenSOC ingests data and pushes it to various processing units for advanced computation and analytics, providing the necessary context for security protection and the ability for efficient information storage. It provides visibility and the information required for successful investigation, remediation, and forensic work.

  2. Real-time

    Real-time processing and application of enrichments such as threat intelligence, geolocation, and DNS information to collected telemetry. The immediate application of this information to incoming telemetry provides the greater context and situational awareness critical for detailed and timely investigations.

  3. Centralized Perspective

    The interface presents alert summaries with threat intelligence and enrichment data specific to an alert on a single page. The advanced search capabilities and full packet-extraction tools are available for investigation without the need to pivot between multiple tools.

During a breach, sensitive customer information and intellectual property is compromised, putting the company’s reputation, resources, and intellectual property at risk. Quickly identifying and resolving the issue is critical, but, traditional approaches to security incident investigation can be time-consuming. An analyst may need to take the following steps:

  1. Review reports from a Security Incident and Event Manager (SIEM) and run batch queries on other telemetry sources for additional context.
  2. Research external threat intelligence sources to uncover proactive warnings to potential attacks.
  3. Research a network forensics tool with full packet capture and historical records in order to determine context.

Apart from having to access several tools and information sets, the act of searching and analyzing the amount of data collected can take minutes to hours using traditional techniques.

When we built OpenSOC, one of our goals was to bring all of these pieces together into a single platform.  Analysts can use a single tool to navigate data with narrowed focus instead of wasting precious time trying to make sense of mountains of unstructured data.

No network is created equal. Telemetry sources differ in every organization. The amount of telemetry that must be collected and stored in order to provide enough historical context also depends on the amount of data flowing through the network. Furthermore, relevant threat intelligence differs for each and every individual organization.

As an open source solution, OpenSOC opens the door for any organization to create an incident detection tool specific to their needs.  The framework is highly extensible: any organization can customize their incident investigation process. It can be tailored to ingest and view any type of telemetry, whether it is for specialized medical equipment or custom-built point of sale devices. By leveraging Hadoop, OpenSOC also has the foundational building blocks to horizontally scale the amount of data it collects, stores, and analyzes based on the needs of the network.  OpenSOC will continually evolve and innovate, vastly improving organizations’ ability to handle security incident response.

We look forward to seeing the OpenSOC framework evolving in the open source community. For more information and to contribute to the OpenSOC community, please visit the community website at http://opensoc.github.io/.


 

1http://www.breachlevelindex.com/

Tags: , , , ,