Cisco Blogs


Cisco Blog > Security

Step-by-Step Setup of ELK for NetFlow Analytics

Contents

 

 

Intro

 

The ELK stack is a set of analytics tools. Its initials represent Elasticsearch, Logstash and Kibana. Elasticsearch is a flexible and powerful open source, distributed, real-time search and analytics engine. Logstash is a tool for receiving, processing and outputting logs, like system logs, webserver logs, error logs, application logs and many more. Kibana is an open source (Apache-licensed), browser-based analytics and search dashboard for Elasticsearch.

ELK is a very open source, useful and efficient analytics platform, and we wanted to use it to consume flow analytics from a network. The reason we chose to go with ELK is that it can efficiently handle lots of data and it is open source and highly customizable for the user’s needs. The flows were exported by various hardware and virtual infrastructure devices in NetFlow v5 format. Then Logstash was responsible for processing and storing them in Elasticsearch. Kibana, in turn, was responsible for reporting on the data. Given that there were no complete guides on how to use NetFlow with ELK, below we present a step-by-step guide on how to set up ELK from scratch and enabled it to consume and display NetFlow v5 information. Readers should note that ELK includes more tools, like Shield and Marvel, that are used for security and Elasticsearch monitoring, but their use falls outside the scope of this guide.

In our setup, we used

  • Elasticsearch 1.3.4
  • Logstash 1.4.2
  • Kibana 3.1.1

For our example purposes, we only deployed one node responsible for collecting and indexing data. We did not use multiple nodes in our Elasticsearch cluster. We used a single-node cluster. Experienced users could leverage Kibana to consume data from multiple Elasticsearch nodes. Elasticsearch, Logstash and Kibana were all running in our Ubuntu 14.04 server with IP address 10.0.1.33. For more information on clusters, nodes and shard refer to the Elasticsearch guide.

Read More »

Tags: , , ,

A Safer Ride, with a Smarter Motorcycle Helmet

Connecting Dark Assets: An ongoing series on how the Internet of Everything is transforming the ways in which we live, work, play, and learn.

Racing down the wide, open highway on a beautifully crafted motorcycle is one of life’s most exhilarating rushes. At least I used to think so, before my wife talked me into taking up safer pastimes.

But Internet of Everything (IoE) technologies may be offering me a new lease on motorcycling. A new product called the Skully AR-1 is being billed as “The World’s Smartest Motorcycle Helmet.” And who am I to argue? Read More »

Tags: , , , , , , ,

OpenSOC: An Open Commitment to Security

According to the Breach Level Index, between July and September of this year, an average of 23 data records were lost or stolen every second – close to two million records every day.1 This data loss will continue as attackers become increasingly sophisticated in their attacks. Given this stark reality, we can no longer rely on traditional means of threat detection. Technically advanced attackers often leave behind clue-based evidence of their activities, but uncovering them usually involves filtering through mountains of logs and telemetry. The application of big data analytics to this problem has become a necessity.

To help organizations leverage big data in their security strategy, we are announcing the availability of an open source security analytics framework: OpenSOC. The OpenSOC framework helps organizations make big data part of their technical security strategy by providing a platform for the application of anomaly detection and incident forensics to the data loss problem. By integrating numerous elements of the Hadoop ecosystem such as Storm, Kafka, and Elasticsearch, OpenSOC provides a scalable platform incorporating capabilities such as full-packet capture indexing, storage, data enrichment, stream processing, batch processing, real-time search, and telemetry aggregation. It also provides a centralized platform to effectively enable security analysts to rapidly detect and respond to advanced security threats.

The OpenSOC framework provides three key elements for security analytics:

  1. Context

    A mechanism to capture, store, and normalize any type of security telemetry at extremely high rates. OpenSOC ingests data and pushes it to various processing units for advanced computation and analytics, providing the necessary context for security protection and the ability for efficient information storage. It provides visibility and the information required for successful investigation, remediation, and forensic work.

  2. Real-time

    Real-time processing and application of enrichments such as threat intelligence, geolocation, and DNS information to collected telemetry. The immediate application of this information to incoming telemetry provides the greater context and situational awareness critical for detailed and timely investigations.

  3. Centralized Perspective

    The interface presents alert summaries with threat intelligence and enrichment data specific to an alert on a single page. The advanced search capabilities and full packet-extraction tools are available for investigation without the need to pivot between multiple tools.

During a breach, sensitive customer information and intellectual property is compromised, putting the company’s reputation, resources, and intellectual property at risk. Quickly identifying and resolving the issue is critical, but, traditional approaches to security incident investigation can be time-consuming. An analyst may need to take the following steps:

  1. Review reports from a Security Incident and Event Manager (SIEM) and run batch queries on other telemetry sources for additional context.
  2. Research external threat intelligence sources to uncover proactive warnings to potential attacks.
  3. Research a network forensics tool with full packet capture and historical records in order to determine context.

Apart from having to access several tools and information sets, the act of searching and analyzing the amount of data collected can take minutes to hours using traditional techniques.

When we built OpenSOC, one of our goals was to bring all of these pieces together into a single platform.  Analysts can use a single tool to navigate data with narrowed focus instead of wasting precious time trying to make sense of mountains of unstructured data.

No network is created equal. Telemetry sources differ in every organization. The amount of telemetry that must be collected and stored in order to provide enough historical context also depends on the amount of data flowing through the network. Furthermore, relevant threat intelligence differs for each and every individual organization.

As an open source solution, OpenSOC opens the door for any organization to create an incident detection tool specific to their needs.  The framework is highly extensible: any organization can customize their incident investigation process. It can be tailored to ingest and view any type of telemetry, whether it is for specialized medical equipment or custom-built point of sale devices. By leveraging Hadoop, OpenSOC also has the foundational building blocks to horizontally scale the amount of data it collects, stores, and analyzes based on the needs of the network.  OpenSOC will continually evolve and innovate, vastly improving organizations’ ability to handle security incident response.

We look forward to seeing the OpenSOC framework evolving in the open source community. For more information and to contribute to the OpenSOC community, please visit the community website at http://opensoc.github.io/.


 

1http://www.breachlevelindex.com/

Tags: , , , ,

Elasticsearch ELK + Cisco UCS turn massive data into massive insights

Elasticsearch Logo

The Internet of Everything continues to gain momentum and every new connection is creating new data. Cisco UCS Integrated Infrastructure for Big Data is helping customers convert that data into powerful intelligence, and we’re working with a number of new partners to bring exciting new solutions to our customers.

Today, I want to spotlight Elasticsearch, Inc. and welcome them to the Cisco Solution Partner Program.

Press Release

Elasticsearch Blog

Elasticsearch excels at providing real-time insight into data – whether structured or unstructured, human- or machine-generated; by bringing a search-based architecture to data analytics. By combining the ELK stack with Cisco UCS, organizations benefit from a turnkey underlying infrastructure solution that provides them with real-time search and analytics for a variety of applications, from log analysis, to structured, semi-structured, or unstructured searches, as well as a web-backend for custom applications that use search-based analytics as a core functionality.

Mozilla is just one of the companies who are already benefiting from the joint solution with real-time search and analysis of data powering its defense platform, MozDef. The ELK stack leverages Cisco UCS’ fast connectivity for query, indexing and replication of data traffic. And Elasticsearch handles the full scale of event storage, archiving, indexing and searching of the data logs. The ELK stack and Cisco UCS also protect Mozilla’s network, services, systems, and audit data from hackers.

Partners like Elasticsearch are just one reason that Cisco UCS Integrated Infrastructure can help your company capitalize on the IoE data avalanche and deliver powerful and cost-effective analytics solutions throughout your enterprise.

Find out more at www.cisco.com/go/bigdata, or register for a webinar entitled, “Learn How Mozilla Tackles their Security Logs with Elasticsearch and Cisco”.

Register Now

 

 

Webinar Details:

Thursday, November 13th 
9:00 AM PST / 12:00 PM EST / 5:00 PM GMT


Are you interested in learning how to build enterprise applications on top of Elasticsearch and Cisco’s Unified Computing System (UCS) infrastructure? We’re holding a webinar to delve more deeply into how to optimize ELK on Cisco UCS infrastructure.

Cisco UCS  unites compute, network, and storage access into a single cohesive system. By combining the ELK stack with Cisco UCS, businesses benefit by having a turnkey hardware-software solution for their search and analytics applications.  In this webinar you’ll learn about the various UCS hardware profiles you should consider when deploying ELK and how Mozilla built MozDef, their custom SIEM application, using ELK on Cisco UCS.

Agenda:

  • Introduction – Jobi George, Elasticsearch (5 minutes)
  • Overview of UCS + ELK reference architectures – Raghunath Nambiar, Distinguished Engineer, Data Center Business Group, Cisco (10 minutes)
  • How Mozilla Built MozDef on ELK and Cisco UCS – Jeff Bryner, Security Assurance, Mozilla (25 minutes)
  • Q&A – Jobi George, Elasticsearch (~20 minutes)

 

 

 

Tags: , , , , , ,

Enhancing the Customer Experience with Data & Analytics

Last month, I had the honor of presenting at the Internet of Things (IoT) World Forum in Chicago. The event gave me the opportunity to do one of my favorite things, collaborate and network with all of my peers who are doing creative work within the world of IoT. One of my colleagues, Kowsalya Arunprakash, Lead Architect of Virtual Data Integration Services for Time Warner Cable, co-presented and shared a use case in which Time Warner is utilizing Cisco Data Virtualization to enhance its customer experience with analytics. In today’s blog, I’d like to share more details about this use case, because I think it’s a great example of how organizations are leveraging IoT solutions as a way to better serve their customers and separate themselves from their competition.

Time Warner Cable IntelligentHome is a home security and energy management system which TWC IntelligentHomeusers can control from their smartphone, tablet or computer to do things like view live video, arm/disarm their system, turn their home lights on/off or adjust the temperature of their thermostat. As you can imagine, each one of these pieces of equipment creates a fair amount of data through radio-frequency identification (RFID) and sensors. On top of this, as consumers generally do, users are going to social media to share their experience.

Utilizing Cisco Data Virtualization, Time Warner is able to couple this data along with sales, marketing and historical customer data to get a full 360 degree view of operational analytics. By operational, I am TWC IntelligentHome 2referring to intelligence like customer trends, sales analytics and resource allocation management.

This combination is an extremely powerful tool to understand the customer in order to create and adapt products and services that cater to their wants and needs. It’s an opportunity to see how the product is being used, via RFIDs and sensors, coupled with customers’ feedback and experience shared on social media as well as their long-term history of usage and preferences.

During her presentation, Kowsalya shared that by leveraging these insights Time Warner is able to improve sales, reduce customer churn and even work with local law enforcement and emergency services to respond faster to current events. To hear more, I encourage you to watch the video interview of Kowsalya and learn about how Time Warner is using Data Virtualization to derive value for their customers.

 

Learn More

To learn more about Cisco Data and Analytics, check out our page.

Join the Conversation

Follow @MikeFlannagan and @CiscoAnalytics.

Tags: , , , , , , , ,