According to the Breach Level Index, between July and September of this year, an average of 23 data records were lost or stolen every second – close to two million records every day.1 This data loss will continue as attackers become increasingly sophisticated in their attacks. Given this stark reality, we can no longer rely on traditional means of threat detection. Technically advanced attackers often leave behind clue-based evidence of their activities, but uncovering them usually involves filtering through mountains of logs and telemetry. The application of big data analytics to this problem has become a necessity.
To help organizations leverage big data in their security strategy, we are announcing the availability of an open source security analytics framework: OpenSOC. The OpenSOC framework helps organizations make big data part of their technical security strategy by providing a platform for the application of anomaly detection and incident forensics to the data loss problem. By integrating numerous elements of the Hadoop ecosystem such as Storm, Kafka, and Elasticsearch, OpenSOC provides a scalable platform incorporating capabilities such as full-packet capture indexing, storage, data enrichment, stream processing, batch processing, real-time search, and telemetry aggregation. It also provides a centralized platform to effectively enable security analysts to rapidly detect and respond to advanced security threats.
The OpenSOC framework provides three key elements for security analytics:
A mechanism to capture, store, and normalize any type of security telemetry at extremely high rates. OpenSOC ingests data and pushes it to various processing units for advanced computation and analytics, providing the necessary context for security protection and the ability for efficient information storage. It provides visibility and the information required for successful investigation, remediation, and forensic work.
Real-time processing and application of enrichments such as threat intelligence, geolocation, and DNS information to collected telemetry. The immediate application of this information to incoming telemetry provides the greater context and situational awareness critical for detailed and timely investigations.
The interface presents alert summaries with threat intelligence and enrichment data specific to an alert on a single page. The advanced search capabilities and full packet-extraction tools are available for investigation without the need to pivot between multiple tools.
During a breach, sensitive customer information and intellectual property is compromised, putting the company’s reputation, resources, and intellectual property at risk. Quickly identifying and resolving the issue is critical, but, traditional approaches to security incident investigation can be time-consuming. An analyst may need to take the following steps:
- Review reports from a Security Incident and Event Manager (SIEM) and run batch queries on other telemetry sources for additional context.
- Research external threat intelligence sources to uncover proactive warnings to potential attacks.
- Research a network forensics tool with full packet capture and historical records in order to determine context.
Apart from having to access several tools and information sets, the act of searching and analyzing the amount of data collected can take minutes to hours using traditional techniques.
When we built OpenSOC, one of our goals was to bring all of these pieces together into a single platform. Analysts can use a single tool to navigate data with narrowed focus instead of wasting precious time trying to make sense of mountains of unstructured data.
No network is created equal. Telemetry sources differ in every organization. The amount of telemetry that must be collected and stored in order to provide enough historical context also depends on the amount of data flowing through the network. Furthermore, relevant threat intelligence differs for each and every individual organization.
As an open source solution, OpenSOC opens the door for any organization to create an incident detection tool specific to their needs. The framework is highly extensible: any organization can customize their incident investigation process. It can be tailored to ingest and view any type of telemetry, whether it is for specialized medical equipment or custom-built point of sale devices. By leveraging Hadoop, OpenSOC also has the foundational building blocks to horizontally scale the amount of data it collects, stores, and analyzes based on the needs of the network. OpenSOC will continually evolve and innovate, vastly improving organizations’ ability to handle security incident response.
We look forward to seeing the OpenSOC framework evolving in the open source community. For more information and to contribute to the OpenSOC community, please visit the community website at http://opensoc.github.io/.
Tags: analytics, Big Data, data loss, detection, OpenSOC
The Internet of Everything continues to gain momentum and every new connection is creating new data. Cisco UCS Integrated Infrastructure for Big Data is helping customers convert that data into powerful intelligence, and we’re working with a number of new partners to bring exciting new solutions to our customers.
Today, I want to spotlight Elasticsearch, Inc. and welcome them to the Cisco Solution Partner Program.
Elasticsearch excels at providing real-time insight into data – whether structured or unstructured, human- or machine-generated; by bringing a search-based architecture to data analytics. By combining the ELK stack with Cisco UCS, organizations benefit from a turnkey underlying infrastructure solution that provides them with real-time search and analytics for a variety of applications, from log analysis, to structured, semi-structured, or unstructured searches, as well as a web-backend for custom applications that use search-based analytics as a core functionality.
Mozilla is just one of the companies who are already benefiting from the joint solution with real-time search and analysis of data powering its defense platform, MozDef. The ELK stack leverages Cisco UCS’ fast connectivity for query, indexing and replication of data traffic. And Elasticsearch handles the full scale of event storage, archiving, indexing and searching of the data logs. The ELK stack and Cisco UCS also protect Mozilla’s network, services, systems, and audit data from hackers.
Partners like Elasticsearch are just one reason that Cisco UCS Integrated Infrastructure can help your company capitalize on the IoE data avalanche and deliver powerful and cost-effective analytics solutions throughout your enterprise.
Find out more at www.cisco.com/go/bigdata, or register for a webinar entitled, “Learn How Mozilla Tackles their Security Logs with Elasticsearch and Cisco”.
Thursday, November 13th
9:00 AM PST / 12:00 PM EST / 5:00 PM GMT
Are you interested in learning how to build enterprise applications on top of Elasticsearch and Cisco’s Unified Computing System (UCS) infrastructure? We’re holding a webinar to delve more deeply into how to optimize ELK on Cisco UCS infrastructure.
Cisco UCS unites compute, network, and storage access into a single cohesive system. By combining the ELK stack with Cisco UCS, businesses benefit by having a turnkey hardware-software solution for their search and analytics applications. In this webinar you’ll learn about the various UCS hardware profiles you should consider when deploying ELK and how Mozilla built MozDef, their custom SIEM application, using ELK on Cisco UCS.
- Introduction – Jobi George, Elasticsearch (5 minutes)
- Overview of UCS + ELK reference architectures – Raghunath Nambiar, Distinguished Engineer, Data Center Business Group, Cisco (10 minutes)
- How Mozilla Built MozDef on ELK and Cisco UCS – Jeff Bryner, Security Assurance, Mozilla (25 minutes)
- Q&A – Jobi George, Elasticsearch (~20 minutes)
Tags: analytics, Big Data, Cisco, Cisco UCS, Cisco Unified Computing System, elasticsearch, UCS
Last month, I had the honor of presenting at the Internet of Things (IoT) World Forum in Chicago. The event gave me the opportunity to do one of my favorite things, collaborate and network with all of my peers who are doing creative work within the world of IoT. One of my colleagues, Kowsalya Arunprakash, Lead Architect of Virtual Data Integration Services for Time Warner Cable, co-presented and shared a use case in which Time Warner is utilizing Cisco Data Virtualization to enhance its customer experience with analytics. In today’s blog, I’d like to share more details about this use case, because I think it’s a great example of how organizations are leveraging IoT solutions as a way to better serve their customers and separate themselves from their competition.
Time Warner Cable IntelligentHome is a home security and energy management system which users can control from their smartphone, tablet or computer to do things like view live video, arm/disarm their system, turn their home lights on/off or adjust the temperature of their thermostat. As you can imagine, each one of these pieces of equipment creates a fair amount of data through radio-frequency identification (RFID) and sensors. On top of this, as consumers generally do, users are going to social media to share their experience.
Utilizing Cisco Data Virtualization, Time Warner is able to couple this data along with sales, marketing and historical customer data to get a full 360 degree view of operational analytics. By operational, I am referring to intelligence like customer trends, sales analytics and resource allocation management.
This combination is an extremely powerful tool to understand the customer in order to create and adapt products and services that cater to their wants and needs. It’s an opportunity to see how the product is being used, via RFIDs and sensors, coupled with customers’ feedback and experience shared on social media as well as their long-term history of usage and preferences.
During her presentation, Kowsalya shared that by leveraging these insights Time Warner is able to improve sales, reduce customer churn and even work with local law enforcement and emergency services to respond faster to current events. To hear more, I encourage you to watch the video interview of Kowsalya and learn about how Time Warner is using Data Virtualization to derive value for their customers.
To learn more about Cisco Data and Analytics, check out our page.
Join the Conversation
Follow @MikeFlannagan and @CiscoAnalytics.
Tags: analytics, Big Data, Cisco data and analytics, data analytics, data virtualization, internet of things, Internet of Things World Forum, IoT, time warner cable
I introduced Cisco Entrepreneurs in Residence (Cisco EIR) earlier this year as a cornerstone in our strategy of embracing open innovation at Cisco. I also shared how we were extending Cisco EIR and open innovation across the US through local incubation partners, and I announced the launch of Cisco EIR in Europe. Now I would like to share updates on the great progress we are making with Cisco EIR as a catalyst of open innovation at Cisco.
Startups Selected to Join Cisco EIR in Europe
Last week we were excited to announce the six startups that will be joining our Cisco EIR program in Europe at the Pioneers Festival in Vienna. The six winners – innovating in the areas of Smart Cities, Internet of Everything (IoE)/cloud and Big Data/analytics – were chosen through a rigorous multiphase selection process conducted in collaboration with Pioneers. More than 350 applicants from 39 countries applied to join Cisco EIR Europe, with 15 finalists pitching live at the Pioneers Festival in front of Cisco experts and our European partners. Winners were selected based on the viability of their business plans, the strength of their teams and their alignment with Cisco’s IoE vision and strategy.
We were impressed beyond our expectations by the vision, passion, talent and technology of all 15 finalists. These startups made us more excited and convinced than ever that Europe was the right platform to discover and nurture the next generation of disruptive ideas for our industry and for Cisco.
Read More »
Tags: analytics, Big Data, chicago, Cisco, Cisco Entrepreneurs in Residence, ciscoeir, entrepreneurs, entrepreneurship, Fresno, Hilton Romanski, innovation, Internet of Everything, internet of things, IoE, IoT, Mala Anand, pioneers, Pioneers14, San Diego, Smart Cities, Smart City, startups, Vienna, Wim Elfrink
In my recent blogs about building the right data strategy and analytics solutions, I discussed how Cisco is helping our customers to meet one of the toughest challenges brought on by the Internet of Everything (IoE) – cost-effectively managing massive amounts of distributed data. With solutions such as Cisco Data Virtualization and Cisco Big Data Warehouse Expansion (BDWE), our customers can bring all of this data together in ways that are meaningful to them. Utilizing the network to securely connect data throughout the IoE and providing advanced analytics, we help our customers predict outcomes so they can drive better decisions in real-time.
To keep up with growing demands for big data analytics, now is the time to look at workload automation end to end. In this blog, I would like to share the other aspect of big data analytics; building the right workload automation strategy to integrate data, analytics and operations.
In today’s big data analytics environments, IT staff is regularly managing increasingly complex processes that are co-dependent on one another and span applications and departments. Rather than implementing silos of automation, we are helping customers take advantage of workload automation from a unified perspective. Our enterprise-wide workload automation solution, Cisco Tidal Enterprise Scheduler (TES), simplifies end-to-end data management and automates diverse business processes across a broad set of applications, systems and environments.
For example, many business intelligence and analytics applications operate 24 hours a day. To handle a high volume of these jobs – many with service-level agreements – customers need the tools to receive and respond to alerts from anywhere. Using the Cisco Tidal Enterprise Scheduler client on their iPhones and iPads, our customers can receive alerts, check logs and remediate errors from anywhere. As organizations continue to require fast and timely business services, we put the power directly in the users’ hands. By using TES’ self-service portal, business users can monitor the progress of relevant workloads through a web browser and perform basic job control. In addition to reducing the downtime of a business user having to wait for IT to address their issue, this also takes some of the burden off IT.
TES is a key piece of any company’s big data and analytics software strategy and a priority for us at Cisco. Cisco IT uses TES internally and, due to successful deployment, is now encouraging further widespread internal adoption. Since Cisco TES is the first workload automation solution in the industry to have a suite of Hadoop adapters (as well as other data source and application adapters), Cisco IT is now additionally leveraging TES in Cisco’s own internal Big Data initiatives to deliver an end-to-end big data workload solution -- as detailed here and here. We are continuously working to develop ways to integrate TES further with other Cisco Big Data solutions, such as the recently launched BDWE solution, to provide more value through holistic data management solutions.
Taking full advantage of big data delivers a tremendous competitive advantage to our customers. Cisco TES facilitates the end-to-end management of the entire process from data acquisition, to value extraction, to action. Stay tuned as we continue to deliver new capabilities to further bridge today’s gap between enterprise IT capabilities and business requirements.
To learn more about Cisco Tidal Enterprise Scheduler and the power of our integrated infrastructure for big data, check out our page.
Join the Conversation
Tags: analytics, Big Data, big data warehouse expansion, data virtualization, Tidal Enterprise Scheduler