Avatar

Contents

 

 

Intro

 

The ELK stack is a set of analytics tools. Its initials represent Elasticsearch, Logstash and Kibana. Elasticsearch is a flexible and powerful open source, distributed, real-time search and analytics engine. Logstash is a tool for receiving, processing and outputting logs, like system logs, webserver logs, error logs, application logs and many more. Kibana is an open source (Apache-licensed), browser-based analytics and search dashboard for Elasticsearch.

ELK is a very open source, useful and efficient analytics platform, and we wanted to use it to consume flow analytics from a network. The reason we chose to go with ELK is that it can efficiently handle lots of data and it is open source and highly customizable for the user’s needs. The flows were exported by various hardware and virtual infrastructure devices in NetFlow v5 format. Then Logstash was responsible for processing and storing them in Elasticsearch. Kibana, in turn, was responsible for reporting on the data. Given that there were no complete guides on how to use NetFlow with ELK, below we present a step-by-step guide on how to set up ELK from scratch and enabled it to consume and display NetFlow v5 information. Readers should note that ELK includes more tools, like Shield and Marvel, that are used for security and Elasticsearch monitoring, but their use falls outside the scope of this guide.

In our setup, we used

  • Elasticsearch 1.3.4
  • Logstash 1.4.2
  • Kibana 3.1.1

For our example purposes, we only deployed one node responsible for collecting and indexing data. We did not use multiple nodes in our Elasticsearch cluster. We used a single-node cluster. Experienced users could leverage Kibana to consume data from multiple Elasticsearch nodes. Elasticsearch, Logstash and Kibana were all running in our Ubuntu 14.04 server with IP address 10.0.1.33. For more information on clusters, nodes and shard refer to the Elasticsearch guide.

Continue reading “Step-by-Step Setup of ELK for NetFlow Analytics”



Authors

Panos Kampanakis

Product Manager

Security & Trust Organization

Avatar

CiscoChampion2015200PX#CiscoChampion Radio is a podcast series by Cisco Champions as technologists. Today we’re talking with Cisco Technical Marketing Engineer and TechWiseTV host, Jimmy Ray Purser  (@JimmyRay_Purser), about all things IT. Amy Lewis (@CommsNinja) moderates and Sven Kutzer, Amy Arnold and Robert Novak are this week’s Cisco Champion guest hosts.

Listen to the Podcast.

Learn about the Cisco Champions Program HERE.
See a list of all #CiscoChampion Radio podcasts HERE.

Cisco SME
Cisco Technical Marketing Engineer and TechWiseTV host, Jimmy Ray Purser (@JimmyRay_Purser)

Cisco Champion
Sven Kutzer, @svenkutzer, Senior Systems Engineer
Amy Arnold, (@amyengineer), Senior Network Engineer
Robert Novak, (@gallifreyan), Cisco Consulting Systems Engineer Continue reading “#CiscoChampion Radio S1|Ep 42. Jimmy Ray Purser Talks Tech”



Authors

Rachel Bakker

Social Media Advocacy Manager

Digital and Social

Avatar
London's Big Ben at Night
London’s Big Ben at Night

Last week I had the opportunity to attend the Gartner Data Center conference in London.   I attended 3 different sessions on SDN-related topics.  Here are some of my observations from what was a very good conference.  Also, since the Gartner Data Center conference runs this week (w/c 1 December 2014) in the US, if you are going, here are some questions to think about when you attend the SDN sessions.

(1)    What does “lack of visibility” in Virtual Overlays really mean?

(2)    In multi-layer SDN, will SDN be cheaper than our current networking approach?

(3)    Are Vendors Guilty of Using NFV for SDN “Washing”?

(4)    If OpenStack is part of your SDN solution, can you help us on OpenStack?

(5)    What is the best hardware server platform for NFV/virtualised workloads?

(6)    How exactly does SDN deliver better network management?

I’ll cover a few questions today and some tomorrow.

Continue reading “SDN Questions to Ask at the Gartner Data Center Conference”



Authors

Stephen Speirs

SP Product Management

Cisco Customer Experience (CX)

Avatar

The game is afoot – and this time around, it pits marketing automation tools against data science driven data products. Specialized digital marketing service providers against dyed-in-the-wool data geeks. Campaign management tools versus correlation driven, random forest, gradient boosting, data products.

Marketing automation tools have, in their short history, focused on delivering specialized services (serving up display advertising, managed paid search ads et al). But over the last few years, these tools have broadened their reach and gained access to considerable data – emerging beyond their specific channel of interest. Tools such as Tag Management Systems, with their data layer, seek to become the data-broker of record. Offerings such as Data Management Platforms (DMP) integrate deeply into re-targeting platforms as well as with first party data and deliver fine-tuned audience segmentation and other behavior profiles in near real-time which can be acted upon immediately to serve better content and deliver richer engagements with the visitor.

Data Scientists in the digital realm have often prospered by:

  • A cross-channel online focus – bringing together online data that, until now, was siloed and typical marketing automation tools had no visibility to
  • Combining online and offline (company owned) data sources to generate insights that an online-only focus will not be able to deliver
  • Adding third-party data (frequently purchased) to further embellish information about their customers and using the same for yielding further insights

The new wave of marketing automation tools are beginning to step into the first of these areas – by beginning to combine data across multiple online channels (website, mobile, social media, email et al). Offerings such as Audience Stream from Tealium are an example of such a product.

The marketing automation tools are also beginning to tap into the third space – bringing in third-party data to combine with company owned data – though the scope is presently limited to third-party online behavior of the visitors the company is interested in. This is the specialization that DMP from companies such as BlueKai offers.

How long before these marketing automation tools provide a method for pulling in offline data to significantly improve the analysis they can provide? And once the data is available, several push-button models can be added on top to deliver interesting insights. Some variation of this is beginning to happen already and by 2020, we may well be dealing with marketing automation tools that cover multi-channel online data as well as online + offline data – with all of it leveraged by push-button models that yield interesting new insights.

So then, where to for the Data Scientist of the Digital proclivity?

For now, there is plenty of gap still remaining that the home-grown (or consultant-driven) Data Science team can deliver insights through the gaps identified above. But then, the team has to continue to expand focus – both onto new data sources (in the world of IoT), in discovering new relationships (graph like approaches to discover and ask more interesting questions), processing larger amounts of data to generate new insights (using new statistical approaches as Math begins to catch up with Big Data), and really spending more time answering the frequently asked, but seldom answered, questions (“what is the customer’s intent when they arrive at a site”, “how best to move the customer along the conversion funnel”).

The age of marketing automation has arrived – coinciding with the increased burden on digital marketing to generate revenue producing leads. The scope and reach of marketing automation will continue to grow and will continuously challenge data science teams to move up the value-chain and deliver deeper insights and focus on answering the harder questions for the marketer.

 



Authors

Sri Srikanth

Advanced Data Analytics & Strategy, Senior Data Scientist, Cisco Digial

Avatar

AP11495How can you make the most of every opportunity to delight your existing client base?
How do you identify new areas to expand your business?
How can you improve employee engagement?

These are just a few of the challenges that organizations face today.

Deloitte’s Human Capital Trends 2014 survey explores “two major themes underlying this year’s trends: globalization and the speed and extent of technological change and innovation.” The survey authors also describe the challenge of employee retention and engagement as — after leadership — the second-most urgent issue for the organizations surveyed.  The challenge is not simply identifying and keeping staff, but creating a “passionate and compassionate” place for them to work.

In its own survey, Virgin Media found that 78% of U.K. employees believe that companies need to offer “anytime, anywhere” work flexibility to attract and retain staff.

Therefore, with increasing globalization and the need to support flexible working environments to retain and nurture talent, we rely more on collaboration technology to build strong relationships with colleagues, customers, and partners.

Create the Environment for Good Connections

How can you ensure that employees stay connected with colleagues and with customers and partners? Continue reading “Put People First: Using Video to Enhance Relationships”



Authors

Angela Murphy

Senior Product Marketing Manager

Cisco IoT

Avatar

At the Cisco Boxborough, Massachusetts office, we are taking part in Giving Tuesday by encouraging our colleagues to participate in the 2014 Global Hunger Relief Campaign, which helps 13 local nonprofit hunger relief organizations. This is part of Cisco’s larger global campaign, which helps more than 160 food organizations worldwide. The goal of the campaign is to raise $1.8 million to end hunger around the world, and so far we are more than halfway toward that goal.

To date, we’ve raised over US$24,000 so far in employee donations, and we continue to make significant headway toward our 2014 goal of $43,000. We owe a big thanks to Director of Engineering, David Abe, who leads the New England Development Center and is an executive champion for this year’s Campaign.

In addition to David Abe’s leadership, my fellow Civic Council members, and a vibrant culture of giving back, our local Campaign launched with a beautiful artistic wall created by Lynne Abell.

Boxborough, Mass. Cisco Civic Council member Lynne Abell designed this artistic wall to commemorate the 2014 Global Hunger Relief Campaign
Boxborough, Massachusetts Cisco Civic Council member Lynne Abell designed this artistic wall to commemorate the 2014 Global Hunger Relief Campaign

Continue reading “Cisco Volunteers in Massachusetts Join Global Campaign to Give Back”



Authors

Darlene McNamara

Manager, Technical Support

Avatar

This post was written by Marcin Noga with contributions by Earl Carter and Martin Lee.

New vulnerabilities for old operating systems may not seem particularly interesting, until you consider the large number of legacy machines running outdated versions of Windows. Windows XP has reached its end of life, meaning that new vulnerabilities will not be patched. In this post we will show that a recent vulnerability can be used as a platform for exploiting Windows XP.

In October, Microsoft released a bulletin for a privilege escalation vulnerability in the FASTFAT driver that was released as:

MS14-063 — Vulnerability in FAT32 Disk Partition Driver Could Allow Elevation of Privilege (2998579)CVE-2014-4115.

Let me present some of the most interesting parts of the advisory and add some details from my own research.

When the bug kicks in…

In the advisory, Microsoft indicates that the following OS’s are vulnerable:

  • Microsoft Windows Server 2003 SP2
  • Vista SP2
  • Server 2008 SP2

The Microsoft bulletin does not mention Windows XP, since Windows XP is no longer supported. According to my research, however, this vulnerability is also present in the Windows XP FASTFAT driver.

See the following video.

This vulnerability can be exploited on Windows XP SP3 using a malicious usb stick with a malformed FAT32 partition. Let’s examine the reaction when the USB is inserted into the system.

Continue reading “MS14-063 A Potential XP Exploit”



Authors

Talos Group

Talos Security Intelligence & Research Group

Avatar

Public Sector IT organisations are weary of vendor lock-in. And rightfully so: it is hard to buy cloud services from any supplier you choose and then freely manage these services as if they were part of your own extended private cloud. Main reason: lack of ability to connect different clouds: private, partner, public, etc. Luckily, this barrier is vanishing…

Thirty years ago, Cisco pioneered a strategy to connect previously isolated, heterogeneous networks, which lead to the rise of the Internet as we know it. Now, Cisco is embarking on a journey just as ambitious: the connection of multiple isolated clouds, leading to the creation of the Intercloud: an interconnected cloud of clouds.

Intercloud

The Intercloud relies on a five key principles and technologies, summarised below:

Continue reading “Understand how the Intercloud elegantly meets Public Sector IT requirements”



Authors

Patrick Bikar

Global Systems Engineer Transformation Programs Lead

Global Systems Engineering

Avatar

The multi-stakeholder Internet Governance process is safe from being replaced by a government-only top down process. At least for now.

The Internet as we know it has added huge social and economic value to the world as well as to our personal lives and is governed by a broad multi-stakeholder process including the private sector, technical community, academia, civil society as well as governments. Each group has an important role to play and the success of the process is due in large part to each doing what they do best and working together when and where appropriate. For example, technical issues are best left to the technical community while national security issues are primarily the domain of governments.

This multi-stakeholder, bottom-up, process is distinct from and in contrast to a multi-lateral process that only includes governments and their multi-lateral organizations. Internet governance broadly has been, and needs to remain, a multi-stakeholder process. It’s a proven approach that created the open Internet of interconnected network of networks in which anyone can access content and use applications from anywhere on the globe.

Earlier this month, the International Telecommunications Union (ITU) concluded its important quadrennial Plenipotentiary conference in Busan, Korea, where the UN organization’s 193 member countries reviewed the ITU Constitution and Convention, elected its officials and set its agenda for the next four years.

Going into the Plenipot, there were concerns that some governments would use the meeting to impose the traditional top-down, government-led multi-lateral approach and counterproductive regulation to replace the bottom-up multi-stakeholder process. Some observers expressed their concern of a “UN takeover of the Internet.” Others were concerned that heavy handed and blunt regulation, which didn’t recognize the open and global architecture of the Internet, would fragment the Internet into national government controlled Intranets.

The good news is that none of the radical, dangerous or even just counterproductive proposals (such as regulating Internet routing) introduced in Busan survived the Plenipotentiary’s consensus-based process. In fact, the broad consensus acknowledged the importance of Internet governance processes and venues outside of the ITU while, at the same time, recognizing the important role the ITU plays, especially with respect to radio spectrum, capacity building, and working with emerging economies on development agendas.

This success was not by accident. It was the result of more than a year and a half of hard work and patient consultations among policy makers from governments around the world that are dedicated to the Open Internet and multi-stakeholder process. The US Delegation (including private sector, civil society and technical community members as well as government), led by Ambassador Daniel Sepulveda, played a key role in Busan, along with many like minded countries, building a consensus around the value of an Open Internet and the multi-stakeholder process. They changed the debate by understanding the importance of relationships and listening when working with other governments to address genuine concerns, while at the same time, building consensus to reject destructive proposals.

As successful as the Plenipot was, it’s not the end of the story. Governments that want to exert more control over the Internet and replace the multi-stakeholder process are not giving up. They are playing a long game and there are important international meetings in 2015 where they will try again. There is a lot of hard work and difficult discussions to come. But an important lesson learned from Busan is that successful diplomacy and policy through relationships, listening, collaboration and engagement, attributes like the Open Internet itself, can be a winning combination.



Authors

Robert Pepper

No Longer with Cisco