Cisco Blogs


Cisco Blog > Security

Can the Elephant Dance to a Security Tune?

HadoopThere is a great debate in the security world right now: have SIEM and logging products run their course? Will Hadoop ride to the rescue? Can machines “learn” about security and reliably spot threats that no other approach can find?

Gartner calls this phenomenon Big Data Security Analytics, and they make a strong point to define BDSA solutions as a three-layer pyramid. At the bottom is the “data lake,” which is what most people equate with Hadoop. The next layer is context—the addition of relevant business, location, and other non-traditional security information to increase the precision of the next layer: applications and analytics (such as Machine Learning). It is this top layer where the real value of BDSA is realized in terms of finding new threats and remediating them before they do damage.

Read More »

Tags: , , , , ,

Paradigm Shift with Edge Intelligence

In my Internet of Things keynote at LinuxCon 2014 in Chicago last week, I touched upon a new trend: the rise of a new kind of utility or service model, the so-called IoT specific service provider model, or IoT SP for short.

I had a recent conversation with a team of physicists at the Large Hadron Collider at CERN. I told them they would be surprised to hear the new computer scientist’s talk these days, about Data Gravity.  Programmers are notorious for overloading common words, adding connotations galore, messing with meanings entrenched in our natural language.

We all laughed and then the conversation grew deeper:

  • Big data is very difficult to move around, it takes energy and time and bandwidth hence expensive. And it is growing exponentially larger at the outer edge, with tens of billions of devices producing it at an ever faster rate, from an ever increasing set of places on our planet and beyond.
  • As a consequence of the laws of physics, we know we have an impedance mismatch between the core and the edge, I coined this as the Moore-Nielsen paradigm (described in my talk as well): data gets accumulated at the edges faster than the network can push into the core.
  • Therefore big data accumulated at the edge will attract applications (little data or procedural code), so apps will move to data, not the other way around, behaving as if data has “gravity”

Therefore, the notion of a very large centralized cloud that would control the massive rise of data spewing from tens of billions of connected devices is pitched both against the laws of physics and Open Source not to mention the thirst for freedom (no vendor lock-in) and privacy (no data lock-in). The paradigm shifted, we entered the 3rd big wave (after the mainframe decentralization to client-server, which in turn centralized to cloud): the move to a highly decentralized compute model, where the intelligence is shifting to the edge, as apps come to the data, at much larger scale, machine to machine, with little or no human interface or intervention.

The age-old dilemma, do we go vertical (domain specific) or horizontal (application development or management platform) pops up again. The answer has to be based on necessity not fashion, we have to do this well; hence vertical domain knowledge is overriding. With the declining cost of computing, we finally have the technology to move to a much more scalable and empowering model, the new opportunity in our industry, the mega trend.

Very reminiscent of the early 90′s and the beginning of the ISPs era, isn’t it? This time much more vertical with deep domain knowledge: connected energy, connected manufacturing, connected cities, connected cars, connected home, safety and security.  These innovation hubs all share something in common: an Open and Interconnected model, made easy by the dramatically lower compute cost and ubiquity in open source, to overcome all barriers of adoption, including the previously weak security or privacy models predicated on a central core. We can divide and conquer, deal with data in motion, differently than we deal with data at rest.

The so-called “wheel of computer science” has completed one revolution, just as its socio-economic observation predicted, the next generation has arrived, ready to help evolve or replace its aging predecessor. Which one, or which vertical will it be first…?

Tags: , , , , , , , , , , , , , , , , , ,

My Top 7 Predictions for Open Source in 2014

My 2014 predictions are finally complete.  If Open Source equals collaboration or credibility, 2013 has been nothing short of spectacular.  As an eternal optimist, I believe 2014 will be even better:

  1. Big data’s biggest play will be in meatspace, not cyberspace.  There is just so much data we produce and give away, great opportunity for analytics in the real world.
  2. Privacy and security will become ever more important, particularly using Open Source, not closed. Paradoxically, this is actually good news as Open Source shows us again, transparency wins and just as we see in biological systems, the most robust mechanisms do so with fewer secrets than we think.
  3. The rise of “fog” computing as a consequence of the Internet of Things (IoT) will unfortunately be driven by fashion for now (wearable computers), it will make us think again what have we done to give up our data and start reading #1 and #2 above with a different and more open mind. Again!
  4. Virtualization will enter the biggest year yet in networking.  Just like the hypervisor rode Moore’s Law in server virtualization and found a neat application in #2 above, a different breed of projects like OpenDaylight will emerge. But the drama is a bit more challenging because the network scales very differently than CPU and memory, it is a much more challenging problem. Thus, networking vendors embracing Open Source may fare well.
  5. Those that didn’t quite “get” Open Source as the ultimate development model will re-discover it as Inner Source (ACM, April 1999), as the only long-term viable development model.  Or so they think, as the glamor of new-style Open Source projects (OpenStack, OpenDaylight, AllSeen) with big budgets, big marketing, big drama, may in fact be too seductive.  Only those that truly understand the two key things that make an Open Source project successful will endure.
  6. AI recently morphed will make a comeback, not just robotics, but something different AI did not anticipate a generation ago, something one calls cognitive computing, perhaps indeed the third era in computing!  The story of Watson going beyond obliterating Jeopardy contestants, looking to open up and find commercial applications, is a truly remarkable thing to observe in our lifespan.  This may in fact be a much more noble use of big data analytics (and other key Open Source projects) than #1 above. But can it exist without it?
  7. Finally, Gen Z developers discover Open Source and embrace it just like their Millennials (Gen Y) predecessors. The level of sophistication and interaction rises and projects ranging from Bitcoin to qCraft become intriguing, presenting a different kind of challenge.  More importantly, the previous generation can now begin to relax knowing the gap is closing, the ultimate development model is in good hands, and can begin to give back more than ever before. Ah, the beauty of Open Source…

Tags: , , , , , , , , , , , , , , , , , , , , , , ,

Omnianalytics for an Omnichannel World

At Cisco, we’re about ready for the NRF trade show being held in New York on Jan.  12-15. We’re at the show expo on Jan. 13-14, and will be featuring four company thought leaders in the highly popular annual Big Idea sessions. Kathryn Howe, retail senior advisor at Cisco, will be discussing one of the industry’s most forward-looking trends – how to utilize omnianalytics that help retailers extract the most data out of omnichannel environments.

Q: The concept of omnianalytics is a new one for many retailers. Can you tell us more about it?

A: In pursuit of the personalized customer experience, retailers are increasingly moving toward omnichannel selling across stores, websites, mobile platforms and applications, phones, kiosks, and so on. Each of these channels adds another layer to the customer experience, and each layer generates a new set of data. These data sets offer a new opportunity for stores to engage with the customer.  Omnianalytics is the process of managing and correlating these large amounts of data to transform your business.

Q: Why is this data so important?

A: For the first time in history, retailers can collect truly objective, quantifiable customer data. Traditional shop-alongs, simulations, and focus groups are inevitably somewhat inaccurate, as simply being observed can change shopper behavior. Today’s automated systems, on the other hand, collect completely unbiased information on dwell times, traffic patterns, and other behaviors. They are also extremely scalable, meaning that consistent metrics can be gathered across thousands of stores to provide very high quality data.

Q: What do you think are the most important topics you’ll discuss at NRF?

Knowing which metrics are game changers for your business is the art and science of executing on omnianalytics. We’ll talk about how to get started and how to understand which metrics you need for your business. We’ll also be joined by John Goedert of Starbucks, who provides a wonderful case study on how his company is using omnianalytics to drive consumer interactions.

Time and Place:

“Omnianalytics: Knowledge is Good, Now How Can It Transform My Business?” with Kathryn Howe takes place on Tuesday, Jan. 14, at 1:15-2:15 am, in Room 4 on Level 3 of the Expo Hall. For those who can’t be there, a recording of the session will be available after the show. Visit Cisco’s NRF website to learn more, and do take the time to stop by Cisco booth #1954.

I’ll see you at NRF!

Tags: , , , , , , , , , , , , , , , ,

The Correlation Is Clear: Measuring Collaboration Is Directly Tied to Better Adoption of Collaboration Tools

November 11, 2013 at 11:59 am PST

At Collaboration Summit, Cisco announced a number of exciting new technologies designed to make collaboration simple, fun, and intuitive. My friend Rowan Trollope who leads Cisco’s Collaboration Technology Group, is working hard to “make technology in the office better than what you have at home.” With Cisco Expressway, Intelligent Proximity, and Jabber Guest, a few of the new products Cisco just unveiled, we are breaking down the barriers between the home and work, creating a seamless experience for staying connected. And in Rowan’s words, “You haven’t seen anything yet.” Rowan and his team are dead set on perfecting the usability aspect of our collaboration technology – making it beautiful, affordable, and easy to assemble – and my services team has the charge of perfecting another: extracting its value.

According to a 2013 Forbes Study Cisco commissioned to understand business executives’ attitudes towards collaboration, we found those who see the greatest value in collaboration technology are the ones who use it the most. Heavy users, or collaboration “leaders,” perceive a strong correlation between using collaboration tools and achieving transformational business metrics in areas like productivity, knowledge sharing, customer satisfaction, cost control, and more.

From a services perspective, collaboration success is dependent on two things: Read More »

Tags: , , , , , , , ,