Cisco Blogs


Cisco Blog > Internet of Everything

Analytics at the Edge: Where the Network Becomes the Database

In 1984, John Gage of Sun Microsystems coined the phrase “the network is the computer” as computing functions started to become increasingly distributed across the network. Today, boundaries that once separated individual computers have disappeared and application processing is enabled—and managed—by the network. We are now at the forefront of a new market transition, as eloquently explained by Rick van der Lans in his paper, “The Network Is the Database.”

The network is indeed becoming the database. Big Data and the related approach to database management are moving away from a centralized data warehouse model and literally starting to flow across the network. We are virtualizing data management by leaving data in the network, instead of copying it into a data center. Data stays in motion wherever and whenever it’s needed across the network, instead of being at rest.

What does this mean for business value? A distributed—and virtualized—data management approach solves the three major issues of Big Data: volume, variety, and velocity.

Read More »

Tags: , , , , , , , ,

Open Source is just the other side, the wild side!

March is a rather event-laden month for Open Source and Open Standards in networking: the 89th IETF, EclipseCon 2014, RSA 2014, the Open Networking Summit, the IEEE International Conference on Cloud (where I’ll be talking about the role of Open Source as we morph the Cloud down to Fog computing) and my favorite, the one and only Open Source Think Tank where this year we dive into the not-so-small world (there is plenty of room at the bottom!) of machine-to-machine (m2m) and Open Source, that some call the Internet of Everything.

There is a lot more to March Madness, of course, in the case of Open Source, a good time to celebrate the 1st anniversary of “Meet Me on the Equinox“, the fleeting moment where daylight conquered the night the day that project Daylight became Open Daylight. As I reflect on how quickly it started and grew from the hearts and minds of folks more interested in writing code than talking about standards, I think about how much the Network, previously dominated, as it should, by Open Standards, is now beginning to run with Open Source, as it should. We captured that dialog with our partners and friends at the Linux Foundation in this webcast I hope you’ll enjoy. I hope you’ll join us in this month in one of these neat places.

As Open Source has become dominant in just about everything, Virtualization, Cloud, Mobility, Security, Social Networking, Big Data, the Internet of Things, the Internet of Everything, you name it, we get asked how do we get the balance right? How does one work with the rigidity of Open Standards and the fluidity of Open Source, particularly in the Network? There is only one answer, think of it as the Yang of Open Standards, the Yin of Open Source, they need each other, they can not function without the other, particularly in the Network.  Open Source is just the other side, the wild side!

Tags: , , , , , , , , , , , , , , , , , ,

Cisco IOx: Real World Benefits

February 17, 2014 at 5:30 am PST

IoT Railway SystemsIn my previous blog I have attempted to describe some of the distributed computing and data processing challenges that have to be solved in order to release the full potential and value from the Internet of Things, and how Cisco is addressing these challenges by enabling a Fog computing model via Cisco IOx. Let’s now review some real world scenarios where benefits from the application enablement capabilities I have described can have a measurable and relevant impact on everyday life and business.

SAFER TRANSPORTATION

Whether it’s a passenger train in a bustling city or a freight train slithering through the mountainside, news of derailment is a tragic story. You may have heard about the fatal train accident in New York City’s Bronx or the recent incident in Philadelphia where a train hauling crude oil was dangling over a river. The US federal government has seen more oil spilled in rail incidents in 2013 than was spilled in the nearly four decades since it began collecting data. The demand for preventative measures is greater than ever. Read More »

Tags: , , , , , , , ,

Cisco IOx: Making Fog Real for IoT

February 10, 2014 at 9:10 am PST

As I mentioned in my previous blog, Fog Computing supports emerging Internet of Things (IoT) applications that demand real-time response and predictable latency such as industrial automation, transportation, networks of sensors, and actuators. Thanks to its wide geographical distribution, the Fog model is well positioned for real time big data and real time analytics. But how can we make Fog real for IoT? We believe Cisco IOx is the answer.

Cisco IOx is delivering an application enablement framework that brings the Fog concept to life by allowing the delivery of distributed computing capabilities and enabling the creation of an intermediate layer between the “things” and the cloud.

 IOx_trad  IOx_IoT

So what exactly is Cisco IOx? In simple terms, Cisco is combining the communication and computing resources that are required for IoT into a single platform for application enablement at the network edge.

Read More »

Tags: , , , , , , ,

Back to the Future: Do Androids Dream of Electric Sheep?

As information consumers that depend so much on the Network or Cloud, we sometimes indulge in thinking what will happen when we really begin to feel the effects of Moore’s Law and Nielsen’s Law combined, at the edges: the amount of data and our ability to consume it (let alone stream it to the edge), is simply too much for our mind to process. We have already begun to experience this today: how much information can you consume on a daily basis from the collective of your so-called “smart” devices, your social networks or other networked services, and how much more data is left behind. Same for machines to machine: a jet engine produces terabytes of data about its performance in just a few minutes, it would be impossible to send this data to some remote computer or network and act on the engine locally.  We already know Big Data is not just growing, it is exploding!

The conclusion is simple: one day we will no longer be able to cope, unless the information is consumed differently, locally. Our brain may no longer be enough, we hope to get help, Artificial Intelligence comes to the rescue, M2M takes off, but the new system must be highly decentralized in order to stay robust, or else it will crash like some kind of dystopian event from H2G2. Is it any wonder that even today, a large portion if not the majority of the world Internet traffic is in fact already P2P and the majority of the world software downloaded is Open Source P2P? Just think of BitCoin and how it captures the imagination of the best or bravest developers and investors (and how ridiculous one of those categories could be, not realizing its potential current flaw, to the supreme delight of its developers, who will undoubtedly develop the fix — but that’s the subject of another blog).

Consequently, centralized high bandwidth style compute will break down at the bleeding edge, the cloud as we know it won’t scale and a new form of computing emerges: fog computing as a direct consequence of Moore’s and Nielsen’s Laws combined. Fighting this trend equates to fighting the laws of physics, I don’t think I can say it simpler than that.

Thus the compute model has already begun to shift: we will want our Big Data, analyzed, visualized, private, secure, ready when we are, and finally we begin to realize how vital it has become: can you live without your network, data, connection, friends or social network for more than a few minutes? Hours? Days? And when you rejoin it, how does it feel? And if you can’t, are you convinced that one day you must be in control of your own persona, your personal data, or else? Granted, while we shouldn’t worry too much about a Blade Runner dystopia or the H2G2 Krikkit story in Life, the Universe of Everything, there are some interesting things one could be doing, and more than just asking, as Philip K Dick once did, do androids dream of electric sheep?

To enable this new beginning, we started in Open Source, looking to incubate a project or two, first one in Eclipse M2M, among a dozen-or-so dots we’d like to connect in the days and months to come, we call it krikkit. The possibilities afforded by this new compute model are endless. One of those could be the ability to put us back in control of our own local and personal data, not some central place, service or bot currently sold as a matter of convenience, fashion or scale. I hope with the release of these new projects, we will begin to solve that together. What better way to collaborate, than open? Perhaps this is what the Internet of Everything and data in motion should be about.

Tags: , , , , , , , , , , , , , , , , , , , , , ,