Cisco Blogs


Cisco Blog > The Platform

Accelerating and Innovating the Internet of Everything in Japan

After a whirlwind week in Tokyo, it’s clear that Japan – the world’s third largest economy — is embracing the potential economic value of the Internet of Everything (IoE). For Japan, we estimate an IoE opportunity of $870 million over the next decade (out of a global economic value of $19 trillion).

With its proud history of industry, technology and innovation leadership, Japan is an ideal location for Cisco’s 7th IoE Center of Innovation — a $20million investment for Cisco — which opened last Thursday with nine Japan-based ecosystem partners. The excitement is high around our open lab’s charter to bring together customers, industry partners, startups, accelerators, government agencies and research communities to collaborate on next-generation technology. Photos of the center’s opening are here.

Wim Tokyo 1

In Tokyo, we will be working with partners to develop Fog Computing solutions focused on Manufacturing, Sports and Entertainment and Public Sector. These Fog solutions extend cloud storage, computing and services to the edge of the network, a critical element of realizing value from IoE.

Read More »

Tags: , , , , , , ,

Paradigm Shift with Edge Intelligence

In my Internet of Things keynote at LinuxCon 2014 in Chicago last week, I touched upon a new trend: the rise of a new kind of utility or service model, the so-called IoT specific service provider model, or IoT SP for short.

I had a recent conversation with a team of physicists at the Large Hadron Collider at CERN. I told them they would be surprised to hear the new computer scientist’s talk these days, about Data Gravity.  Programmers are notorious for overloading common words, adding connotations galore, messing with meanings entrenched in our natural language.

We all laughed and then the conversation grew deeper:

  • Big data is very difficult to move around, it takes energy and time and bandwidth hence expensive. And it is growing exponentially larger at the outer edge, with tens of billions of devices producing it at an ever faster rate, from an ever increasing set of places on our planet and beyond.
  • As a consequence of the laws of physics, we know we have an impedance mismatch between the core and the edge, I coined this as the Moore-Nielsen paradigm (described in my talk as well): data gets accumulated at the edges faster than the network can push into the core.
  • Therefore big data accumulated at the edge will attract applications (little data or procedural code), so apps will move to data, not the other way around, behaving as if data has “gravity”

Therefore, the notion of a very large centralized cloud that would control the massive rise of data spewing from tens of billions of connected devices is pitched both against the laws of physics and Open Source not to mention the thirst for freedom (no vendor lock-in) and privacy (no data lock-in). The paradigm shifted, we entered the 3rd big wave (after the mainframe decentralization to client-server, which in turn centralized to cloud): the move to a highly decentralized compute model, where the intelligence is shifting to the edge, as apps come to the data, at much larger scale, machine to machine, with little or no human interface or intervention.

The age-old dilemma, do we go vertical (domain specific) or horizontal (application development or management platform) pops up again. The answer has to be based on necessity not fashion, we have to do this well; hence vertical domain knowledge is overriding. With the declining cost of computing, we finally have the technology to move to a much more scalable and empowering model, the new opportunity in our industry, the mega trend.

Very reminiscent of the early 90′s and the beginning of the ISPs era, isn’t it? This time much more vertical with deep domain knowledge: connected energy, connected manufacturing, connected cities, connected cars, connected home, safety and security.  These innovation hubs all share something in common: an Open and Interconnected model, made easy by the dramatically lower compute cost and ubiquity in open source, to overcome all barriers of adoption, including the previously weak security or privacy models predicated on a central core. We can divide and conquer, deal with data in motion, differently than we deal with data at rest.

The so-called “wheel of computer science” has completed one revolution, just as its socio-economic observation predicted, the next generation has arrived, ready to help evolve or replace its aging predecessor. Which one, or which vertical will it be first…?

Tags: , , , , , , , , , , , , , , , , , ,

Connecting the Unconnected

Connecting the Unconnected

“The Internet of Things is the next technology transition where devices will allow us to sense and control the physical world by making objects smarter and connecting them through an intelligent network”, Lindsay Hiebert, Senior Marketing Manager, Internet of Things, Cisco Systems

The Internet of Things in a Manufacturing Plant Environment

The Internet of Things in a Manufacturing Plant Environment

The Internet of Things is the network of physical objects accessed through the Internet.  These objects contain embedded technology to interact with internal states or the external environment.  This technology allows objects within such places as manufacturing floors, energy grids, healthcare facilities, and transportation systems to be controlled from virtually anywhere in the world.   This connectivity also means more data can be gathered from more places, with more ways to increase efficiency and improve safety and security.   The Internet of Things and the Internet of Everything (people, process, data and things) is about connecting the unconnected.

Read More »

Tags: , , , , , , , , , , , , , , , , , ,

Open Source at The Large Hadron Collider and Data Gravity

I am delighted to announce a new Open Source cybergrant awarded to the Caltech team developing the ANSE project at the Large Hadron Collider. The project team lead by Caltech Professor Harvey Newman will be further developing the world’s fastest data forwarding network with Open Daylight. The LHC experiment is a collaboration of world’s top Universities and research institutions, the network is designed and developed by the California Institute of Technology High Energy Physics department in partnership with CERN and the scientists in search of the Higgs boson, adding new dimensions to the meaning of “big data analytics”, the same project team that basically set most if not all world records in data forwarding speeds over the last decade, and quickly approaching the remarkable 1 Tbps milestone.

Unique in its nature and remarkable in its discovery, the LHC experiment and its search for the elusive particle, the very thing that imparts mass to observable matter, is not only stretching the bleeding edge of physics, but makes the observation that data behaves as if it has gravity too. With the exponential rise in data (2 billion billion bytes per day and growing!), services and applications are drawn to “it”. Moving data around is neither cheap nor trivial. Though advances in network bandwidth are in fact observed to be exponential (Nielsen’s Law), advances in compute are even faster (Moore’s Law), and storage even more.  Thus, the impedance mismatch between them, forces us to feel and deal with the rising force of data gravity, a natural consequence of the laws of physics. Since not all data can be moved to the applications nor moved to core nor captured in the cloud, the applications will be drawn to it, a great opportunity for Fog computing, the natural evolution from cloud and into the Internet of Things.

Congratulations to the Caltech physicists, mathematicians and computer scientists working on this exciting project. We look forward to learning from them and their remarkable contribution flowing in Open Source made possible with this cybergrant so that everyone can benefit from it, not just the elusive search for gravity and dark matter. After all, there was a method to the madness of picking such elements for Open Daylight as Hydrogen and Helium. I wander what comes next…

Tags: , , , , , , , , , , , , , , , , , ,

4 Key Requirements to Scale the Internet of Things

April 15, 2014 at 8:00 am PST

Today the Internet of Things (IoT) is everywhere: you can easily see smart meters on houses, parking sensors in the ground, cameras attached to traffic posts, and people wearing intelligent wristband and glasses -- all of them connected to the Internet. And this is only the tip of the iceberg: while you are reading this blog post factories, trains and trucks around the world are also being connected to the Internet.

Many traditional industries have historically requested help from different types of engineers to improve their processes and gain efficiency. Now they are asking us, the Internet engineers, to contribute solving new industrial world challenges by connecting billions of new devices.

The more ambitious part of this journey is the integration between both worlds: Information Technology (IT) and Operation Technology (OT). For that a systems approach is required to scale the existing Internet infrastructure to accommodate IoT use cases, while making IT technology easy to adopt for OT operators. We are facing a historical opportunity to convergence massive scale systems in a way we have never seen before, and such an effort will unlock a multibillion-dollar business.

Scaling IoT

In order to be ready to capture this opportunity and scale in a sustainable manner, four requirements are necessary:

Read More »

Tags: , , , , , ,