Cisco Blogs


Cisco Blog > Internet of Everything

Cisco IOx: An Application Enablement Framework for the Internet of Things

January 29, 2014
at 8:00 am PST

IOxThis is a three part blog that will explore some of the issues that are still holding back the Internet of Things (IoT), what Cisco is doing to help to solve these issues (via Cisco IOx), and what are some of the real life benefits that can be achieved.

Helping to solve the “Data Tsunami” for the Internet of Things

Big Data is a term being used a lot these dates. A “Data Tsunami” would be a better descriptor. In roughly 2000 years of recorded history humans created 2 Exabytes of data. The pace of data creation has accelerated at an incredible pace in the last few years, we now generate over 2.5 Exabytes of data every day:

  • Energy utility companies process 1.1 BILLION data points (.5TB) per day
  • A large offshore oil field produces 0.75TB of data weekly
  • A large refinery generates 1TB of raw data per day
  • An airplane will generate 10TB of data for every 30 minutes of flight

The Internet of Things (IoT) is enabling the proliferation of connected objects, and these objects are creating a data explosion, with data coming from billions of disparate devices, located all around the world. But unless these disparate devices can work together to create meaning,  all of this data is relatively useless.

Now, think about the idea that every single bit of data would have to be backhauled to a cloud-based application so it can be analyzed… we are going to run into the “Data Gravity” issue pretty fast. You can put all your data somewhere, but as it grows in size it becomes very expensive to move it around.

It is getting very clear that the Internet of Things requires a different computing model, one that enables distributed processing of data with the level of resiliency, scale, speed, and mobility that is required to efficiently and effectively deliver the value that the data that is being generated can create when properly processed across the network. This distributed computing model is called Fog.

Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The distinguishing characteristics of Fog are its proximity to end-users, its dense geographical distribution, and its support for mobility. Services are hosted where they’re used: at the network edge or even end devices such as set-top-boxes or access points. By hosting services locally, the Fog paradigm reduces service latency and improves QoS, resulting in superior user-experience.

If you think about it for a second, a distributed computing model similar to Fog is already in use by many of the sensors people use in their everyday life – some of them are called “wearables”. Fitness gear such as the Fitbit devices, the Nike Fuel Band, and the Jawbone UP use local communication methods (such as Bluetooth and even direct connectivity via USB or the Audio jack) to perform short-range connections that enable quick data processing via the user’s smartphone, instead of relying on a direct internet connection from the wearable (sensor) to a cloud-based app. The smartphone provides that bit of local, distributed computing the user needs through the app. The local interface allows the wearable to provide information and status updates, that are processed locally and, if needed, communicated over the internet to the central data analysis and storage application in the cloud. The intelligence, the app, and the interface are all distributed in the most efficient way.

For industrial IoT environments Cisco is now enabling the network infrastructure to play for industrial sensors the part that the smartphone plays for personal gear. In essence, Cisco is enabling access to computing and storage resources within the network devices to host applications and interfaces as close as the devices as possible, thus enabling Fog via the network.

As for why you’d not use your smartphone as that distributed computing layer for IoT, very simply, distributed intelligence (much like in real estate) is all about location. You need local/near communication methods and protocols to be able to interact with the devices and to make decisions in real time. Unless you plan to have a smartphone or a tablet located close to every one of these devices – and under the harsh conditions that outdoor and industrial environments require, supporting every different interface needed by these devices, and making them compliant with the regulations for those environments – having  an app in your smartphone for these IoT environments would imply that you still need a way to blindly connect every device to the internet first, then transverse all the data through the internet to the smartphone, so then it can processed, acted upon, relay the appropriate action back to the device, and then link it to a cloud-based app. You would need 2 x the number of connections and 2x the amount of bandwidth. You’d also have delays, latency, and all the other problems inherent in fully traversing the network for each data point. For IoT, that just does not work. For IoT you need local processing for much of your data -- thus, you need a Fog model.

Quick Interview of Guido Jouret, Cisco VP of IoT

Tags: , , , ,

In an effort to keep conversations fresh, Cisco Blogs closes comments after 60 days. Please visit the Cisco Blogs hub page for the latest content.

1 Comments.


  1. I believe that Cloud was designed for business or consumers applications and systems, but the requirements of IoT are not the same. Try to include in the existing Cloud(s) will be a great mistake. A new global Intenet of the Things need to be designed. I am not familiar with Fog but maybe can be a start.

       0 likes