The Internet of Things (IoT) has been among us for a while, but in recent years we have seen a change in scale, in part due to cheaper sensors that are emerging. Cities are deploying sensors to improve the quality of life for their citizens, while factories are connecting more and more machines and collecting more data about the production processes. Supply chains are being revolutionized by tracking in real-time not only position but also movement (shaking, dropping), humidity, etc… In almost every industry you can see the impact of IoT.
Due to this change in scale new challenges are starting to emerge that are demanding a rethink of the Cloud only paradigm, and the silo approach to IoT.
IoT typically means deploying application intelligence and analytics at the edge (the area between Cloud/data centers and end points such as sensors, factory robots, etc…), or pushing data directly to the Cloud for processing. Both approaches have their advantages as well as potential drawbacks.
The network between the edge and the Cloud can be relatively expensive (especially if you send all data to the Cloud) or has limited capacity (capacity is of course correlated with price). Latency to the Cloud can also be relatively high, and often lacks determinism. For example changing the color of traffic lights via the Cloud might not be optimal.
More and more solutions are being deployed at the edge to address the challenges Cloud faces. But these solutions have their drawbacks too. Many different solutions (hardware and software) make it more challenging to manage these edge services in a consistent and coherent manner.
IoT deployment is typically not confined to the traditional enterprise IT domain (au contraire). This means that traditional security solutions do not always apply, resulting in potential high risk security breaches: it is not only about stealing data, but also about controlling machines For example manufacturing robots, location of vehicles, …
One of the trends that we are seeing is that providers of edge services want to focus on their service (application) as this is where their expertise is. Today however many providers also need to provide the hardware (not always a good source of revenue), a certain level of security (not always their primary level of expertise), and a way to manage their services and devices (which can pose a challenge if a customer deploys multiple silos of IoT services).
A Unified platform beyond Cloud
To address the challenges described above, a rethink is needed. On one hand the Cloud only paradigm is not sufficient, yet such a new platform needs to support a Cloud like methodology for the edge.
Fog, a driver for IoT
The emphasis here is on “like”, as the edge differs from a Cloud/data center in several important aspects such as: limited resources, limited network capacity, security challenges, and resource distribution. However, such a platform will also have things in common with Clouds. Just like in a Cloud environment it needs to manage the (edge) service life cycle and orchestrate deployment.
With such a platform in place, edge service providers can focus on their core business as this new platform provides them with hooks to develop, deploy, scale, monitor, and manage their services in a secure and safe environment while seamlessly connecting to the Cloud.
Moving to Cloud and beyond
The vision of such a unified platform has been described by Bonomi et.al. and labeled Fog Computing. We are now seeing this vision unfold in several distinct stages.
Unified (IP based) connectivity is typically the first stage. For example, cities offering free Wi-Fi in the city center, or factories that are consolidating their different networks.
Once unified connectivity is in place, it becomes easier to deploy services at the edge by connecting hardware to this IP network. This can lead to service silos, which are sometimes difficult to avoid due to legacy applications and hardware.
The next stage is the deployment of a unified platform (Fog platform) between Cloud and the endpoints to enhance service deployment beyond the Cloud but also to spur innovation by making it easier to share data between these services. This stage is where there is a true added value, as service management is unified and hardware platforms can become more consolidated.
This paradigm shift to think beyond Cloud towards a unified platform, will lead to new products, services and business models, but can also increases the risk of fragmentation due to lack of standards, architectural vision and abstraction. In order for this paradigm shift to truly succeed it is therefore important to have a continuous conversation between the IT and OT industry.
To ensure companies capture the value of IoT, it is important to start the thought process on a Fog and IoT vision early on: service deployments, connectivity capacity beyond Cloud, data filtering and analytics at the edge, device consolidation, real-time requirements, etc…
Such an IoT vision will enable companies to better prepare and understand the risks and opportunities in an increasingly connected world.
Tags: Corporate Technology Group, CTG, Fog, IoT
Ten large oil refineries produce about 10 terabytes of data each day, which equates to the entire printed collection of the U.S. Library of Congress.
One modernized city the size of Singapore can generate about 2.5 petabytes of data every day, which translates to all U.S. academic research libraries combined.
And with more than 14 billion, data-transmitting devices connected to the Internet today, growing to 50 billion by 2020, it is little wonder that most of us are overwhelmed by this mind-boggling explosion of data.
Turning this flood of raw data into useful information and even wisdom for better business decisions and quality of life experiences is what the Internet of Everything (IoE) is all about. This is a daunting task. According to IDC Research, just .5% of all data is used or analyzed, and online data volumes are doubling every two years from a combination of mobile devices, videos, sensors, M2M, social media, applications and much more.
Connected Analytics Portfolio
Last Thursday, however, Cisco unveiled our Connected Analytics portfolio for the Internet of Everything, a unique approach that includes software packages to bring analytics to the data, regardless of its location or whether it is in motion or at rest. This new generation of analytics tools for IoE can convert more and more data into valuable intelligence — from the inter cloud, to the data center to the network’s edge.
Read More »
Tags: analytics, Big Data, Cisco, Fog, Internet of Everything, internet of things, IoE, IoT, Process Improvement, Wim Elfrink
This week at the Internet of Things World Forum we are challenging the industry to accelerate the adoption of the Internet of Things (IoT). IoT leverages many ‘off the shelf’ technologies but also has some unique requirements, which must be met. How do we make sure that the right critical information is being processed while conserving bandwidth and having a resilient network? Here at Cisco, “fog computing” is a clear technology vision, with the means to provide greater visibility and control- having the network and applications process the critical data in concert with the cloud. With today’s announcement at IoT World Forum, Cisco continues to deliver on its vision for fog computing with an increase in the number of platforms supporting Cisco IOx and the addition of application management capabilities.
Earlier this year we announced the availability of Cisco IOx, as part of the Cisco Fog portfolio of technologies. Cisco IOx allows customers and solution providers across all industries to develop, manage and run software applications directly on Cisco industrial networked-devices, including hardened routers, switches, and other devices. We have seen tremendous market traction of Cisco IOx in the last few months along with the accelerating IoT market growth. As IoT transitions from early adoption to wide deployment, Cisco IOx is enabling solutions providers across many industries to create innovative software solutions. Today’s announcement of the second phase of the IOx platform builds on the continuing momentum of Cisco’s vision for Fog computing. Read More »
Tags: Fog, internet of things, IoT, IoTWF, IOx
In 1984, John Gage of Sun Microsystems coined the phrase “the network is the computer” as computing functions started to become increasingly distributed across the network. Today, boundaries that once separated individual computers have disappeared and application processing is enabled—and managed—by the network. We are now at the forefront of a new market transition, as eloquently explained by Rick van der Lans in his paper, “The Network Is the Database.”
The network is indeed becoming the database. Big Data and the related approach to database management are moving away from a centralized data warehouse model and literally starting to flow across the network. We are virtualizing data management by leaving data in the network, instead of copying it into a data center. Data stays in motion wherever and whenever it’s needed across the network, instead of being at rest.
What does this mean for business value? A distributed—and virtualized—data management approach solves the three major issues of Big Data: volume, variety, and velocity.
Read More »
Tags: 3 Vs, analytics, analytics at the edge, Big Data, Cisco, Cisco Consulting Services, distributed network architecture, Fog, Internet of Everything
March is a rather event-laden month for Open Source and Open Standards in networking: the 89th IETF, EclipseCon 2014, RSA 2014, the Open Networking Summit, the IEEE International Conference on Cloud (where I’ll be talking about the role of Open Source as we morph the Cloud down to Fog computing) and my favorite, the one and only Open Source Think Tank where this year we dive into the not-so-small world (there is plenty of room at the bottom!) of machine-to-machine (m2m) and Open Source, that some call the Internet of Everything.
There is a lot more to March Madness, of course, in the case of Open Source, a good time to celebrate the 1st anniversary of “Meet Me on the Equinox“, the fleeting moment where daylight conquered the night the day that project Daylight became Open Daylight. As I reflect on how quickly it started and grew from the hearts and minds of folks more interested in writing code than talking about standards, I think about how much the Network, previously dominated, as it should, by Open Standards, is now beginning to run with Open Source, as it should. We captured that dialog with our partners and friends at the Linux Foundation in this webcast I hope you’ll enjoy. I hope you’ll join us in this month in one of these neat places.
As Open Source has become dominant in just about everything, Virtualization, Cloud, Mobility, Security, Social Networking, Big Data, the Internet of Things, the Internet of Everything, you name it, we get asked how do we get the balance right? How does one work with the rigidity of Open Standards and the fluidity of Open Source, particularly in the Network? There is only one answer, think of it as the Yang of Open Standards, the Yin of Open Source, they need each other, they can not function without the other, particularly in the Network. Open Source is just the other side, the wild side!
Tags: Big Data, cloud, Eclipse, Fog, IEEE, ietf, internet of things, IoE, IoT, Linux, Linux Foundation, M2M, network, Open Daylight, open source, open standards, social networking, virtualization, Yin Yang