Cisco Blogs
Share

Fog Computing: The New Model for the IoE

- January 14, 2015 - 0 Comments

Last month, we proudly announced Connected Analytics for the Internet of Everything (IoE), easy-to-deploy software packages that bring analytics to data regardless of its location. It is a continued part of our commitment to delivering on our vision for fog computing, also called edge computing, a model that does not require the movement of data back to a centralized location for processing. If you’ve been reading my blog, you’ve seen me write about this as the concept of ‘Analytics 3.0’ or the ability to do analytics in a widely distributed manner, at the edge of the network and on streaming data. This capability is unique to Cisco and critical for deriving real-time insights in the IoE era.

To perform analytics using a traditional computing method, once data is generated it is aggregated, moved and stored into a central repository, such as a data lake or enterprise data warehouse, so it can be analyzed for insight. In the IoE, data is massive, messy, and everywhere – spanning many centralized data repositories in multiple clouds, and data warehouses. Increasingly, data is also being created in massive volume in a very distributed way…from sensors on offshore oil rigs, ships at sea, airplanes in flight, and machines on factory floors. In this new world, there are many problems that arise with the traditional method – not only is it expensive and time consuming to move all of this data to a central place, but critical data can also lose its real-time value in the process. In fact, many companies have stopped moving all of their data into a central repository and accepted the fact that data will live in multiple places.

Slide1

Analytics 3.0 creates a more appropriate model, where the path to derive insight is different by combining traditional centralized data storage and analysis with data management and analytics that happen at the edge of the network…much closer to where the huge volume of new data is being created. Analytics involves complicated statistical models and software, but the concept is simple…using software to look for patterns in data, so you can make better decisions.  It makes sense then to have this software close to where data is created, so you can find those patterns more quickly…and that’s the key concept behind Analytics 3.0. Once it’s analyzed, we can make more intelligent decisions about what data should be stored, moved or discarded. This model gives us the opportunity to get to the ‘interesting data’ quicker and also alleviates the costs of storing and moving the ‘non-interesting data.’

Analytics 3.0 is not about replacing big data analytics, cloud analytics and other centralized analytics.  Those elements are all part of Analytics 3.0, but they are not sufficient to handle the volume of massively distributed data created in the IoE, and so they must be augmented with the ability to process and analyze data closer to where it is created. By combining centralized data sources with streaming data at the edge, you will look for and find new patterns in your data. Those patterns will help you make better decisions about growing your business, optimizing your operations or better serving your customers…and that is the power of Analytics for the IoE.

 

Join the Conversation

Follow @MikeFlannagan and @CiscoAnalytics.

Learn More from My Colleagues

Check out the blogs of Mala AnandBob Eve and Nicola Villa to learn more.

Tags:

In an effort to keep conversations fresh, Cisco Blogs closes comments after 60 days. Please visit the Cisco Blogs hub page for the latest content.

Share