In my Internet of Things keynote at LinuxCon 2014 in Chicago last week, I touched upon a new trend: the rise of a new kind of utility or service model, the so-called IoT specific service provider model, or IoT SP for short.
I had a recent conversation with a team of physicists at the Large Hadron Collider at CERN. I told them they would be surprised to hear the new computer scientist’s talk these days, about Data Gravity. Programmers are notorious for overloading common words, adding connotations galore, messing with meanings entrenched in our natural language.
We all laughed and then the conversation grew deeper:
- Big data is very difficult to move around, it takes energy and time and bandwidth hence expensive. And it is growing exponentially larger at the outer edge, with tens of billions of devices producing it at an ever faster rate, from an ever increasing set of places on our planet and beyond.
- As a consequence of the laws of physics, we know we have an impedance mismatch between the core and the edge, I coined this as the Moore-Nielsen paradigm (described in my talk as well): data gets accumulated at the edges faster than the network can push into the core.
- Therefore big data accumulated at the edge will attract applications (little data or procedural code), so apps will move to data, not the other way around, behaving as if data has “gravity”
Therefore, the notion of a very large centralized cloud that would control the massive rise of data spewing from tens of billions of connected devices is pitched both against the laws of physics and Open Source not to mention the thirst for freedom (no vendor lock-in) and privacy (no data lock-in). The paradigm shifted, we entered the 3rd big wave (after the mainframe decentralization to client-server, which in turn centralized to cloud): the move to a highly decentralized compute model, where the intelligence is shifting to the edge, as apps come to the data, at much larger scale, machine to machine, with little or no human interface or intervention.
The age-old dilemma, do we go vertical (domain specific) or horizontal (application development or management platform) pops up again. The answer has to be based on necessity not fashion, we have to do this well; hence vertical domain knowledge is overriding. With the declining cost of computing, we finally have the technology to move to a much more scalable and empowering model, the new opportunity in our industry, the mega trend.
Very reminiscent of the early 90’s and the beginning of the ISPs era, isn’t it? This time much more vertical with deep domain knowledge: connected energy, connected manufacturing, connected cities, connected cars, connected home, safety and security. These innovation hubs all share something in common: an Open and Interconnected model, made easy by the dramatically lower compute cost and ubiquity in open source, to overcome all barriers of adoption, including the previously weak security or privacy models predicated on a central core. We can divide and conquer, deal with data in motion, differently than we deal with data at rest.
The so-called “wheel of computer science” has completed one revolution, just as its socio-economic observation predicted, the next generation has arrived, ready to help evolve or replace its aging predecessor. Which one, or which vertical will it be first…?
Tags: Big Data, big data analytics, CERN, cloud, Data Gravity, Fog computing, gravity, IoT, IoTSP, ISP, keynote, LHC, Linux, LinuxCon, M2M, Moore’s law, Nielsen's Law, open source, SP
I am delighted to announce a new Open Source cybergrant awarded to the Caltech team developing the ANSE project at the Large Hadron Collider. The project team lead by Caltech Professor Harvey Newman will be further developing the world’s fastest data forwarding network with Open Daylight. The LHC experiment is a collaboration of world’s top Universities and research institutions, the network is designed and developed by the California Institute of Technology High Energy Physics department in partnership with CERN and the scientists in search of the Higgs boson, adding new dimensions to the meaning of “big data analytics”, the same project team that basically set most if not all world records in data forwarding speeds over the last decade, and quickly approaching the remarkable 1 Tbps milestone.
Unique in its nature and remarkable in its discovery, the LHC experiment and its search for the elusive particle, the very thing that imparts mass to observable matter, is not only stretching the bleeding edge of physics, but makes the observation that data behaves as if it has gravity too. With the exponential rise in data (2 billion billion bytes per day and growing!), services and applications are drawn to “it”. Moving data around is neither cheap nor trivial. Though advances in network bandwidth are in fact observed to be exponential (Nielsen’s Law), advances in compute are even faster (Moore’s Law), and storage even more. Thus, the impedance mismatch between them, forces us to feel and deal with the rising force of data gravity, a natural consequence of the laws of physics. Since not all data can be moved to the applications nor moved to core nor captured in the cloud, the applications will be drawn to it, a great opportunity for Fog computing, the natural evolution from cloud and into the Internet of Things.
Congratulations to the Caltech physicists, mathematicians and computer scientists working on this exciting project. We look forward to learning from them and their remarkable contribution flowing in Open Source made possible with this cybergrant so that everyone can benefit from it, not just the elusive search for gravity and dark matter. After all, there was a method to the madness of picking such elements for Open Daylight as Hydrogen and Helium. I wander what comes next…
Tags: ANSE, California Institute of Technology, Caltech, CERN, cloud, Data Gravity, Fog computing, Hadron, Hadron Collider, Helium, Higgs boson, Hydrogen, Internet of Things (IoT), IoT, LHC, Open Daylight, open source, opendaylight, physics
Faster than the speed of light?
The biggest news to hit the world of physics recently was that neutrinos – “ethereal particles which pervade the universe but rarely interact with anything while they are doing so” – were shown in an experiment to have traveled faster than the speed of light. According to Einstein’s Theory of Relativity, this is impossible.
The experiment was conducted at CERN in September. Researchers shot neutrinos from Switzerland to Italy and measured the time it took for the neutrinos to arrive. The results showed they were arriving 60 nanoseconds earlier than light would have taken.
Is Einstein wrong? Are the laws of theoretical physics about to come crashing down around us? Um, not so fast (pun intended). Read More »
Tags: cache engines, CERN, Cisco, Cisco Router, einstein, experiment, Hadron Collider, Neutrinos, Sir Tim Berners-Lee, WWW