Dreamhack Winter 2013 in the small city of Jonkoping in central Sweden claims to be the world’s largest digital event. I was skeptical at first, but when I arrived for the event, I was completely amazed at the scale of things in this tiny part of the world where its only bright for 5 or 6 hours during day this time of the year.
Of course, the amount of daylight didn’t impact one bit the large numbers of people streaming into the event venue well before opening time on Thursday afternoon.
“Software is Eating the World” is a quote attributed to Marc Andreessen and somewhat further explored by his business partner Ben Horowitz. Mark Andreessen gives compelling reasons to validate this quote. To some extend I have to agree with some of his reasons (but I am also a little bit biased as a software engineer). On the other hand, when I read this (and this is partly based on working in different domains on software), I wonder if software is that disruptive? If you look “under the hood” of software applications, you find that a lot of software is based on fundamental software principles that are already 20-30 years old, yet Read More »
“Software is Eating the World” is a quote attributed to Marc Andreessen and somewhat further explored by his business partner Ben Horowitz. Mark Andreessen gives compelling reasons to validate this quote. To some extend I have to agree with some of his reasons (but I am also a little bit biased as a software engineer). On the other hand, when I read this (and this is partly based on working in different domains on software), I wonder if software is that disruptive. If you look “under the hood” of software applications, you find that a lot of software is based on fundamental software principles that are already 20-30 years old, yet they are still frequently used (and for good reasons). That does not mean there are no new advances in software, however old and proven technologies still play an important role (like we say in mathematics, it does not become old, it becomes classic).
So maybe the reason that “Software is Eating the World” is due to the advances in hardware? Would you run modern enterprise applications in the Cloud 20 years ago? One of the challenges could certainly be the bandwidth. Was the IPhone a victory for software or hardware? A lot of the IPhone GUI was not that revolutionary IMO but the combination of hardware and software made for a potent technology disruption.
The abstract for this conference was designed to be a bit provocative, specifically:
“ Virtualization as a concept is not new. However, in the context of Software Defined Networking,the virtualization discussion has been focusing on overlay functions e.g networking. What about virtualization overlays and interworking with existing architectures? What are the implications to performance and management? Are we speaking the same language?
The panelists will have an opportunity to articulate the virtualization problem space for the industry and the opportunity for the industry to address.”
My panelists included the following individuals: Read More »
I am attending South Korea’s Big Data Forum in Seoul, and one question here is, “How big is Big Data?” My friend and colleague Dave Evans has pointed out that by the end of this year, more data will be created every 10 minutes than in the entire history of the world up to 2008. Now, that’s big!
Much of this data is being created by billions of sensors that are embedded in everything from traffic lights and running shoes to medical devices and industrial machinery—the backbone of the Internet of Things (IoT). But the real value of all this data can be realized only when we look at it in the context of the Internet of Everything (IoE). While IoT enables automation through machine-to-machine (M2M) communication, IoE adds the elements of “people” and “process” to the “data” and “things” that make up IoT. Analytics is what brings intelligence to these connections, creating endless possibilities.
To understand why, let’s step back and take a look at the classic approach to Big Data and analytics. Traditionally, organizations have tended to store all the data they collect from various sources in centralized data centers. With this model, if a retailer wants to know something about the buying patterns of a certain store’s customers, it can create an analysis of loyalty card purchases based on data in the data warehouse. Collecting, cleansing, overlaying, and manipulating this data takes time. By the time the analysis is run, the customer has already left the store.
Big Data today is characterized by volume, variety, and velocity. This phenomenon is putting a tremendous strain on the centralized model, as it is no longer feasible to duplicate and store all that data in a centralized data warehouse. Decisions and actions need to take place at the edge, where and when the data is created; that is where the data and analysis need to be as well. That’s what Cisco calls “Data in Motion.” With sensors gaining more processing power and becoming more context-aware, it is now possible to bring intelligence and analytic algorithms close to the source of the data, at the edge of the network. Data in Motion stays where it is created, and presents insights in real time, prompting better, faster decisions.