As information consumers that depend so much on the Network or Cloud, we sometimes indulge in thinking what will happen when we really begin to feel the effects of Moore’s Law and Nielsen’s Law combined, at the edges: the amount of data and our ability to consume it (let alone stream it to the edge), is simply too much for our mind to process. We have already begun to experience this today: how much information can you consume on a daily basis from the collective of your so-called “smart” devices, your social networks or other networked services, and how much more data is left behind. Same for machines to machine: a jet engine produces terabytes of data about its performance in just a few minutes, it would be impossible to send this data to some remote computer or network and act on the engine locally. We already know Big Data is not just growing, it is exploding!
The conclusion is simple: one day we will no longer be able to cope, unless the information is consumed differently, locally. Our brain may no longer be enough, we hope to get help, Artificial Intelligence comes to the rescue, M2M takes off, but the new system must be highly decentralized in order to stay robust, or else it will crash like some kind of dystopian event from H2G2. Is it any wonder that even today, a large portion if not the majority of the world Internet traffic is in fact already P2P and the majority of the world software downloaded is Open Source P2P? Just think of BitCoin and how it captures the imagination of the best or bravest developers and investors (and how ridiculous one of those categories could be, not realizing its potential current flaw, to the supreme delight of its developers, who will undoubtedly develop the fix — but that’s the subject of another blog).
Consequently, centralized high bandwidth style compute will break down at the bleeding edge, the cloud as we know it won’t scale and a new form of computing emerges: fog computing as a direct consequence of Moore’s and Nielsen’s Laws combined. Fighting this trend equates to fighting the laws of physics, I don’t think I can say it simpler than that.
Thus the compute model has already begun to shift: we will want our Big Data, analyzed, visualized, private, secure, ready when we are, and finally we begin to realize how vital it has become: can you live without your network, data, connection, friends or social network for more than a few minutes? Hours? Days? And when you rejoin it, how does it feel? And if you can’t, are you convinced that one day you must be in control of your own persona, your personal data, or else? Granted, while we shouldn’t worry too much about a Blade Runner dystopia or the H2G2 Krikkit story in Life, the Universe of Everything, there are some interesting things one could be doing, and more than just asking, as Philip K Dick once did, do androids dream of electric sheep?
To enable this new beginning, we started in Open Source, looking to incubate a project or two, first one in Eclipse M2M, among a dozen-or-so dots we’d like to connect in the days and months to come, we call it krikkit. The possibilities afforded by this new compute model are endless. One of those could be the ability to put us back in control of our own local and personal data, not some central place, service or bot currently sold as a matter of convenience, fashion or scale. I hope with the release of these new projects, we will begin to solve that together. What better way to collaborate, than open? Perhaps this is what the Internet of Everything and data in motion should be about.
Considering all the hype around the cloud, it’s easy to forget that we live in a world of many clouds. Organizations can’t simply tap into a single all-powerful entity located everywhere and nowhere, all at once. In reality, they must dip in and out of a complex and often challenging array of public, private, and hybrid clouds.
But what is the future of cloud? The Internet of Everything (IoE) is driving an unprecedented explosion in connectivity — and transformation — and cloud is the key delivery system that makes it all possible. In the enterprise, cloud has already upended traditional IT consumption models, transitioning IT departments into brokers of services that are increasingly available through third-party vendors — and accessed through a variety of clouds. Facing an increasingly cloudy future, service providers are focused on moving beyond their traditional roles as telecom providers, while new players continue to enter the core markets of traditional service providers.
But how will enterprises and service providers meet the security and operational challenges of an ever-expanding and increasingly complicated cloud universe? Part of the answer lies in the industry’s evolution toward an ecosystem of cloud providers. Incorporating a cloud “brokerage” and a cloud “federation,” this ecosystem will give customers a choice of cloud solutions that meet their specific needs.
I’m happy to report that Cisco, along with some of our key partners, is helping to smooth the cloud transformation journey both on the demand (enterprise) and supply (service provider) sides.
NRF 2014 was held last week at the Javits Centre in New York City. It’s the biggest retail event of the year where vendors show off the future of the industry to all the delegates both using inspiring key notes and exciting demos on the Expo floors.
2014 and beyond:
It wasn’t too hard to identify that there were some common themes. On Tuesday afternoon I stood on the main Expo floor and just looking around I could quickly see the industry’s top of mind phrases and buzz words popping out:
We live in a time of tremendous and challenging technological disruptions. Yet it is also a time when the opportunities for business transformation are equally vast and impactful. This is particularly true for the retail industry.
The wave of change, which Cisco calls the Internet of Everything (IoE), is fast-moving, and retailers will need to adapt quickly or be left behind. After all, this explosion in connectivity — from 10 billion things today to 50 billion in 2020 — will demand a new paradigm: the IoE-Ready Retailer. And it will enable vast improvements in customer experience, employee productivity, and supply-chain efficiency, while allowing retailers to know their customers like never before.
Cisco’s research into this new dimension in connectivity among people, process, data, and things — and the overall Value at Stake over the next 10 years —presents some mind-boggling numbers: $14.4 trillion for the private sector overall and another $4.6 trillion for public sector organizations.
As per Cisco’s estimate, the retail industry will account for 11 percent of the total IoE private sector Value at Stake over the next 10 years — second only to the manufacturing industry. Cisco believes that success for retailers will hinge particularly on their ability to apply technology to improve the “people” and “process” aspects of their businesses, and to be able to offer unique, new connected experiences to the average shopper.
Cisco’s new research, which explores how the average consumer is thinking and adopting these connected experiences, uncovers some startling facts. Consumers now research, compare, and purchase products with one-click ease. The population of ever-connected, digital natives is increasing at unprecedented rates (60%+ year over year). This affords sellers with a wealth of real-time data insights that can help them stock the right products and present them in novel ways.
How to manage data as an asset and transform data into business value.
Why Big Data projects fail and how to avoid common pitfalls.
How to optimize a Big Data supply chain.
What infrastructure is required to run and automate data workload processing.
Big Data has become the next big thing, not only for the promise of finding the “needle on the haystack” of untapped revenues, but also for the possibility of uncovering and delivering exciting new business models.
For IT, Big Data can be both exciting and daunting: By delivering analysis from huge amounts of data more quickly than with existing Business Intelligence (BI) tools, data center managers can deliver new services to the business end users; daunting because researching, selecting, buying, staffing and managing Big Data tools sets pretty much demand a separate data center.
We invited our partner Informatica to share its knowledge on managing complex data supply chains have produced this webinar with insight from leading analyst firm Gartner, that will help you align how you think about your data and the infrastructure that it runs on.
By incorporating these the Big Data essentials into your planning you can prevent your sandbox from turning into quicksand and avoiding project delays and cost overruns.
This webinar consists of three parts:
Doug Laney, Research VP at Gartner for business analytics, begins with a lesson in Infonomics – managing data as an asset, and the skills required for Big Data analytics.
John Haddad, Senior Director Big Data Product Marketing at Informatica, explains how to turn these data assets into actionable information generating business value and developing a Big Data supply chain.
Ronnie Ray, Director Product Management for Cisco Cloud and Systems Management, completes the discussion by describing a Big Data infrastructure and operations platform that will exceed your business service levels.
Many of our customers already agree: Cisco is delivering Big Data infrastructure and data workload processing management with the Cisco Common Platform Architecture (CPA). CPA is based on Cisco Unified Computing (UCS), Unified Fabric, and Unified Management. The Cisco Big Data solution delivers:
Integrated Server Management
Cisco UCS unifies computing, networking, management, virtualization, and storage access into a single integrated architecture. It’s an ideal platform for Big Data applications.
Integrated Network Management
The same Cisco network architecture can serve both traditional applications and databases and Big Data processing solutions.
Integrated Data Processing Management
Cisco provides workload automation that manages the flow of data between a wide variety of applications into and out of your Big Data processing environments.