Considering all the hype around the cloud, it’s easy to forget that we live in a world of many clouds. Organizations can’t simply tap into a single all-powerful entity located everywhere and nowhere, all at once. In reality, they must dip in and out of a complex and often challenging array of public, private, and hybrid clouds.
But what is the future of cloud? The Internet of Everything (IoE) is driving an unprecedented explosion in connectivity — and transformation — and cloud is the key delivery system that makes it all possible. In the enterprise, cloud has already upended traditional IT consumption models, transitioning IT departments into brokers of services that are increasingly available through third-party vendors — and accessed through a variety of clouds. Facing an increasingly cloudy future, service providers are focused on moving beyond their traditional roles as telecom providers, while new players continue to enter the core markets of traditional service providers.
But how will enterprises and service providers meet the security and operational challenges of an ever-expanding and increasingly complicated cloud universe? Part of the answer lies in the industry’s evolution toward an ecosystem of cloud providers. Incorporating a cloud “brokerage” and a cloud “federation,” this ecosystem will give customers a choice of cloud solutions that meet their specific needs.
I’m happy to report that Cisco, along with some of our key partners, is helping to smooth the cloud transformation journey both on the demand (enterprise) and supply (service provider) sides.
This is a three part blog that will explore some of the issues that are still holding back the Internet of Things (IoT), what Cisco is doing to help to solve these issues (via Cisco IOx), and what are some of the real life benefits that can be achieved.
Helping to solve the “Data Tsunami” for the Internet of Things
Big Data is a term being used a lot these dates. A “Data Tsunami” would be a better descriptor. In roughly 2000 years of recorded history humans created 2 Exabytes of data. The pace of data creation has accelerated at an incredible pace in the last few years, we now generate over 2.5 Exabytes of data every day:
Energy utility companies process 1.1 BILLION data points (.5TB) per day Tweet
A large offshore oil field produces 0.75TB of data weekly Tweet
A large refinery generates 1TB of raw data per dayTweet
An airplane will generate 10TB of data for every 30 minutes of flight Tweet
The Internet of Things (IoT) is enabling the proliferation of connected objects, and these objects are creating a data explosion, with data coming from billions of disparate devices, located all around the world. But unless these disparate devices can work together to create meaning, all of this data is relatively useless. Read More »
We live in a time of tremendous and challenging technological disruptions. Yet it is also a time when the opportunities for business transformation are equally vast and impactful. This is particularly true for the retail industry.
The wave of change, which Cisco calls the Internet of Everything (IoE), is fast-moving, and retailers will need to adapt quickly or be left behind. After all, this explosion in connectivity — from 10 billion things today to 50 billion in 2020 — will demand a new paradigm: the IoE-Ready Retailer. And it will enable vast improvements in customer experience, employee productivity, and supply-chain efficiency, while allowing retailers to know their customers like never before.
Cisco’s research into this new dimension in connectivity among people, process, data, and things — and the overall Value at Stake over the next 10 years —presents some mind-boggling numbers: $14.4 trillion for the private sector overall and another $4.6 trillion for public sector organizations.
As per Cisco’s estimate, the retail industry will account for 11 percent of the total IoE private sector Value at Stake over the next 10 years — second only to the manufacturing industry. Cisco believes that success for retailers will hinge particularly on their ability to apply technology to improve the “people” and “process” aspects of their businesses, and to be able to offer unique, new connected experiences to the average shopper.
Cisco’s new research, which explores how the average consumer is thinking and adopting these connected experiences, uncovers some startling facts. Consumers now research, compare, and purchase products with one-click ease. The population of ever-connected, digital natives is increasing at unprecedented rates (60%+ year over year). This affords sellers with a wealth of real-time data insights that can help them stock the right products and present them in novel ways.
My 2014 predictions are finally complete. If Open Source equals collaboration or credibility, 2013 has been nothing short of spectacular. As an eternal optimist, I believe 2014 will be even better:
Big data’s biggest play will be in meatspace, not cyberspace. There is just so much data we produce and give away, great opportunity for analytics in the real world.
Privacy and security will become ever more important, particularly using Open Source, not closed. Paradoxically, this is actually good news as Open Source shows us again, transparency wins and just as we see in biological systems, the most robust mechanisms do so with fewer secrets than we think.
The rise of “fog” computing as a consequence of the Internet of Things (IoT) will unfortunately be driven by fashion for now (wearable computers), it will make us think again what have we done to give up our data and start reading #1 and #2 above with a different and more open mind. Again!
Virtualization will enter the biggest year yet in networking. Just like the hypervisor rode Moore’s Law in server virtualization and found a neat application in #2 above, a different breed of projects like OpenDaylight will emerge. But the drama is a bit more challenging because the network scales very differently than CPU and memory, it is a much more challenging problem. Thus, networking vendors embracing Open Source may fare well.
Those that didn’t quite “get” Open Source as the ultimate development model will re-discover it as Inner Source (ACM, April 1999), as the only long-term viable development model. Or so they think, as the glamor of new-style Open Source projects (OpenStack, OpenDaylight, AllSeen) with big budgets, big marketing, big drama, may in fact be too seductive. Only those that truly understand the two key things that make an Open Source project successful will endure.
AI recently morphed will make a comeback, not just robotics, but something different AI did not anticipate a generation ago, something one calls cognitive computing, perhaps indeed the third era in computing! The story of Watson going beyond obliterating Jeopardy contestants, looking to open up and find commercial applications, is a truly remarkable thing to observe in our lifespan. This may in fact be a much more noble use of big data analytics (and other key Open Source projects) than #1 above. But can it exist without it?
Finally, Gen Z developers discover Open Source and embrace it just like their Millennials (Gen Y) predecessors. The level of sophistication and interaction rises and projects ranging from Bitcoin to qCraft become intriguing, presenting a different kind of challenge. More importantly, the previous generation can now begin to relax knowing the gap is closing, the ultimate development model is in good hands, and can begin to give back more than ever before. Ah, the beauty of Open Source…
More and more enterprises are managing distributed infrastructures and applications that need to share data. This data sharing can be viewed as data flows that connect (and flow through) multiple applications. Applications are partly managed on-premise, and partly in (multiple) off-premise clouds.
With the advent of the Internet of Things (IoT) the need to share data between applications, sensors, infrastructure and people (specifically on the edge) will only increase. This raises fundamental questions on how we develop scalable distributed systems: How to manage the flow of events (data flows)? How to facilitate a frictionless integration of new components into the distributed systems and the various data flows in a scalable manner? What primitives do we need, to support the variety of protocols? Read More »