Whenever I hear about a serious train accident, mugging or shootout on the streets of a city, my thoughts often turn to Fog Computing. The same is true when I too am stuck idling in a traffic jam or at home and there’s a power outage during a winter storm or a summer heat wave.
Why do I think about Fog Computing? Well, my job at Cisco is to not only identify the latest disruptive Internet of Things (IoT) technologies, but also to validate where they might be applied to improve overall quality of life.. Whether it’s drones, artificial intelligence or robotics, my passion is to accelerate the art of the possible.
Consider Fog Computing. Fog extends cloud computing to the edge of the network. This provides a virtualized platform for compute, storage and network services between devices and data storage centers in the cloud. Because of its low latency, location awareness, real-time interactions and wide geo distribution, Fog Computing can sense and respond to situations in the real physical world almost instantly.
The speed and power of Fog to connect people, data, processes and things opens up a new world of practical solutions. For example, Fog Computing, when combined with sensors and wireless networks, can immediately alert the train operator as soon as there is trouble on the tracks, such as a slow-walking pedestrian or a stalled vehicle. With Fog, energy loads can be automatically re-balanced or re-routed to alternative sources during spikes in demand or low availability.
In a Smart+Connected Community, acoustic sensors deployed around streets that are connected to Fog Computing infrastructure can identify gunshots, perpetrators, victims, accidents, or even cries for help with high accuracy while also alerting appropriate authorities.
Read More »
Tags: Biren Gandhi, Cisco, cloud, Cloud Computing, Fog computing, innovation, Intel, Internet of Everything, internet of things, Smart Cities, Smart+Connected Communities
When we think of “cloud” we think of a vast collection of compute, network, and storage capabilities that resides somewhere high above us—a massive repository of functionality that can be accessed from anywhere and any device with enough bandwidth to handle the data flow.
With practically unlimited power and scalability, cloud technology has been a key enabler of the Internet. But the Internet of Things (IoT) demands something more. IoT is a broad collection of sensors, cameras, smartphones, computers, and machines—all connected to and communicating with applications, websites, social media, and other devices. To maximize value, much of the data generated by these “things” must be processed and analyzed in real time. For example, sensors and cameras in and around a large retail store may continuously collect data about customer volume and traffic flow. The store can derive some value from all this data by sending it back to the cloud to analyze long-term trends. But the value is multiplied if the system can process the data locally, in real time, and then act on it immediately by sending more cashiers to the check-out line just before a surge in customer traffic.
This sort of real-time, high-bandwidth application requires a new distributed cloud model that brings cloud networking, compute, and storage capabilities down to earth—to the very edge of the network. My friend Flavio Bonomi has worked tirelessly with both academia and other industry partners to advance the concept of fog, inspired by the way the San Francisco fog extends the cloud to the ground. Fog computing creates a platform—what we call a fog node—that provides a layer of compute, storage, and networking services between end devices “on the ground” and cloud computing data centers. Fog is not a separate architecture; it merely extends the existing cloud architecture to the edge of the network—as close to the source of the data as possible—to enable real-time data processing and analytics. Read More »
Tags: Cisco, Connected Vehicles, edge computing, Flavio Bonomi, Fog computing, Internet of Everything, internet of things, IoE, IoT, Maciej Kranz
Sitting in traffic the other day, I tuned off my Peter Frampton Spotify channel and started to listen to New York area local AM radio. The president of a mortgage business came on in an advertisement. He ended his pitch with the tag line “I promise you the best service humanly possible”.
This caught my attention. At one time I would have viewed this as a very positive statement. But is this still the case?
Consumers consistently cite the “indifference of one person” as a key reason they leave a supplier relationship. (Source: TARP).
The contact center industry has long counted on non-human software applications to attempt to emulate human interaction. This could be as simple as “if-then” routing schemes to sophisticated avatars which emulate human engagement (just “Ask Jenn” at Alaska Airlines – http://www.alaskaair.com/content/about-us/site-info/ask-jenn.aspx) Read More »
Tags: Cisco, collaboration, customer experience, internet of things
Once upon a time, the world’s greatest inventions always seemed to come from individual geniuses locked in a room day and night on their own. We often think of Alexander Graham Bell inventing the telephone by himself, Thomas Edison inventing the light bulb solo or Johannes Gutenberg working mostly alone to develop a mold that led to the first printing press.
Solo inventors will always play pivotal roles in developing “the next big thing” even as we we’re half way through the second decade of the 21st century. Think Mark Zuckerberg masterminding Facebook on his own in his dorm room at Harvard.
More and more, however, we’re discovering that in today’s Internet of Everything world, where complex technologies increasingly connect and converge, innovation hinges on all types of hyper collaborations. Today, innovation requires open interaction among businesses, universities, startups, incubators, developers and others. Now, collaboration makes innovation happen! Read More »
Tags: Alex Goryachev, Cisco, innovation, Internet of Everything, internet of things, IoE, IoE Innovation Center, IoT
Traditional to Big Data to IoT: Transaction Processing Performance Council Establishes Internet of Things Working Group (TPC-IoT)
Over the past quarter century, the Transaction Processing Performance Council (TPC) has developed several industry standard benchmarks for database performance, pretty much in line with major technology trends. The two most influential benchmark standards have been TPC-C (standard for benchmarking transaction processing systems) introduced in 1991, and TPC-D and its successor TPC-H (standards for benchmarking decision support systems) introduced in 1994. These standards have been significant driving force behind the development and advancement of several database, server and storage related technologies. In addition, the TPC laid a solid foundation for complete system-level performance, and methodology for calculating total-system-price and price-performance, that have been widely adapted in the industry.
There is no doubt that industry and technology landscapes have changed and are still continue to change at a fast pace. Two of the technologies that will change the world in next 10 years are Big Data and Internet of things (IoT).
Big Data: Big Data is a popular term now that describes the exponential growth of data, often defined by the 5Vs, and the associated technologies to storage and process effectively and drive business values. The Big Data technology and services market represents one of the fast-growing, multi-billion dollar, worldwide market that is expected to grow to a $60 billion market driving $300 billion worldwide IT spending directly or indirectly by 2020.
Foreseeing the importance, in 2014 the TPC has developed TPC Express Benchmark HS (TPCx-HS) to provide the industry with verifiable performance, price-performance, and availability metrics of hardware and software systems dealing with Big Data. This standard can be used to assess a broad range of system topologies and implementation of Hadoop systems in a technically rigorous and directly comparable, and vendor-neutral manner. This is the first major step while the TPC continues to enhance and develop new standards in this area such as TPC-DS with support for Hadoop and TPC-Big Bench.
Internet of Things (IoT): IoT has emerged in the last few of years, poised to transform virtually every major market segments, which contains a complex mix of technologies and products, from data collection and data curation to complex analytics exploiting the data generated by exploding number of connected devices. According to IDC the global IoT market will grow from $665 billion in 2014 to $1.7 trillion in 2020. To put that in perspective, it’s an absolutely enormous figure; only 16 economies in the world had gross domestic products exceeding $1 trillion in 2014.
As the IoT ecosystem evolves in the enterprises, it is eminent to have a set of standards that enable effective comparison of hardware and software systems and topologies in a technology and vendor-neutral manner. Continuing its commitment to bring relevant standards to the industry, today the TPC announced the formation TPC-IoT benchmark committee tasked with developing industry standard benchmarks for benchmarking hardware and software platforms associated with IoT.
We’d like to connect with companies, research and government institutions, to ensure holistic perspective during the benchmark development process. Anyone interested in our efforts can visit our membership page.
Tags: Big Data, internet of things, IoT