The Internet of Everything (IoE) is a juggernaut of change, transforming organizations in profound ways. It sows disruption, and it grants enormous opportunities. But this sweeping wave of change is not reserved for what we normally think of as “technology companies.” In the IoE economy, even seemingly “analog” endeavors must be bestowed with network connectivity, no matter how venerable a company’s roots or old its traditions.
In a world where Everyone Is a Tech Company, there are some great examples of older companies that are heeding this new reality. Retail, manufacturing, transportation, and education are just a few of the places where people, process, data, and things are being connected in startling new ways. Companies that are ahead of the IoE transformation curve will ensure their competiveness in marketplaces that are ever more vulnerable to disruption.
Dundee Precious Metals provides a great example of a company that is embracing change. A far-flung global organization, the company, for example, runs Europe’s largest mine in Chelopech, Bulgaria, from which it ships gold-rich copper ore to a smelter in Namibia. Yet through IoE-related technologies, executives at the company’s headquarters in Toronto, Canada, have gained unprecedented visibility into all aspects of their operations.
The end result? A boon in safety, efficiency, and productivity.
Read More »
Tags: Big Data, canada, Chelopech, Cisco, Cisco Consulting Services, Dundee Precious Metals, employee productivity, Fast IT, Future of IT, innovation, Internet of Everything, internet of things, IoE, IoE Value Index, IoT, job creation, Manufacturing, Mark Gelsomini, mining, Namibia, Network programmability, toronto, value at stake
By now it is clear that big data analytics opens the door to unprecedented analytic opportunities for business innovation, customer retention and profit growth. However, a shortage of data scientists is creating a bottleneck as organizations move from early big data experiments into larger scale adoption. This constraint limits big data analytics and the positive business outcomes that could be achieved.
Click on the photo to hear from Comcast’s Jason Hull, Data Integration Specialist about how his team uses data virtualization to get what they need done, faster
It’s All About the Data
As every data scientist will tell you, the key to analytics is data. The more data the better, including big data as well as the myriad other data sources both in the enterprise and across the cloud. But accessing and massaging this data, in advance of data modeling and statistical analysis, typically consumes 50% or more of any new analytic development effort.
• What would happen if we could simplify the data aspect of the work?
• Would that free up data scientists to spend more time on analysis?
• Would it open the door for non-data scientists to contribute to analytic projects?
SQL is the key. Because of its ease and power, it has been the predominant method for accessing and massaging data for the past 30 years. Nearly all non-data scientists in IT can use SQL to access and massage data, but very few know MapReduce, the traditional language used to access data from Hadoop sources.
How Data Virtualization Helps
“We have a multitude of users…from BI to operational reporting, they are constantly coming to us requesting access to one server or another…we now have that one central place to say ‘you already have access to it’ and they immediately have access rather than having to grant access outside of the tool” -Jason Hull, Comcast
Data virtualization offerings, like Cisco’s, can help organizations bridge this gap and accelerate their big data analytics efforts. Cisco was the first data virtualization vendor to support Hadoop integration with its June 2011 release. This standardized SQL approach augments specialized MapReduce coding of Hadoop queries. By simplifying access to Hadoop data, organizations could for the first time use SQL to include big data sources, as well as enterprise, cloud and other data sources, in their analytics.
In February 2012, Cisco became the first data virtualization vendor to enable MapReduce programs to easily query virtualized data sources, on-demand with high performance. This allowed enterprises to extend MapReduce analyses beyond Hadoop stores to include diverse enterprise data previously integrated by the Cisco Information Server.
In 2013, Cisco maintained its big data integration leadership with updates of its support for Hive access to the leading Hadoop distributions including Apache Hadoop, Cloudera Distribution (CDH) and Hortonworks (HDP). In addition, Cisco now also supports access to Hadoop through HiveServer2 and Cloudera CDH through Impala.
Others, beyond Cisco, recognize this beneficial trend. In fact, Rick van der Lans, noted Data Virtualization expert and author, recently blogged on future developments in this area in Convergence of Data Virtualization and SQL-on-Hadoop Engines.
So if your organization’s big data efforts are slowed by a shortage of data scientists, consider data virtualization as a way to break the bottleneck.
Tags: apache, Big Data, Cisco Data Center, Cisco Data virtualization, Cloudera, Composite Software, data integration, data virtualization, Hadoop, HiveServer2, Hortonworks, mapreduce, query, SQL, video
Not all workdays begin with a convoy of cyclists hailing from India, Saudi Arabia, Europe, and America. And fewer still wind up with creations made of LEGOs, spaghetti, string, and marshmallows.
Yet every workday — no matter how challenging — should have the same spirit of diversity, adventure, and assumption-busting repartee that I experienced at THNK — The Amsterdam School of Creative Leadership.
Once our Cisco Consulting Services colleagues finished winding through the streets of central Amsterdam each morning, we got down to the serious business of “hacking” some key global issues, together with our friends at THNK.
One of those issues has evolved into a Cisco/THNK partnership challenge, inwhich we will share Cisco’s expertise on the Internet of Everything (IoE) to solve some global problems around food safety and food distribution. I will speak more about the Internet of Food initiative in a subsequent blog.
Another key challenge was to foster digital disruption in the Internet of Everything (IoE) age — a time when our enterprise customers, and especially their end users, are demanding rapid transformation.
That level of change stems from the kind of open innovation and inclusive creative processes promoted by THNK in Amsterdam. Those processes are also being embraced by Cisco at our innovation hubs in such places as Rio de Janeiro, Toronto, and Songdo, South Korea. At these centers, IoE cornerstones such as cloud, mobility, Big Data analytics, and social media are already enabling digital disruption — and will continue to accelerate it.
Read More »
Tags: Big Data, Cisco, Cisco Consulting Services, cloud, connected supply chain, customer experience, design thinking, food distribution, Internet of Everything, IoE, Problem solving
If you’re an Operations Technology (OT) pro, then the buzz about the Internet of Everything (IoE) should have you pretty excited–because it will likely impact your work. You won’t want to miss a chance to find out more about it at Cisco Live San Francisco May 18 – 22.
Cisco has been hard at work building solutions to address your OT challenges. Cisco Live San Francisco is the place to find out the details…
Here are five (5) reasons not to miss this pivotal event:
#1. A Targeted OT Learning Track: We’ve put together a special program to bring OT and IT issues together and make it crystal clear how the Internet of Everything (IoE)–the convergence of machines, sensors, processes, people and data–is going to make your job a lot more interesting. Read More »
Tags: Big Data, Cisco, cisco live, Cisco Live! San Francisco, city network, Internet of Everything, internet of things, IoE, IoT, Operations technology, OT, S+CC, Smart Cities, Smart+Connected Communities, urban services, wi-fi
Data, Data Everywhere!
The challenge of making business decisions in a networked world isn’t a lack of data. It’s having data residing in multiple systems, global locations, locked away in spreadsheets, and in people’s heads.
Almost every enterprise faces this data silos challenge to a greater or lesser degree. But how businesses address it makes the difference between becoming a market leader or an “also-ran.” The fact is, better information leads to better decisions and better business outcomes. The Harvard Business Review (Big Data’s Management Revolution, October 2012) stated that data-driven companies are 5 percent more productive and 6 percent more profitable than their competitors.
Being able to easily access and use vast data stores has always been difficult. But in just the past few years, the problem has become 10 times worse. If it was just more data, then more compute and database horsepower could fix it. The bigger issues for businesses are proliferating data silos and ever-expanding distribution.
Data Virtualization to the Rescue
Industry-leading businesses are addressing the challenge with data virtualization. Data virtualization is an agile data integration approach that organizations use to:
- Gain more insight from their data
- Respond faster to accelerating analytics and business intelligence requirements
- Reduce costs by 50 to 75 percent compared to data replication and consolidation approaches
- Data virtualization abstracts data from multiple sources and transparently brings it together to give users a unified, friendly view of the data that they need.
Armed with quick and easy access to critical data, users can analyze it with their favorite business intelligence and analytic tools to drive a wide range of business outcomes. For example, they can increase customer profitability. Bring products to market faster. Reduce costs. And lower risk.
To read more about what Data Virtualization might mean to your enterprise, check out our new white paper Data Virtualization: Achieve Better Business Outcomes, Faster.
Tags: Big Data, Cisco, data virtualization, white paper