Avatar

In Part 1 of this blog series, I talked about how data integration provides a critical foundation for capturing actionable insights that generate improved outcomes. Now, in Part 2, I’ll focus on the two other challenges that must be met to extract value from data: 1) automating the collection of data, and 2) analyzing the data to effectively identify business-relevant, actionable insights. This is where things, data, processes, and people come together.

Let’s start with automation.

After IoT data is captured and integrated, organizations must get the data to the right place at the right time (and to the right people) so it can be analyzed. This includes automatically assessing the data to determine whether it needs to be moved to the “center” (a data center or the cloud) or analyzed where it is, at the “edge” of the network (“moving the analytics to the data”). Analytics at the Edge

The edge of the network is essentially the place where data is captured. On the other hand, the “center” of the network refers to offsite locations such as the cloud and remote data centers — places where data is transmitted for offsite storage and processing, usually for traditional reporting purposes. The edge effectively could be anywhere, such as on a manufacturing plant floor, in a retail store, or on a moving vehicle.

In “edge computing,” therefore, applications, data, and services are pushed to the logical extremes of a network — away from the center — to enable analytics knowledge generation and immediate decision-making at the source of the data.

Maximum value comes from employing a combination of edge computing and the “center” (data center or cloud), not one or the other. Organizations, therefore, require a connected infrastructure that enables insight from the data center to the edge. Here are some important considerations that typically drive the need for edge computing/analytics:

  • Performance requirements: Are there requirements for low latency that will impact where the data should be processed? There are certain use cases where low latency may be a requirement.
  • Data preprocessing opportunities: In many instances, it will not be appropriate to transmit all the data generated by a particular solution to the cloud for processing. It may make sense to process or compress data before transmitting it to the cloud, or to transmit only select data (e.g., anomalies, exceptions, averages). In other words, only enriched information (instead of raw data sets) is sent from the edge to the center.
  • Highly distributed applications: Some applications (e.g., pipeline monitoring, connected oil rigs, smart grid) may involve a high degree of distribution, making processing at the edge more attractive.

To witness the economic advantages of edge computing/analytics, look no further than the security and video analytics cameras employed by a typical retail store. By processing these cameras’ data locally — at the edge — rather than pushing it to a centralized data center, the store can reduce the overall load on its network. According to analysis conducted by economics colleagues in Cisco Consulting Services, for a retail store with $20 million in annual sales and 100 security and video analytics cameras, edge computing/analytics can deliver annualized savings of $33,800 — and a 1.7 percent annual EBIT increase — versus employing a traditional data center/cloud computing approach.

IT and operational technology (OT) leaders appear to understand the growing importance of edge computing — particularly when it comes to the Internet of Things (IoT). In fact, in a recent survey conducted by Cisco Consulting Services, nearly 40 percent of respondents indicated that within the next three years, “most” of the data produced by their IoT solutions will be processed at the edge of the network using intelligent devices and appliances.

Whether it is in the cloud or at the edge, IoT data must be analyzed to identify actionable insights that can be used to create better outcomes (such as from process optimization or improved customer engagement). Without this critical step, data remains just “data.”

Organizations, however, often lack analytical capabilities due to an absence of both the skill sets (such as those possessed by data scientists) and tools to deal with the exploding size, speed, variety, and distribution of data.

To maximize their ability to capture actionable data insights, organizations will need to prepare for the workforce of the future — one that can drive the transformational opportunities promised by IoT and data, with competencies aligned to industry-specific concerns and outcomes.

Already, we are witnessing tremendous interest by those looking to enter these areas of opportunity. For example, the online Big Data course taught by MIT’s Computer Science and Artificial Laboratory attracted more than 3,500 students (including some members of my team) from 88 countries for its inaugural session in 2013.

The greatest value, however, will come from employees whose knowledge intersects data science, design, and enterprise architecture. To deliver true value, data insights must link to specific business processes and outcomes.

Chief Data OfficerTo help enable this linkage, the role of chief data officer (CDO) is becoming increasingly common in organizations around the world. CDOs are essentially responsible for determining how data can be used across an organization and the operational environment to drive better business outcomes.” Gartner predicts that 25 percent of large global organizations will have appointed CDOs (also known as “Big Data Czars”) by January 2015.

When combined with data integration, automation and analytics round out the capabilities required to turn raw data into actionable insights that have the power to change outcomes for private and public sector organizations.

 



Authors

Nicola Villa

Managing Director, Global Analytics Practice

Cisco Consulting Services