I caught up with Stewart Young, Global Alliance Manager at OSIsoft LLC, a Cisco partner, to find out more about ‘Edge Computing’, or, as some call it, ‘Fog Computing’. With the huge amount of data coming off Industrial sensors and outlying infrastructure, customers are trying to find more ways to rationalize the data while extracting information that they can turn into business intelligence.
As we find out in the “A New Reality for Oil & Gas” Thought Leadership I contributed to:
“The oil and gas industry provides a prime example of the need for “edge computing.” A typical offshore oil platform generates between 1TB and 2TB of data per day.1 Most of this data is time-sensitive, pertaining to platform production and drilling-platform safety. The most common communication link for offshore oil platforms is transmitting data via a satellite connection, with data speeds ranging from 64Kbps to 2Mbps. This means it would take more than 12 days to move one day’s worth of oil-platform data to a central repository.”
There’s a better, more efficient, more ‘digitized’ way. Analyze the data in real time at the edge of the network. Take notice of anomalies and out-of-line situations. Just send on to the central repository what’s needed for decision making and for the historian. cisco equipment and solutions are getting even more intelligent, so that they can help do this. That’s thanks, in part, to IOx.
What Stewart is showing is how that works in real life. The OSIsoft PI connector runs on IOx on the edge routing equipment. That way ‘lightweight’ (aka just looking for the key anomalies) analytics can be done. And it can be done right next to where it’s happening – in harsh environments, next to oil rigs, refineries, and sensor networks. Products like the Cisco GSR and the 8X9 products have these capabilities, and you’ll see more IOx enabled products and solutions over time, and Cisco working with other partners too.
When I asked Stewart to elucidate on the business benefits (no point in reading this if there aren’t any, right?!), he explained that he’s finding customers are able to expand the sources of data that they’re collecting further out in the field or to the plant/rig/refinery, giving more visibility about what’s happening in real time across the organizational infrastructure. They’re also then able to do some of the analysis sooner and not have to pass it back to a central processing environment. So:
- Better visibility in real time into operations
- Broader reach for analytics into remote areas
- Faster local analysis and response
Coming back to the Thought Leadership report “A New Reality for Oil & Gas“, and what IDC and Gartner are saying:
IDC forecasts that, with a business case built on predictive analytics and optimization in drilling, production, and asset integrity, 50 percent of oil and gas companies will have advanced analytics capabilities in place by 2016. As a result, IDC believes that O&G CEOs, for example, will expect immediate and accurate information about top shale opportunities to be available by the end of 2015, improving asset value by 30 percent2.
According to Gartner, O&G firms’ ability to leverage analytics to reduce operating costs and increase production rates “may be an essential survival skill for upstream companies.”3 Gartner mentioned several new analytics methods that are already benefiting the performance of subsurface activities:
- Digital completion technologies are boosting ultimate recovery rates for unconventional reservoirs from 3-5 percent to 12 – 16 percent, vastly improving those assets’ competitiveness.
- Advanced sensor technologies such as down-hole fiber generate high-resolution reservoir data for conventional assets, enabling more accurate modeling, simulation, and decision-making.
- Expanded integration of real-time data from field sensors (old and new) with the reservoir model is enabling more robust 4D modeling and, in turn, more dynamic reservoir management.