Cisco Blogs

Cisco Blog > Energy - Oil & Gas and Utilities

Analytics for Oil and Gas – real-time data enabling real-time intelligence and response

Andrew Miller, Sr. Sales Engineer from Bit Stew Systems, showed me an oil and gas use case a short while ago. He was using real-time analytics to demonstrate how an energy company could monitor and respond to a natural disaster that might effect an oil and gas pipeline infrastructure. I was impressed by the visualizations he showed me and how he could interact with a map on the screen that was showing real-time analytical data fed in from a number of sources.

The demo shows some critical assets in the service territory in the north west of North America. That includes pipelines, block valves stations, initial injection stations, pumps, regulators and the like. We were able to see lots of details on the assets, and real-time status. Andrew simulated a natural disaster event (a fire), and that info would generally come from a live data feed and be overlaid on the map. We then looked at the affected assets and which ones were at risk. The really cool thing was the modeling of where the fire might spread to (based on weather). Response teams could be dispatched in real time to tackle the situation, and the situation monitored so that crews could be sent to where the fire was likely to spread to. Crew locations could be seen, again in real time.

Bit Stew Blog #1Bit Stew’s MIx Director™ (formerly Grid Director™) works with Cisco IOx and enables industrial companies to discover actionable insights that optimize operational performance. Andrew talks about how it leverages Cisco IOx and therefore hardened infrastructure that might include Cisco CGR or ISR solutions can be implemented enabling ‘fog computing’ or edge computing as it’s sometimes called.

IOx enables the analytics to be closer to the edge of the network. Closer to the sensors and devices that energy companies want to monitor. That relieves the pressure on the central IT sites and allows faster analysis at the source. Read More »

Tags: , , , , , ,

Foggy weather is coming to a place near you with analytics at the edge

I caught up with Stewart Young, Global Alliance Manager at OSIsoft LLC, a Cisco partner, to find out more about ‘Edge Computing’, or, as some call it, ‘Fog Computing’. With the huge amount of data coming off Industrial sensors and outlying infrastructure, customers are trying to find more ways to rationalize the data while extracting information that they can turn into business intelligence.

 As we find out in the “A New Reality for Oil & Gas” Thought Leadership I contributed to:

“The oil and gas industry provides a prime example of the need for “edge computing.” A typical offshore oil platform generates between 1TB and 2TB of data per day.1 Most of this data is time-sensitive, pertaining to platform production and drilling-platform safety. The most common communication link for offshore oil platforms is transmitting data via a satellite connection, with data speeds ranging from 64Kbps to 2Mbps. This means it would take more than 12 days to move one day’s worth of oil-platform data to a central repository.”

There’s a better, more efficient, more ‘digitized’ way. Analyze the data in real time at the edge of the network. Take notice of anomalies and out-of-line situations. Just send on to the central repository what’s needed for decision making and for the historian. cisco equipment and solutions are getting even more intelligent, so that they can help do this. That’s thanks, in part, to IOx.

What Stewart is showing is how that works in real life. The OSIsoft PI connector runs on IOx on the edge routing equipment. That way ‘lightweight’ (aka just looking for the key anomalies) analytics can be done. And it can be done right next to where it’s happening – in harsh environments, next to oil rigs, refineries, and sensor networks. Products like the Cisco GSR and the 8X9 products have these capabilities, and you’ll see more IOx enabled products and solutions over time, and Cisco working with other partners too.

When I asked Stewart to elucidate on the business benefits (no point in reading this if there aren’t any, right?!), he explained that he’s finding customers are able to expand the sources of data that they’re collecting further out in the field or to the plant/rig/refinery, giving more visibility about what’s happening in real time across the organizational infrastructure. They’re also then able to do some of the analysis sooner and not have to pass it back to a central processing environment. So:

  • Better visibility in real time into operations
  • Broader reach for analytics into remote areas
  • Faster local analysis and response

Oil TL Paper1


Coming back to the Thought Leadership report “A New Reality for Oil & Gas“, and what IDC and Gartner are saying:

IDC forecasts that, with a business case built on predictive analytics and optimization in drilling, production, and asset integrity, 50 percent of oil and gas companies will have advanced analytics capabilities in place by 2016. As a result, IDC believes that O&G CEOs, for example, will expect immediate and accurate information about top shale opportunities to be available by the end of 2015, improving asset value by 30 percent2.

According to Gartner, O&G firms’ ability to leverage analytics to reduce operating costs and increase production rates “may be an essential survival skill for upstream companies.”Gartner mentioned several new analytics methods that are already benefiting the performance of subsurface activities:

  • Digital completion technologies are boosting ultimate recovery rates for unconventional reservoirs from 3-5 percent to 12 – 16 percent, vastly improving those assets’ competitiveness.
  • Advanced sensor technologies such as down-hole fiber generate high-resolution reservoir data for conventional assets, enabling more accurate modeling, simulation, and decision-making.
  • Expanded integration of real-time data from field sensors (old and new) with the reservoir model  is enabling more robust 4D modeling and, in turn, more dynamic reservoir management.

Read More »

Tags: , , , , , , , ,

It’s Not Just the Connections, It’s the Applications

The Internet of Things (IoT) is connecting sensors, cameras, machines, and other devices at an amazing rate. But what drives the value of these digitized devices is not just the connections—it’s the applications that the connections enable. Think, for example, of a connected transportation system. It is not enough that buses have GPS and can connect to the Internet—what could really make a difference is an application that dynamically plans bus routes based on where people are, how long they have been waiting, and where they are going. That’s where the true value is.

You might even say that applications are the reason we connect things and collect data from those things. So those of us who are building the IoT infrastructure must understand what application developers need, and then enable them to take advantage of the IoT infrastructure and the data it carries. This means we need more than open APIs—we must make it easy for an application to get the data it requires from the infrastructure and to provide input into the infrastructure.

Additionally, we need to respond to the changing ways people want to interact with the devices at the edge. Traditionally, a process engineer might control or program a production line using a fixed human-machine interface (HMI) screen physically attached to the production machinery. Today, there is a growing need for remote and mobile interface capabilities—especially for the growing ranks of Millennials who want to be able to use iPads and other mobile devices to interact with IoT deployments. Cisco’s IOx platform is a flexible application development environment with a goal of enabling developers to connect applications with any protocol, interface, or device. In the future, this could even enable a control engineer in the factory to look at a robot’s operation through smart goggles, instantly viewing maintenance statistics and malfunction alerts.

Millennials in the workforce demand flexibility and mobility in interacting with IoT deployments

Millennials in the workforce demand flexibility and mobility in interacting with IoT deployments

It’s also extremely important to Read More »

Tags: , , , , , , , ,

Bit Stew shows off Analytics for Utilities with MIx Director at Cisco Live

We caught up with Andrew Miller, Sr. Sales Engineer from Bit Stew Systems at Cisco Live this year. Bit Stew is a Cisco partner that focuses on the analytics space with a platform that they call ‘Software Defined Operations for the Industrial Internet’. Their solution works with Cisco IOx on a number of Cisco platforms. The demonstration in this video shows just a small part of what they do, but does showcase analytics at the edge (Fog Computing) in a practical way with, in this case, an electrical utility customer.

Bit Stew’s Mix Core platform automates data ingestion, applies machine intelligence to learn patterns in the data, allowing industrial companies to discover actionable insights that optimize operational performance. MIx Director™ (formerly Grid Director™) is powered by the MIx Core platform, and is the application that industrial enterprises rely on for a contextual and real-time view of their operations, assets and customers.

In the Video, Andrew talks about the “Fog Computing” aspects of the MIx Director solution. With this solution running in cisco Grid routers at the edge of the network, a lot of the filtering of data can be done locally, without back-hauling to the data center or elsewhere. So long as everything is ticking along nicely, there’s no need to burden central resources or comms networks with unnecessary traffic. But if something untoward should happen, then operators will get alerts and see in real time what’s happening. Service crews or emergency services can be dispatched and potential disasters minimized. Well, don’t let me steal too much of Andrews thunder. Watch the video to see what happens next!

You can find out more about Bit Stew and some of their key people by reading these other blogs:

There’s a great “point of View” document that talks about the business benefits of the partnership here: Read More »

Tags: , , , , , , , , , ,

Cisco and Bit Stew Set Out to Transform the Industrial Internet of Things (IIoT) – One Customer at a Time

Kevin Collins

Kevin Collins, CEO and co-founder, Bit Stew

We’ve introduced several of the key figures within Bit Stew and shared with you the ways they are working to bring the IIoT to fruition, particularly within the energy sector.

I had a chat with Kevin Collins, CEO of Bit Stew to discuss the next opportunities for the company:

“It’s an exciting day at Bit Stew, with the announcement of additional funding from Cisco Investments and GE Ventures. With this support, we will continue to bring our experience in managing massive data sets and optimizing edge and fog computing to automate industrial operations in utilities and other industries as well.”

Kevin told me that the new funding will help fuel Bit Stew’s ongoing technology innovation and customer adoption: “This investment will open doors to new market opportunities for Bit Stew, and positions the company as a global leader in Software Defined Operations for IIoT. Bit Stew has quickly become the hot company to watch”.

GRID-EDGE-20Bit Stew was recently recognized to Greentech Media’s prestigious Grid Edge 20 list, as one of the top 20 innovators architecting the future of the electric power industry, along with Tesla, Duke Energy and SolarCity. “Making the Grid Edge 20 provides validation of our strong market traction, and is a tribute to what we’ve achieved since Bit Stew was incorporated in 2009. It also serves as a reminder of the responsibility we have to our utility customers, partners, and the industry as we work towards transforming the power sector to one that is more efficient, reliable and agile.”

Purpose-built for the Industrial Internet
The MIx Core platform is the culmination of years of industry-hardened machine learning derived from trillions of data points analyzed throughout the utility and oil and gas industries. Purpose-built for the Industrial Internet, MIx Core processes and analyzes greater volumes of data than most of the largest social networks in the world every day.

Bit Stew’s MIx Core takes full advantage of Cisco’s IOx technology, by embedding its core technology inside Cisco fog devices, providing data analysis at the edge of the network and in cloud-based systems – all in real-time. Running MIx Core in the “fog” brings a significant new advantage for organizations that are dealing with massive amounts of data running on complex networks in the IIoT

“Bit Stew’s collaboration with Cisco and the synergy between our Mix products and Cisco’s IOx platform has allowed us to utilize fog computing to completely revolutionize the way the energy sector operates,” Kevin said. “By using the edge of the network in the computing and analysis process, together we can create instant intelligence that is shared simultaneously in the operations center and in the field. This contextual analysis of industrial operations enables decision-making with a confidence that wasn’t necessarily available before. This expanded awareness results in increased up-time, faster issue resolution and optimized dispatch of resources,” adds Kevin.

Clearly Bit Stew is going places. And not just with utilities anymore. Find out more here: Read More »

Tags: , , , , , , , , ,