Fog Computing Reduces Risks and Improves Data Flows

Similar to cloud, fog brings powerful analysis to the network edge
Fog Computing Reduces Risks and Improves Data Flows

Companies across all sectors of energy will have new opportunities to improve operational performance and reduce risk as “fog computing” emerges from the mist in the months ahead. Fog is similar to cloud computing in that it collects and processes machine data in a cloud environment and applies powerful analysis. The major difference is the […]

Companies across all sectors of energy will have new opportunities to improve operational performance and reduce risk as “fog computing” emerges from the mist in the months ahead.

Fog is similar to cloud computing in that it collects and processes machine data in a cloud environment and applies powerful analysis. The major difference is the speed at which analysis and response occur. In fog computing, intelligence is applied at the network edge, enabling much faster control response than the cloud can deliver.

Fog computing also addresses data overload issues by processing data locally and reducing the volumes moving to and from storage or the cloud. Data overload is already acute at utilities and in E&P, due to the growing numbers of machine sensors and smart meters. Enterprises will benefit from segregating the data into streams: the fast ones that go to fog nodes for instant processing and control response, and the slower ones that are stored for longer-term analysis.

Cloud computing is enormous and centralized, and data movement between devices and cloud servers is not optimized for speed. As TechCrunch wrote in 2016, “The cloud paradigm is like having your brain command your limbs from miles away — it won’t help you where you need quick reflexes. … The fog layer is the perfect junction where there are enough compute, storage and networking resources to mimic cloud capabilities at the edge and support the local ingestion of data and the quick turnaround of results.”

Control at the Network Edge

One early target is the Advanced Metering Infrastructure (AMI). Fog computing can make smart grids and AMI more secure and reduce the threat of meters and controllers being hacked and taken over. Utilities are already deploying cloud-based security to protect AMI, and fog processing can strengthen security measures by moving threat intelligence to the network edges where it enables fast detection, response and mitigation. A fog model can also reduce the transmission of raw smart meter data into the cloud, where the risk of theft is higher.

Management and processing at the network edge is promising in building management, traffic control, production systems, data center control, drone operation, and emerging niches such as autonomous vehicle operation. With fog computing, data from vehicles can be collected, processed and analyzed with a minimum of latency, which enables remote control systems to direct vehicles away from collisions.

A pure cloud computing model is inadequate for that application, because the sensor data and control measure are compromised by latency. As Harish Vadala wrote in on Medium.com in May, “An increasing amount of data is priceless in the seconds after it’s collected … but it loses all value during the time it takes to move it to the centralized data center infrastructure or upload it to the cloud.”

Speed of response is similarly valuable in oil and gas pipelines. If a pipeline sensor detects a leak, the response must be instantaneous, or the control measures will be too slow to prevent a sizable spill.

The control of wind and solar resources is a likely application for fog computing or edge processing, because the faster response will support timely control of wind turbines and solar panels, so that resources are responding quickly to changes in conditions.

Fog can also improve voltage optimization and service restoration on the grid, as well as theft detection, outage detection and analysis, and demand response.

 Consortium Creates Standards

Cisco introduced fog computing in 2014, and several major players joined the networking giant in 2015 to found the OpenFog Consortium. Dell, Microsoft, ARM and the Princeton University Edge Laboratory are working together to develop an open architecture for fog, with some deliverables expected this year.

While the speed has obvious value, fog computing can also address the management issues relating to Big Data. In oil and gas exploration, where engineers apply powerful analysis to massive data sets such as seismic surveys, fog computing can apply cloud-level analysis from a local area network and preserve bandwidth.

Railroads are another strong candidate for fog solutions, because the rapid response could make the difference between the derailment of a train carrying crude oil and the same train staying safely on the track.

Fog computing sounds a lot like edge processing, and both technologies have plenty of applicability in the energy industry. Fog computing sends data from endpoints to a gateway for processing in a local area network (LAN), whereas traditional edge processing uses dedicated resources such as programmable automation controllers.

Internet of Things (IoT) devices can use both fog and cloud simultaneously for different purposes. The fog computing piece will provide instantaneous, purposefully processing aimed at device and environment control, while the cloud provides storage and longer-term analysis.

— By John MacKenna, Energy Writer

0 Comments

Leave a Comment

Please Post Your Comments & Reviews

Leave a Reply