Next Steps in Virtualization

virtualization
Next Steps in Virtualization

Market parameters driving innovation It is predicted that, by 2019, some 40% of energy production companies will have the basic platforms in place for managing data and generating analytics, as well as artificial intelligence (AI) for performance related insights. This is according to the research experts at IDC, whom also anticipate that most of the […]

Market parameters driving innovation

It is predicted that, by 2019, some 40% of energy production companies will have the basic platforms in place for managing data and generating analytics, as well as artificial intelligence (AI) for performance related insights. This is according to the research experts at IDC, whom also anticipate that most of the oil and gas companies that executed digital transformation plans this year (2016) will have reduced costs and increased efficiencies by 10% to 15%.

Around the same time, simulation and virtual reality will have helped to leverage modelling, optimize asset performance, and reduce risk. Software from Siemens is already being used for this purpose to create a virtual 3D model of a drilling platform. Such technology can assist with training as well as helping workers to prepare for maintenance assignments. GE is another example with its ‘Digital Twin’ black box. This comprises analytic models, advanced sensor technology, and AI – all of which can be actioned to model the present state of every asset in a digital power plant or wind farm.

 Cost benefits of virtual modelling

 A constant need to send vast amounts of data worldwide, in a strategic and timely fashion, can bring headaches to IT teams and hefty bandwidth bills. Virtualization can help with minimizing both manpower and bandwidth requirements by helping companies to realise what – and where – work needs to be done before even setting foot onsite. Other energy companies are trying out more analytics in-situ (on actual devices and sensors), further negating the need for physical human presence (and the enormous costs associated with getting teams out to service sites).

Another initiative is the collaboration of academics and sector-specific innovators at a new research center to solve data access problems – specifically for well exploration and in-field operations. The initiative, called SIRUS, opened earlier this year and supported by the Research Council of Norway. Its purpose is to find better ways to extract and exploit the Big Data opportunity for the oil and gas sector in particular.

 The Cloud

 GE estimates that, by 2020, more than seven billion devices will be installed across the energy chain. These devices are estimated to generate 24 exabytes of data, at least 50% of which will reside in the cloud – be it public, private, or hybrid.

The thing is, this sector’s reliance on heavy data – which for virtualization will likely incorporate high processing capabilities and super-powerful graphics cards – means some rather specific technological requirements. And these requirements can often fall outside of the ‘usual’ cloud based systems which are currently being adopted in other markets. While ways to action very large files, through cloud-based systems are in development, this kind of technology – in the energy sector at least – is still finding its feet.

One work-around, from the Oil & Gas Council’s website, is that “the logical solution is to continue to maintain high power workstations under the operational control of the specialists that need them, whilst seamlessly integrating the software of the workstations with the cloud system for all other functions.” This results in a hybrid cloud, which more simply put is two systems that seemingly operate as one.

Cyber security remains a top priority when considering the slew of emerging ‘access points’. Access points include the rise of robotics, ‘wearables’, and other offerings intrinsic to the Industrial Internet of Things’ (IIoT) proliferation. While IT experts are looking at ways to embed security to their virtualization efforts, any security considerations will likely need to encompass cross-border regulations, differing local and regional standards, as well as governmental policies – suggesting a need for interoperability on a globally encompassing scale. This is not going to be easy.

 So, what’s next in terms of virtualization?

 While computing costs are coming down (servers, IoT enablement etc.) other barriers to adoption are hanging on, such as: executive buy-in, budget allocation / prioritizing, security, and regulations. So, for now, virtualization remains a progressive and collective effort. There are technological frontiers as well as geographical ones and it will be years before the entire energy chain becomes saturated.

What we do know is that there will be more robotics, more analytics, faster and more cost-effective operation methods, more virtual modelling, and potentially far less people doing dangerous jobs in remote places. So, in terms of new frontiers for virtualization, we can also deliberate how – rather than if – this will impact the price of producing a barrel of oil or a GW of electricity.

— By Rachael Corry, Energy Writer

0 Comments

Leave a Comment

Please Post Your Comments & Reviews

Leave a Reply