Imagine a dozen people in a room for a meeting. Each one from a different country, and each one only able to speak and understand their own native language. Now imagine that you are the note-taker for this meeting. A daunting task to say the least.
So how do you do it? Well, frankly I would probably leave the room screaming, but a similar situation happens every day in the oil and gas industry with big data.
You see, with the mega amounts of data that are produced in the oilfield, pipelines and refineries each and every moment of each and every day, 24 hours a day, seven days a week, all of it has to be understood, translated, analyzed and acted upon.
In almost all instances, the large volumes of data come from many different sources in a plethora of formats, pretty much like a dozen people speaking different languages.
But new technologies are being developed and introduced almost daily which can help take the petabytes of big data now coming in from the oil and gas fields, and turn them into cost-saving, real-world tools. Technologies like remote monitoring, subsea seismic, oil flow rates and pressures all generate useful data, but the real test comes by translating all of that information into a single language, and then putting in place all of the processes which turn that data into usable, functional, and actionable knowledge.
The addition of these new technologies creates an added responsibility for key executives within the industry as well. They need to make sure they’re not just adding technology for technology’s sake. The right methods have to be established to make sure that there is actually value being extracted.
It’s a little like going to Costco and buying a five-gallon tub of Greek yogurt. It’s most definitely worth something if you have a plan on how to use it before it all spoils.
It’s the same with data. Like yogurt, data spoils quickly. It flows in, but it’s pretty much worthless if it isn’t used quickly and properly, and certainly if it isn’t used for the purposes in which it was gathered in the first place.
As an example, one oil company has a program which uses the big data being gathered to teach their field and office employees abnormal event detection procedures. The program is designed to get their employees to respond to abnormal readings before a problem occurs. This saves money, time, and most-importantly, lives.
Companies like that, and for that matter, most major oil producers, service companies, and engineering firms, don’t want to simply bring in data without also figuring out a way to turn that data into cost-saving and life-saving knowledge. And they certainly want to have some analytical procedures in place that provide some true value from that data. Without that type of procedure in place, all you have is more data. Or in our example, five gallons of really bad Greek yogurt.
In their rush to keep up with the flood of big data and the technology that creates it, energy companies would benefit by doing their homework in advance, and have the systems planned before diving in.
But how does it save a company money?
Easy. Automated system modules can deliver historical reports and graphs that characterize normal operating conditions for a specific well or refinery run. But because this data is a part of a larger database, it can be used to calculate precise production data. And as in the case mentioned above, it can also detect irregularities.
Setting up a big data analysis system can allow a company to redirect their manpower, and therefore focus more on specific optimization procedures. This redistribution of the existing personnel to work on more immediate concerns is effectively equal to hiring additional staff, thereby saving additional money.
But getting that message across to C-Suites isn’t easy. The fact is that system changes and upgrades are largely a financial decision and not always one of efficiency. Many oil companies are reluctant, if not obstinate, about getting rid of their legacy systems in favor of newer systems which are designed to handle big data.
And who can blame them? Investments in new IT systems large enough to handle, translate, amalgamate, and analyze big data can run into the millions of dollars. But one life saved, or a ten percent increase in a single well’s production more than offsets the costs.
No matter how you slice it, more companies within oil and gas need the ability to access and analyze the tons of big data being generated every day.
The end result is lower lifting costs, increased safety, a healthier bottom line, and employees who won’t be stuck with the equivalent of gallons of spoiled yogurt.
— By Jeff Miller, energy writer