July 2017 /// Vol 238 No. 7)
That frustrated sigh you hear is a freshly minted engineer tasked with sorting through voluminous sensor-generated data streams, in an attempt to isolate a trend that could explain why a current drilling campaign is sustaining higher rates of non productive time (NPT) than offsets.
That frustrated sigh you hear is a freshly minted engineer tasked with sorting through voluminous sensor-generated data streams, in an attempt to isolate a trend that could explain why a current drilling campaign is sustaining higher rates of non productive time (NPT) than offsets. For all practical purposes, the assignment likely will prove as ineffectual as asking your teenaged baby sitter to decipher Grover’s quantum search algorithm, suggests the founder of a Houston data analytics company.
“No matter how good the insight you provide me, if I have limited capability, or I’m the new engineer on the block, having just graduated and with no experience, how on earth do you expect me to provide good judgment on that insight?” asks Moblize Inc. CEO Amit Mehta.
Therein lies but one of the issues restraining the conversion of raw data into interpretable real-time tools that can be used to short-circuit problems and optimize drilling efficiency. In keeping with what can best be described as industry-wide “datamania,” the IADC Drilling Engineering Committee (DEC) devoted its June technology forum in Houston to what it appropriately designated “Making Data Actionable: Operators Perspective.”
One such operator says costs, system incompatibility and data quality stand as barriers to the transition from manually-derived visual to automated analysis of nearly limitless data streams.
“The problem is, we have more data than we can look at,” says John Willis, Occidental Oil & Gas Corp. drilling and completions manager of Permian Resources Delaware Basin New Mexico. “And, the thing about manual data analysis is, it’s really tedious. You download one well’s worth of one-second data, and you might have a million rows of data. You might take about as long, as it took to drill the well, to get it loaded and into charts,” he told the quarterly forum. “It also requires a considerable amount of skill to manipulate it.”
“Smarter humans.” Mehta claims his company’s big data analytics platform cuts through the quagmire by compressing the mélange of wellsite data “into actionable insights to make the engineer’s life easier.” According to Moblize, the end-end platform automatically aggregates real-time sensor (machine) data, unstructured daily dialogues from emails and other sources, as well as historical data from third-party vendor hubs. Mehta said the technology is now employed on nearly 100 U.S. land rigs, where it is used to accelerate an engineer’s capacity to identify myriad trends and make sound decisions, be it selecting the most effective BHA for a given application or resolving an NPT issue.
He explained, “When you have an NPT event on one well, it’s easy to blame the tool. With our system, you can click on the NPT event and bring up a real-time signature that shows the corresponding drilling parameters that potentially contributed to that event. The business value of that is we can show our clients that an NPT event is not always the fault of a tool.”
“Everybody is focused on making the machine smarter, but we took the approach of applying the data and making the human smarter,” he said. “In our opinion, the machine is already very smart and will only get smarter, so let’s focus on the human rather than the machine.”
Keeping it home. The human component likewise figures prominently in Apache Corp.’s well-documented drive toward developing an automatic, rig-based, drilling advisory system. The foundation has been laid with the installation of data aggregation boxes on 21 Apache rigs, which integrate seamlessly with a rig’s electronic drilling recorder (EDR) to run physics-based models built around a basic Bayesian probabilistic network, says Senior Drilling Advisor Michael Behounek. As constructed, the system is able to quickly pinpoint sensor, equipment and process faults, as well as run real-time hydraulics, and torque and drag models. The catalyst behind the initiative, Behounek said, is the capacity to make decisions at the rig site, rather than on the basis of interpretations from real-time operations centers.
“The (end) goal for Apache is not just to aggrandize data, but have a rig-centric rig advisory system,” he said. “The concept is a reverse of real-time centers, in that the focus is on the rig. The data and models are actually being used at the rig, and the calculations are being made, the pattern-matching and the analytics are all in place and run at the rig. Instead of depending on interpretations from watching squiggly lines (in operations centers), the information is now going straight to the rig site personnel, to help them make better decisions.”
To ensure that the human factor remains firmly in the loop, the system’s visual interface, for instance, is very similar to the all-too-familiar EDR display. “The system is not only going to recommend what they do, but, more importantly, why. We don’t want a disconnect at the wellsite. The driller still needs to understand what is going on downhole.”
Meanwhile, from Oxy’s perspective, the compatibility issues of rigid, vendor-controlled, data analytics packages represent a major drawback. “As a buyer, I want plug-n-play and for things to fit together. I don’t know exactly what it is I want right now, so I need something that is easy to implement and economical, so engineers can experiment and try things out,” Willis said.
Toss data quality into the mix, and operators face a quandary when it comes to easily used and universally applicable analytics programs. “It’s a classic Catch-22,” he says. “Why spend a lot of money on data quality, when all you do is look at it. On the other hand, why invest in analytics when you have bad data quality?”