INTERVIEW: How We Use Machine Learning To Detect Faults In Oil Wells – Datagration

On the sidelines of the 2024 SPE Artificial Lift Conference and Exhibition in The Woodlands, Texas, United States, Oscar Skaer, Datagration’s Director of Sales in the U.S., spoke to Tayo OLU of THE WHISTLER on how the company’s EcoVisor platform leverages machine learning to predict failures of artificial lift systems in oil wells......See Full Story>>.....See Full Story>>

Skaer explained the data sources and machine learning models used by the Software as a Service (SaaS) company to detect such failures, as well as how their “unified data model” differs from traditional methods used in the oil and gas industry. He further spoke on ways oil companies who use their services automate financial reporting using custom analytics. Excerpts…

How Does Your Ecovisor Platform Detect Anomalies In Oil Wells?

With the EcoVisor ESG data reporting tool, we’re predicting the likelihood of failure of Artificial Lift Systems. We have about an 80% accuracy rate within a 30-day period. We say hey, something is going to happen to this Electric Submersible Pump (ESP). You need to go take a look at it. So we use machine learning for that.

We also have scoring mechanisms to highlight underperforming wells based on certain parameters. So that way, when oil companies come in, they do not have to sift through each well and look for anomalies. Our tool highlights those and brings them up and says, Hey, these are the five to 10 wells that you probably need to look at today or this week.

What Data Sources Does Datagration Use To Train The Machine Learning Models For Predicting Artificial Lift Failures?

There are different ways. What we do with machine learning is that we use well history. In other words, we usually take one to two years of well history (data), and we train our models on that well history for the ESPs. But usually, especially in the ESP world, it’s going to be based on that well history and the problems that it’s had before, and that’s how we come up with that machine learning algorithm. And so we’re not using someone else’s data or some public data. We make sure that the data that we’re using is unique to those wells within that area.

Yes, that’s sometimes a limitation, right? They need to have a history of having problems on those wells, so we can actually train the model to detect (future problems).

Can You Explain The Concept Of ‘Unified Data Model’ That Datagration Utilises And How It Differs From Traditional Data Models Used In Oil and Gas Industry?

At the very core of Datagration, we connect data like nobody else can in the industry. We have a few things that are proprietary technology that allows us to do so. One is what we call the unified data model, and it’s unlike any other data model that’s utilised by any other platform or software in the industry because it’s what we call an elastic model, so it expands as you add and remove data. And the other thing is that you don’t have to build a data structure on the back end of it.

So, like every other software or data lake, you have to think about how you want to store that data so people can access it, and then they can slice and dice it and use it to combine, right?

But with our unified data model, all you have to do is start with an entity. An entity can be anything you want, anything that you want to associate data with, in oil and gas, that’s typically an oil well.

You may say ‘I want all this data associated with these types of wells,’ right? And then you give it a hierarchy based how you want that data to flow through that hierarchy and at which level of that hierarchy do you want it to be aggregated.

Our Petrovisor platform and the unified data model behind it systematically builds that data structure on the back end, so you don’t have to worry about how you structure your data.

How Does Your Platform Automate Financial Reporting?

Yes, we can also systematically connect financial systems and bring that into our platform. How a lot of people are doing that today, even with a data warehouse, you still have to go in and query that data. You had to find it, you had to create tables, and you still have to pull all that data together in something usually like Excel. Then you have to go talk to your finance guys and say, Hey, I need this financial information from this period to this period, if you’re lucky, you probably get it in a week or a month, because they’re busy, right?

And then you run all your calculations, you massage all that data, do all the operational calculations, merge all that data together, and then send that to like a Spotfire or a Power BI dashboard.
What we do is we automate that entire process. So, if you have to do that daily, right? It’s going to show up every day. And whenever you walk in, all those results will show up in your dashboard systematically. So, overnight, we’re updating any changes that have happened to that data. We bring in new data that gets automatically served up to a Power BI dashboard.

We can ping codes that oil companies have written that are kind of their own secret sauce, whether it’s in Excel, VBA, Python, or something really sophisticated.

We systematically send data to where those codes sit, ping them to run, and then extract those results to the back end to be viewed with all the other data. So, instead of having to do all that data wrangling and redoing all those calculations, we automate that process so that way they can focus more on doing analytics than having to try to get all that data just to get an answer.

.....CONTINUE READING.....CONTINUE READING