Bridging Silos

How the Lumi™ data and AI platform integrates planning and operations

Traditionally, subsurface data has primarily been used in exploration workflows and field development. Similarly, production data from operations has often been confined to operational workflows such as flow assurance. The organizational divide between the domains is not limited to users and data, different data systems are used to store the data, giving rise to data silos. Data exchange between the planning and operation domains is hard and time consuming and is kept to a minimum.

However, many workflows and critical business decisions rely on information from both domains or on data being consistently shared across domains. For example, fluid models are required in subsurface workflows to compute the inflow of fluid from the reservoir to the well. The same data is used in network models, to compute the change in fluid composition as the liquid rises up the well and flows through pipes and equipment.

Reservoir simulation workflows provide another example. Here, the reservoir model typically consists of two parts, the subsurface description and the operational history. The subsurface description explains the initial condition of the reservoir at the time of discovery, including location, shape and size of the reservoir and description of the rock within the reservoir, hydrocarbon content, how easily reservoir fluids can move through the rock, and many more relevant characteristics. This is complemented by the operational history about how the reservoir has been developed. When and where have wells been drilled? How much fluid has been produced or injected? Have wells been plugged and abandoned? What processes have been put in place to increase recovery— water flooding, artificial lift, stimulation or fracking? All of these are relevant details about the operational history of the reservoir. Both subsurface and operational data are required to produce forecasts of future reservoir production and to evaluate field development scenarios.

Because of the data silos, inconsistencies may arise between the domains, casting doubt on the conclusions drawn from the workflows, requiring rework, and ultimately leading to suboptimal business decisions.

Customers are missing the ability to exchange data between the domain silos—an integrated and consistent environment where relevant data can be shared in a timely and efficient manner.

The Lumi™ data and AI platform consists of best of breed data platforms: the Open Group’s OSDU® Data Platform as our planning data foundation and Cognite Data Fusion (CDF) as our operations data foundation. The OSDU Data platform is a corporate data store, primarily aimed at subsurface and planning data. It is based on an open data model, APIs, data contracts and domain data management services (DDMS). Being an emerging standard, the technology is maturing, and the data footprint is growing.

CDF is a data platform to support connected applications in the operations domain. It connects to data acquisition systems and historians, and allows users to manipulate, aggregate and deliver operations data at speed. It also manages catalogues of assets and equipment, allowing users to create digital twins of their up- and mid-stream operations.

The E&P Data Bridge is a set of data services that create consistency between the systems and domains. Consistency for subsurface workflows, by automatically making the latest production data available to OSDU Data Platform connected planning applications. Consistency for operations, by sharing relevant corporate master data such as well headers, trajectories, and completions.

This is achieved by establishing managed data flows between the OSDU Data Platform and CDF that guarantee relevant data is available for consumption workflows in connected applications. It is important to appreciate that these are not data duplications in the typical sense, which should be avoided according to good data management practice. CDF is a system-of-engagement, a system where operations data, such as high frequency production data, is transformed, aggregated, and prepared for consumption. CDF uses contextualization to assign incoming data to the correct assets; wells or equipment, for example. These assets are often recorded in a corporate system-of-records, such as OSDU Data Platform. In this context, it makes sense that curated and trusted corporate data should be utilized by CDF when contextualizing incoming operations data.

OSDU Data Platform is the long-term storage repository for valuable aggregated production data. CDF is the system that prepares such aggregations. OSDU Data Platform not only stored and archives these data, but also allows for data consumption in workflows, such as the reservoir modelling workflow mentioned before, where operational history is an integral part of the simulation model and the forecasts derived from it.

The Lumi platform establishes an integrated data system for planning and operations. Cross-domain workflows are enabled by removing the traditional domain silos. Data is exchanged through managed data flows that automate the process and guarantee consistency between the domains. This allows data of higher fidelity to be exchanged at higher cadence, with less latency and effort. This not only improves the efficiency of data-driven workflows but guarantees that consistent and up-to-date data is used throughout, leading to improved business decisions and outcomes.

 

View the full Insights Series

Tom Dombrowsky

Tom Dombrowsky

Digital and Integration Product Manager