TU Delft Institute for Computational Science and Engineering (DCSE)

About DCSE

Computational Science and Engineering (CSE) is rapidly developing field that brings together applied mathematics, engineering and (social) science. DCSE is represented within all eight faculties of TU Delft. About forty research groups and more than three hundred faculty members are connected to, and actively involved in DCSE and its activities. Over 250 PhD students perform research related to computational science.

CSE is a multidisciplinary application-driven field that deals with the development and application of computational models and simulations. Often coupled with high-performance computing to solve complex physical problems arising in engineering analysis and design (computational engineering) as well as natural phenomena (computational science). CSE has been described as the "third mode of discovery" (next to theory and experimentation). In many fields, computer simulation, development of problem-solving methodologies and robust numerical tools are integral and therefore essential to business and research. Computer simulations provide the capability to enter fields that are either inaccessible to traditional experimentation or where carrying out traditional empirical inquiries is prohibitively expensive. 
 

Agenda

21 March 2025 12:30 till 13:15

[NA] Antoine Lechevallier : Enhancing History Matching with Machine Learning: A Hybrid Newton Approach

"History matching involves adjusting model parameters to improve the fit between observed and simulated data, ensuring better forecasting and reservoir management. Traditional ensemble-based methods, while effective, are computationally intensive, particularly for large datasets and complex reservoir models. This study introduces an innovative methodology that builds on the ""Hybrid Newton"" method by leveraging machine learning (ML) for non-linear preconditioning to accelerate history matching, with a specific focus on incremental learning for near-well prediction.

Part 1: Methodology Introduction on the Drogon Field (Rehearsal for SPE RSC25)

The iterative nature of history matching requires frequent updates to parameters such as porosity and permeability to account for uncertainties. Ensemble-based techniques refine multiple model realizations but at a high computational cost, largely due to repeated simulations. Machine learning presents a promising solution, with deep learning capturing non-linear and time-dependent dynamics. Surrogate models help reduce computational demands while maintaining accuracy, yet training these models for ensemble-based history matching remains challenging due to large datasets and generalization issues.

To address this, we enhance standard solvers with ML-based optimizations. Deep neural networks can predict reservoir outputs efficiently and improve solver performance by refining initial guesses. The Hybrid Newton method reduces the number of iterations required for convergence by leveraging ML-assisted initial guesses. Specifically, the local hybrid Newton strategy focuses on near-well regions, where well events introduce non-linear stiffness. Well dynamics often cause significant changes in reservoir behavior, necessitating drastic time-step reductions for iterative non-linear solvers to achieve convergence. Well behavior tends to exhibit similarity across spatial and temporal dimensions, presenting an opportunity for incremental learning.

We capitalize on this by training deep learning models tailored to each well’s near-well region after each history matching ensemble. These models predict solutions in the near-well region and serve as initial guesses for Newton’s method, ensuring faster convergence. Our methodology integrates ML models as local non-linear preconditioners for subsequent history matching ensembles, leveraging nested parameter distributions from prior simulations. This approach is cost-effective, as it does not require additional simulations and remains computationally feasible due to its localized application. Implemented using the open-source Python Ensemble Toolbox for ensemble modeling and OPM Flow for simulation, our methodology is validated on the Drogon field dataset. Results demonstrate a significant reduction in non-linear solver iterations required for convergence. Moreover, training deep learning models for each well can be carried out in parallel on CPUs, further enhancing efficiency.

Part 2: Assessment of SWIM on a Simplified Test Case

While the methodology has shown promising results on the Drogon field dataset, further assessment is necessary to evaluate computational efficiency and generalization capabilities. The Sample Where It Matters (SWIM) strategy has been proposed to reduce training time by optimizing data selection during supervised learning, particularly when integrating ML into solvers. High offline data generation costs for training neural networks have historically limited computational benefits, and SWIM aims to address this issue.

To validate SWIM, we employ a simplified test case with basic reservoir geometries and black-oil equations within the OPM Flow simulator. The strategy reduces computational demand by selectively training ML models where discrepancies are most pronounced, ensuring targeted learning without excessive redundant computations. Fully connected neural networks enhance Newton’s method, with SWIM ensuring efficient training on parallel CPUs. The combination of ensemble-based history matching with nested parameter distributions and SWIM optimization results in significant improvements in efficiency and solver robustness.

Conclusion

This research highlights the potential of machine learning to augment traditional history matching processes. By integrating ML-enhanced solvers and incremental learning strategies, we achieve a scalable and efficient framework that accelerates history matching while maintaining robustness. The proposed methodology can be seamlessly integrated into existing workflows, complementing conventional solvers while preserving their theoretical guarantees. Ultimately, our work paves the way for broader ML applications in reservoir engineering, optimizing computational resources and improving predictive accuracy."