scispace - formally typeset
Search or ask a question
Author

Akhil Datta-Gupta

Bio: Akhil Datta-Gupta is an academic researcher from Texas A&M University. The author has contributed to research in topics: Reservoir modeling & Fast marching method. The author has an hindex of 42, co-authored 274 publications receiving 6007 citations. Previous affiliations of Akhil Datta-Gupta include Lawrence Berkeley National Laboratory & University of Texas System.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a semianalytic approach for modeling tracer motion in heterogeneous permeable media is presented, which is analytic along streamlines; the streamlines are derived from an underlying velocity field which is obtained numerically from a conventional fluid flow simulator.

140 citations

Proceedings ArticleDOI
TL;DR: This work proposes a new analytic technique that uses an extremely efficient three-dimensional multiphase streamline simulator as a forward model and the inverse method is analogous to seismic waveform inversion and thus, allows us to utilize efficient methods from geophysical imaging.
Abstract: One of the outstanding challenges in reservoir characterization is to build high resolution reservoir models that satisfy static as well as dynamic data. However, integration of dynamic data typically requires the solution of an inverse problem that can be computationally intensive and becomes practically infeasible for fine-scale reservoir models. A critical issue here is computation of sensitivity coefficients, the derivatives of dynamic production history with respect to model parameters such as permeability and porosity. We propose a new analytic technique that has several advantages over existing approaches. First, the method utilizes an extremely efficient three-dimensional multiphase streamline simulator as a forward model. Second, the parameter sensitivities are formulated in terms of one-dimensional integrals of analytic functions along the streamlines. Thus, the computation of sensitivities for all model parameters requires only a single simulation run to construct the velocity field and generate the streamlines. The integration of dynamic data is then performed using a two-step iterative inversion that involves (i) 'lining-up' the breakthrough times at the producing wells and then (ii) matching the production history. Our approach follows from an analogy between streamlines and ray tracing in seismology. The inverse method is analogous to seismic waveform inversion and thus, allows us to utilize efficient methods from geophysical imaging. The feasibility of our proposed approach for large-scale field applications has been demonstrated by integrating production response directly into three dimensional reservoir models consisting of 31500 grid blocks in less than 3 hours in a Silicon Graphics without any artificial reduction of parameter space, for example, through the use of 'pilot points'. Use of 'pilot points' will allow us to substantially increase the model size without any significant increase in computation time.

138 citations

Journal ArticleDOI
TL;DR: Numerical results demonstrate that the proposed method leads to a severalfold increase in the acceptance rate of MCMC and provides a practical approach to uncertainty quantification during subsurface characterization.
Abstract: [1] In this paper, we use a two-stage Markov chain Monte Carlo (MCMC) method for subsurface characterization that employs coarse-scale models. The purpose of the proposed method is to increase the acceptance rate of MCMC by using inexpensive coarse-scale runs based on single-phase upscaling. Numerical results demonstrate that our approach leads to a severalfold increase in the acceptance rate and provides a practical approach to uncertainty quantification during subsurface characterization.

121 citations


Cited by
More filters
01 Apr 2003
TL;DR: The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it as mentioned in this paper, and also presents new ideas and alternative interpretations which further explain the success of the EnkF.
Abstract: The purpose of this paper is to provide a comprehensive presentation and interpretation of the Ensemble Kalman Filter (EnKF) and its numerical implementation. The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it. This paper reviews the important results from these studies and also presents new ideas and alternative interpretations which further explain the success of the EnKF. In addition to providing the theoretical framework needed for using the EnKF, there is also a focus on the algorithmic formulation and optimal numerical implementation. A program listing is given for some of the key subroutines. The paper also touches upon specific issues such as the use of nonlinear measurements, in situ profiles of temperature and salinity, and data which are available with high frequency in time. An ensemble based optimal interpolation (EnOI) scheme is presented as a cost-effective approach which may serve as an alternative to the EnKF in some applications. A fairly extensive discussion is devoted to the use of time correlated model errors and the estimation of model bias.

2,975 citations

Journal ArticleDOI
TL;DR: The Bayesian approach to regularization is reviewed, developing a function space viewpoint on the subject, which allows for a full characterization of all possible solutions, and their relative probabilities, whilst simultaneously forcing significant modelling issues to be addressed in a clear and precise fashion.
Abstract: The subject of inverse problems in differential equations is of enormous practical importance, and has also generated substantial mathematical and computational innovation. Typically some form of regularization is required to ameliorate ill-posed behaviour. In this article we review the Bayesian approach to regularization, developing a function space viewpoint on the subject. This approach allows for a full characterization of all possible solutions, and their relative probabilities, whilst simultaneously forcing significant modelling issues to be addressed in a clear and precise fashion. Although expensive to implement, this approach is starting to lie within the range of the available computational resources in many application areas. It also allows for the quantification of uncertainty and risk, something which is increasingly demanded by these applications. Furthermore, the approach is conceptually important for the understanding of simpler, computationally expedient approaches to inverse problems.

1,695 citations

11 Jun 2010
Abstract: The validity of the cubic law for laminar flow of fluids through open fractures consisting of parallel planar plates has been established by others over a wide range of conditions with apertures ranging down to a minimum of 0.2 µm. The law may be given in simplified form by Q/Δh = C(2b)3, where Q is the flow rate, Δh is the difference in hydraulic head, C is a constant that depends on the flow geometry and fluid properties, and 2b is the fracture aperture. The validity of this law for flow in a closed fracture where the surfaces are in contact and the aperture is being decreased under stress has been investigated at room temperature by using homogeneous samples of granite, basalt, and marble. Tension fractures were artificially induced, and the laboratory setup used radial as well as straight flow geometries. Apertures ranged from 250 down to 4µm, which was the minimum size that could be attained under a normal stress of 20 MPa. The cubic law was found to be valid whether the fracture surfaces were held open or were being closed under stress, and the results are not dependent on rock type. Permeability was uniquely defined by fracture aperture and was independent of the stress history used in these investigations. The effects of deviations from the ideal parallel plate concept only cause an apparent reduction in flow and may be incorporated into the cubic law by replacing C by C/ƒ. The factor ƒ varied from 1.04 to 1.65 in these investigations. The model of a fracture that is being closed under normal stress is visualized as being controlled by the strength of the asperities that are in contact. These contact areas are able to withstand significant stresses while maintaining space for fluids to continue to flow as the fracture aperture decreases. The controlling factor is the magnitude of the aperture, and since flow depends on (2b)3, a slight change in aperture evidently can easily dominate any other change in the geometry of the flow field. Thus one does not see any noticeable shift in the correlations of our experimental results in passing from a condition where the fracture surfaces were held open to one where the surfaces were being closed under stress.

1,557 citations

Journal ArticleDOI
TL;DR: This review paper will summarize key developments in history matching and then review many of the accomplishments of the past decade, including developments in reparameterization of the model variables, methods for computation of the sensitivity coefficients, and methods for quantifying uncertainty.
Abstract: History matching is a type of inverse problem in which observed reservoir behavior is used to estimate reservoir model variables that caused the behavior. Obtaining even a single history-matched reservoir model requires a substantial amount of effort, but the past decade has seen remarkable progress in the ability to generate reservoir simulation models that match large amounts of production data. Progress can be partially attributed to an increase in computational power, but the widespread adoption of geostatistics and Monte Carlo methods has also contributed indirectly. In this review paper, we will summarize key developments in history matching and then review many of the accomplishments of the past decade, including developments in reparameterization of the model variables, methods for computation of the sensitivity coefficients, and methods for quantifying uncertainty. An attempt has been made to compare representative procedures and to identify possible limitations of each.

726 citations