scispace - formally typeset
Search or ask a question

Showing papers on "Reservoir modeling published in 2000"


Book
15 Mar 2000
TL;DR: In this paper, the material balance equation was used to predict the type curve analysis of reservoir fluid flow in terms of the ratio of relative permeability of reservoir-fluid properties relative to rock properties.
Abstract: Fundamentals of Reservoir Fluid Behavior Reservoir-Fluid Properties Laboratory Analysis of Reservoir Fluids Fundamentals of Rock Properties Relative Permeability Concepts Fundamentals of Reservoir Fluid Flow Oil Well Performance Gas Well Performance Gas and Water Coning Water Influx Oil Recovery Mechanisms and the Material Balance Equation Predicting Oil Reservoir Performance Gas Reservoirs Principles of Waterflooding Vapor-Liquid Phase Equilibria Decline and Type Curve Analysis Index

932 citations




Proceedings ArticleDOI
01 Jan 2000
TL;DR: In this article, a new approach that combines the use of continuum and discrete fracture modeling methods has been developed, which provides the unique opportunity to constrain the fractured models to all existing geologic, geophysical, and engineering data, and hence derive conditioned discrete fracture models.
Abstract: A new approach that combines the use of continuum and discrete fracture modeling methods has been developed. The approach provides the unique opportunity to constrain the fractured models to all existing geologic, geophysical, and engineering data, and hence derive conditioned discrete fracture models. Such models exhibit greater reality, since the spatial distribution of fractures reflects the underlying drivers that control fracture creation and growth. The modeling process is initiated by constructing continuous fracture models that are able to capture the underlying complex relationships that may exist between fracture intensity (defined by static measures, such as fracture count, or dynamic measures, such as hydrocarbon production), and many possible geologic drivers (e.g. structure, thickness, lithology, faults, porosity). Artificial intelligence tools are used to correlate the multitude of geologic drivers with the chosen measure of fracture intensity. The resulting continuous fracture intensity models are then passed to a discrete fracture network (DFN) method. The current practice in DFN modeling is to assume fractures are spatially distributed according to a stationary Poisson process, simple clustering rules, or controlled by a single geologic driver. All these approaches will in general be overly simplistic and lead to unreliable predictions of fracture distribution away from well locations. In contrast, the new approach determines the number of fractures in each grid-block, based on the value of the fracture intensity provided by the continuous model. As a result, the discrete fracture models honor all the geologic conditions reflected in the continuous models and exhibit all the observed fracture features. The conditioned DFN models are used to build a realistic and detailed model of flow in discrete conduits. There are two main areas where detailed discrete fracture models can be used: (1) Upscaling of fracture properties (permeability, porosity and a factor) for input into reservoir simulators; and (2) Optimization of well-design, completion and operation based on an understanding of the inter-well scale flows. For accurate results, the full permeability tensor is calculated for each grid-block based on flow calculations using generalized linear boundary conditions. Inter-well flows are analyzed in terms of the variability in flow paths, characterized by distance and time traveled, through the fracture network connecting injectors and producers.

51 citations



Journal ArticleDOI
TL;DR: In this paper, a neural network was used as a multivariate correlative tool to learn the non-linear relationships between multiple input and output variables, which can be used for reservoir characterization.

50 citations


Journal ArticleDOI
TL;DR: Dolberg et al. as discussed by the authors studied the significance of seismic amplitude variations over the crest of Lavrans Field (Figure 2) to understand the meaning of these amplitudes and found that porosity variations were fluid effects, porosity, pressure changes, or processing and illumination artifacts.
Abstract: Several authors have changed employment since this paper was written. Those authors and their new locations are: David Dolberg, Mobil Oil Canada, Calgary, Alberta, Canada; Jan Helgesen, CGG Norge, Hovik, Norway; Tore Hakon Hanssen, Fortum Petroleum, Oslo, Norway; Ingrid Magnus, Norsk Hydro ASA, Bergen, Norway; Girish Saigal, Norsk Hydro ASA, Oslo, Norway; and Bengt K. Pedersen, Norwegian University of Science and Technology, Trondheim, Norway. Lavrans Field lies offshore Norway on the western extreme of Halten Terrace, 15 km south of Smorbukk Field (Figure 1). Exploration in this area has been extremely active, yielding two large gas and condensate discoveries within Petroleum License 199—Lavrans Field to the east and Kristin Field to the west. Combined reserves are approximately 1200 million barrels oil equivalent of gas and condensate. Unique conditions have come together to provide hydrocarbons and preserved porosity at depths greater than 5 km. Figure 1. Lavrans and Kristin fields relative to Smor-bukk. Field outlines (rose) are drawn on the map of the Base Cretaceous Unconformity. View is to the north. Hydrocarbon-bearing sandstones at Lavrans have a thickness of 600 m. These reservoirs consist of shallow-marine deposits of Jurassic age. Although the facies can be laterally extensive, the overprint of diagenesis makes prediction of reservoir quality difficult. Seismic inversion provides insight to porosity variations away from limited well control. Thus, seismic inversion can be a valuable tool for reservoir characterization prior to field development. This study developed out of a need to better understand the significance of seismic amplitude variations over the crest of Lavrans Field (Figure 2). Hypotheses examined to explain amplitude variations were fluid effects, porosity, pressure changes, or processing and illumination artifacts. Figure 2. Cross-line 3185 is a dip line through well 6406/2-2. Seismic anomalies are seen below the Top Ile Formation. Understanding the meaning of these amplitudes was a primary motivation …

47 citations


Journal ArticleDOI
TL;DR: In this paper, a streamline-based approach for estimating relative permeabilities from production data is presented, which can analyze the sensitivity of the production response with respect to relative permeability parameters.
Abstract: One of the outstanding challenges in reservoir characterization is to build high-resolution reservoir models that satisfy static as well as dynamic data. Integration of dynamic data so far has mainly focused on estimating spatial distribution of absolute permeability. Among the various properties important for simulating reservoir behavior, the relative permeability curves may be by far the most poorly determined by present methods. Estimation of relative permeability simultaneously with absolute permeability is a strongly nonlinear and ill-posed estimation problem. In this paper we present a streamline-based approach for estimating relative permeabilities from production data. The streamline approach offers two principal advantages. First, we can analytically compute the sensitivity of the production response with respect to relative permeability parameters. The approach is extremely fast and requires a single streamline simulation run. Second, we can exploit the analogy between streamlines and seismic ray tracing to develop a formalism for efficient inversion of production data. Thus, estimation of relative permeabilities is carried out in two steps: (i) matching of breakthrough or first arrival times and (ii) matching of amplitudes of the production response. For relative permeability representations we have used the commonly used power functions and also a more flexible representation through the use of B-splines. The relative advantages of these representations are examined through inversions of water-cut data from a nine-spot pattern. Finally, we address the underlying challenges associated with the simultaneous estimation of absolute and relative permeabilities from production data. We systematically investigate the non-uniqueness associated with the inverse problem and quantitatively evaluate the role of additional data

44 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compared object-based and pixel-based reservoir modeling techniques to assess differences between object and pixel based modeling techniques, and the resulting realizations from the two methods were assessed by visual inspection and by evaluation of the values and ranges of the single phase effective permeability tensors, obtained through upscaling.
Abstract: To assess differences between object and pixel-based reservoir modeling techniques, ten realizations of a UK Continental Shelf braided fluvial reservoir were produced using Boolean Simulation (BS) and Sequential Indicator Simulation (SIS). Various sensitivities associated with geological input data as well as with technique-specific modeling parameters were analyzed for both techniques. The resulting realizations from the object-based and pixel-based modeling efforts were assessed by visual inspection and by evaluation of the values and ranges of the single-phase effective permeability tensors, obtained through upscaling. The BS method performed well for the modeling of two types of fluvial channels, yielding well-confined channels, but failed to represent the complex interaction of these with sheetflood and other deposits present in the reservoir. SIS gave less confined channels and had great difficulty in representing the large-scale geometries of one type of channel while maintaining its appropriate proportions. Adding an SIS background to the Boolean channels, as opposed to a Boolean background, resulted in an improved distribution of sheetflood bodies. The permeability results indicated that the SIS method yielded models with much higher horizontal permeability values (20–100%) and lower horizontal anisotropy than the BS versions. By widening the channel distribution and increasing the range of azimuths, however, the BS-produced models gave results approaching the SIS behavior. For this reservoir, we chose to combine the two methods by using object-based channels and a pixel-based heterogeneous background, resulting in moderate permeability and anisotropy levels.

43 citations


Journal ArticleDOI
TL;DR: In this article, the correlation between two seismic surveys acquired over South Timbalier Block 295 field (offshore Louisiana) record significant differences in amplitude that are correlated to hydrocarbon production at multiple reservoir levels.
Abstract: Two seismic surveys acquired over South Timbalier Block 295 field (offshore Louisiana) record significant differences in amplitude that are correlated to hydrocarbon production at multiple reservoir levels. The K8 sand, a solution-gas-drive reservoir, shows increases in seismic amplitude associated with gas exsolution. The K40 sand, a water-drive reservoir, shows decreases in seismic amplitude associated with increases in water saturation. A methodology is presented to optimize the correlation between two seismic surveys after they have been individually processed (poststack). This methodology includes rebinning, crosscorrelation, band-pass filtering, and cross-equalization. A statistical approach is developed to characterize the correlation between the seismic surveys. This statistical analysis is used to discriminate seismic amplitude differences that record change in rock and fluid properties from those that could be the result of miscorrelation of the seismic data. Time-lapse seismic analysis provides an important new approach to imaging hydrocarbon production; it may be used to improve reservoir characterization and guide production decisions.

43 citations


Journal ArticleDOI
TL;DR: The use of the k-fold cross validation technique is demonstrated to obtain confidence bounds on an ANN’s accuracy statistic from a finite sample set and its classification accuracy is dramatically improved by transforming the ANN”s input feature space to a dimensionally smaller, new input space.

Journal ArticleDOI
TL;DR: Dolberg et al. as mentioned in this paper applied porosity prediction techniques only to the Ile Formation and found that the total thickness of Ile is close to 50 ms TWT (80-100 m), with a depth of approximately 4.1 s.
Abstract: Kristin Field is a recently discovered gas and condensate field offshore Norway (See Figure 1 of Dolberg et al.). Reservoir sandstones are tidally influenced, laterally continuous shallow marine deposits of Jurassic age (Figure 1). A diagenetic overprint during progressive burial to depths of 5 km creates a patchwork of reservoir quality making 3-D mapping of these properties challenging. At the start of reservoir modeling, two exploration wells had penetrated the reservoir. One main focus was to predict lateral variations in reservoir quality (porosity and permeability). During the project, two additional appraisal wells were drilled and used for reservoir characterization refinement. Four potential reservoirs exist at Kristin Field (Figure 1). Only the Garn and Ile Formations are proven to be hydrocarbon-bearing. For this study, porosity prediction techniques are applied only to the Ile Formation. Total thickness of Ile is close to 50 ms TWT (80–100 m), with a depth of approximately 4.1 s (4600 m).


Proceedings ArticleDOI
01 Jan 2000
TL;DR: The next generation of reservoir characterization tools for the new millennium – soft computing is proposed and the unique roles of the three major methodologies of soft computing – neurocomputing, fuzzy logic and evolutionary computing are outlined.
Abstract: This paper presents an overview of soft computing techniques for reservoir characterization. The key techniques include neurocomputing, fuzzy logic and evolutionary computing. A number of documented studies show that these intelligent techniques are good candidates for seismic data processing and characterization, well logging, reservoir mapping and engineering. Future research should focus on the integration of data and disciplinary knowledge for improving our understanding of reservoir data and reducing our prediction uncertainty. Introduction Accurate prediction of reservoir performance is a difficult problem. This is mainly due to the failure of our understanding of the spatial distribution of lithofacies and petrophysical properties. Because of this, the recovery factors in many reservoirs are unacceptably low. The current technologies based on conventional methodologies are inadequate and/or inefficient. In this paper, we propose the next generation of reservoir characterization tools for the new millennium – soft computing. Reservoir characterization plays a crucial role in modern reservoir management. It helps to make sound reservoir decisions and improves the asset value of the oil and gas companies. It maximizes integration of multi-disciplinary data and knowledge and improves the reliability of the reservoir predictions. The ultimate product is a reservoir model with realistic tolerance for imprecision and uncertainty. Soft computing aims to exploit such a tolerance for solving practical problems. Soft computing is an ensemble of various intelligent computing methodologies which include neurocomputing, fuzzy logic and evolutionary computing. Unlike the conventional or hard computing, it is tolerant of imprecision, uncertainty and partial truth. It is also tractable, robust, efficient and inexpensive. In reservoir characterization, these intelligent techniques can be used for uncertainty analysis, risk assessment, data fusion and data mining which are applicable to feature extraction from seismic attributes, well logging, reservoir mapping and engineering. Figure 1 shows schematically the flow of information and techniques to be used for intelligent reservoir characterization. The main goal is to integrate soft data such as geological data with hard data such as 3D seismic and production data to build a reservoir and stratigraphic model. While some individual methodologies (esp. neurocomputing) have gained much popularity during the past few years, the true benefit of soft computing lies on the integration of its constituent methodologies rather than use in isolation. This paper firstly outlines the unique roles of the three major methodologies of soft computing – neurocomputing, fuzzy logic and evolutionary computing. We will summarize a number of relevant and documented reservoir characterization applications. Lastly we will provide a list of recommendations for the future use of soft computing. This includes the hybrid of various methodologies (e.g. neural-fuzzy or neuro-fuzzy, neural-genetic, fuzzy-genetic and neural-fuzzy-genetic) and the latest tool of “computing with words” (CW). CW provides a completely new insight into computing with imprecise, qualitative and linguistic phrases and is a potential tool for geological modeling which is based on words rather than exact numbers. An appendix is also provided for introducing the basics in soft computing. Neurocomputing Neurocomputing represents general computation with the use of artificial neural networks. An artificial neural network is a computer model that attempts to mimic simple biological learning processes and simulate specific functions of human nervous system. It is an adaptive, parallel information processing system which is able to develop associations, transformations or mappings between objects or data. It is also SPE 59397 Soft Computing for Intelligent Reservoir Characterization D. Tamhane, SPE, P.M. Wong, SPE, University of New South Wales, F. Aminzadeh, SPE, FACT Inc. & dGB-USA, M. Nikravesh, SPE, Energy and Geoscience Institute (EGI)-University of Utah & Zadeh Institute of Information Technology. SPE 59397 SOFT COMPUTING FOR INTELLIGENT RESERVOIR CHARACTERIZATION 2 the most popular intelligent technique for pattern recognition to date. The basic elements of a neural network are the neurons and their connection strengths (weights). Given a topology of the network structure expressing how the neurons (the processing elements) are connected, a learning algorithm takes an initial model with some “prior” connection weights (usually random numbers) and produces a final model by numerical iterations. Hence “learning” implies the derivation of the “posterior” connection weights when a performance criterion is matched (e.g. the mean square error is below a certain tolerance value). Learning can be performed by “supervised” or “unsupervised” algorithm. The former requires a set of known input-output data patterns (or training patterns), while the latter requires only the input patterns. Figure 2 depicts a typical structure of a neural network, showing three layers of neurons. The lines represent how the neurons are connected. Each line is represented by a weight value. In this case, the inputs are passed to each layer and the results are obtained at the output layer. This is commonly known as the feedforward model, in which no lateral or backward connections are used. The full technical details can be found in Bishop. Applications. The major applications of neurocomputing are seismic data processing and interpretation, well logging and reservoir mapping and engineering. Good quality seismic data is essential for realistic delineation of reservoir structures. Seismic data quality depends largely on the efficiency of data processing. The processing step is time consuming and complex. The major applications include first arrival picking, noise elimination, structural mapping, horizon picking and event tracking. A detailed review can be found in Nikravesh and Aminzadeh. For interwell characterization, neural networks have been used to derive reservoir properties by crosswell seismic data. In Chawathé et al., the authors used a neural network to relate five seismic attributes (amplitude, reflection strength, phase, frequency and quadrature) to gamma ray (GR) logs obtained at two wells in the Sulimar Queen field (Chaves County). Then the GR response was predicted between the wells and was subsequently converted to porosity based on a field-specific porosity-GR transform. The results provided good delineation of various lithofacies. Feature extraction from 3D seismic attributes is an extremely important area. Most statistical methods are failed due to the inherent complexity and nonlinear information content. Figure 3 shows an example use of neural networks for segmenting seismic characters thus deducing information on the seismic facies and reservoir properties (lithology, porosity, fluid saturation and sand thickness). A display of the level of confidence (degree of match) between the seismic character at a given point versus the representative wavelets (centers of clusters) is also shown. Combining this information with the seismic model derived from the well logs while perturbing for different properties gives physical meaning of different clusters. Monson and Pita applied neural networks to find relationships between 3D seismic attributes and well logs. The study provided realistic prediction of log responses far away from the wellbore. Boadu also used similar technology to relate seismic attributes to rock properties for sandstones. In Nikravesh et al., the author applied a combination of k-means clustering, neural networks and fuzzy c-means (a clustering algorithm in which each data vector belongs to each of the clusters to a degree specified by a membership grade) techniques to characterize a field that produces from the Ellenburger Dolomite. The techniques were used to perform clustering of 3D seismic attributes and to establish relationships between the clusters and the production log. The production log was established away from wellbore. The production log and the clusters were then superimposed at each point of a 3D seismic cube. They also identified the optimum locations for new wells based on the connectivity, size and shape of the clusters related to the pay zones (see Figure 4). The use of neural networks in well logging has been popular for nearly one decade. Many successful applications have been documented. The most recent work by Bruce et al. presented a state-of-the-art review of the use of neural networks for predicting permeability from well logs. In this application, the network is used as a nonlinear regression tool to develop transformation between well logs and core permeability. Such a transformation can be used for estimating permeability in uncored intervals and wells. One example is shown in Figure 5. In this work, the permeability profile was predicted by a Bayesian neural network. The network was trained by a training set with four well logs (GR, NPHI, RHOB and RT) and core permeability. The network also provided a measure of confidence (the standard deviation of a Gaussian function): the higher the standard deviation (“sigma”), the lower the prediction reliability. This is very useful for understanding the risk of data extrapolation. The same tool can be applied to estimate porosity and fluid saturations. Another important application is the clustering of well logs for the recognition of lithofacies. This provides useful information for improved petrophysical estimates and well correlation. Neurocomputing has also been applied to reservoir mapping. In Wong et al. and Wang et al., the authors applied a radial basis function neural network to relate the conceptual distribution of geological facies (in the form of hand drawings) to reservoir porosity. It is able to incorporate the general property trend provided



Journal ArticleDOI
TL;DR: It is shown how transforming compressional and shear wave velocity data to the (rho/lambda, mu/lambda)-plane results in a set of quasi-orthogonal coordinates for porosity and liquid saturation that greatly aids in the interpretation of seismic data for the physical parameters of most interest.
Abstract: For wave propagation at low frequencies in a porous medium, the Gassmann–Domenico relations are well-established for homogeneous partial saturation by a liquid. They provide the correct relations for seismic velocities in terms of constituent bulk and shear moduli, solid and fluid densities, porosity and saturation. It has not been possible, however, to invert these relations easily to determine porosity and saturation when the seismic velocities are known. Also, the state (or distribution) of saturation, i.e., whether or not liquid and gas are homogeneously mixed in the pore space, is another important variable for reservoir evaluation. A reliable ability to determine the state of saturation from velocity data continues to be problematic. It is shown how transforming compressional and shear wave velocity data to the (ρ/λ,μ/λ)-plane (where λ and μ are the Lame parameters and ρ is the total density) results in a set of quasi-orthogonal coordinates for porosity and liquid saturation that greatly aids in the interpretation of seismic data for the physical parameters of most interest. A second transformation of the same data then permits isolation of the liquid saturation value, and also provides some direct information about the state of saturation. By thus replotting the data in the (λ/μ, ρ/μ)-plane, inferences can be made concerning the degree of patchy (inhomogeneous) versus homogeneous saturation that is present in the region of the medium sampled by the data. Our examples include igneous and sedimentary rocks, as well as man-made porous materials. These results have potential applications in various areas of interest, including petroleum exploration and reservoir characterization, geothermal resource evaluation, environmental restoration monitoring, and geotechnical site characterization.

Journal ArticleDOI
TL;DR: In this paper, an integrated geoscientific study of the Appleton field structure and reservoir was undertaken to determine whether drilling additional wells in the field would extend the productive life of the reservoir, and the conclusion from the integrated study, which included advanced carbonate reservoir characterization, three-dimensional geologic visualization modeling, seismic forward modeling, porosity distribution analysis, and field production analysis, was that a sidetrack well drilled on the western paleohigh should result in improved oil recovery from the field.
Abstract: Appleton oil field, located in Escambia County, Alabama, was discovered in 1983 through the use of two-dimensional seismic reflection data. The field structure is a northwest-southeast–trending paleotopographic ridge comprised of local paleohighs. The field produces from microbial reef boundstones and shoal grainstones and packstones of the Upper Jurassic Smackover Formation. Because Appleton field is approaching abandonment, owing to reduced profitability, an integrated geoscientific study of the field structure and reservoir was undertaken to determine whether drilling additional wells in the field would extend the productive life of the reservoir. The conclusion from the integrated study, which included advanced carbonate reservoir characterization, three-dimensional geologic visualization modeling, seismic forward modeling, porosity distribution analysis, and field production analysis, was that a sidetrack well drilled on the western paleohigh should result in improved oil recovery from the field. The sidetrack well was drilled and penetrated porous Smackover reservoir near the crest of the western paleohigh. The well tested 136 bbl oil/day. (Begin page 1700)

Journal ArticleDOI
TL;DR: In this paper, a co-rigging algorithm based on the concept of cokriging with block average data is woven into probability field simulation for building facies models, and the results are compared with those obtained from facies indicator simulation without integrating seismic data.

Journal ArticleDOI
TL;DR: In this paper, bitumen was extracted from core samples of Jurassic reservoir rock from the Rind discovery (offshore Norway) using a rapid procedure known as ''micro-extraction”.
Abstract: Well logging is used routinely to delineate variations in the physical properties of the water, gas and oil zones in a petroleum reservoir. The analysis of petroleum extracted from reservoir samples is a complementary approach to reservoir screening. For this study, we studied core samples of Jurassic reservoir rock from the Rind discovery (offshore Norway). Bitumen was extracted from these samples using a rapid procedure known as “micro- extraction”. The extracts were then separated into compound classes (saturated hydrocarbons, aromatic hydrocarbons and polar compounds) by Iatroscan thin-layer chromatography -- flame ionisation detection (TLC-FID). Using this approach, we show how problematic transition zones in this reservoir, which had not been resolved by logging techniques, were targeted, analysed and categorised. Furthermore, we demonstrate how oil and water zones can be


Proceedings ArticleDOI
17 Oct 2000
TL;DR: In this paper, the authors presented a new and novel methodology to generate synthetic magnetic resonance logs using readily available conventional wireline logs such as spontaneous potential, gamma ray, density, and induction logs.
Abstract: Magnetic resonance logs provide the capability of in-situ measurement of reservoir characteristics such as effective porosity, fluid saturation, and rock permeability. This study presents a new and novel methodology to generate synthetic magnetic resonance logs using readily available conventional wireline logs such as spontaneous potential, gamma ray, density, and induction logs. The study also examines and provides alternatives for situations in which all required conventional logs are unavailable for a particular well. Synthetic magnetic resonance logs for wells with an incomplete suite of conventional logs are generated and compared with actual magnetic resonance logs for the same well. In order to demonstrate the feasibility of the concept being introduced here, the methodology is applied to a highly heterogeneous reservoir in East Texas. The process was verified by applying it to a well away from the wells used during the development process. This technique is capable of providing a better image of the reservoir properties (effective porosity, fluid saturation, and permeability) and more realistic reserve estimation at a much lower cost.

Journal ArticleDOI
Chip Story, Patrick Peng1, Christoph Heubeck1, Claire Sullivan1, Jian Dong Lin2 
TL;DR: Liuhua 11-1 Field is 130 miles southeast of Hong Kong in 1000 ft of water (Figure 1). The field, discovered in 1987, is being developed by the consortium of BP Amoco, China National Offshore Oil, and Kerr-McGee.
Abstract: Liuhua 11-1 Field is 130 miles southeast of Hong Kong in 1000 ft of water (Figure 1). The field, discovered in 1987, is being developed by the consortium of BP Amoco, China National Offshore Oil, and Kerr-McGee. The reservoir zone at 3850 ft subsea is producing 16–22° API oil through 25 long-radius horizontal wells. Success depends on limiting bottom-water production, which in turn makes accurate reservoir description critically important. Figure 1. Pearl River Mouth Basin index map and geologic setting. To better define reservoir heterogeneity, an ultrahigh-resolution 3-D seismic survey was acquired in July 1997. Acquisition was conducted during calm seas with short (1500 m) streamers and shallow (3.5 m) tow depths. The 180-Hz field data, enhanced during processing, produced peak frequencies of 240 Hz. Approximately four million traces were processed at a bin spacing of 5 × 5 m over 100 km2. The seismic data were first converted to acoustic impedance by using geologically constrained inversion techniques and then to porosity based on a linear impedance versus porosity relationship. Drilling data were integrated to create detailed maps of reservoir structure and stratigraphy. Petrophysical data and modeling were combined with the seismic inversion to create a spatial distribution of porosity, permeability, and saturation. Faults, fractures, and solution-collapse phenomena were analyzed using coherence technology. Complex attribute analyses added additional understanding of rock matrix continuity. This information has been used to build reservoir characterization and simulation models, that have been tuned and validated using history matching, to predict future reservoir performance. The geoscience interpretation and characterization of the reservoir for simulation are the focus of this paper. Liuhua reef carbonates are composed largely of shallow-water foram-algal packstones and boundstones belonging to the Miocene Zhujiang Formation. In-place reserves are projected at 1.2 billion barrels over the entire closure. Only the western part of …

Proceedings ArticleDOI
01 Jan 2000
TL;DR: In this paper, the authors generalize the streamline approach to transient pressure applications by introducing a "diffusive" time of flight along streamlines, which allows to define drainage areas or volumes associated with primary recovery and compressible flow under the most general conditions.
Abstract: Streamline models have shown significant potential in integrating dynamic data into high-resolution reservoir models in a computationally efficient manner. However, previous efforts towards production data integration using streamline models have been limited to tracer data and multiphase production history such as water-cut at the wells. In this paper we generalize the streamline approach to transient pressure applications by introducing a ‘diffusive’ time of flight along streamlines. We show that the ‘diffusive’ time of flight allows us to define drainage areas or volumes associated with primary recovery and compressible flow under the most general conditions. We then utilize developments in seismic tomography and waveform imaging to formulate an efficient approach to integrating transient pressure data into highresolution reservoir models. Our proposed approach exploits an analogy between a propagating wave and a propagating ‘pressure front’. In particular, we adopt a high frequency asymptotic solution to the transient pressure equation to compute travel times associated with a propagating 'pressure front'. The asymptotic approach has been widely used in modeling wave propagation phenomena. A key advantage of the asymptotic approach is that parameter sensitivities required for solving inverse problems related to production data integration can be obtained analytically using a single streamline simulation. Thus, the approach can be orders of magnitude faster than current techniques that can require multiple flow simulations. We have applied our proposed approach to both synthetic and field examples. The synthetic example utilizes transient pressure response from an interference test in a nine-spot pattern. The spatial distribution of permeability is estimated by matching arrival times of the 'pressure front' in each of the observation wells. The field example is from the Conoco Borehole Test Facility in Kay County, Oklahoma. A series of pressure interference tests were performed in a skewed fivespot pattern to identify the distribution and orientation of the natural fracture system at the Fort Riley formation. We have inverted the pressure drawdowns at the observation wells to create a conceptual model for the Fort Riley formation. The predominant fracture patterns emerging from the inversion are shown to be consistent with outcrop mapping and crosswell seismic imaging.

Proceedings ArticleDOI
01 Oct 2000
TL;DR: In this paper, a more powerful methodology is created, for the evaluation of naturally fractured reservoirs, when combining two techniques that have, historically, been applied in relative isolation, for reservoir characterization and simulation modeling.
Abstract: Reservoir characterization and simulation modeling of naturally fractured reservoirs (NFRs) presents unique challenges that differentiate it from conventional, single porosity continuum reservoirs. Not only do the intrinsic characteristics of the fractures, as well as the matrix, have to be characterized, but the interaction between matrix and fractures must also be modeled accurately. Three field case studies have been evaluated combining the forward modeling approach, typically used by geoscientists, with inverse techniques, usually incorporated by reservoir engineers. The forward approach examines various causes of natural fractures and its' associated properties (e.g. fracture spacing, height, stress distribution, etc.) while the inverse approach focuses more on the effect created by the NFR (e.g. decline analysis, material balance, productivity, etc.). This study shows how a more powerful methodology is created, for the evaluation of naturally fractured reservoirs, when combining two techniques that have, historically, been applied in relative isolation.


Journal ArticleDOI
TL;DR: In the E&P business, by far the largest component of geophysical spending is driven by the need to characterize (potential) reservoirs as mentioned in this paper, and that increased investments in subsurface work are paying off in an overall reduction in overall exploration cost.
Abstract: In the E&P business, by far the largest component of geophysical spending is driven by the need to characterize (potential) reservoirs. The simple reason is that better reservoir characterization means higher success rates and fewer wells for reservoir exploitation … and that increased investments in subsurface work are paying off in an overall reduction in E&P cost. Future geophysical spending will depend primarily on our continued contribution to overall E&P cost control by further improvements in reservoir characterization.