scispace - formally typeset
Search or ask a question

Showing papers in "Seg Technical Program Expanded Abstracts in 2008"



Proceedings ArticleDOI
TL;DR: A high-resolution Radonbased separation technique that sufficiently separates the data to allow subsequent standard noise attenuation techniques to complete the task, and is motivated to the work presented here by observations made on a 3D dataset acquired over the Petronius field in the Gulf of Mexico with two source vessels.
Abstract: The term “simultaneous source” refers to the idea of firing several seismic sources so that their combined energy is recorded into the same set of receivers during a single conventional shotpoint timing cycle. The idea is to collect the equivalent of two or more shots worth of data in the same time as it takes to collect one. The potential advantages include cost or time savings in field acquisition, which is of renewed interest due to the popularity and expense of WATS data. We were motivated to the work presented here by observations made on a 3D dataset acquired over the Petronius field in the Gulf of Mexico with two source vessels. The second source was fired with a random delay compared to the first, so that the energy from secondary source is similar to asynchronous noise. While the random nature of the crosstalk in combination with the two known geometries had been enough to successfully apply relatively standard processing techniques for other studied datasets, we found that this one required an improvement on those techniques. This paper describes a high-resolution (sparse) Radonbased separation technique with that aim. We find that while the technique does not by itself do all the required separation, it sufficiently separates the data to allow subsequent standard noise attenuation techniques to complete the task.

211 citations


Proceedings ArticleDOI
TL;DR: In this article, wave equations for modeling and reverse-time migration in acoustic VTI media were derived directly from Hooke's law and the equations of motion, and the resulting set of five first-order differential equations in the three particle velocity components and two independent stress components involves no approximations apart from the acoustic-VTI approximation itself.
Abstract: We present wave equations for modeling and reverse-time migration in acoustic VTI media, derived directly from Hooke’s law and the equations of motion. The resulting set of five first-order differential equations in the three particle velocity components and two independent stress components involves no approximations apart from the acoustic VTI approximation itself and allows handling variable density in a natural way.

193 citations


Proceedings ArticleDOI
Ian Moore1, Bill Dragoset1, Tor Ommundsen1, David M. Wilson1, Daniel Eke1, Camille Ward1 
TL;DR: In conventional data acquisition, the delay time between the firing of one source and the next is such that the energy from the previous source has decayed to an acceptable level before data associated with the following source arrives, which imposes constraints on the data acquisition rate.
Abstract: In conventional data acquisition, the delay time between the firing of one source and the next is such that the energy from the previous source has decayed to an acceptable level before data associated with the following source arrives. This minimum delay time imposes constraints on the data acquisition rate. For marine data, the minimum delay time also implies a minimum inline shot interval, because the vessel’s minimum speed is limited.

184 citations


Proceedings ArticleDOI
TL;DR: The resulting Cadzow filtering method is superior to both f-xy prediction (deconvolution) and projection filtering, especially for very noisy data, and preserves signal better and can be made much harsher.
Abstract: Summary Cadzow filtering has previously been applied along constant-frequency slices to remove random noise from 2-D seismic data. Here I extend Cadzow filtering to two or more spatial dimensions. The resulting method is superior to both f-xy prediction (deconvolution) and projection filtering, especially for very noisy data. In particular, it preserves signal better and can be made much harsher.

173 citations



Proceedings ArticleDOI
TL;DR: It is argued that strategies favoring computational complexity over memory (to the point where disk i/o can be avoided) are attractive for 3D prestack migrations.
Abstract: The imaging condition used in reverse-time migration requires that the source wavefield (computed via a forward recursion) and the receiver wavefield (computed via a backwards recursion) must be made available at the same time in an implementation of the algorithm. Several strategies to organize the calculation can be employed, differing in balance between memory and computation. This paper describes and compares these different approaches, and argues that strategies favoring computational complexity over memory (to the point where disk i/o can be avoided) are attractive for 3D prestack migrations. An example of 3D reverse-time migration applied to wide-azimuth data from the Gulf of Mexico is presented to support the claim.

127 citations


Proceedings ArticleDOI
TL;DR: In this article, the authors propose to operate any number of sources on a recording spread, without needing to coordinate their activity, without requiring real-time synchronization of sources and recording systems.
Abstract: Recent developments in recording systems allow for a recording spread to be continually active, which will be referred to as continuous recording, although it may be more accurately described as recording of a set of contiguous records. This removes the necessity for real time synchronization of sources and recording systems. As long as the continuously recorded data and the source initiation can both be linked to the same time standard, (eg GPS time) the traditional shot records can be combed from the continuous dataset at any later stage. This new freedom allows us to operate any number of sources on a recording spread, without needing to coordinate their activity.

116 citations


Proceedings ArticleDOI
TL;DR: It is shown that a PEF-based adaptive subtraction of the estimated wavefield due to a secondary source provides an effective separation of the sources.
Abstract: The acquisition of n-shots, more or less simultaneously, increases acquisition efficiency and collects a wider range of information for imaging and reservoir characterisation. Its success relies critically on the ability to separate n-shots from one recording. Stefani et al (2007) showed that while some datasets may be easily separated, others are more difficult. Using the more difficult data example from Stefani et al (loc.cit.), we show that a PEF-based adaptive subtraction (Spitz, 2007) of the estimated wavefield due to a secondary source provides an effective separation of the sources.

113 citations




Proceedings ArticleDOI
TL;DR: The classical iterative velocity update made of several iterations of RMO picking, pre-stack migration and velocity update can be replaced by a more efficient sequential approach involving a single preSDM and a single residual move-out (RMO) picking followed by a non-linear tomographic inversion.
Abstract: We present a fast turnaround strategy for building depth velocity models from kinematic invariants. Our approach is based on the concept of kinematic invariants describing locally coherent events by their position and slopes in the un-migrated pre-stack domain. 3D slope tomography can be based on kinematic invariants that fully characterize the events in terms of positioning and focusing. Kinematic invariants offer a versatile tool for velocity model building as they can be derived from dip and move-out picks made either in pre-stack depth migrated (preSDM) or pre-stack time migrated (preSTM) domains, or even in the unmigrated domain. Since the invariants are in the unmigrated domain, they only need to be picked once. The classical iterative velocity update made of several iterations of RMO picking, pre-stack migration and velocity update can thus be replaced by a more efficient sequential approach involving a single preSDM and a single residual move-out (RMO) picking followed by a non-linear tomographic inversion, should the quality of the initial PreSDM be appropriate for an automated volumetric picking.


Proceedings ArticleDOI
Robert Soubaras1, Yu Zhang1
TL;DR: A new way of solving the twoway wave equation called the two-step Explicit Marching method, based on a high order differential operator and allows arbitrary large time steps with guaranteed numerical stability and minimized dispersion.
Abstract: We describe in this paper a new way of solving the twoway wave equation called the two-step Explicit Marching method. Compared to the conventional explicit finitedifference algorithms, which can be second or fourth order but are subject to stability conditions and dispersion problems that limit the magnitude of the time steps used to propagate the wavefieds, the proposed method is based on a high order differential operator and allows arbitrary large time steps with guaranteed numerical stability and minimized dispersion. Synthetic and real data examples show that it allows the reverse time migration to be performed with the Nyquist time step, based on the maximum frequency of the input data, which is the maximum time step that can be used for proper imaging.

Proceedings ArticleDOI
TL;DR: The f-x EMD method as discussed by the authors is equivalent to an autoadaptive f-k filter with a frequency-dependent, high-wavenumber cut filtering property, and can be applied to entire data sets without user interaction.
Abstract: We have devised a newfiltering technique for random and coherentnoiseattenuationinseismicdatabyapplyingempiricalmodedecompositionEMDonconstant-frequencyslicesinthefrequency-offsetf-xdomainandremovingthefirst intrinsicmodefunction.Themotivationbehindthisdevelopmentistoovercomethepotentiallowperformanceof f-x deconvolution for signal-to-noise enhancement when processinghighlycomplexgeologicsections,dataacquiredusingirregular trace spacing, and/or data contaminated with steeply dipping coherent noise. The resulting f-x EMD method is equivalent to an autoadaptive f-k filter with a frequency-dependent, high-wavenumber cut filtering property. Removing both random and steeply dipping coherent noise in either prestackorstacked/migratedsectionsisusefulandcompares well with other noise-reduction methods, such as f-x deconvolution,medianfiltering,andlocalsingularvaluedecomposition.Initssimplestimplementation, f-xEMDisparameterfree and can be applied to entire data sets without user interaction.

Proceedings ArticleDOI
TL;DR: The principle of blended acquisition is summarized and how to process blended data is shown, which two processing routes can be followed: reconstructing the unblended data followed by conventional processing, or directly processing the blended measurements.
Abstract: Seismic acquisition surveys are designed such that the time interval between shots is sufficiently large to avoid the tail of the previous source response to interfere with the next one (zero overlap in time). To economize on survey time and processing effort, the current compromise is to keep the number of shots to some acceptable minimum. The result is that the source domain is poorly sampled. In this paper it is proposed to abandon the condition of non-overlapping shot records. Instead, a plea is made to move to densely sampled and wide-azimuth source distributions with relatively small time intervals between shots (‘blended acquisition’). The underlying rationale is that interpolating missing shot records, i.e., generating data that have not been recorded (aliasing problem), is much harder than separating the data of overlapping shot records (interference problem). In this paper we summarize the principle of blended acquisition and show how to process blended data. Two processing routes can be followed: reconstructing the unblended data (‘deblending’) followed by conventional processing, or directly processing the blended measurements. Both approaches will be described and illustrated with numerical examples. A theoretical framework is presented that enables the design of blended 3D seismic surveys.

Proceedings ArticleDOI
TL;DR: The Z-TEM or Z-Axis Tipper Electromagnetics, an airborne AFMAG system, has been guided by numerical modeling of the target types as mentioned in this paper.
Abstract: Summary Mineral exploration using the Z-TEM or Z-Axis Tipper Electromagnetics, an airborne AFMAG system, has been guided by numerical modeling of the target types Numerical modeling is used to plan the survey in terms of survey line spacing, survey height, and expected signatures of the targets Knowing the Z-TEM response of a deposit of the type that is being explored for, aids in the interpretation of the results Numerical modeling has demonstrated that the Z-TEM system is ideally suited for large, deep deposits of low to high resistivity contrasts such as porphyry copper and SEDEX deposits

PatentDOI
TL;DR: In this paper, a correlation window is selected for each of the plurality of seismic signals, and each correlation window has a selected time interval including an arrival time of the at least one seismic event in each seismic signal.
Abstract: A method for determining presence of seismic events in seismic signals includes determining presence of at least one seismic event in seismic signals corresponding to each of a plurality of seismic sensors. A correlation window is selected for each of the plurality of seismic signals. Each correlation window has a selected time interval including an arrival time of the at least one seismic event in each seismic signal. Each window is correlated to the respective seismic signal between a first selected time and a second selected time. Presence of at least one other seismic event in the seismic signals from a result of the correlating.


Proceedings ArticleDOI
TL;DR: In this article, a Kirchhoff approach was proposed to compute frequency-dependent traveltimes for geologic structures below a viscoaoustic overburden, which is important for account for amplitude dimming, frequency loss, and phase distortion.
Abstract: Summary When imaging geologic structures below a viscoaoustic overburden, it is important to account for amplitude dimming, frequency loss, and phase distortion. Several viscoaoustic wave-equation based methods have been developed to solve this problem. We present a Kirchhoff approach. The technical significance of this approach is that we have found an efficient way to compute frequency-dependent traveltimes, which are crucial in the Kirchhoff integral. The validity of our technology is illustrated by data examples.


Proceedings ArticleDOI
Yaxun Tang1
TL;DR: The proposed method modifies the original explicit Hessian formula, enabling efficient computation of the operator, and a particular advantage of this method is that it reduces or eliminates storage of Green’s functions on the hard disk.
Abstract: I demonstrate a method for computing wave-equation Hessian operators, also known as resolution functions or point-spread functions, under the Born approximation. The proposed method modifies the original explicit Hessian formula, enabling efficient computation of the operator. A particular advantage of this method is that it reduces or eliminates storage of Green’s functions on the hard disk. The modifications, however, also introduce undesired crosstalk artifacts. I introduce two different phase-encoding schemes, namely, plane-wave phase encoding and random phase encoding, to suppress the cross-talk. I applied the Hessian operator obtained by using random phase encoding to the Sigsbee2A synthetic data set, where a better subsalt image with higher resolution is obtained.

Proceedings ArticleDOI
Antonio Pica1, Laurie Delmas1
TL;DR: In this paper, a model-based surface related multiple modeling (3D SRMM) can be achieved by the use of pre-stack demigration algorithms, thus avoiding the constrains on the shot positions distribution required by the data-based methods.
Abstract: Summary Model-based surface related multiple modeling (3D SRMM) can be achieved by the use of pre-stack demigration algorithms, thus avoiding the constrains on the shot positions distribution required by the data-based methods. In the following, we show a new method for modeling internal multiples in 3D by using a model-based technique. This method follows a parallel flow with respect to those employed by the data-based multiple modeling techniques, and allows for the construction of internal multiple events produced between upper layers reflecting the energy downward, and lower layers reflecting the energy upward. The method has been applied to Wide Azimuth Towed Streamer data from the Gulf of Mexico.

Proceedings ArticleDOI
Zvi Koren1, Igor Ravve1, Evgeny Ragoza1, Allon Bartana1, Dan Kosloff1 
TL;DR: In this paper, a seismic imaging system for generating and extracting high-resolution information about subsurface angle dependent reflectivity, with simultaneous emphasis on both continuous structural surfaces and discontinuous objects, such as faults and small-scale fractures, is presented.
Abstract: This work presents a new seismic imaging system for generating and extracting high-resolution information about subsurface angle dependent reflectivity, with simultaneous emphasis on both continuous structural surfaces and discontinuous objects, such as faults and small-scale fractures. The system enables full-azimuth, angledependent seismic imaging using reflection data recorded through seismic acquisition surveys, especially wideazimuth and long offset data. Geometrical attributes, such as dip-azimuth and continuity of the local reflecting surfaces, can be automatically extracted directly from the full-azimuth angle gathers. Azimuthal anisotropy can be detected, leading to an accurate anisotropy model representation.

Proceedings ArticleDOI
TL;DR: In this paper, a local nonlinear filter is proposed to construct a coherent noise model in a localized time-space window and performs the noise attenuation by adaptively subtracting the noise model from the input data.
Abstract: A new method uses eigenimages to construct a coherent noise model in a localized time-space window and performs the noise attenuation by adaptively subtracting the noise model from the input data. Advantages to this method include minimum spatial-amplitude smearing, effective attenuation on various types of coherent noise such as ground roll, air waves and near-surface scattered energy as well as handling both the aliased and non-aliased noise quite well. This new nonlinear filter significantly outperforms conventional techniques. We demonstrate the performance of this local-nonlinear filter with real data examples.

Proceedings ArticleDOI
TL;DR: In this paper, the authors investigate the pros and cons of waveform inversion for each domain along with some discussion on their ability to be used for 3D surveys in the time or frequency domain.
Abstract: The Earth model parameters are essential to hydrocarbon exploration In particular, the velocity representation of the subsurface permanently engages creative minds to find ways to derive more accurate fields One of these tools is full waveform inversion This compute-intensive tool uses acquired seismic data and forward modeling to obtain a velocity field in an iterative manner Since the mideighties, the geophysical community has been devoting considerable research to waveform inversion The inversion can be implemented in either the time or frequency domain In this paper, we investigate the pros and cons for each domain along with some discussion on their ability to be used for 3D surveys

Proceedings ArticleDOI
TL;DR: This work extends the prior work to VTI tomography, modify the process of regularization optimization, and proposes an updated way for uncertainty and resolution quantification using the apparatus of eigendecomposition.
Abstract: Tomographic velocity model building has become an industry standard for depth migration. Anisotropy of the Earth challenges tomography because the inverse problem becomes severely ill-posed. Singular value decomposition (SVD) of tomographic operators or, similarly, eigendecomposition of the corresponding normal equations, are well known as a useful framework for analysis of the most significant dependencies between model and data. However, application of this approach in velocity model building has been limited, primarily because of the perception that it is computationally prohibitively expensive, especially for the anisotropic case. In this paper, we extend our prior work (Osypov et al., 2008) to VTI tomography, modify the process of regularization optimization, and propose an updated way for uncertainty and resolution quantification using the apparatus of eigendecomposition. We demonstrate the simultaneous tomographic estimation of VTI parameters on a real dataset. Our approach provides extra capabilities for regularization optimization and uncertainty analysis in anisotropic model parameter space which can be further translated into the structural uncertainty within the image.


Proceedings ArticleDOI
TL;DR: In this paper, the authors calibrate the response of various seismic attributes to a well-understood fluvial system and show that seismic data has difficulty in distinguishing shale filled channels vs. sand-filled channels.
Abstract: Detection of channels and their infill lithology has always posed a challenge for exploration geologists and geophysicists, and the Red Fork channels in the Anadarko Basin do not fall outside of this challenge. The goal of this study is to take a new look at seismic attributes given the considerable well control that has been acquired during the past decade. By using this well understood reservoir as a natural laboratory, we calibrate the response of various attributes to a well-understood fluvial system. The extensive drilling program shows that seismic data has difficulty in distinguishing shale filled channels vs. sandfilled channels, where the ultimate exploration goal is to find sand-filled channels. Furthermore, the drill bit has encountered many seismically ‘invisible channels’ that are of economic value. Since original work done in 1998 both seismic attributes and seismic geomorphology have undergone rapid advancement. The findings of this work will be applicable to nearby active areas as well as other intervals in the area that exhibit the same challenge such as the Springer channels.

Proceedings ArticleDOI
TL;DR: In this paper, a high frequency wavefield can be time stepped with no loss of frequency content and with a much larger time step than is commonly used, which is adapted to variable velocity using a localized Fourier transform (Gabor transform).
Abstract: As a result of the numerical performance of finite-difference operators, reverse-time migration (RTM) images are typically low frequency. We consider an alternative to wavefield propagation with finite differences, a high-fidelity time-stepping equation based on the Fourier transform, which is exact for homogeneous media if an aliasing condition is met. The technique is adapted to variable velocity using a localized Fourier transform (Gabor transform). The feasibility of using the time-stepping equation for RTM is demonstrated by studying its stability properties, its impulse response, and by migrating a synthetic example of a salt dome. We show that a high frequency wavefield can be time stepped with no loss of frequency content and with a much larger time step than is commonly used