Friday, March 27. 2015Fast 3D velocity scan
A new paper is added to the collection of reproducible documents:
A fast algorithm for 3D azimuthally anisotropic velocity scan Conventional velocity scan can be computationally expensive for largesize seismic data, particularly when the presence of anisotropy requires multiparameter estimation. We introduce a fast algorithm for 3D azimuthally anisotropic velocity scan, which is a generalization of the previously proposed 2D butterfly algorithm for hyperbolic Radon transform. To compute the semblance in a twoparameter residual moveout domain, the numerical complexity of our algorithm is roughly as opposed to of the straightforward velocity scan, with being representative of the number of points in either dimension of data space or parameter space. We provide both synthetic and fielddata examples to illustrate the efficiency and accuracy of the algorithm. Multiple suppression using PEF
Another old paper is added to the collection of reproducible documents:
Multiple suppression using predictionerror filter I present an approach to multiple suppression, that is based on the moveout between primary and multiple events in the CMP gather. After normal moveout correction, primary events will be horizontal, whereas multiple events will not be. For each NMOed CMP gather, I reorder the offset in random order. Ideally, this process has little influence on the primaries, but it destroys the shape of the multiples. In other words, after randomization of the offset order, the multiples appear as random noise. This "manmade" random noise can be removed using predictionerror filter (PEF). The randomization of the offset order can be regarded as a random process, so we can apply it to the CMP gather many times and get many different samples. All the samples can be arranged into a 3D cube, which is further divided into many small subcubes. A 3D PEF can then be estimated from each subcube and reapplied to it to remove the multiple energy. After that, all the samples are averaged back into one CMP gather, which is supposed to be free of multiple events. In order to improve the efficiency of the algorithm of estimating the PEF for each subcube, except for the first subcube which starts with a zerovalued initial guess, all the subsequent subcubes take the last estimated PEF as an initial guess. Therefore, the iteration count can be reduced to one step for all the subsequent subcubes with little loss of accuracy. Three examples demonstrate the performance of this new approach, especially in removing the nearoffset multiples. FWI on GPU
A new paper is added to the collection of reproducible documents:
A graphics processing unit implementation of timedomain fullwaveform inversion The graphics processing unit (GPU) has become a popular device for seismic imaging and inversion due to its superior speedup performance. In this paper we implement GPUbased full waveform inversion (FWI) using the wavefield reconstruction strategy. Because the computation on GPU is much faster than CPUGPU data communication, in our implementation the boundaries of the forward modeling are saved on the device to avert the issue of data transfer between host and device. The ClaytonEnquist absorbing boundary is adopted to maintain the efficiency of GPU computation. A hybrid nonlinear conjugate gradient algorithm combined with the parallel reduction scheme is utilized to do computation in GPU blocks. The numerical results confirm the validity of our implementation. Thursday, March 26. 2015Tutorial on phase and the Hilbert transform
The example in rsf/tutorials/hilbert reproduces the tutorial from Steve Purves on phase and the Hilbert transform. The tutorial was published in the October 2014 issue of The Leading Edge.
Madagascar users are encouraged to try improving the results. See also: Antialiasing in Kirchhoff migration
Another old paper is added to the collection of reproducible documents:
When is antialiasing needed in Kirchhoff migration? We present criteria to determine when numerical integration of seismic data will incur operator aliasing. Although there are many ways to handle operator aliasing, they add expense to the computational task. This is especially true in three dimensions. A twodimensional Kirchhoff migration example illustrates that the image zone of interest may not always require antialiasing and that considerable cost may be spared by not incorporating it. Wednesday, March 25. 2015Stratigraphic coordinates
A new paper is added to the collection of reproducible documents:
Stratigraphic coordinates, a coordinate system tailored to seismic interpretation In certain seismic data processing and interpretation tasks, such as spiking deconvolution, tuning analysis, impedance inversion, spectral decomposition, etc., it is commonly assumed that the vertical direction is normal to reflectors. This assumption is false in the case of dipping layers and may therefore lead to inaccurate results. To overcome this limitation, we propose a coordinate system in which geometry follows the shape of each reflector and the vertical direction corresponds to normal reflectivity. We call this coordinate system stratigraphic coordinates. We develop a constructive algorithm that transfers seismic images into the stratigraphic coordinate system. The algorithm consists of two steps. First, local slopes of seismic events are estimated by planewave destruction; then structural information is spread along the estimated local slopes, and horizons are picked everywhere in the seismic volume by the predictivepainting algorithm. These picked horizons represent level sets of the first axis of the stratigraphic coordinate system. Next, an upwind finitedifference scheme is used to find the two other axes, which are perpendicular to the first axis, by solving the appropriate gradient equations. After seismic data are transformed into stratigraphic coordinates, seismic horizons should appear flat, and seismic traces should represent the direction normal to the reflectors. Immediate applications of the stratigraphic coordinate system are in seismic image flattening and spectral decomposition. Synthetic and real data examples demonstrate the effectiveness of stratigraphic coordinates. Diffraction imaging of carbonate reservoirs
A new paper is added to the collection of reproducible documents:
Carbonate reservoir characterization using seismic diffraction imaging Although extremely prolific worldwide, carbonate reservoirs are challenging to characterize using traditional seismic reflection imaging techniques. We use computational experiments with synthetic models to demonstrate the possibility seismic diffraction imaging has of overcoming common obstacles associated with seismic reflection imaging and aiding interpreters of carbonate systems. Diffraction imaging improves the horizontal resolution of individual voids in a karst reservoir model and identification of heterogeneous regions below the resolution of reflections in a reservoir scale model. Signal and noise orthogonalization
A new paper is added to the collection of reproducible documents:
Random noise attenuation using local signalandnoise orthogonalization We propose a novel approach to attenuate random noise based on local signalandnoise orthogonalization. In this approach, we first remove noise using one of the conventional denoising operators, and then apply a weighting operator to the initially denoised section in order to predict the signalleakage energy and retrieve it from the initial noise section. The weighting operator is obtained by solving a leastsquares minimization problem via shaping regularization with a smoothness constraint. Next, the initially denoised section and the retrieved signal are combined to form the final denoised section. The proposed denoising approach corresponds to orthogonalizing the initially denoised signal and noise in a local manner. We evaluate denoising performance by using local similarity. In order to test the orthogonalization property of the estimated signal and noise, we calculate the local similarity map between the denoised signal section and removed noise section. Low values of local similarity indicate a good orthogonalization and thus a good denoising performance. Synthetic and field data examples demonstrate the effectiveness of the proposed approach in applications to noise attenuation for both conventional and simultaneoussource seismic data. Tuesday, March 24. 2015wxPython
There are many different libraries for GUI (graphical user interfaces), many of them with Python bindings: PyGTK, PyQt, PySide, etc. Tkinter is one of the oldest Python GUI libraries and is considered to be the standard one. Another popular choice is wxPython, a Python interface for wxWidgets C++ library.
A quick example of wxPython is provided in wxvpconvert, a silly GUI for Madagascar's vpconvert script. Compare with tkvpconvert. See also: Sunday, March 22. 2015Passive seismic imaging
Another old paper is added to the collection of reproducible documents:
Passive seismic imaging applied to synthetic data It can be shown that for a 1D Earth model illuminated by random plane waves from below, the crosscorrelation of noise traces recorded at two points on the surface is the same as what would be recorded if one location contained a shot and the other a receiver. If this is true for real data, it could provide a way of building `pseudoreflection seismograms' from background noise, which could then be processed and used for imaging. This conjecture is tested on synthetic data from simple 1D and point diffractor models, and in all cases, the kinematics of observed events appear to be correct. The signal to noise ratio was found to increase as , where is the length of the time series. The number of incident plane waves does not directly affect the signal to noise ratio; however, each plane wave contributes only its own slowness to the common shot domain, so that if complete hyperbolas are to be imaged then upcoming waves must be incident from all angles. Tuesday, March 10. 2015AVO of methane hydrates
Another old paper is added to the collection of reproducible documents:
Seismic AVO analysis of methane hydrate structures Marine seismic data from the Blake Outer Ridge offshore Florida show strong ``bottom simulating reflections'' (BSR) associated with methane hydrate occurence in deep marine sediments. We use a detailed amplitude versus offset (AVO) analysis of these data to explore the validity of models which might explain the origin of the bottom simulating reflector. After careful preprocessing steps, we determine a BSR model which can successfully reproduce the observed AVO responses. The P and Svelocity behavior predicted by the forward modeling is further investigated by estimating the P and Simpedance contrasts at all subsurface positions. Our results indicate that the Blake Outer Ridge BSR is compatible with a model of methane hydrate in sediment, overlaying a layer of free methane gassaturated sediment. The hydratebearing sediments seem to be characterized by a high Pwave velocity of approximately 2.5 km/s, an anomalously low Swave velocity of approximately 0.5 km/s, and a thickness of around 190 meters. The underlaying gassaturated sediments have a Pwave velocity of 1.6 km/s, an Swave velocity of 1.1 km/s, and a thickness of approximately 250 meters. Monday, March 9. 2015Tutorial on tuning and AVO
The example in rsf/tutorials/tuning reproduces the tutorial from Wes Hamlyn on thinbed tuning and AVO analysis in seismic interpretation. The tutorial was published in the December 2014 issue of The Leading Edge.
Madagascar users are encouraged to try improving the results. See also: Tuesday, March 3. 2015CiSE Paper on Madagascar Community
The paper Reproducible Research as a Community Effort: Lessons from the Madagascar Project was published in the January/February 2015 issue of Computing in Science and Engineering, a special issue on Scientific Software Communities.
Reproducible research is the discipline of attaching software code and data to publications, which enables the reader to reproduce, verify, and extend published computational experiments. Instead of being the responsibility of an individual author, computational reproducibility should become the responsibility of open source scientificsoftware communities. A dedicated community effort can keep a body of computational research alive by actively maintaining its reproducibility. The Madagascar open source software project offers an example of such a community. Sunday, March 1. 2015Program of the month: sfhistogram
sfthistogram computes a histogram for distribution of values in the input dataset.
The following example from rsf/rsf/sfnoise plots the histogram of a normallydistributed random noise: The output of sfhistogram contains integer values arranged in a onedimensional array. The sampling is specified by n1=, d1=, and o1= parameters. 10 previous programs of the month:Saturday, January 31. 2015Acoustic staggered grid in IWAVE
A new paper is added to the collection of reproducible documents:
Acoustic staggered grid modeling in IWAVE IWAVE is a framework for timedomain regular grid finite difference and finite element methods. The IWAVE package includes source code for infrastructure component, and implementations of several wave physics modeling categories. This paper presents two sets of examples using IWAVE acoustic staggered grid modeling. The first set illustrates the effectiveness of a simple version of Perfectly Matched Layer absorbing boundary conditions. The second set reproduce illustrations from a recent paper on error propagation for heterogeneous medium simulation using finite differences, and demostrate the interface error effect which renders all FD methods effectively firstorder accurate. The source code for these examples is packaged with the paper source, and supports the user in duplicating the results presented here and using IWAVE in other settings.
(Page 1 of 29, totaling 431 entries)
» next page

CalendarQuicksearchArchivesTop ExitsSyndicate This BlogBlog AdministrationCategoriesLast Search (Google, Yahoo, Bing, Scroogle)Creative Commons 