Friday, April 17. 2015Seisletbased MCA
A new paper is added to the collection of reproducible documents:
Seisletbased morphological component analysis using scaledependent exponential shrinkage Morphological component analysis (MCA) is a powerful tool used in image processing to separate different geometrical components (cartoons and textures, curves and points etc). MCA is based on the observation that many complex signals may not be sparsely represented using only one dictionary/transform, however can have sparse representation by combining several overcomplete dictionaries/transforms. In this paper we propose seisletbased MCA for seismic data processing. MCA algorithm is reformulated in the shapingregularization framework. Successful seisletbased MCA depends on reliable slope estimation of seismic events, which is done by planewave destruction (PWD) filters. An exponential shrinkage operator unifies many existing thresholding operators and is adopted in scaledependent shaping regularization to promote sparsity. Numerical examples demonstrate a superior performance of the proposed exponential shrinkage operator and the potential of seisletbased MCA in application to trace interpolation and multiple removal. Thursday, April 16. 2015Tutorial on image resolution
The example in rsf/tutorials/images reproduces the tutorial from Matt Hall on playing with image resolution. For more explanation, see Matt's blog post R is for Resolution.
Madagascar users are encouraged to try improving the results. Monday, April 13. 2015madagascar1.7 released
The 1.7 stable release features 21 new reproducible papers and multiple other enhancements including improved tools for parallel computing developed at the Second Working Workshop.
According to the SourceForge statistics, the previous 1.5 stable distribution has been downloaded more than 4,000 times. The top country (with 24% of all downloads) was USA, followed by China, Colombia, Germany, and Brazil. According to Openhub.net (last updated in January 2015), the year of 2014 was a period of a high development activity, with 33 contributors making 1,876 commits to the repository (up 16% from the previous year). Openhub.net says that Madagascar "has a well established, mature codebase maintained by a very large development team with stable yearoveryear commits " and estimated 239 manyears of effort (an estimated development cost of $13 million). Thursday, April 2. 2015How to make your research irreproducible
Yesterday (April 1, 2015) a group of computer scientists from UK (Neil Chue Hong, Tom Crick, Ian Gent, and Lars Kotthoff) announced a seminal paper Top Tips to Make Your Research Irreproducible.
Here are the tips that the authors share: These tips will be undoubtedly embraced by all scientists trying to make their research irreproducible. The paper ends with an important conjecture: We make a simple conjecture: an experiment that is irreproducible is exactly equivalent to an experiment that was never carried out at all. The happy consequences of this conjecture for experts in irreproducibility will be published elsewhere, with extremely impressive experimental support. Friday, March 27. 2015Fast 3D velocity scan
A new paper is added to the collection of reproducible documents:
A fast algorithm for 3D azimuthally anisotropic velocity scan Conventional velocity scan can be computationally expensive for largesize seismic data, particularly when the presence of anisotropy requires multiparameter estimation. We introduce a fast algorithm for 3D azimuthally anisotropic velocity scan, which is a generalization of the previously proposed 2D butterfly algorithm for hyperbolic Radon transform. To compute the semblance in a twoparameter residual moveout domain, the numerical complexity of our algorithm is roughly as opposed to of the straightforward velocity scan, with being representative of the number of points in either dimension of data space or parameter space. We provide both synthetic and fielddata examples to illustrate the efficiency and accuracy of the algorithm. Multiple suppression using PEF
Another old paper is added to the collection of reproducible documents:
Multiple suppression using predictionerror filter I present an approach to multiple suppression, that is based on the moveout between primary and multiple events in the CMP gather. After normal moveout correction, primary events will be horizontal, whereas multiple events will not be. For each NMOed CMP gather, I reorder the offset in random order. Ideally, this process has little influence on the primaries, but it destroys the shape of the multiples. In other words, after randomization of the offset order, the multiples appear as random noise. This "manmade" random noise can be removed using predictionerror filter (PEF). The randomization of the offset order can be regarded as a random process, so we can apply it to the CMP gather many times and get many different samples. All the samples can be arranged into a 3D cube, which is further divided into many small subcubes. A 3D PEF can then be estimated from each subcube and reapplied to it to remove the multiple energy. After that, all the samples are averaged back into one CMP gather, which is supposed to be free of multiple events. In order to improve the efficiency of the algorithm of estimating the PEF for each subcube, except for the first subcube which starts with a zerovalued initial guess, all the subsequent subcubes take the last estimated PEF as an initial guess. Therefore, the iteration count can be reduced to one step for all the subsequent subcubes with little loss of accuracy. Three examples demonstrate the performance of this new approach, especially in removing the nearoffset multiples. FWI on GPU
A new paper is added to the collection of reproducible documents:
A graphics processing unit implementation of timedomain fullwaveform inversion The graphics processing unit (GPU) has become a popular device for seismic imaging and inversion due to its superior speedup performance. In this paper we implement GPUbased full waveform inversion (FWI) using the wavefield reconstruction strategy. Because the computation on GPU is much faster than CPUGPU data communication, in our implementation the boundaries of the forward modeling are saved on the device to avert the issue of data transfer between host and device. The ClaytonEnquist absorbing boundary is adopted to maintain the efficiency of GPU computation. A hybrid nonlinear conjugate gradient algorithm combined with the parallel reduction scheme is utilized to do computation in GPU blocks. The numerical results confirm the validity of our implementation. Thursday, March 26. 2015Tutorial on phase and the Hilbert transform
The example in rsf/tutorials/hilbert reproduces the tutorial from Steve Purves on phase and the Hilbert transform. The tutorial was published in the October 2014 issue of The Leading Edge.
Madagascar users are encouraged to try improving the results. See also: Antialiasing in Kirchhoff migration
Another old paper is added to the collection of reproducible documents:
When is antialiasing needed in Kirchhoff migration? We present criteria to determine when numerical integration of seismic data will incur operator aliasing. Although there are many ways to handle operator aliasing, they add expense to the computational task. This is especially true in three dimensions. A twodimensional Kirchhoff migration example illustrates that the image zone of interest may not always require antialiasing and that considerable cost may be spared by not incorporating it. Wednesday, March 25. 2015Stratigraphic coordinates
A new paper is added to the collection of reproducible documents:
Stratigraphic coordinates, a coordinate system tailored to seismic interpretation In certain seismic data processing and interpretation tasks, such as spiking deconvolution, tuning analysis, impedance inversion, spectral decomposition, etc., it is commonly assumed that the vertical direction is normal to reflectors. This assumption is false in the case of dipping layers and may therefore lead to inaccurate results. To overcome this limitation, we propose a coordinate system in which geometry follows the shape of each reflector and the vertical direction corresponds to normal reflectivity. We call this coordinate system stratigraphic coordinates. We develop a constructive algorithm that transfers seismic images into the stratigraphic coordinate system. The algorithm consists of two steps. First, local slopes of seismic events are estimated by planewave destruction; then structural information is spread along the estimated local slopes, and horizons are picked everywhere in the seismic volume by the predictivepainting algorithm. These picked horizons represent level sets of the first axis of the stratigraphic coordinate system. Next, an upwind finitedifference scheme is used to find the two other axes, which are perpendicular to the first axis, by solving the appropriate gradient equations. After seismic data are transformed into stratigraphic coordinates, seismic horizons should appear flat, and seismic traces should represent the direction normal to the reflectors. Immediate applications of the stratigraphic coordinate system are in seismic image flattening and spectral decomposition. Synthetic and real data examples demonstrate the effectiveness of stratigraphic coordinates. Diffraction imaging of carbonate reservoirs
A new paper is added to the collection of reproducible documents:
Carbonate reservoir characterization using seismic diffraction imaging Although extremely prolific worldwide, carbonate reservoirs are challenging to characterize using traditional seismic reflection imaging techniques. We use computational experiments with synthetic models to demonstrate the possibility seismic diffraction imaging has of overcoming common obstacles associated with seismic reflection imaging and aiding interpreters of carbonate systems. Diffraction imaging improves the horizontal resolution of individual voids in a karst reservoir model and identification of heterogeneous regions below the resolution of reflections in a reservoir scale model. Signal and noise orthogonalization
A new paper is added to the collection of reproducible documents:
Random noise attenuation using local signalandnoise orthogonalization We propose a novel approach to attenuate random noise based on local signalandnoise orthogonalization. In this approach, we first remove noise using one of the conventional denoising operators, and then apply a weighting operator to the initially denoised section in order to predict the signalleakage energy and retrieve it from the initial noise section. The weighting operator is obtained by solving a leastsquares minimization problem via shaping regularization with a smoothness constraint. Next, the initially denoised section and the retrieved signal are combined to form the final denoised section. The proposed denoising approach corresponds to orthogonalizing the initially denoised signal and noise in a local manner. We evaluate denoising performance by using local similarity. In order to test the orthogonalization property of the estimated signal and noise, we calculate the local similarity map between the denoised signal section and removed noise section. Low values of local similarity indicate a good orthogonalization and thus a good denoising performance. Synthetic and field data examples demonstrate the effectiveness of the proposed approach in applications to noise attenuation for both conventional and simultaneoussource seismic data. Tuesday, March 24. 2015wxPython
There are many different libraries for GUI (graphical user interfaces), many of them with Python bindings: PyGTK, PyQt, PySide, etc. Tkinter is one of the oldest Python GUI libraries and is considered to be the standard one. Another popular choice is wxPython, a Python interface for wxWidgets C++ library.
A quick example of wxPython is provided in wxvpconvert, a silly GUI for Madagascar's vpconvert script. Compare with tkvpconvert. See also: Sunday, March 22. 2015Passive seismic imaging
Another old paper is added to the collection of reproducible documents:
Passive seismic imaging applied to synthetic data It can be shown that for a 1D Earth model illuminated by random plane waves from below, the crosscorrelation of noise traces recorded at two points on the surface is the same as what would be recorded if one location contained a shot and the other a receiver. If this is true for real data, it could provide a way of building `pseudoreflection seismograms' from background noise, which could then be processed and used for imaging. This conjecture is tested on synthetic data from simple 1D and point diffractor models, and in all cases, the kinematics of observed events appear to be correct. The signal to noise ratio was found to increase as , where is the length of the time series. The number of incident plane waves does not directly affect the signal to noise ratio; however, each plane wave contributes only its own slowness to the common shot domain, so that if complete hyperbolas are to be imaged then upcoming waves must be incident from all angles. Tuesday, March 10. 2015AVO of methane hydrates
Another old paper is added to the collection of reproducible documents:
Seismic AVO analysis of methane hydrate structures Marine seismic data from the Blake Outer Ridge offshore Florida show strong ``bottom simulating reflections'' (BSR) associated with methane hydrate occurence in deep marine sediments. We use a detailed amplitude versus offset (AVO) analysis of these data to explore the validity of models which might explain the origin of the bottom simulating reflector. After careful preprocessing steps, we determine a BSR model which can successfully reproduce the observed AVO responses. The P and Svelocity behavior predicted by the forward modeling is further investigated by estimating the P and Simpedance contrasts at all subsurface positions. Our results indicate that the Blake Outer Ridge BSR is compatible with a model of methane hydrate in sediment, overlaying a layer of free methane gassaturated sediment. The hydratebearing sediments seem to be characterized by a high Pwave velocity of approximately 2.5 km/s, an anomalously low Swave velocity of approximately 0.5 km/s, and a thickness of around 190 meters. The underlaying gassaturated sediments have a Pwave velocity of 1.6 km/s, an Swave velocity of 1.1 km/s, and a thickness of approximately 250 meters.
(Page 1 of 30, totaling 436 entries)
» next page

CalendarQuicksearchArchivesTop ExitsSyndicate This BlogBlog AdministrationCategoriesLast Search (Google, Yahoo, Bing, Scroogle)Creative Commons 