Wednesday, November 19. 2014Talitrus saltator
In the excellent reproducible science tutorial at SciPy2014, a reproducible data processing example involved segmenting the eye in an image of Talitrus saltator.
The example is reproduced, with modifications, in rsf/tutorials/talitrus. Madagascar users are encouraged to try improving the results. Thursday, November 13. 2014Madagascar school in Harbin
A Madagascar school will take place on January 78, 2015, in Harbin, China, and will be hosted by the Harbin Institute of Technology (HIT) in conjunction with the International Workshop on Mathematical Geophysics.
More information will be available soon on the school webpage. Wednesday, November 12. 2014Program of the month: sfthreshold
sfthreshold filters the input by soft thresholding (shrinkage).
Soft thresholding is a pointbypoint operation, which can be described mathematically as $\begin{array}{cc}\phantom{\rule{6.0em}{0ex}}& {T}_{\mu}\left[u\right]=\{\begin{array}{ccc}\hfill u\mu \phantom{\rule{0.167em}{0ex}}\text{sign}\left(u\right)& \hfill \phantom{\rule{1.00em}{0ex}}\hfill & \text{if}\phantom{\rule{0.278em}{0ex}}\leftu\right>\mu \hfill \\ \hfill 0& \hfill \phantom{\rule{1.00em}{0ex}}\hfill & \text{if}\phantom{\rule{0.278em}{0ex}}\leftu\right\le \mu \hfill \end{array}\phantom{\}}\hfill \end{array}$ Soft thresholding was analyzed by Donoho (1995) and became particularly popular thanks to the iterative shrinkagethresholding algorithm by Daubechies et al. (2004). Donoho, D. L. (1995). Denoising by softthresholding. Information Theory, IEEE Transactions on, 41(3), 613627. Daubechies, I., Defrise, M., & De Mol, C. (2004). An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Communications on pure and applied mathematics, 57(11), 14131457. The following example from tccs/seislet/lena shows an image (Seismic Lena) and its reconstruction after soft thresholding in the seislet domain using 5% thresholding (pclip=5). sfthreshold uses percentage parameter pclip= to set thresholding at the corresponding quantile of the data values. To do soft or hard thresholding with a fixed threshold, use sfthr. An alternative thresholdinglike operation is provided by sfsharpen. 10 previous programs of the month:Seismic data analysis using SSWT
A new paper is added to the collection of reproducible documents:
Timefrequency analysis of seismic data using synchrosqueezing wavelet transform Timefrequency (TF) decomposition is used for characterizing the nonstationary relation between time and instantaneous frequency, which is very important in the processing and interpretation of seismic data. The conventional timefrequency analysis approaches suffer from the contradiction between time resolution and frequency resolution. A new timefrequency analysis approach is proposed based on the synchrosqueezing wavelet transform (SSWT). The SSWT is an empiricalmodedecompositionlike tool but uses a different approach in constructing the components. With the help of the synchrosqueezing techniques, the SSWT can obtain obvious higher time and frequency resolution. Synthetic examples show that the SSWT based TF analysis can exactly capture the variable frequency components. Field data tests show the potential of the proposed approach in detecting anomalies of highfrequency attenuation and detecting the deeplayer weak signal. Monday, November 10. 2014Deblending using NMO median filtering
A new paper is added to the collection of reproducible documents:
Deblending using normal moveout and median filtering in commonmidpoint gathers The benefits of simultaneous source acquisition are compromised by the challenges of dealing with intense blending noise. In this paper, we propose a processing workflow for blended data. The incoherent property of blending noise in the commonmidpoint (CMP) gathers is utilized for applying median filtering along the spatial direction after normal moveout (NMO) correction. The key step in the proposed workflow is that we need to obtain a precise velocity estimation which is required by the subsequent NMO correction. Because of the intense blending noise, the velocity scan can not be obtained in one step. We can recursively polish both deblended result and velocity estimation by deblending using the updated velocity estimation and velocity scanning using the updated deblended result. We use synthetic and field data examples to demonstrate the performance of the proposed approach. The migrated image of deblended data is cleaner than that of blended data, and is similar to that of unblended data. Journals unite for reproducibility
Simultaneous editorials in Science and Nature state
Reproducibility, rigour, transparency and independent verification are cornerstones of the scientific method. Of course, just because a result is reproducible does not make it right, and just because it is not reproducible does not make it wrong. A transparent and rigorous approach, however, will almost always shine a light on issues of reproducibility. This light ensures that science moves forward, through independent verifications as well as the course corrections that come from refutations and the objective examination of the resulting data. The editorials describe Proposed Principles and Guidelines for Reporting Preclinical Research developed this summer and endorsed by dozens of leading scientific journals publishing in the field of biomedical research. The guidelines focus on the issue of reproducibility of scientific experiments and include provisions for sharing data and software. Nature explains its software sharing policy further in the following statement: Nature and the Nature journals have decided that, given the diversity of practices in the disciplines we cover, we cannot insist on sharing computer code in all cases. But we can go further than we have in the past, by at least indicating when code is available. Accordingly, our policy now mandates that when code is central to reaching a paper’s conclusions, we require a statement describing whether that code is available and setting out any restrictions on accessibility. Editors will insist on availability where they consider it appropriate: any practical issues preventing code sharing will be evaluated by the editors, who reserve the right to decline a paper if important code is unavailable. These changes in publication policies by the leading scientific journals may lead to a fundamental change in scientific standards for reproducibility of computational experiments in different fields. See also:
Sunday, November 9. 2014Robust timetodepth conversion
A new paper is added to the collection of reproducible documents:
A robust approach to timetodepth conversion and interval velocity estimation from time migration in the presence of lateral velocity variations The problem of conversion from timemigration velocity to an interval velocity in depth in the presence of lateral velocity variations can be reduced to solving a system of partial differential equations. In this paper, we formulate the problem as a nonlinear leastsquares optimization for seismic interval velocity and seek its solution iteratively. The input for inversion is the Dix velocity which also serves as an initial guess. The inversion gradually updates the interval velocity in order to account for lateral velocity variations that are neglected in the Dix inversion. The algorithm has a moderate cost thanks to regularization that speeds up convergence while ensuring a smooth output. The proposed method should be numerically robust compared to the previous approaches, which amount to extrapolation in depth monotonically. For a successful timetodepth conversion, imageray caustics should be either nonexistent or excluded from the computational domain. The resulting velocity can be used in subsequent depthimaging model building. Both synthetic and field data examples demonstrate the applicability of the proposed approach. Wednesday, October 22. 2014Tutorial on parameter testing
The example in rsf/tutorials/parameters reproduces the tutorial from Matt Hall on parameter testing.
Madagascar users are encouraged to try improving the results. In his blog post and in the discussion that follows, Matt brings up an interesting question about finding the best way for parameter selection. For the lack of a better approach, parameter selection in seismic attributes is just an interactive game. In the Madagascar version, the key parameter for the Canny edge detector is the amount of prior anisotropicdiffusion smoothing, controlled by the smoothing radius (rect= parameter.) We can do different things with it: for example, make a movie of different images looping through different values of the radius, or, by exposing the parameter to the commandline SCons interface, build a simple GUI script for controlling it. The question posted by Matt waits for a better answer. See also: Saturday, October 18. 2014Tutorial on colormaps
The example in rsf/tutorials/colormaps reproduces the tutorial from Matteo Niccoli on how to evaluate and compare color maps. The tutorial was published in the August 2014 issue of The Leading Edge.
Madagascar users are encouraged to try improving the results. See also:
Several new color palettes have been recently added to Madagascar (thanks to Aaron Stanton): color=seismic (redyellowwhiteblack, popular among seismic interpreters), color=owb (orangewhiteblack), and color=rwb (redwhiteblack). Friday, October 17. 2014Petition to raise awareness about the role of software in researchThe Software Sustainability Institute in the UK has created an online petition to "everyone in the research community", which states "We must accept that software is fundamental to research, or we will lose our ability to make groundbreaking discoveries." 1. We want software to be treated as a valuable research object which befits the same level of investment and effort as any other aspect of the research infrastructure. You can sign the petition at Change.org. Wednesday, October 8. 2014Program of the month: sfsigmoid
sfsigmoid generates a 2D synthetic reflectivity model, created by Jon Claerbout.
One of the first occurrences of this model is in SEP73 sponsor report from 1992, where it appeared in several papers
The model was described as "a synthetic model that illustrates local variation in bedding. Notice dipping bedding, curved bedding, unconformity between them, and a fault in the curved bedding." Later, the sigmoid model made an appearance in Claerbout's book Basic Earth Imaging. The following example from bei/krch/sep73 illustrates the effect of aliasing on Kirchhoff modeling and migration: The model has appeared in numerous other tests. The following example from tccs/flat/flat shows automatic flattening of the sigmoid model by predictive painting. sfsigmoid has several parameters that control the model. The usual n1=, n2=, o1=, o2=, d1=, d2= parameters control the mesh size and sampling, taper= indicates whether to taper the sides of the model, large= controls the length of the synthetic reflectivity series. The program takes no input. 10 previous programs of the month:Highperformance computing and opensource software
A recent Report on High Performance Computing by the US Secretary of Energy Advisory Board contains a bizarre section on open source software, which states
There has been very little open source that has made its way into broad use within the HPC commercial community where great emphasis is placed on serviceability and security. In his thoughtful blog post in response to this report, Will Schroeder, the CEO an cofounder of the legendary Kitware Inc. makes a number of strong points defending the role of open source in the past and future development of HPC. He concludes The basic point here is that issues of scale require us to remove inefficiencies in researching, deploying, funding, and commercializing technology, and to find ways to leverage the talents of the broader community. Open source is a vital, strategic tool to do this as has been borne out by the many OS software systems now being used in HPC application... It’s easy to overlook open source as a vital tool to accomplish this important goal, but in a similar way that open source Linux has revolutionized commercial computing, open source HPC software will carry us forward to meet the demands of increasingly complex computing systems. See also Will Schroeder's presentation The New Scientific Publishers at SciPy2013. Wednesday, September 24. 2014Program of the month: sfmax1
sfmax1 finds local maxima along the first axis of the input. It takes floatingpoint input but outputs complex numbers, where the real part stands for the location of the local minima and the imaginary part stands for the value of the input at local minima.
The number of minima to output is controlled by np= parameter. To control the range for the minima locations (in the case that it is smaller than the full range of the data), use min= and max=. The output is sorted by value so that the largest maxima appear first. Here is a quick example. Let us create some data: bash$ sfmath n1=5 output="sin(x1)" > data.rsf Observing the data values, we can suspect that the local maximum is between 1 and 2. bash$ < data.rsf sfmax1 np=1  sfdisfil sfmax1 uses local parabolic interpolation to locate the minimum at 1.581 with the value of 0.9826. In the following example, from tccs/flat/flat, sfmax1 is used to locate the strongestamplitude horizons for predictive painting. 10 previous programs of the month:Wednesday, August 20. 2014Tutorial on data slicing
The example in rsf/tutorials/slicing reproduces the tutorial from Evan Bianco of simple data slicing.
See also: Madagascar users are encouraged to try improving the results. Iterative deblending using shaping regularization
A new paper is added to the collection of reproducible documents:
Iterative deblending of simultaneoussource seismic data using seisletdomain shaping regularization We introduce a novel iterative estimation scheme for separation of blended seismic data from simultaneous sources. The scheme is based on an augmented estimation problem, which can be solved by iteratively constraining the deblended data using shaping regularization in the seislet domain. We formulate the forward modeling operator in the common receiver domain, where two sources are assumed to be blended using a random timeshift dithering approach. The nonlinear shapingregularization framework offers some freedom in designing a shaping operator to constrain the model in an underdetermined inverse problem. We design the backward operator and the shaping operator for the shaping regularization framework. The backward operator can be optimally chosen as a half of the identity operator in the twosource case, and the shaping operator can be chosen as coherencypromoting operator. Three numerically blended synthetic datasets and one numerically blended field dataset demonstrate the highperformance deblending effect of the proposed iterative framework. Compared with alternative fk domain thresholding and fx predictive filtering, seisletdomain soft thresholding exhibits the most robust behavior.
(Page 1 of 28, totaling 408 entries)
» next page

CalendarQuicksearchTop ExitsSyndicate This BlogBlog AdministrationCategoriesLast Search (Google, Yahoo, Bing, Scroogle)Creative Commons 