A numerical tour of wave propagation

Next: Conjugate gradient (CG) implementation Up: Full waveform inversion (FWI) Previous: Full waveform inversion (FWI)

## The Newton, Gauss-Newton, and steepest-descent methods

In terms of Eq. (64),

 (71)

That is to say,

 (72)

where takes the real part, and is the Jacobian matrix, i.e., the sensitivity or the Fréchet derivative matrix.

Differentiation of the gradient expression (71) with respect to the model parameters gives the following expression for the Hessian :

 (73)

In matrix form

 (74)

In many cases, this second-order term is neglected for nonlinear inverse problems. In the following, the remaining term in the Hessian, i.e., , is referred to as the approximate Hessian. It is the auto-correlation of the derivative wavefield. Eq. (68) becomes

 (75)

The method which solves equation (74) when only is estimated is referred to as the Gauss-Newton method. To guarantee th stability of the algorithm (avoiding the singularity), we can use , leading to

 (76)

Alternatively, the inverse of the Hessian in Eq. (68) can be replaced by , leading to the gradient or steepest-descent method:

 (77)

At the -th iteration, the misfit function can be presented using the 2nd-order Taylor-Lagrange expansion

 (78)

Setting gives

 (79)

 A numerical tour of wave propagation

Next: Conjugate gradient (CG) implementation Up: Full waveform inversion (FWI) Previous: Full waveform inversion (FWI)

2021-08-31