next up previous [pdf]

Next: Autoregressive spectral analysis Up: Fomel: Regularized nonstationary autoregression Previous: Introduction

Regularized nonstationary regression

Regularized nonstationary regression (Fomel, 2009) is based on the following simple model. Let $d(\mathbf{x})$ represent the data as a function of data coordinates $\mathbf{x}$, and $b_n(\mathbf{x})$, $n=1,2,\ldots,N$, represent a collection of basis functions. The goal of stationary regression is to estimate coefficients $a_n$, $n=1,2,\ldots,N$ such that the prediction error

\begin{displaymath}
e(\mathbf{x}) = d(\mathbf{x}) - \sum_{n=1}^{N} a_n\,b_n(\mathbf{x})
\end{displaymath} (1)

is minimized in the least-squares sense. In the case of regularized nonstationary regression (RNR), the coefficients become variable,
\begin{displaymath}
\hat{e}(\mathbf{x}) = d(\mathbf{x}) - \sum_{n=1}^{N} \hat{a}_n(\mathbf{x})\,b_n(\mathbf{x})\;.
\end{displaymath} (2)

The problem in this case is underdetermined but can be constrained by regularization (Engl et al., 1996). I use shaping regularization (Fomel, 2007) to implement an explicit control on the resolution and variability of regression coefficients. Shaping regularization applied to RNR amounts to linear inversion,
\begin{displaymath}
\mathbf{a} = \mathbf{M}^{-1}\,\mathbf{c}\;,
\end{displaymath} (3)

where $\mathbf{a}$ is a vector composed of $\hat{a}_n(\mathbf{x})$, the elements of vector $\mathbf{c}$ are
\begin{displaymath}
c_i(\mathbf{x}) = \mathbf{S}\left[b_i^{*}(\mathbf{x})\,d(\mathbf{x})\right]\;,
\end{displaymath} (4)

the elements of matrix $\mathbf{M}$ are
\begin{displaymath}
M_{ij}(\mathbf{x}) = \lambda^2\,\delta_{ij} +
\mathbf{S...
...thbf{x})\,b_j(\mathbf{x}) -
\lambda^2\,\delta_{ij}\right]\;,
\end{displaymath} (5)

$\lambda$ is a scaling coefficient, and $\mathbf{S}$ represents a shaping (typically smoothing) operator. When inversion in equation 3 is implemented by an iterative method, such as conjugate gradients, strong smoothing makes $\mathbf{M}$ close to identity and easier (taking less iterations) to invert, whereas weaker smoothing slows down the inversion but allows for more details in the solution. This intuitively logical behavior distinguishes shaping regularization from alternative methods (Fomel, 2009).

Regularized nonstationary autoregression (RNAR) corresponds to the case of basis functions being causal translations of the input data itself. In 1D, with $\mathbf{x}=t$, this condition implies $b_n(t) =
d(t-n\,\Delta t)$.


next up previous [pdf]

Next: Autoregressive spectral analysis Up: Fomel: Regularized nonstationary autoregression Previous: Introduction

2013-10-09