Regression analysis

Let's first review the classic stationary regression theory. Let $d(t)$ be a time series, it can be represented in the norm of $b_n(t)(n=1,2,\cdots,N)$ (called basis function) in the least square criteria:

$\displaystyle \min\parallel d(t)-\sum_{n=1}^{N}a_nb_n(t)\parallel_2^2,$ (1)

where $a_n$ is the regressive coefficient and $\parallel \parallel_2^2$ denotes the squared L2 norm of a function. In the non-stationary case, the regressive coefficients become variable with time, which can be expressed as:

$\displaystyle \min\parallel d(t)-\sum_{n=1}^{N}a_n(t)b_n(t)\parallel_2^2.$ (2)

The minimization of equation 2 is ill-posed for the reason that more unknown variables than given variables need to be found. In the theory of SDRNAR, Fomel (2013) used shaping regularization to constrain equation 2.