next up previous contents
Next: Interpolation Up: Solver - inverse transforms Previous: Solver - inverse transforms   Contents

Least squares

For a comparative low polynomial degree $ \vert I_{\ensuremath{\boldsymbol{N}}}\vert<M$ the system (3.2) is over-determined, so that in general the given data $ y_j\in
\ensuremath{\mathbb{C}},\;j=0,\hdots,M-1,$ will be only approximated up to the residual $ \ensuremath{\boldsymbol{r}}:=\ensuremath{\boldsymbol{f}}-\ensuremath{\boldsymbol{A}} \ensuremath{\boldsymbol{\hat f}}$ . One considers the weighted approximation problem

$\displaystyle \Vert\ensuremath{\boldsymbol{y}} - \ensuremath{\boldsymbol{A}} \e...
...t^2\right)^{1/2} \stackrel{\ensuremath{\boldsymbol{\hat f}}}{\rightarrow} \min,$    

which may incorporate weights $ w_j> 0$ , $ \ensuremath{\boldsymbol{W}}:={\, \rm diag}(w_j)_{j=0,\hdots,M-1}$ , to compensate for clusters in the sampling set $ {\cal X}$ . This least squares problem is equivalent to the weighted normal equation of first kind

$\displaystyle \ensuremath{\boldsymbol{A}}^{{\vdash \hspace*{-1.72mm} \dashv}} \...
...ace*{-1.72mm} \dashv}} \ensuremath{\boldsymbol{W}} \ensuremath{\boldsymbol{y}}.$ (3.3)

Applying the Landweber (also known as Richardson, ...), the steepest descent, or the conjugate gradient scheme to (3.3) yields the following Algorithms 5-7.
\begin{algorithm}
% latex2html id marker 1213
[ht!]
\caption{Landweber}
Inpu...
...\end{algorithmic} Output: $\ensuremath{\boldsymbol{\hat f}}_{l}$
\end{algorithm}


\begin{algorithm}
% latex2html id marker 1242
[ht!]
\caption{Steepest descent}
...
...end{algorithmic} Output: $\ensuremath{\boldsymbol{\hat f}}_{l}$
\end{algorithm}


\begin{algorithm}
% latex2html id marker 1277
[ht!]
\caption{Conjugate gradient...
...\end{algorithmic} Output: $\ensuremath{\boldsymbol{\hat f}}_{l}$
\end{algorithm}



Jens Keiner 2006-11-20