**Uncertainty Quantification Working Group**

January 22, 2004, 11:30 AM, CNLS Conf. Room, TA-3, Bldg. 1690

## Tutorial on the Kalman filter and the extended Kalman filter - Part I

John M. Finn, T-15

(slides)

### Abstract

In this first of two lectures on Kalman Filtering, I will
introduce the concepts by means of very simple examples. My first
approach will be by means of least squares applied to find the optimal
estimate and its covariance matrix for several examples with Gaussian
noise. These examples include sampling statistics and the random
walk, cases with measurement noise and dynamical noise, respectively.
I will show how the least-squares results can be put in recursive
Kalman Filter form. This form has the advantage that an update to the
estimate and its covariance matrix can be made in terms of the
previous estimate, the previous covariance matrix, and the new data,
and discuss how this form is useful in control and guidance. I will
also show how the same results can be obtained by finding the estimate
with the minimum variance.

In the second lecture I will show extensions of these basic ideas.
These will include 1) generalization to a system consisting of an n'th
order linear stochastic system (i.e. with dynamical noise) plus
measurement noise, 2) treatment in terms of conditional probabilities,
and 3) the nonlinear or Extended Kalman Filter.