# Particle Filtering

AI and Robotics

Nov 24, 2013 (4 years and 5 months ago)

117 views

Advanced Topics in Statistical Signal Processing

Particle Filtering

By Steffen Barembruch

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Particle Filtering

A Bayesian Approach

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

I. Introduction

Why Particle Filtering?

Particle Filtering is very powerful for nonlinear and non
-
Gaussian systems.

Used in sequential signal processing.

Wide range of applications.

Alternative to the (Extended) Kalman Filter.

Discrete Approximation of the probability distribution,
rather than (linear) approximation of the model.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

What is Particle Filtering?

)
ˆ
(

P

ˆ
θ

P(
θ
)

Parameter

space

Prior

Knowledge

X

Observe data

Estimate

2
ˆ

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

What is Particle Filtering?

Particle Filtering is based on Bayesian Inference.

Theory of conditional probabilities.

Goal: get information on the distribution of the random
quantity
θ
.

Final estimate: Probability distribution of
θ

given the
data
y
t
, this is

where

Distribution approximations computed with sequential
importance sampling, and sometimes Bootstrap
methods.

)
|
(
:
0
n
y
P

}
0
,
{
:
0
n
t
y
y
t
n

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

What is Particle Filtering?

Distribution approximated by discrete random measures
composed of particles (= samples from the state space).

Weights (= Probabilities) are assigned to the particles
computed with Bayesian Inference.

In sequential signal processing, the distribution of the
signal x
t

is derived sequentially.

The distribution of x
n

is approximated with the help of
the previously derived distribution of x
n
-
1
. Given the
measured values y
o:n

and the signals x
0:n
-
1

we get

)
,
|
(
1
:
0
:
0

n
n
n
x
y
x
P
Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Applications

Blind equalization for channels

Positioning

Tracking

Wireless communication

...

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

II. Bayesian Inference

Two main differences to classical statistics

Quantity of interest considered as a random variable.
Classically assumed to be deterministic.

Estimate is a probability distribution rather than a single
value.

The statistician must specify prior knowledge about the
estimate.

This Prior is subjective and may e.g. reflect scepticism
concerning a sample estimate.

More prior knowledge → less data needed for same
performance.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Difference to Bootstrap Methods

With Bootstrap methods the distribution of the data a
priori completely unknown and approximated.

Bayesian Theory: necessary to know the class of
distribution of the data (e.g. Gaussian, Binomial,
Uniform or any other class of distributions).

Inference only made on the parameters of the
distribution (e.g. the mean and the variance for a
Gaussian distribution).

Posterior distribution gives information about how the
parameters are distributed, rather than the distribution of
the data.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

The (discrete) Bayes‘ Theorem

Conditional probability of A given B

Bayes‘ Theorem

or equivalently

)
(
)
(
)
|
(
B
P
B
A
P
B
A
P

)
(
)
(
)
|
(
)
|
(
B
P
A
P
A
B
P
B
A
P

)
(
)
|
(
)
|
(
A
P
A
B
P
B
A
P

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Bayes‘ Theorem

Bayes‘ Theorem may be extended to the continuous
case:

P(
θ
) is called the Prior (distribution). It reflects a
priori knowledge.

P(X|
θ
) is the likelihood function of the data.

P(
θ
|X) is the Posterior (distribution). Product of Prior
and likelihood of the data.

f(x) is a normalization constant.

)
(
)
|
(
)
|
(
)
(
)
(
)
|
(
)
|
(

P
X
P
X
P
X
f
P
X
P
X
P

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Deriving the Posterior

Before observing the data all the knowledge is
contained in the Prior.

After obtaining data the Prior is updated with the
information contained in the data.

If nothing is known a priori a vague prior can be used,
e.g. a Uniform distribution.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Example

Consider the stochastic process X(t) = A + e(t), where
e(t) Gaussian noise with known variance
σ
2

and mean
0.

Inference shall be made on A.

Assume person 1 places a Gaussian prior
p
1
(A)
with
mean
-
4 and person 2 chooses a Gaussian prior
p
2
(A)

with mean 4.

The posterior distribution P(A|X
0:n
) is (with Gaussian
likelihood and Gaussian Prior) also Gaussian.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Example

Priors p
1
(A) and p
2
(A):

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Example

)
|
(
50
:
0
X
A
P
)
|
(
3
:
0
X
A
P
Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Importance Sampling

Complicated Priors and complicated Likelihoods result
in complicated Posterior distributions.

Estimates might not be analytically derivable.

In that case the distribution is approximated with the
help of the particles.

If the posterior distribution is not too complicated the
particles can be directly sampled from the posterior.

Direct Sampling is not applicable in practical problems
because of too complicated distributions or because of
inefficiency.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

How does Importance Sampling work?

Samples x
(m)

drawn from an arbitrary distribution
function, called Importance function f(x).

Support of the Importance Function needs to include
the support of the Posterior.

Weights (= Probabilities) are assigned to the Samples.

The weights are computed as

The closer the Importance Function is to the Posterior,
the better the approximation is.

)
(
)
(
)
(
)
(
)
(
m
m
m
x
f
x
P
w

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

The result of Importance sampling

Discrete probability space (with the normalized
weights)

Discrete approximation of the Posterior distribution.

Statistical quantities (e.g. mean, variance) may now be
approximated in the discrete probability space.

i.e. Integrations are simplified to sums.

)
(
)
(
~
,
m
m
w
x
Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Sequential Importance Sampling

In the context of Particle Filtering, a sequence of
parameters has to be estimated, i.e. a the signal x
n
.

The Importance Sampling is conducted sequentially.

When sampling the probability of x
n
, the probability
distributions of x
0:n
-
1

are used.

0:n
-
1

function for x
n
.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Tracking

What is Tracking?

Tracking means to find some state parameters of an
object, e.g. airplane

The state parameters might be position, speed,
acceleration.

What must be given?

A model for the evolution of the state with time.

Usually in tracking: A Markov equation, nonlinear

A model relating the noisy measurements to the state.

The distribution of the noise in the system, not
necessarily Gaussian.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

State model

Evolution of the state sequence

Sequence of the measurements

v
k
, n
k

is noise, f
k

and h
k

are some transformations.

,...
1
,
0
,

k
x
k
)
,
(
k
k
k
k
n
x
h
y

)
,
(
1
1

k
k
k
k
v
x
f
x
Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Example for a state model

Quantity of interest: Position p
t

Input measurements: speed

Sample period: T

t
t
t
t
Tf
Tv
p
p

1
Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Bayesian Tracking

Get knowledge on x
t

given the measurements y
0:t
.
Posterior distribution

Prior p(x
0
) is assumed to be available

Posterior is computed recursively

where the left part of the product corresponds to the
distribution of the noise and the right part (prediction)
can be obtained via integration.

)
|
(
:
0
t
t
y
x
P
)
|
(
)
|
(
)
|
(
1
:
1
:
1

k
k
k
k
k
k
y
x
P
x
y
P
y
x
P
Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Implementation with Importance Sampling

Several implementations of the Bayes‘ approach for the
tracking problem

Posterior approximated by discrete random measures
with Importance Sampling.

Main difference between the implementations: Choice of
the importance function.

accuracy of the importance function.

In some implementations resampling is done to improve
on the number of relevant particles

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

V. Summary

Quite powerful even in nonlinear systems or non
-
Gaussian noise.

A priori knowledge may be included in the Prior.

Model not linearized around current estimates.

In several cases better performance than the extended
Kalman Filter.

Integrations reduce to sums.

Particle
Filtering

Introduction

Bayesian
Inference

Importance
Sampling

Tracking

Summary

Advanced Topics in Statistical Signal Processing

Steffen Barembruch

Summary

Very high computational complexity.

A Prior must be included.

The distributions of the noise, state dynamics,
measurement functions must be known.

The likelihood function needs to be available for
pointwise evaluation.

Degeneracy Problem.

Sample Impoverishment.

Not powerful in high state dimensions.