29
The Unscented Particle Filter 2000/0 9/29 이 이이

The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Embed Size (px)

DESCRIPTION

Extended Kalman filter –linearize the measurements and evolution models using Taylor series Unscented Kalman Filter –not apply to general non Gaussian distribution Seq. Monte Carlo Methods : Particle filters –represent posterior distribution of states. –any statistical estimates can be computed. –deal with nonlinearities distribution

Citation preview

Page 1: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

The Unscented Particle Filter

2000/09/2

9이 시은

Page 2: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Introduction

• Filtering– estimate the states(parameters or hidden variable) a

s a set of observations becomes available on-line• To solve it

– modeling the evolution of the system and noise• Resulting models

– non-linearity and non-Gaussian distribution

Page 3: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

• Extended Kalman filter– linearize the measurements and evolution models u

sing Taylor series• Unscented Kalman Filter

– not apply to general non Gaussian distribution• Seq. Monte Carlo Methods : Particle filters

– represent posterior distribution of states.– any statistical estimates can be computed.– deal with nonlinearities distribution

Page 4: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

• Particle Filter– rely on importance sampling– design of proposal distribution

• Proposal for Particle Filter– EKF Gaussian approximation– UKF proposal

• control rate at which tails go to zero• heavy tailed distribution

Page 5: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Dynamic State Space Model

• Transition equation and a measurement’s equation

• Goal – approximate the posterior – one of marginals, filtering density recursively

)|( 1tt xxp)|( tt xyp

)|( :1:0 tt yxp)|( :1 tt yxp

Page 6: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Extended Kalman Filter

• MMSE estimator based on Taylor expansion of nonlinear f and g around estimate of state tx1| ttx

Page 7: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Unscented Kalman Filter

• Not approximate non-linear process and observation models

• Use true nonlinear models and approximate distribution of the state random variable

• Unscented transformation

Page 8: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Particle Filtering

• Not require Gaussian approximation • Many variations, but based on sequential im

portance sampling– degenerate with time

• Include resampling stage

Page 9: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Perfect Monte Carlo Simulation

• A set of weighted particles(samples) drawn from the posterior

• Expectation

)(1)|(ˆ :01

:1:0 :0)( t

N

ixtt dx

Nyxp

ti

ttttttt dxyxpxgxgE :0:1:0:0:0 )|()())((

Page 10: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

)(1))(( )(:0

1:0

it

N

ittt xg

NxgE

))(())(( :0.

:0 ttsa

tt xgExgE

)))((var,0()))(())((( :0)|(:0:0 :1 ttypNtttt xgNxgExgENt

Page 11: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Bayesian Importance Sampling

• Impossible to sample directly from the posterior

• sample from easy-to-sample, proposal distribution )|( :1:0 tt yxq

tttt

tttt

tttttt

ttttt

ttttt

tttttt

dxyxqypxwxg

dxyxqyxqypxpxypxg

dxyxqyxqyxpxgxgE

:0:1:0:1

:0:0

:0:1:0:1:0:1

:0:0:1:0

:0:1:0:1:0

:1:0:0:0

)|()()()(

)|()|()()()|()(

)|()|()|()())((

Page 12: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

))(())()((

)|()(

)|()()(

)|()|()()|(

)|()()(

)|()()()(

1))((

:0)|(

:0:0)|(

:0:1:0:0

:0:1:0:0:0

:0:1:0

:1:0:0:0:1

:0:1:0:0:0

:0:1:0:0:0:1

:0

:1

:1

ttyq

ttttyq

ttttt

ttttttt

ttt

ttttt

ttttttt

tttttttt

tt

xwExgxwE

dxyxqxw

dxyxqxwxg

dxyxqyxqxpxyp

dxyxqxwxg

dxyxqxwxgyp

xgE

t

t

)(~)(

)(/1

)()(/1))((

:0)(

1:0

)(

1 :0)(

1:0

)(:0

)(

:0

ti

N

itt

it

N

i ti

t

N

it

itt

it

tt

xwxg

xwN

xwxgNxgE

Page 13: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

• Asymptotic convergence and a central theorem for under the following assumptions– i.i.d samples drawn from the proposal, su

pport of the proposal include support of posterior and finite exists.

– Expectation of , exist and are finite.

))(( :0 tt xgE

tix :0)(

))(( :0 tt xgE)( :0

2ttt xgwtw

) ( ~ 1 ) | ( ˆ: 01

) (:1 : 0: 0

) (t

N

ix

it t tdx w

Ny x pt

i

Page 14: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Sequential Importance Sampling

• Proposal distribution

• assumption – state: Markov process– observations: independent given states

),|()|()|( :11

1:1:10:1:0 j

t

jjjttt yxxqyxqyxq

Page 15: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

– we can sample from the proposal and evaluate likelihood and transition probability, generate a prior set of samples and iteratively compute the importance weights

),|()|()|(

),|()|()()|(

:11:0

11

1:11:01:11:0

:0:0:1

ttt

ttttt

ttttt

tttt

yxxqxxpxypw

yxxqyxqxpxypw

Page 16: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Choice of proposal distribution

• Minimize variance of the importance weights

• popular choice

• move particle towards the region of high likelihood

)|()|( :1,1:0:1,1:0 tttttt yxxpyxxq

)|()|( 1:1,1:0 ttttt xxpyxxq

Page 17: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Degeneracy of SIS algorithm

• Variance of importance ratios increases stochastically over time

Page 18: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Selection(Resampling)• Eliminate samples with low importance ratios

and multiply samples with high importance ratios.

• Associate to each particle a number of children

tix :0)(

NNN N

i ii 1,

Page 19: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

SIR and Multinomial sampling

• Mapping Dirac random measure onto an equally weighted random measure

• Multinomial distribution

}~{ ,:0)(

tti wx

}/1{ ,:0)( Nx t

i

Page 20: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Residual resampling

• Set• perform an SIR procedure to select

remaining samples with new weights• add the results to the current

)(~~ iti wNN

iN~

Page 21: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Minimum variance samplingWhen to sample

Page 22: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Generic Particle Filter

1. Initialization t=02. For t=1,2, …

(a) Importance sampling stepfor I=1, …N, sample:

evaluate importance weightnormalize the importance weights

(b) Selection (resampling)(c) output

~ˆ )(itx

Page 23: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Improving Particle Filters

• Monte Carlo(MC) assumption – Dirac point-mass approx. provides an adequate

representaion of posterior • Importance sampling(IS) assumption

– obtain samples from posterior by sampling from a suitable proposal and apply importance sampling corrections.

Page 24: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

MCMC Move Step• Introduce MCMC steps of invariant distribution• If particles are distributed according to the poste

rior then applying a Markov chain transition kernel

)|~( :1:0 tt yxp)~|( :0:0 tt xxK

Page 25: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Designing Better Importance Proposals

• Move samples to regions of high likelihood• prior editing

– ad-hoc acceptance test of proposing particles• Local linearization

– Taylor series expansion of likelihood and transition prior

– ex)– improved simulated annealed sampling algorithm

)ˆ,()|( )()(:1

)(,1:0

)( it

itt

it

it PxNyxxq

Page 26: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Rejection methods

• If likelihood is bounded, sample from optimal importance distribution

ttt Mxyp )|(

Page 27: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Auxiliary Particle Filters

• Obtain approximate samples from the optimal importance distribution by an auxiliary variable k.

• draw samples from joint distribution

Page 28: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Unscented Particle Filter

• Using UKF for proposal distribution generation within a particle filter framework

Page 29: The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes

Theoretical Convergence

• Theorem1If importance weight is upper bounded for any and if one o

f selection schemes, then for all , there exists independent of N s.t. for any

Nf

cydxpxfxfN

E tt

N

itttt

itt

22

1:1:0:0

)(:0 )|()()(1

)|()|()|(

:1,1:0

1

ttt

ttttt yxxq

xxpxypw

),( 1 tt yx