Particle Filter
Last updated
Last updated
The particle filter is just like histogram filter, it approximate the posterior by a finite number of parameters. However, they differ in the way these parameters are generated, and in which they populate the state space.
The key idea of the particle filter is to represent the posterior by a set of random state samples drawn from this posterior. Such representation is approximate, but it is non-parametric, and therefore can represent a much broader space of distributions than, for example, Gaussian. Another advantage of the sample based representation is its ability to model nonlinear transformations of random variables.
In particle filter, the samples of a posterior distribution are called particles.
Each particle is a concrete instantiation of the state at time . In other words, a particle is a hypothesis as to what the true world state may be at time . Here denotes the number of particles in the particle set .
The intuition behind particle filter is to approximate the belief by the set of particles . Ideally, the likelihood for a state hypothesis to be included in the particle set shall be proportional to its Bayes filter posterior .
Just like all other Bayes filter algorithm, the particle filter algorithm constructs the new belief recursively from the previous belief one time step earlier. Since the beliefs are represented by a set of particles, the filter constructs the new particle set recursively from the previous particle set.
Notice that move
produces a new particle state, it is equivalent to produce a new state given the control which is turn
and forward
, and previous state, which is (self.x, self.y, self.theta)
.
The re-sampling step is a probabilistic implementation of the Darwinian idea of survival of the fittest. It refocuses the particle set to region in state space with high posterior probability. By doing so, it focuses the computational resources of the filter algorithm to regions in the state space where they matter the most.
Think of particles as samples of state sequences. One sequence can be described as follows.
Particle filter calculates the posterior over all state sequences.
Using the same technique from Bayes' filter derivation.
We generate a hypothetical state based on the particle from previous time step and the control at current time step.The set of particles obtained from the first for loop is filter's representation of .
The is known as the importance factor or particle scores. Importance factors are used to incorporate measurements. The importance is the probability of the measurement under the particle . We compute the particle weight (importance) and then add the new particle and its weight to the prediction set, denoted as .
The real trick occurs here which is the re-sampling portion. The algorithm draws with replacement particles from the temporary set . The probability of drawing each particle is given by its importance weight. Re-sampling transforms a particle set into another particle set of the same size. By incorporating the importance weights into the re-sampling process, the distribution of the particles change.
Notice that only state has value at because first control action and measurement data are applied at .
Apply Markov assumption, that is complete and no variables prior to may influence the stochastic evolution of future states. (not yet completed)