Mislocalization Heatmap
General Math Concepts
Joint Distribution
The joint distribution of two random variables X and Y are written as follows.
If they are independent,
If they are conditioned,
Theorem of Total Probability
Bayes' Rule
Since we know x and y are conditioned on each other, we can use the theorem of total probability to express Bayes' rule.
In discrete form,
In integral form,
Prior & Posterior Distribution
If x is a quantity that we would like to infer from y, then
p(x) is called prior probability distribution.
p(x∣y) is called posterior probability distribution over X.
p(y∣x) is called generative model.
In general Y is called data, e.g. range finder laser measurements or control actions.
Glossary
Let xt denote robot state at time t.
Let ut denote control action we apply to a robot at time t.
Let zt denote measurement at time t.
State Transition Probability
State transition probability describes what is the likelihood of producing a new state xt given that previous state xt−1 and control action ut.
Measurement Probability
Measurement probability describes what is the likelihood of seeing a set of measurements, given the current state xt.
Belief
A belief reflects the robot's internal knowledge about the state of the environment. A belief distribution assigns a probability to each possible hypothesis with regards to the true state. Belief distributions are posterior probabilities over state variables conditioned on the available data.
Using zero indices, a belief is described as follows.
For each time step, before we incorporate the measurement data, we would like to make a prediction. The prediction belief is described as follows.
Calculating a belief from a prediction belief is called correction or measurement update.
Bayes Filter
The general Bayes filter involves two steps.
Generate prediction of current state xt using previous state xt−1 and control action ut.
Perform correction, also known as measurement update, by incorporating zt.
Prediction
Formally speaking, it is impossible to know the true state xt, at best we can only describe what we know about the current state or previous state as a probability density function, denote as bel(xt).
Prediction step is also called control update.
Measurement Update
As noted before, the final belief function is a probability density function that tells you what is the probability for the random variable X to take the value of xt.
MCL Particle Filter
Monte Carlo Localization Particle Filter is an algorithm derived from the Bayes filter, suitable for representing beliefs that cannot be modeled by Gaussian or other parametric models.
Parametric model is a class of probability distributions that has a finite number of parameters.
Particle filters represent beliefs by a cluster of particles. It usually involves 4 major steps.
Initialize a set of M particles.
Iterate through each particle, for m=1 to m=M.
Perform control update on particle pm.
Perform measurement update on particle pm.
Compute weight wmof the particle.
Add pm to a sample set.
Iterate M times.
Draw pm from sample set with probability proportional to wm with replacement.
Add pm to the final sample set.
Return final sample set, which should have length M.
Repeat step 2 to step 4 for subsequent control and measurement updates.
Last updated