Probabilistic Robotics
  • Introduction
  • Basics
    • Recursive State Estimation
      • Basic Concepts in Probability
      • Robot Environment Interaction
      • Bayes Filters
    • Gaussian Filters
      • Kalman Filter
      • Extended Kalman Filter
    • Nonparametric Filters
      • Histogram Filter
      • Particle Filter
    • Robot Perception
      • Beam Models of Range Finders
      • Likelihood Fields for Range Finders
  • Localization
    • Taxonomy of Localization Problems
    • Grid and Monte Carlo
      • Monte Carlo Localization
    • Markov and Gaussian
  • Projects
    • Mislocalization Heatmap
Powered by GitBook
On this page

Was this helpful?

  1. Basics

Gaussian Filters

PreviousBayes FiltersNextKalman Filter

Last updated 5 years ago

Was this helpful?

We are going to introduce an important family of recursive state estimators, collectively called Gaussian filters. Gaussian techniques all share the basic idea that beliefs are represented by multivariate normal distributions.

p(x)=det(2πΣ)−0.5exp⁡(−12(x−μ)TΣ−1(x−μ))p(x) = det(2\pi\Sigma)^{-0.5} \exp\left(\frac{-1}{2}(x-\mu)^{T}\Sigma^{-1}(x-\mu)\right)p(x)=det(2πΣ)−0.5exp(2−1​(x−μ)TΣ−1(x−μ))

The density over the variable xxx is characterized by two sets of parameters. The mean μ\muμ and covariance matrix Σ\SigmaΣ. Gaussians are unimodal; they possess a single maximum. This may be suitable for some problems but may not be appropriate for problems that exist many distinct hypotheses.

Parameterization

The parameterization of a Gaussian by its mean and covariance is called the moments parametrization. This is because the mean and covariance are the first and second moments of a probability distribution. There is an alternative parameterization called canonical parameterization which will be discussed later.