Probabilistic Fundamentals in Robotics Nonparametric Filters Basilio Bona DAUIN Politecnico di Torino June 2011 Course Outline Basic mathematical framework Probabilistic models of mobile robots Mobile robot localization problem Robotic mapping Probabilistic planning and control Reference textbook Thrun, Burgard, Fox, Probabilistic Robotics, MIT Press, 2006 http://www.probabilistic robotics.org/ Basilio Bona June 2011 2 1
Probabilistic models of mobile robots Recursive state estimation Basic concepts in probability Robot environment Bayes filters Gaussian filters Kalman filter Extended Kalman Filter Unscented Kalman filter Information filter Nonparametric filters Histogram filter Particle filter Basilio Bona June 2011 3 Introduction Nonparametric filters do not rely on fixed functional form of the posterior probability like Gaussian pdf s They approximate posteriors over continuous state spaces (CSS) with ihfinitely i many values Decomposition of CSS in finitely many regions and representation of the posterior by a histogram: histogram filters Representation of CSS by finitely many samples: particle filters Nonparametric filters represents well multimodal distributions, i.e., distinct hypotheses (as in mobile robotics) Computational complexity is greater than with parametric filters Basilio Bona June 2011 4 2
Histogram filter The state space can be discrete or continuous The random variable X t can take finitely many values Examples of discrete spaces are: Grid element in a grid map: occupied/free Doors: open/closed Terrain slopes: limited/mid/high Terrain characteristics: sand/rock/grass/ Discrete Bayes filters can be used for this type of problems Basilio Bona June 2011 5 Discrete Bayes Filter DBF Discrete probability distribution Basilio Bona June 2011 6 3
Continuous State DBF can be used to approximate continuous state spaces They are also called histogram filters Spaceis divided into mutually non overlapping overlapping intervals (bins or grid elements) Basilio Bona June 2011 7 Approximation If state is discrete, the following pdf s are well defined If state is continuous, approximation is necessary; e.g., Mean value of the state (normalized) Basilio Bona June 2011 8 4
Histogram transformation The histogram of the transformed random variable is computed by passing multiple points from each histogram bin through the nonlinear function Basilio Bona June 2011 9 Decomposition of the state space Bins definition: for the histogram approach a decomposition of the state space is necessary Static decomposition partitions the state in a fixed pre define number of mutually non overlapping subsets: easier to implement, high computational resources Dynamic decomposition adapts the decomposition to the shape of the posterior distribution reduced computational resources, added algorithmic complexity Density trees are an example of dynamic decomposition Basilio Bona June 2011 10 5
Density trees Space decomposition is recursive Adapts the resolution to the posterior probability: the less likely is a region, the coarser the decomposition Compact representation: higher approximation quality with the same number of bins Basilio Bona June 2011 11 Topological and grid maps Basilio Bona June 2011 12 6
Static state and Binary Bayes filter Binary Bayes filters are used when the state is both static and binary Basilio Bona June 2011 13 Binary Bayes filter Basilio Bona June 2011 14 7
Particle Filters: key concepts PF are another nonparametric implementation of the Bayes filter Approximate the posterior by a finite number of parameters (as in histogram filters) Key idea: the posterior belief is represented by a set of state samples drawn from the distribution The state samples are called particles A particle is a hypothesis on what we think the true world state might be at time t The likelihood for a state hypothesis x t to be included in the particle set, shall be proportional to the its Bayes filter posterior bel(x t ) Basilio Bona June 2011 15 Particle filters: examples The denser a region is populated by samples, the more likely is that the true state belongs to this region Normal distribution Multimodal distribution Basilio Bona June 2011 16 8
Particle filters: mathematical description Basilio Bona June 2011 17 Particle filter algorithm Hypothetical state is generated Importance factor, used to incorporate the measurement into the particle set Resampling aka importance sampling M particles are drawn with replacement from the temporary set The probability is given by the importance weight. At the end of the process Basilio Bona June 2011 18 9
Resampling Resampling is an important step to correctly approximate the posterior belief It can be seen as a probabilistic implementation of the survival of the fittest model It concentrates the particles on regions of space that have high posterior probability We will discuss resampling in more details Basilio Bona June 2011 19 Importance sampling Starting with samples coming from a distribution g, we want to compute an expectation over a probability function f Basilio Bona June 2011 20 10
Importance sampling Target distribution This is what we want Basilio Bona June 2011 21 Importance sampling Proposal distribution This is what we have Basilio Bona June 2011 22 11
Importance sampling This is what we obtain Basilio Bona June 2011 23 Practical considerations and properties Density estimation or density extraction: we want continuous description of belief, not discrete approximations given by particles Sampling variance: statistics from particles is different from statistics from original densities Resampling Sampling bias, particle deprivation: not treatedhere Basilio Bona June 2011 24 12
Density estimation Methods: 1. Gaussian approximation. It is simple but captures only approximated distribution; unimodality only 2. Histogram approximation. It can represent multimodal distributions, is computationally highly efficient, the complexity of computing density in any state point is independent of the number of particles 3. Kernel density approximation. Itcan represent multimodal distributions; smoothness and algorithmic simplicity, the complexity of computing density in any state point is linear in the number of particles Basilio Bona June 2011 25 Gaussian approximation Basilio Bona June 2011 26 13
Histogram approximation Basilio Bona June 2011 27 Kernel approximation Basilio Bona June 2011 28 14
Sampling variance 250 particles 25 particles Basilio Bona June 2011 29 Resampling Sampling variance is amplified through repetitive resampling Look at step 3. It may happen that no command signal u t is applied No new states are introduced at successive steps The particles are erased and new ones are not created M identical copies of a single particle will survive Thevariance of the particle set decreases, but the variance of the particle set as an estimator of the true belief increases Basilio Bona June 2011 30 15
Summary and conclusions 1. The histogram filter decomposes the state space in finitely many convex regions 2. It represents the cumulative posterior probability of each region by a single numericalvalue 3. Many state space decomposition techniques exist. The granularity of decomposition may or may not depend on the structure of the environment. When it does, the decomposition is called topological 4. An alternative nonparametric technique is the particle filter algorithm. It is easy to implement and, with due care, is the most versatile of all Bayes filters algorithms. 5. Specific strategies exist to reduce the error in particle filters a) Reduction of the variance of the estimate that arises from the randomness of the algorithm b) Adaptation of the number of particles in accordance to the complexity of the posterior Basilio Bona June 2011 31 Thank you Any question? Basilio Bona June 2011 32 16