Parameter Inference and Uncertainty Quantification using Information Geometry: an overview
Kevin Burrage
(Queensland University of Technology)
J. Sharp, A. Browning, M. Simpson
There are many ways to describe distance between distributions: Kullback Leibler, Hellinger distance, Pearson’s discrepancy measure. Information geometry is a branch of mathematics connecting aspects of information theory including probability theory and statistics with concepts and techniques in differential geometry. The fundamental idea is that of a statistical manifold: a geometric representation of a distribution space, in which points on the Riemannian manifold correspond to probability distributions. Distance between distributions is then defined by geodesics between these points.
This talk gives a gentle overview of how Information Geometry can be used in parameter inference of mathematical models.