## information theory and inference

**2021**

Information entropies (like Kullback-Leibler or Rényi entropies) are measures of statistical randomness of distributions, if applied to the posterior of a distribution they serve as a quantification of remaining statistical uncertainty, i.e. how well measurements have been able to improve the knowledge on a given physical model. We work on the connection between more conventional tools in statistics and inference such as likelihoods and statistical tests, with novel concepts like information entropies, with an application to cosmological data sets.