skip to main content

Title: Critique of information geometry

As applied to probability, information geometry fails because probability distributions do not form a metric space. Probability theory rests on a compelling foundation of elementary symmetries, which also support information (aka minus entropy, Kullback-Leibler) H(p;q) as the unique measure of divergence from source probability distribution q to destination p. Because the only compatible connective H is from≠to asymmetric, H(p;q)≠H(q;p), there can be no compatible geometrical distance (which would necessarily be from=to symmetric). Hence there is no distance relationship compatible with the structure of probability theory. Metrics g and densities sqrt(det(g)) interpreted as prior probabilities follow from the definition of distance, and must fail likewise. Various metrics and corresponding priors have been proposed, Fisher's being the most popular, but all must behave unacceptably. This is illustrated with simple counter-examples.
Authors:
 [1]
  1. Maximum Entropy Data Consultants Ltd, Kenmare (Ireland)
Publication Date:
OSTI Identifier:
22390759
Resource Type:
Journal Article
Resource Relation:
Journal Name: AIP Conference Proceedings; Journal Volume: 1636; Journal Issue: 1; Conference: MaxEnt 2013: 33. International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Canberra, ACT (Australia), 15-20 Dec 2013; Other Information: (c) 2014 AIP Publishing LLC; Country of input: International Atomic Energy Agency (IAEA)
Country of Publication:
United States
Language:
English
Subject:
71 CLASSICAL AND QUANTUM MECHANICS, GENERAL PHYSICS; ASYMMETRY; DISTANCE; ENTROPY; FOUNDATIONS; GEOMETRY; INFORMATION; METRICS; PROBABILITY; SPACE; SYMMETRY