skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Critique of information geometry

Journal Article · · AIP Conference Proceedings
DOI:https://doi.org/10.1063/1.4903705· OSTI ID:22390759
 [1]
  1. Maximum Entropy Data Consultants Ltd, Kenmare (Ireland)

As applied to probability, information geometry fails because probability distributions do not form a metric space. Probability theory rests on a compelling foundation of elementary symmetries, which also support information (aka minus entropy, Kullback-Leibler) H(p;q) as the unique measure of divergence from source probability distribution q to destination p. Because the only compatible connective H is from≠to asymmetric, H(p;q)≠H(q;p), there can be no compatible geometrical distance (which would necessarily be from=to symmetric). Hence there is no distance relationship compatible with the structure of probability theory. Metrics g and densities sqrt(det(g)) interpreted as prior probabilities follow from the definition of distance, and must fail likewise. Various metrics and corresponding priors have been proposed, Fisher's being the most popular, but all must behave unacceptably. This is illustrated with simple counter-examples.

OSTI ID:
22390759
Journal Information:
AIP Conference Proceedings, Vol. 1636, Issue 1; Conference: MaxEnt 2013: 33. International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Canberra, ACT (Australia), 15-20 Dec 2013; Other Information: (c) 2014 AIP Publishing LLC; Country of input: International Atomic Energy Agency (IAEA); ISSN 0094-243X
Country of Publication:
United States
Language:
English