Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Generalizing Information to the Evolution of Rational Belief

Journal Article · · Entropy
DOI:https://doi.org/10.3390/e22010108· OSTI ID:1605735

Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullback–Leibler divergence, Lindley’s information in experiment, cross entropy, and mutual information. We derive a general theory of information from first principles that accounts for evolving belief and recovers all of these measures. Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief. We may then regard entropy as the information we expect to gain upon realization of a discrete latent random variable. This theory of information is compatible with the Bayesian paradigm in which rational belief is updated as evidence becomes available. Furthermore, this theory admits novel measures of information with well-defined properties, which we explored in both analysis and experiment. This view of information illuminates the study of machine learning by allowing us to quantify information captured by a predictive model and distinguish it from residual information contained in training data. We gain related insights regarding feature selection, anomaly detection, and novel Bayesian approaches.

Research Organization:
Sandia National Laboratories (SNL-CA), Livermore, CA (United States)
Sponsoring Organization:
USDOE National Nuclear Security Administration (NNSA)
Grant/Contract Number:
AC04-94AL85000
OSTI ID:
1605735
Alternate ID(s):
OSTI ID: 1618088
Report Number(s):
SAND--2020-3103J; SAND--2019-13298J; 684733
Journal Information:
Entropy, Journal Name: Entropy Journal Issue: 1 Vol. 22; ISSN ENTRFG; ISSN 1099-4300
Publisher:
MDPICopyright Statement
Country of Publication:
United States
Language:
English

References (39)

Principal Information Theoretic Approaches journal December 2000
Information Properties of Order Statistics and Spacings journal January 2004
On a Measure of the Information Provided by an Experiment journal December 1956
On the Distribution Function of Additive Functions journal January 1946
Sur une généralisation des intégrales de M. J. Radon journal January 1930
A Mathematical Theory of Communication journal October 1948
A Mathematical Theory of Communication journal July 1948
On rational betting systems journal March 1962
On rational betting systems journal April 1964
Understanding predictive information criteria for Bayesian models journal August 2013
Generalized statistics: yet another generalization journal September 2004
Generalized Shannon–Khinchin axioms and uniqueness theorem for pseudo-additive entropies journal October 2014
Friendship stability in adolescence is associated with ventral striatum responses to vicarious rewards journal January 2021
Skillful statistical models to predict seasonal wind speed and solar radiation in a Yangtze River estuary case study journal May 2020
Kullback-Leibler information as a basis for strong inference in ecological studies journal January 2001
Capturing the Intangible Concept of Information journal December 1994
Dynamic Coherence and Probability Kinematics journal March 1987
Information Theory and Statistical Mechanics journal May 1957
Group entropies, correlation laws, and zeta functions journal August 2011
Maximum Entropy Principle in Statistical Inference: Case for Non-Shannonian Entropies journal March 2019
Gradient-based learning applied to document recognition journal January 1998
A new look at the statistical model identification journal December 1974
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy journal January 1980
Properties of cross-entropy minimization journal July 1981
Information Properties of Order Statistics and Spacings journal January 2004
Information Measures in Perspective: Information Measures in Perspective journal December 2010
The Theory of Information journal January 1951
Probability, Frequency and Reasonable Expectation journal January 1946
A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions journal January 2011
Bayes' Method for Bookies journal August 1969
Maximum Entropy for Hypothesis Formulation, Especially for Multidimensional Contingency Tables journal September 1963
On Information and Sufficiency journal March 1951
Estimating the Dimension of a Model journal March 1978
Expected Information as Expected Utility journal May 1979
On confirmation and rational betting journal September 1955
Understanding predictive information criteria for Bayesian models preprint January 2013
Generalized Shannon-Khinchin Axioms and Uniqueness Theorem for Pseudo-additive Entropies text January 2013
Maximum Entropy Principle in statistical inference: case for non-Shannonian entropies text January 2018
Generalized statistics: yet another generalization text January 2003

Cited By (1)


Similar Records

Jensen-Shannon divergence as a measure of distinguishability between mixed quantum states
Journal Article · Mon Nov 14 23:00:00 EST 2005 · Physical Review. A · OSTI ID:20786462

On variational definition of quantum entropy
Journal Article · Mon Jan 12 23:00:00 EST 2015 · AIP Conference Proceedings · OSTI ID:22390863