skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Generalizing Information to the Evolution of Rational Belief

Journal Article · · Entropy
DOI:https://doi.org/10.3390/e22010108· OSTI ID:1605735

Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullback–Leibler divergence, Lindley’s information in experiment, cross entropy, and mutual information. We derive a general theory of information from first principles that accounts for evolving belief and recovers all of these measures. Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief. We may then regard entropy as the information we expect to gain upon realization of a discrete latent random variable. This theory of information is compatible with the Bayesian paradigm in which rational belief is updated as evidence becomes available. Furthermore, this theory admits novel measures of information with well-defined properties, which we explored in both analysis and experiment. This view of information illuminates the study of machine learning by allowing us to quantify information captured by a predictive model and distinguish it from residual information contained in training data. We gain related insights regarding feature selection, anomaly detection, and novel Bayesian approaches.

Research Organization:
Sandia National Lab. (SNL-CA), Livermore, CA (United States)
Sponsoring Organization:
USDOE National Nuclear Security Administration (NNSA)
Grant/Contract Number:
AC04-94AL85000
OSTI ID:
1605735
Alternate ID(s):
OSTI ID: 1618088
Report Number(s):
SAND-2020-3103J; SAND-2019-13298J; ENTRFG; 684733
Journal Information:
Entropy, Vol. 22, Issue 1; ISSN 1099-4300
Publisher:
MDPICopyright Statement
Country of Publication:
United States
Language:
English
Citation Metrics:
Cited by: 4 works
Citation information provided by
Web of Science

References (37)

A Mathematical Theory of Communication journal July 1948
Estimating the Dimension of a Model journal March 1978
The Theory of Information journal January 1951
On rational betting systems journal March 1962
A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions journal January 2011
On confirmation and rational betting journal September 1955
On the Distribution Function of Additive Functions journal January 1946
Kullback-Leibler information as a basis for strong inference in ecological studies journal January 2001
Information Theory and Statistical Mechanics journal May 1957
Maximum Entropy for Hypothesis Formulation, Especially for Multidimensional Contingency Tables journal September 1963
Group entropies, correlation laws, and zeta functions journal August 2011
A new look at the statistical model identification journal December 1974
Generalized Shannon–Khinchin axioms and uniqueness theorem for pseudo-additive entropies journal October 2014
On Information and Sufficiency journal March 1951
Properties of cross-entropy minimization journal July 1981
On a Measure of the Information Provided by an Experiment journal December 1956
Understanding predictive information criteria for Bayesian models journal August 2013
Probability, Frequency and Reasonable Expectation journal January 1946
Principal Information Theoretic Approaches journal December 2000
Generalized statistics: yet another generalization journal September 2004
Expected Information as Expected Utility journal May 1979
Bayes' Method for Bookies journal August 1969
Dynamic Coherence and Probability Kinematics journal March 1987
Information Measures in Perspective: Information Measures in Perspective journal December 2010
Gradient-based learning applied to document recognition journal January 1998
Capturing the Intangible Concept of Information journal December 1994
Sur une généralisation des intégrales de M. J. Radon journal January 1930
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy journal January 1980
On rational betting systems journal April 1964
Friendship stability in adolescence is associated with ventral striatum responses to vicarious rewards journal January 2021
Skillful statistical models to predict seasonal wind speed and solar radiation in a Yangtze River estuary case study journal May 2020
A Mathematical Theory of Communication journal October 1948
Understanding predictive information criteria for Bayesian models preprint January 2013
Generalized Shannon-Khinchin Axioms and Uniqueness Theorem for Pseudo-additive Entropies text January 2013
Maximum Entropy Principle in statistical inference: case for non-Shannonian entropies text January 2018
Generalized statistics: yet another generalization text January 2003
Information Properties of Order Statistics and Spacings journal January 2004

Cited By (1)