DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Generalizing Information to the Evolution of Rational Belief

Abstract

Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullback–Leibler divergence, Lindley’s information in experiment, cross entropy, and mutual information. We derive a general theory of information from first principles that accounts for evolving belief and recovers all of these measures. Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief. We may then regard entropy as the information we expect to gain upon realization of a discrete latent random variable. This theory of information is compatible with the Bayesian paradigm in which rational belief is updated as evidence becomes available. Furthermore, this theory admits novel measures of information with well-defined properties, which we explored in both analysis and experiment. This view of information illuminates the study of machine learning by allowing us to quantify information captured by a predictive model and distinguish it from residual information contained in training data. We gain related insights regarding feature selection, anomaly detection, and novel Bayesian approaches.

Authors:
ORCiD logo [1];  [1]
  1. Sandia National Lab. (SNL-CA), Livermore, CA (United States)
Publication Date:
Research Org.:
Sandia National Lab. (SNL-CA), Livermore, CA (United States)
Sponsoring Org.:
USDOE National Nuclear Security Administration (NNSA)
OSTI Identifier:
1605735
Alternate Identifier(s):
OSTI ID: 1618088
Report Number(s):
SAND-2020-3103J; SAND-2019-13298J
Journal ID: ISSN 1099-4300; ENTRFG; 684733
Grant/Contract Number:  
AC04-94AL85000
Resource Type:
Accepted Manuscript
Journal Name:
Entropy
Additional Journal Information:
Journal Volume: 22; Journal Issue: 1; Journal ID: ISSN 1099-4300
Publisher:
MDPI
Country of Publication:
United States
Language:
English
Subject:
97 MATHEMATICS AND COMPUTING; information; Bayesian inference; entropy; self information; mutual information; Kullback–Leibler divergence; Lindley information; maximal uncertainty; proper utility

Citation Formats

Duersch, Jed A., and Catanach, Thomas A. Generalizing Information to the Evolution of Rational Belief. United States: N. p., 2020. Web. doi:10.3390/e22010108.
Duersch, Jed A., & Catanach, Thomas A. Generalizing Information to the Evolution of Rational Belief. United States. https://doi.org/10.3390/e22010108
Duersch, Jed A., and Catanach, Thomas A. Thu . "Generalizing Information to the Evolution of Rational Belief". United States. https://doi.org/10.3390/e22010108. https://www.osti.gov/servlets/purl/1605735.
@article{osti_1605735,
title = {Generalizing Information to the Evolution of Rational Belief},
author = {Duersch, Jed A. and Catanach, Thomas A.},
abstractNote = {Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon’s concept of entropy include realization information, Kullback–Leibler divergence, Lindley’s information in experiment, cross entropy, and mutual information. We derive a general theory of information from first principles that accounts for evolving belief and recovers all of these measures. Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief. We may then regard entropy as the information we expect to gain upon realization of a discrete latent random variable. This theory of information is compatible with the Bayesian paradigm in which rational belief is updated as evidence becomes available. Furthermore, this theory admits novel measures of information with well-defined properties, which we explored in both analysis and experiment. This view of information illuminates the study of machine learning by allowing us to quantify information captured by a predictive model and distinguish it from residual information contained in training data. We gain related insights regarding feature selection, anomaly detection, and novel Bayesian approaches.},
doi = {10.3390/e22010108},
journal = {Entropy},
number = 1,
volume = 22,
place = {United States},
year = {Thu Jan 16 00:00:00 EST 2020},
month = {Thu Jan 16 00:00:00 EST 2020}
}

Journal Article:
Free Publicly Available Full Text
Publisher's Version of Record

Citation Metrics:
Cited by: 4 works
Citation information provided by
Web of Science

Save / Share:

Works referenced in this record:

A Mathematical Theory of Communication
journal, July 1948


Estimating the Dimension of a Model
journal, March 1978


The Theory of Information
journal, January 1951


On rational betting systems
journal, March 1962

  • Adams, Ernest W.
  • Archiv für Mathematische Logik und Grundlagenforschung, Vol. 6, Issue 1-2
  • DOI: 10.1007/BF02025803

On confirmation and rational betting
journal, September 1955

  • Lehman, R. Sherman
  • Journal of Symbolic Logic, Vol. 20, Issue 3
  • DOI: 10.2307/2268221

On the Distribution Function of Additive Functions
journal, January 1946

  • Erdos, P.
  • The Annals of Mathematics, Vol. 47, Issue 1
  • DOI: 10.2307/1969031

Kullback-Leibler information as a basis for strong inference in ecological studies
journal, January 2001

  • Burnham, Kenneth P.; Anderson, David R.
  • Wildlife Research, Vol. 28, Issue 2
  • DOI: 10.1071/WR99107

Information Theory and Statistical Mechanics
journal, May 1957


Maximum Entropy for Hypothesis Formulation, Especially for Multidimensional Contingency Tables
journal, September 1963


Group entropies, correlation laws, and zeta functions
journal, August 2011


A new look at the statistical model identification
journal, December 1974


Generalized Shannon–Khinchin axioms and uniqueness theorem for pseudo-additive entropies
journal, October 2014

  • Ilić, Velimir M.; Stanković, Miomir S.
  • Physica A: Statistical Mechanics and its Applications, Vol. 411
  • DOI: 10.1016/j.physa.2014.05.009

On Information and Sufficiency
journal, March 1951

  • Kullback, S.; Leibler, R. A.
  • The Annals of Mathematical Statistics, Vol. 22, Issue 1
  • DOI: 10.1214/aoms/1177729694

Properties of cross-entropy minimization
journal, July 1981


On a Measure of the Information Provided by an Experiment
journal, December 1956


Understanding predictive information criteria for Bayesian models
journal, August 2013


Probability, Frequency and Reasonable Expectation
journal, January 1946


Principal Information Theoretic Approaches
journal, December 2000


Generalized statistics: yet another generalization
journal, September 2004

  • Jizba, Petr; Arimitsu, Toshihico
  • Physica A: Statistical Mechanics and its Applications, Vol. 340, Issue 1-3
  • DOI: 10.1016/j.physa.2004.03.085

Expected Information as Expected Utility
journal, May 1979


Bayes' Method for Bookies
journal, August 1969

  • Freedman, David A.; Purves, Roger A.
  • The Annals of Mathematical Statistics, Vol. 40, Issue 4
  • DOI: 10.1214/aoms/1177697494

Dynamic Coherence and Probability Kinematics
journal, March 1987

  • Skyrms, Brian
  • Philosophy of Science, Vol. 54, Issue 1
  • DOI: 10.1086/289350

Information Measures in Perspective: Information Measures in Perspective
journal, December 2010


Gradient-based learning applied to document recognition
journal, January 1998

  • Lecun, Y.; Bottou, L.; Bengio, Y.
  • Proceedings of the IEEE, Vol. 86, Issue 11
  • DOI: 10.1109/5.726791

Capturing the Intangible Concept of Information
journal, December 1994


Sur une généralisation des intégrales de M. J. Radon
journal, January 1930


Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
journal, January 1980


On rational betting systems
journal, April 1964

  • Adams, Ernest W.
  • Archiv für Mathematische Logik und Grundlagenforschung, Vol. 6, Issue 3-4
  • DOI: 10.1007/bf01969549

Friendship stability in adolescence is associated with ventral striatum responses to vicarious rewards
journal, January 2021

  • Schreuders, Elisabeth; Braams, Barbara R.; Crone, Eveline A.
  • Nature Communications, Vol. 12, Issue 1
  • DOI: 10.1038/s41467-020-20042-1

Skillful statistical models to predict seasonal wind speed and solar radiation in a Yangtze River estuary case study
journal, May 2020


A Mathematical Theory of Communication
journal, October 1948


Understanding predictive information criteria for Bayesian models
preprint, January 2013


Generalized statistics: yet another generalization
text, January 2003


Principal Information Theoretic Approaches
journal, December 2000


Information Properties of Order Statistics and Spacings
journal, January 2004

  • Ebrahimi, N.; Soofi, E. S.; Zahedi, H.
  • IEEE Transactions on Information Theory, Vol. 50, Issue 1
  • DOI: 10.1109/tit.2003.821973

On a Measure of the Information Provided by an Experiment
journal, December 1956


On the Distribution Function of Additive Functions
journal, January 1946

  • Erdos, P.
  • The Annals of Mathematics, Vol. 47, Issue 1
  • DOI: 10.2307/1969031

Sur une généralisation des intégrales de M. J. Radon
journal, January 1930


Works referencing / citing this record: