Entropy and its Relationship with Statistics
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
The purpose of our report is to discuss the notion of entropy and its relationship with statistics. Our goal is to provide a manner in which you can think about entropy, its central role within information theory and relationship with statistics. We review various relationships between information theory and statistics—nearly all are well-known but unfortunately are often not recognized. Entropy quantities the "average amount of surprise" in a random variable and lies at the heart of information theory, which studies the transmission, processing, extraction, and utilization of information. For us, data is information. What is the distinction between information theory and statistics? Information theorists work with probability distributions. Instead, statisticians work with samples. In so many words, information theory using samples is the practice of statistics.
- Research Organization:
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
- Sponsoring Organization:
- USDOE National Nuclear Security Administration (NNSA); USDOE Laboratory Directed Research and Development (LDRD) Program
- DOE Contract Number:
- NA0003525
- OSTI ID:
- 1895025
- Report Number(s):
- SAND2022-15094; 711399
- Country of Publication:
- United States
- Language:
- English
Similar Records
Applied extreme-value statistics
"Test" is a Four Letter Word