An Informative Interpretation of Decision Theory: The Information Theoretic Basis for SignaltoNoise Ratio and Log Likelihood Ratio
Abstract
The signal processing concept of signaltonoise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible informationpreserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore, the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.
 Authors:

 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Computer Science Mathematics Division
 Publication Date:
 Research Org.:
 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
 Sponsoring Org.:
 USDOE
 OSTI Identifier:
 1344244
 Grant/Contract Number:
 AC0500OR22725
 Resource Type:
 Accepted Manuscript
 Journal Name:
 IEEE Access
 Additional Journal Information:
 Journal Volume: 1; Journal ID: ISSN 21693536
 Publisher:
 IEEE
 Country of Publication:
 United States
 Language:
 English
 Subject:
 97 MATHEMATICS AND COMPUTING; data compression; decision theory; detection algorithms; information measures; information theory; KullbackLeibler divergence; log likelihood ratio; performance evaluation; performance measures; selfscaling property; signal processing algorithms; signal to noise ratio; statistical anlaysis
Citation Formats
Polcari, J. An Informative Interpretation of Decision Theory: The Information Theoretic Basis for SignaltoNoise Ratio and Log Likelihood Ratio. United States: N. p., 2013.
Web. doi:10.1109/ACCESS.2013.2277930.
Polcari, J. An Informative Interpretation of Decision Theory: The Information Theoretic Basis for SignaltoNoise Ratio and Log Likelihood Ratio. United States. https://doi.org/10.1109/ACCESS.2013.2277930
Polcari, J. Fri .
"An Informative Interpretation of Decision Theory: The Information Theoretic Basis for SignaltoNoise Ratio and Log Likelihood Ratio". United States. https://doi.org/10.1109/ACCESS.2013.2277930. https://www.osti.gov/servlets/purl/1344244.
@article{osti_1344244,
title = {An Informative Interpretation of Decision Theory: The Information Theoretic Basis for SignaltoNoise Ratio and Log Likelihood Ratio},
author = {Polcari, J.},
abstractNote = {The signal processing concept of signaltonoise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible informationpreserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore, the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.},
doi = {10.1109/ACCESS.2013.2277930},
journal = {IEEE Access},
number = ,
volume = 1,
place = {United States},
year = {2013},
month = {8}
}
Web of Science