Entropy vs. energy waveform processing: A comparison based on the heat equation
Virtually all modern imaging devices collect electromagnetic or acoustic waves and use the energy carried by these waves to determine pixel values to create what is basically an “energy” picture. However, waves also carry “information”, as quantified by some form of entropy, and this may also be used to produce an “information” image. Numerous published studies have demonstrated the advantages of entropy, or “information imaging”, over conventional methods. The most sensitive information measure appears to be the joint entropy of the collected wave and a reference signal. The sensitivity of repeated experimental observations of a slowlychanging quantity may be defined as the mean variation (i.e., observed change) divided by mean variance (i.e., noise). Wiener integration permits computation of the required mean values and variances as solutions to the heat equation, permitting estimation of their relative magnitudes. There always exists a reference, such that joint entropy has larger variation and smaller variance than the corresponding quantities for signal energy, matching observations of several studies. Moreover, a general prescription for finding an “optimal” reference for the joint entropy emerges, which also has been validated in several studies.
 Authors:

^{[1]};
^{[2]};
^{[1]};
^{[2]};
^{[2]}
 Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)
 Washington University in St. Louis, St Louis, MO (United States)
 Publication Date:
 Grant/Contract Number:
 NIH EB002168; 5R21EB018095; NSF DMS 1300280; AC0576RL01830
 Type:
 Accepted Manuscript
 Journal Name:
 Entropy
 Additional Journal Information:
 Journal Volume: 17; Journal Issue: 6; Journal ID: ISSN 10994300
 Publisher:
 MDPI
 Research Org:
 Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
 Sponsoring Org:
 USDOE
 Country of Publication:
 United States
 Language:
 English
 Subject:
 46 INSTRUMENTATION RELATED TO NUCLEAR SCIENCE AND TECHNOLOGY; information wave; optimal detection; entropy image; joint entropy
 OSTI Identifier:
 1211541
Hughes, Michael S., McCarthy, John E., Bruillard, Paul J., Marsh, Jon N., and Wickline, Samuel A.. Entropy vs. energy waveform processing: A comparison based on the heat equation. United States: N. p.,
Web. doi:10.3390/e17063518.
Hughes, Michael S., McCarthy, John E., Bruillard, Paul J., Marsh, Jon N., & Wickline, Samuel A.. Entropy vs. energy waveform processing: A comparison based on the heat equation. United States. doi:10.3390/e17063518.
Hughes, Michael S., McCarthy, John E., Bruillard, Paul J., Marsh, Jon N., and Wickline, Samuel A.. 2015.
"Entropy vs. energy waveform processing: A comparison based on the heat equation". United States.
doi:10.3390/e17063518. https://www.osti.gov/servlets/purl/1211541.
@article{osti_1211541,
title = {Entropy vs. energy waveform processing: A comparison based on the heat equation},
author = {Hughes, Michael S. and McCarthy, John E. and Bruillard, Paul J. and Marsh, Jon N. and Wickline, Samuel A.},
abstractNote = {Virtually all modern imaging devices collect electromagnetic or acoustic waves and use the energy carried by these waves to determine pixel values to create what is basically an “energy” picture. However, waves also carry “information”, as quantified by some form of entropy, and this may also be used to produce an “information” image. Numerous published studies have demonstrated the advantages of entropy, or “information imaging”, over conventional methods. The most sensitive information measure appears to be the joint entropy of the collected wave and a reference signal. The sensitivity of repeated experimental observations of a slowlychanging quantity may be defined as the mean variation (i.e., observed change) divided by mean variance (i.e., noise). Wiener integration permits computation of the required mean values and variances as solutions to the heat equation, permitting estimation of their relative magnitudes. There always exists a reference, such that joint entropy has larger variation and smaller variance than the corresponding quantities for signal energy, matching observations of several studies. Moreover, a general prescription for finding an “optimal” reference for the joint entropy emerges, which also has been validated in several studies.},
doi = {10.3390/e17063518},
journal = {Entropy},
number = 6,
volume = 17,
place = {United States},
year = {2015},
month = {5}
}