skip to main content

SciTech ConnectSciTech Connect

Title: Entropy vs. energy waveform processing: A comparison based on the heat equation

Virtually all modern imaging devices collect electromagnetic or acoustic waves and use the energy carried by these waves to determine pixel values to create what is basically an “energy” picture. However, waves also carry “information”, as quantified by some form of entropy, and this may also be used to produce an “information” image. Numerous published studies have demonstrated the advantages of entropy, or “information imaging”, over conventional methods. The most sensitive information measure appears to be the joint entropy of the collected wave and a reference signal. The sensitivity of repeated experimental observations of a slowly-changing quantity may be defined as the mean variation (i.e., observed change) divided by mean variance (i.e., noise). Wiener integration permits computation of the required mean values and variances as solutions to the heat equation, permitting estimation of their relative magnitudes. There always exists a reference, such that joint entropy has larger variation and smaller variance than the corresponding quantities for signal energy, matching observations of several studies. Moreover, a general prescription for finding an “optimal” reference for the joint entropy emerges, which also has been validated in several studies.
 [1] ;  [2] ;  [1] ;  [2] ;  [2]
  1. Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)
  2. Washington University in St. Louis, St Louis, MO (United States)
Publication Date:
OSTI Identifier:
Grant/Contract Number:
NIH EB002168; 5R21EB018095; NSF DMS 1300280; AC05-76RL01830
Accepted Manuscript
Journal Name:
Additional Journal Information:
Journal Volume: 17; Journal Issue: 6; Journal ID: ISSN 1099-4300
Research Org:
Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)
Sponsoring Org:
Country of Publication:
United States
46 INSTRUMENTATION RELATED TO NUCLEAR SCIENCE AND TECHNOLOGY information wave; optimal detection; entropy image; joint entropy