skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: DeepCMB: Lensing Reconstruction of the Cosmic Microwave Background with Deep Neural Networks

Abstract

Next-generation cosmic microwave background (CMB) experiments will have lower noise and therefore increased sensitivity, enabling improved constraints on fundamental physics parameters such as the sum of neutrino masses and the tensor-to-scalar ratio r. Achieving competitive constraints on these parameters requires high signal-to-noise extraction of the projected gravitational potential from the CMB maps. Standard methods for reconstructing the lensing potential employ the quadratic estimator (QE). However, the QE performs suboptimally at the low noise levels expected in upcoming experiments. Other methods, like maximum likelihood estimators (MLE), are under active development. In this work, we demonstrate reconstruction of the CMB lensing potential with deep convolutional neural networks (CNN) - ie, a ResUNet. The network is trained and tested on simulated data, and otherwise has no physical parametrization related to the physical processes of the CMB and gravitational lensing. We show that, over a wide range of angular scales, ResUNets recover the input gravitational potential with a higher signal-to-noise ratio than the QE method, reaching levels comparable to analytic approximations of MLE methods. We demonstrate that the network outputs quantifiably different lensing maps when given input CMB maps generated with different cosmologies. We also show we can use the reconstructed lensing map formore » cosmological parameter estimation. This application of CNN provides a few innovations at the intersection of cosmology and machine learning. First, while training and regressing on images, we predict a continuous-variable field rather than discrete classes. Second, we are able to establish uncertainty measures for the network output that are analogous to standard methods. We expect this approach to excel in capturing hard-to-model non-Gaussian astrophysical foreground and noise contributions.« less

Authors:
ORCiD logo [1];  [2];  [3];  [2];  [4];  [5]
  1. Chicago U., EFI
  2. Chicago U., KICP
  3. Chicago U., Astron. Astrophys. Ctr.
  4. Brown U. (main)
  5. Descartes Labs, Santa Fe
Publication Date:
Research Org.:
Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)
Sponsoring Org.:
USDOE Office of Science (SC), High Energy Physics (HEP) (SC-25)
OSTI Identifier:
1487051
Report Number(s):
arXiv:1810.01483; FERMILAB-PUB-18-515-A-CD
oai:inspirehep.net:1696852
Grant/Contract Number:  
AC02-07CH11359
Resource Type:
Journal Article: Accepted Manuscript
Journal Name:
Astron.Comput.
Additional Journal Information:
Journal Volume: 28
Country of Publication:
United States
Language:
English
Subject:
79 ASTRONOMY AND ASTROPHYSICS

Citation Formats

Caldeira, J., Wu, W. L.K., Nord, B., Avestruz, C., Trivedi, S., and Story, K. T. DeepCMB: Lensing Reconstruction of the Cosmic Microwave Background with Deep Neural Networks. United States: N. p., 2019. Web. doi:10.1016/j.ascom.2019.100307.
Caldeira, J., Wu, W. L.K., Nord, B., Avestruz, C., Trivedi, S., & Story, K. T. DeepCMB: Lensing Reconstruction of the Cosmic Microwave Background with Deep Neural Networks. United States. doi:10.1016/j.ascom.2019.100307.
Caldeira, J., Wu, W. L.K., Nord, B., Avestruz, C., Trivedi, S., and Story, K. T. Wed . "DeepCMB: Lensing Reconstruction of the Cosmic Microwave Background with Deep Neural Networks". United States. doi:10.1016/j.ascom.2019.100307.
@article{osti_1487051,
title = {DeepCMB: Lensing Reconstruction of the Cosmic Microwave Background with Deep Neural Networks},
author = {Caldeira, J. and Wu, W. L.K. and Nord, B. and Avestruz, C. and Trivedi, S. and Story, K. T.},
abstractNote = {Next-generation cosmic microwave background (CMB) experiments will have lower noise and therefore increased sensitivity, enabling improved constraints on fundamental physics parameters such as the sum of neutrino masses and the tensor-to-scalar ratio r. Achieving competitive constraints on these parameters requires high signal-to-noise extraction of the projected gravitational potential from the CMB maps. Standard methods for reconstructing the lensing potential employ the quadratic estimator (QE). However, the QE performs suboptimally at the low noise levels expected in upcoming experiments. Other methods, like maximum likelihood estimators (MLE), are under active development. In this work, we demonstrate reconstruction of the CMB lensing potential with deep convolutional neural networks (CNN) - ie, a ResUNet. The network is trained and tested on simulated data, and otherwise has no physical parametrization related to the physical processes of the CMB and gravitational lensing. We show that, over a wide range of angular scales, ResUNets recover the input gravitational potential with a higher signal-to-noise ratio than the QE method, reaching levels comparable to analytic approximations of MLE methods. We demonstrate that the network outputs quantifiably different lensing maps when given input CMB maps generated with different cosmologies. We also show we can use the reconstructed lensing map for cosmological parameter estimation. This application of CNN provides a few innovations at the intersection of cosmology and machine learning. First, while training and regressing on images, we predict a continuous-variable field rather than discrete classes. Second, we are able to establish uncertainty measures for the network output that are analogous to standard methods. We expect this approach to excel in capturing hard-to-model non-Gaussian astrophysical foreground and noise contributions.},
doi = {10.1016/j.ascom.2019.100307},
journal = {Astron.Comput.},
number = ,
volume = 28,
place = {United States},
year = {2019},
month = {7}
}

Journal Article:
Free Publicly Available Full Text
This content will become publicly available on July 24, 2020
Publisher's Version of Record

Figures / Tables:

Figure 1 Figure 1: We train neural networks to learn a mapping from the lensed (Q̃, Ũ) maps into the unlensed E map and the gravitational convergence map κ, extracting the underlying fields from the observed quantities. Here we illustrate this mapping using one of the realizations in the training set. Themore » maps correspond to a patch of the sky five degrees across.« less

Save / Share: