Analytic continuation of noisy data using Adams Bashforth residual neural network
Abstract
We propose a data-driven learning framework for the analytic continuation problem in numerical quantum many-body physics. Designing an accurate and efficient framework for the analytic continuation of imaginary time using computational data is a grand challenge that has hindered meaningful links with experimental data. The standard Maximum Entropy (MaxEnt)-based method is limited by the quality of the computational data and the availability of prior information. Also, the MaxEnt is not able to solve the inversion problem under high level of noise in the data. Here we introduce a novel learning model for the analytic continuation problem using a Adams-Bashforth residual neural network (AB-ResNet). Additionally, the advantage of this deep learning network is that it is model independent and, therefore, does not require prior information concerning the quantity of interest given by the spectral function. More importantly, the ResNet-based model achieves higher accuracy than MaxEnt for data with higher level of noise. Finally, numerical examples show that the developed AB-ResNet is able to recover the spectral function with accuracy comparable to MaxEnt where the noise level is relatively small.
- Authors:
-
- New York Univ. (NYU), NY (United States)
- Florida State Univ., Tallahassee, FL (United States)
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
- Univ. of Tennessee, Knoxville, TN (United States)
- Publication Date:
- Research Org.:
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Org.:
- National Science Foundation (NSF); USDOE Office of Science (SC), Basic Energy Sciences (BES). Materials Sciences & Engineering Division; USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR). Scientific Discovery through Advanced Computing (SciDAC)
- OSTI Identifier:
- 1814399
- Grant/Contract Number:
- AC05-00OR22725; DMS-1620280
- Resource Type:
- Accepted Manuscript
- Journal Name:
- Discrete and Continuous Dynamical Systems - Series S
- Additional Journal Information:
- Journal Volume: 15; Journal Issue: 4; Journal ID: ISSN 1937-1632
- Publisher:
- American Institute of Mathematical Sciences (AIMS)
- Country of Publication:
- United States
- Language:
- English
- Subject:
- 97 MATHEMATICS AND COMPUTING; analytic continuation; inverse problem; stochastic optimization; machine learning; neural network
Citation Formats
Xie, Xuping, Bao, Feng, Maier, Thomas, and Webster, Clayton. Analytic continuation of noisy data using Adams Bashforth residual neural network. United States: N. p., 2021.
Web. doi:10.3934/dcdss.2021088.
Xie, Xuping, Bao, Feng, Maier, Thomas, & Webster, Clayton. Analytic continuation of noisy data using Adams Bashforth residual neural network. United States. https://doi.org/10.3934/dcdss.2021088
Xie, Xuping, Bao, Feng, Maier, Thomas, and Webster, Clayton. Thu .
"Analytic continuation of noisy data using Adams Bashforth residual neural network". United States. https://doi.org/10.3934/dcdss.2021088. https://www.osti.gov/servlets/purl/1814399.
@article{osti_1814399,
title = {Analytic continuation of noisy data using Adams Bashforth residual neural network},
author = {Xie, Xuping and Bao, Feng and Maier, Thomas and Webster, Clayton},
abstractNote = {We propose a data-driven learning framework for the analytic continuation problem in numerical quantum many-body physics. Designing an accurate and efficient framework for the analytic continuation of imaginary time using computational data is a grand challenge that has hindered meaningful links with experimental data. The standard Maximum Entropy (MaxEnt)-based method is limited by the quality of the computational data and the availability of prior information. Also, the MaxEnt is not able to solve the inversion problem under high level of noise in the data. Here we introduce a novel learning model for the analytic continuation problem using a Adams-Bashforth residual neural network (AB-ResNet). Additionally, the advantage of this deep learning network is that it is model independent and, therefore, does not require prior information concerning the quantity of interest given by the spectral function. More importantly, the ResNet-based model achieves higher accuracy than MaxEnt for data with higher level of noise. Finally, numerical examples show that the developed AB-ResNet is able to recover the spectral function with accuracy comparable to MaxEnt where the noise level is relatively small.},
doi = {10.3934/dcdss.2021088},
journal = {Discrete and Continuous Dynamical Systems - Series S},
number = 4,
volume = 15,
place = {United States},
year = {Thu Apr 01 00:00:00 EDT 2021},
month = {Thu Apr 01 00:00:00 EDT 2021}
}
Works referenced in this record:
Fast and efficient stochastic optimization for analytic continuation
journal, September 2016
- Bao, F.; Tang, Y.; Summers, M.
- Physical Review B, Vol. 94, Issue 12
Stable architectures for deep neural networks
journal, December 2017
- Haber, Eldad; Ruthotto, Lars
- Inverse Problems, Vol. 34, Issue 1
Implementation of the maximum entropy method for analytic continuation
journal, June 2017
- Levy, Ryan; LeBlanc, J. P. F.; Gull, Emanuel
- Computer Physics Communications, Vol. 215
Deep learning
journal, May 2015
- LeCun, Yann; Bengio, Yoshua; Hinton, Geoffrey
- Nature, Vol. 521, Issue 7553
Sparse modeling approach to analytical continuation of imaginary-time quantum Monte Carlo data
journal, June 2017
- Otsuki, Junya; Ohzeki, Masayuki; Shinaoka, Hiroshi
- Physical Review E, Vol. 95, Issue 6
Exponential convergence of the deep neural network approximation for analytic functions
journal, September 2018
- E., Weinan; Wang, Qingcan
- Science China Mathematics, Vol. 61, Issue 10
Machine Learning Approximation Algorithms for High-Dimensional Fully Nonlinear Partial Differential Equations and Second-order Backward Stochastic Differential Equations
journal, January 2019
- Beck, Christian; E., Weinan; Jentzen, Arnulf
- Journal of Nonlinear Science, Vol. 29, Issue 4
NETT: solving inverse problems with deep neural networks
journal, June 2020
- Li, Housen; Schwab, Johannes; Antholzer, Stephan
- Inverse Problems, Vol. 36, Issue 6
Artificial Neural Network Approach to the Analytic Continuation Problem
journal, February 2020
- Fournier, Romain; Wang, Lei; Yazyev, Oleg V.
- Physical Review Letters, Vol. 124, Issue 5
Bayesian inference and the analytic continuation of imaginary-time quantum Monte Carlo data
journal, May 1996
- Jarrell, Mark; Gubernatis, J. E.
- Physics Reports, Vol. 269, Issue 3
Forecasting with artificial neural networks:
journal, March 1998
- Zhang, Guoqiang; Eddy Patuwo, B.; Y. Hu, Michael
- International Journal of Forecasting, Vol. 14, Issue 1
Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems
journal, January 2020
- Bao, Feng; Maier, Thomas
- Foundations of Data Science, Vol. 2, Issue 1
Diagrammatic quantum Monte Carlo study of the Fröhlich polaron
journal, September 2000
- Mishchenko, A.; Prokof’ev, N.; Sakamoto, A.
- Physical Review B, Vol. 62, Issue 10
Closure Learning for Nonlinear Model Reduction Using Deep Residual Neural Network
journal, March 2020
- Xie, Xuping; Webster, Clayton; Iliescu, Traian
- Fluids, Vol. 5, Issue 1
Maximum entropy method in image processing
journal, January 1984
- Gull, S. F.; Skilling, J.
- IEE Proceedings F Communications, Radar and Signal Processing, Vol. 131, Issue 6
Deep Residual Learning for Image Recognition
conference, June 2016
- He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing
- 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Non-Intrusive Inference Reduced Order Model for Fluids Using Deep Multistep Neural Network
journal, August 2019
- Xie, Xuping; Zhang, Guannan; Webster, Clayton G.
- Mathematics, Vol. 7, Issue 8
Computer Methods for Ordinary Differential Equations and Differential-Algebraic Equations
book, January 1998
- Ascher, Uri M.; Petzold, Linda R.
Spectral densities of the symmetric Anderson model
journal, July 1990
- Silver, R. N.; Gubernatis, J. E.; Sivia, D. S.
- Physical Review Letters, Vol. 65, Issue 4
Projected regression method for solving Fredholm integral equations arising in the analytic continuation problem of quantum physics
journal, October 2017
- Arsenault, Louis-François; Neuberg, Richard; Hannah, Lauren A.
- Inverse Problems, Vol. 33, Issue 11
Spectral analysis by the method of consistent constraints
journal, August 2013
- Prokof’ev, N. V.; Svistunov, B. V.
- JETP Letters, Vol. 97, Issue 11
Deep Convolutional Neural Network for Inverse Problems in Imaging
journal, September 2017
- Jin, Kyong Hwan; McCann, Michael T.; Froustey, Emmanuel
- IEEE Transactions on Image Processing, Vol. 26, Issue 9
Analytic continuation of quantum Monte Carlo data by stochastic analytical inference
journal, May 2010
- Fuchs, Sebastian; Pruschke, Thomas; Jarrell, Mark
- Physical Review E, Vol. 81, Issue 5
Model Reduction with Memory and the Machine Learning of Dynamical Systems
journal, January 2019
- Ma, Chao; Wang, Jianchun; E., Weinan
- Communications in Computational Physics, Vol. 25, Issue 4
Stochastic method for analytic continuation of quantum Monte Carlo data
journal, May 1998
- Sandvik, Anders W.
- Physical Review B, Vol. 57, Issue 17
Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups
journal, November 2012
- Hinton, Geoffrey; Deng, Li; Yu, Dong
- IEEE Signal Processing Magazine, Vol. 29, Issue 6
Statistical and computational intelligence approach to analytic continuation in Quantum Monte Carlo
journal, January 2017
- Bertaina, Gianluca; Galli, Davide Emilio; Vitali, Ettore
- Advances in Physics: X, Vol. 2, Issue 2
Analytic continuation via domain knowledge free machine learning
journal, December 2018
- Yoon, Hongkee; Sim, Jae-Hoon; Han, Myung Joon
- Physical Review B, Vol. 98, Issue 24