Adapting Multigrid-in-Time to Train Deep Neural Networks [Slides]
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
- Lawrence Livermore National Laboratory (LLNL), Livermore, CA (United States)
- Emory Univ., Atlanta, GA (United States)
- Univ. of New Mexico, Albuquerque, NM (United States)
- Technical Univ. of Kaiserslautern (Germany)
- Korea Aerospace University (KAU), Goyang (South Korea)
- Mathworks, Natick, MA (United States)
- Rice Univ., Houston, TX (United States)
Abstract not provided.
- Research Organization:
- Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)
- Sponsoring Organization:
- USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR); USDOE National Nuclear Security Administration (NNSA)
- DOE Contract Number:
- NA0003525
- OSTI ID:
- 2004439
- Report Number(s):
- SAND2022--11168C; 709293
- Country of Publication:
- United States
- Language:
- English
Similar Records
Robust architectures initialization and training for deep neural networks via the adaptive basis interpretation.
Robust architectures, initialization, and training for deep neural networks via the adaptive basis interpretation.
Parallel-In-Time Training of Recurrent Neural Networks.
Conference
·
Thu Apr 01 00:00:00 EDT 2021
·
OSTI ID:1866561
Robust architectures, initialization, and training for deep neural networks via the adaptive basis interpretation.
Conference
·
Wed Sep 01 00:00:00 EDT 2021
·
OSTI ID:1889595
Parallel-In-Time Training of Recurrent Neural Networks.
Conference
·
Fri Apr 01 00:00:00 EDT 2022
·
OSTI ID:2002170