A survey of MPI usage in the US exascale computing project
Journal Article
·
· Concurrency and Computation. Practice and Experience
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
- Univ. of Tennessee, Knoxville, TN (United States)
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
- Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
- Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Technical Univ. of Munich, Munich (Germany)
The Exascale Computing Project (ECP) is currently the primary effort in the United States focused on developing “exascale” levels of computing capabilities, including hardware, software, and applications. In order to obtain a more thorough understanding of how the software projects under the ECP are using, and planning to use the Message Passing Interface (MPI), and help guide the work of our own project within the ECP, we created a survey. Of the 97 ECP projects active at the time the survey was distributed, we received 77 responses, 56 of which reported that their projects were using MPI. This paper reports the results of that survey for the benefit of the broader community of MPI developers.
- Research Organization:
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Organization:
- USDOE National Nuclear Security Administration (NNSA)
- Grant/Contract Number:
- AC04-94AL85000; AC05-00OR22725; NA0003525; FC02-06ER25750; AC52-07NA27344
- OSTI ID:
- 1477440
- Alternate ID(s):
- OSTI ID: 1474751; OSTI ID: 1560504
- Report Number(s):
- SAND-2018-6513J; 664505
- Journal Information:
- Concurrency and Computation. Practice and Experience, Vol. 32, Issue 3; ISSN 1532-0626
- Publisher:
- WileyCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Cited by: 36 works
Citation information provided by
Web of Science
Web of Science
Application health monitoring for extreme‐scale resiliency using cooperative fault management
|
journal | July 2019 |
Foreword to the Special Issue of the Workshop on Exascale MPI (ExaMPI 2017)
|
journal | July 2019 |
Similar Records
A Survey of MPI Usage in the U.S. Exascale Computing Project
Understanding the use of message passing interface in exascale proxy applications
A survey of software implementations used by application codes in the Exascale Computing Project
Technical Report
·
Fri Jun 01 00:00:00 EDT 2018
·
OSTI ID:1477440
+6 more
Understanding the use of message passing interface in exascale proxy applications
Journal Article
·
Mon Aug 17 00:00:00 EDT 2020
· Concurrency and Computation. Practice and Experience
·
OSTI ID:1477440
+3 more
A survey of software implementations used by application codes in the Exascale Computing Project
Journal Article
·
Fri Jun 25 00:00:00 EDT 2021
· International Journal of High Performance Computing Applications
·
OSTI ID:1477440
+5 more