A survey of MPI usage in the US exascale computing project
Abstract
The Exascale Computing Project (ECP) is currently the primary effort in the United States focused on developing “exascale” levels of computing capabilities, including hardware, software, and applications. In order to obtain a more thorough understanding of how the software projects under the ECP are using, and planning to use the Message Passing Interface (MPI), and help guide the work of our own project within the ECP, we created a survey. Of the 97 ECP projects active at the time the survey was distributed, we received 77 responses, 56 of which reported that their projects were using MPI. This paper reports the results of that survey for the benefit of the broader community of MPI developers.
- Authors:
-
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
- Univ. of Tennessee, Knoxville, TN (United States)
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
- Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
- Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Technical Univ. of Munich, Munich (Germany)
- Publication Date:
- Research Org.:
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Org.:
- USDOE National Nuclear Security Administration (NNSA)
- OSTI Identifier:
- 1477440
- Alternate Identifier(s):
- OSTI ID: 1474751; OSTI ID: 1560504
- Report Number(s):
- SAND-2018-6513J
Journal ID: ISSN 1532-0626; 664505
- Grant/Contract Number:
- AC04-94AL85000; AC05-00OR22725; NA0003525; FC02-06ER25750; AC52-07NA27344
- Resource Type:
- Accepted Manuscript
- Journal Name:
- Concurrency and Computation. Practice and Experience
- Additional Journal Information:
- Journal Volume: 32; Journal Issue: 3; Journal ID: ISSN 1532-0626
- Publisher:
- Wiley
- Country of Publication:
- United States
- Language:
- English
- Subject:
- 97 MATHEMATICS AND COMPUTING; exascale; MPI
Citation Formats
Bernholdt, David E., Boehm, Swen, Bosilca, George, Venkata, Manjunath Gorentla, Grant, Ryan E., Naughton, Thomas, Pritchard, Howard P., Schulz, Martin, and Vallee, Geoffroy R. A survey of MPI usage in the US exascale computing project. United States: N. p., 2018.
Web. doi:10.1002/cpe.4851.
Bernholdt, David E., Boehm, Swen, Bosilca, George, Venkata, Manjunath Gorentla, Grant, Ryan E., Naughton, Thomas, Pritchard, Howard P., Schulz, Martin, & Vallee, Geoffroy R. A survey of MPI usage in the US exascale computing project. United States. https://doi.org/10.1002/cpe.4851
Bernholdt, David E., Boehm, Swen, Bosilca, George, Venkata, Manjunath Gorentla, Grant, Ryan E., Naughton, Thomas, Pritchard, Howard P., Schulz, Martin, and Vallee, Geoffroy R. Thu .
"A survey of MPI usage in the US exascale computing project". United States. https://doi.org/10.1002/cpe.4851. https://www.osti.gov/servlets/purl/1477440.
@article{osti_1477440,
title = {A survey of MPI usage in the US exascale computing project},
author = {Bernholdt, David E. and Boehm, Swen and Bosilca, George and Venkata, Manjunath Gorentla and Grant, Ryan E. and Naughton, Thomas and Pritchard, Howard P. and Schulz, Martin and Vallee, Geoffroy R.},
abstractNote = {The Exascale Computing Project (ECP) is currently the primary effort in the United States focused on developing “exascale” levels of computing capabilities, including hardware, software, and applications. In order to obtain a more thorough understanding of how the software projects under the ECP are using, and planning to use the Message Passing Interface (MPI), and help guide the work of our own project within the ECP, we created a survey. Of the 97 ECP projects active at the time the survey was distributed, we received 77 responses, 56 of which reported that their projects were using MPI. This paper reports the results of that survey for the benefit of the broader community of MPI developers.},
doi = {10.1002/cpe.4851},
journal = {Concurrency and Computation. Practice and Experience},
number = 3,
volume = 32,
place = {United States},
year = {Thu Sep 27 00:00:00 EDT 2018},
month = {Thu Sep 27 00:00:00 EDT 2018}
}
Web of Science
Works referenced in this record:
Enabling communication concurrency through flexible MPI endpoints
journal, September 2014
- Dinan, James; Grant, Ryan E.; Balaji, Pavan
- The International Journal of High Performance Computing Applications, Vol. 28, Issue 4
GPU-Centric Communication on NVIDIA GPU Clusters with InfiniBand: A Case Study with OpenSHMEM
conference, December 2017
- Potluri, Sreeram; Goswami, Anshuman; Rossetti, Davide
- 2017 IEEE 24th International Conference on High Performance Computing (HiPC)
A Survey of MPI Usage in the U.S. Exascale Computing Project
report, June 2018
- Bernholdt, David E.; Boehm, Swen; Bosilca, George
Post-failure recovery of MPI communication capability: Design and rationale
journal, June 2013
- Bland, Wesley; Bouteiller, Aurelien; Herault, Thomas
- The International Journal of High Performance Computing Applications, Vol. 27, Issue 3
Open MPI: Goals, Concept, and Design of a Next Generation MPI Implementation
book, January 2004
- Gabriel, Edgar; Fagg, Graham E.; Bosilca, George
- Recent Advances in Parallel Virtual Machine and Message Passing Interface
MPI Sessions: Leveraging Runtime Infrastructure to Increase Scalability of Applications at Exascale
conference, January 2016
- Holmes, Daniel; Mohror, Kathryn; Grant, Ryan E.
- Proceedings of the 23rd European MPI Users' Group Meeting on - EuroMPI 2016
Why is MPI so slow?: analyzing the fundamental limits in implementing MPI-3.1
conference, January 2017
- Raffenetti, Ken; Blocksome, Michael; Si, Min
- Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis on - SC '17
Works referencing / citing this record:
Application health monitoring for extreme‐scale resiliency using cooperative fault management
journal, July 2019
- Agarwal, Pratul K.; Naughton, Thomas; Park, Byung H.
- Concurrency and Computation: Practice and Experience, Vol. 32, Issue 2
Foreword to the Special Issue of the Workshop on Exascale MPI (ExaMPI 2017)
journal, July 2019
- Skjellum, Anthony; Bangalore, Purushotham V.; Grant, Ryan E.
- Concurrency and Computation: Practice and Experience, Vol. 32, Issue 3