Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Building MPI for MultiProgramming Systems using Implicit Information
 

Summary: Building MPI for Multi­Programming Systems using
Implicit Information
Frederick C. Wong 1 , Andrea C. Arpaci­Dusseau 2 , and David E. Culler 1
1 Computer Science Division, University of California, Berkeley
{fredwong, culler}@CS.Berkeley.EDU
2 Computer Systems Laboratory, Stanford University
dusseau@CS.Stanford.EDU
Abstract. With the growing importance of fast system area networks in the par­
allel community, it is becoming common for message passing programs to run in
multi­programming environments. Competing sequential and parallel jobs can
distort the global coordination of communicating processes. In this paper, we
describe our implementation of MPI using implicit information for global co­
scheduling. Our results show that MPI program performance is, indeed, sensitive
to local scheduling variations. Further, the integration of implicit co­scheduling
with the MPI runtime system achieves robust performance in a multi­program­
ming environment, without compromising performance in dedicated use.
1 Introduction
With the emergence of fast system area networks and low­overhead communication
interfaces [6], it is becoming common for parallel MPI programs to run in cluster envi­
ronments that offer both high performance communication and multi­programming.

  

Source: Arpaci-Dusseau, Andrea - Department of Computer Sciences, University of Wisconsin at Madison

 

Collections: Computer Technologies and Information Sciences