Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Building MPI for Multi-Programming Systems using Implicit Information
 

Summary: Building MPI for Multi-Programming Systems using
Implicit Information
Frederick C. Wong1
, Andrea C. Arpaci-Dusseau2
, and David E. Culler1
1
Computer Science Division, University of California, Berkeley
{fredwong, culler}@CS.Berkeley.EDU
2
Computer Systems Laboratory, Stanford University
dusseau@CS.Stanford.EDU
Abstract. With the growing importance of fast system area networks in the par-
allel community, it is becoming common for message passing programs to run in
multi-programming environments. Competing sequential and parallel jobs can
distort the global coordination of communicating processes. In this paper, we
describe our implementation of MPI using implicit information for global co-
scheduling. Our results show that MPI program performance is, indeed, sensitive
to local scheduling variations. Further, the integration of implicit co-scheduling
with the MPI runtime system achieves robust performance in a multi-program-
ming environment, without compromising performance in dedicated use.

  

Source: Arpaci-Dusseau, Andrea - Department of Computer Sciences, University of Wisconsin at Madison

 

Collections: Computer Technologies and Information Sciences