skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Task parallelism and high-performance languages

Abstract

The definition of High Performance Fortran (HPF) is a significant event in the maturation of parallel computing: it represents the first parallel language that has gained widespread support from vendors and users. The subject of this paper is to incorporate support for task parallelism. The term task parallelism refers to the explicit creation of multiple threads of control, or tasks, which synchronize and communicate under programmer control. Task and data parallelism are complementary rather than competing programming models. While task parallelism is more general and can be used to implement algorithms that are not amenable to data-parallel solutions, many problems can benefit from a mixed approach, with for example a task-parallel coordination layer integrating multiple data-parallel computations. Other problems admit to both data- and task-parallel solutions, with the better solution depending on machine characteristics, compiler performance, or personal taste. For these reasons, we believe that a general-purpose high-performance language should integrate both task- and data-parallel constructs. The challenge is to do so in a way that provides the expressivity needed for applications, while preserving the flexibility and portability of a high-level language. In this paper, we examine and illustrate the considerations that motivate the use of task parallelism. We alsomore » describe one particular approach to task parallelism in Fortran, namely the Fortran M extensions. Finally, we contrast Fortran M with other proposed approaches and discuss the implications of this work for task parallelism and high-performance languages.« less

Authors:
Publication Date:
Research Org.:
Argonne National Lab. (ANL), Argonne, IL (United States)
Sponsoring Org.:
USDOE, Washington, DC (United States); National Science Foundation, Washington, DC (United States)
OSTI Identifier:
205045
Report Number(s):
MCS-P440-0594
ON: DE96007629; CNN: Contract CCR-8809615
DOE Contract Number:  
W-31109-ENG-38
Resource Type:
Technical Report
Resource Relation:
Other Information: PBD: [1996]
Country of Publication:
United States
Language:
English
Subject:
99 MATHEMATICS, COMPUTERS, INFORMATION SCIENCE, MANAGEMENT, LAW, MISCELLANEOUS; PARALLEL PROCESSING; TASK SCHEDULING; FORTRAN; ALGORITHMS; SCHEDULES

Citation Formats

Foster, I. Task parallelism and high-performance languages. United States: N. p., 1996. Web. doi:10.2172/205045.
Foster, I. Task parallelism and high-performance languages. United States. https://doi.org/10.2172/205045
Foster, I. 1996. "Task parallelism and high-performance languages". United States. https://doi.org/10.2172/205045. https://www.osti.gov/servlets/purl/205045.
@article{osti_205045,
title = {Task parallelism and high-performance languages},
author = {Foster, I},
abstractNote = {The definition of High Performance Fortran (HPF) is a significant event in the maturation of parallel computing: it represents the first parallel language that has gained widespread support from vendors and users. The subject of this paper is to incorporate support for task parallelism. The term task parallelism refers to the explicit creation of multiple threads of control, or tasks, which synchronize and communicate under programmer control. Task and data parallelism are complementary rather than competing programming models. While task parallelism is more general and can be used to implement algorithms that are not amenable to data-parallel solutions, many problems can benefit from a mixed approach, with for example a task-parallel coordination layer integrating multiple data-parallel computations. Other problems admit to both data- and task-parallel solutions, with the better solution depending on machine characteristics, compiler performance, or personal taste. For these reasons, we believe that a general-purpose high-performance language should integrate both task- and data-parallel constructs. The challenge is to do so in a way that provides the expressivity needed for applications, while preserving the flexibility and portability of a high-level language. In this paper, we examine and illustrate the considerations that motivate the use of task parallelism. We also describe one particular approach to task parallelism in Fortran, namely the Fortran M extensions. Finally, we contrast Fortran M with other proposed approaches and discuss the implications of this work for task parallelism and high-performance languages.},
doi = {10.2172/205045},
url = {https://www.osti.gov/biblio/205045}, journal = {},
number = ,
volume = ,
place = {United States},
year = {Fri Mar 01 00:00:00 EST 1996},
month = {Fri Mar 01 00:00:00 EST 1996}
}