Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

LM4HPC: Towards Effective Language Model Application in High-Performance Computing

Conference ·
 [1];  [1];  [1];  [1];  [2];  [1]
  1. Lawrence Livermore National Laboratory
  2. Argonne National Laboratory

Research Organization:
Lawrence Livermore National Laboratory (LLNL), Livermore, CA (United States)
Sponsoring Organization:
USDOE National Nuclear Security Administration (NNSA)
DOE Contract Number:
AC52-07NA27344
OSTI ID:
2329383
Report Number(s):
LLNL-CONF-849438; 1075118
Resource Relation:
Journal Volume: 14114; Conference: Bristol, null, United Kingdom
Country of Publication:
United States
Language:
English

References (12)

Multi-View Learning for Parallelism Discovery of Sequential Programs May 2022
Finding Reusable Machine Learning Components to Build Programming Language Processing Pipelines January 2023
CodeBERT: A Pre-Trained Model for Programming and Natural Languages January 2020
CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation January 2021
Convolutional Neural Networks over Tree Structures for Programming Language Processing February 2016
DRB-ML-Dataset January 2022
DataRaceBench November 2017
Rodinia: A benchmark suite for heterogeneous computing October 2009
Early Experience with Transformer-Based Similarity Analysis for DataRaceBench November 2022
Learning to Parallelize in a Shared-Memory Environment with Transformers February 2023
HPCFAIR: Enabling FAIR AI for HPC Applications November 2021
ExeBench: an ML-scale dataset of executable C functions June 2022