Compressing unstructured mesh data from simulations using machine learning
- Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
The amount of data output from a computer simulation has grown to terabytes and petabytes as increasingly complex simulations are being run on massively parallel systems. As we approach exaflop computing in the next decade, it is expected that the I/O subsystem will not be able to write out these large volumes of data. In this paper, we explore the use of machine learning to compress the data before it is written out. Despite the computational constraints that limit us to using very simple learning algorithms, our results show that machine learning is a viable option for compressing unstructured data. Furthermore, we demonstrate that by simply using a better sampling algorithm to generate the training set, we can obtain more accurate results compared to random sampling, but at no extra cost. Further, by carefully selecting and incorporating points with high prediction error, we can improve reconstruction accuracy without sacrificing the compression rate.
- Research Organization:
- Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
- Sponsoring Organization:
- USDOE National Nuclear Security Administration (NNSA)
- Grant/Contract Number:
- AC52-07NA27344
- OSTI ID:
- 1738887
- Report Number(s):
- LLNL-JRNL-750460; 935302
- Journal Information:
- International journal of data science and analytics, Vol. 9, Issue 1; ISSN 2364-415X
- Publisher:
- SpringerCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
Optimal Compressed Sensing and Reconstruction of Unstructured Mesh Datasets
Machine Learning Algorithms for Matching Theories, Simulations, and Observations in Cosmology (Final Project)