Exploring Deep Learning and Sparse Matrix Format Selection
- Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
- North Carolina State Univ., Raleigh, NC (United States)
We proposed to explore the use of Deep Neural Networks (DNN) for addressing the longstanding barriers. The recent rapid progress of DNN technology has created a large impact in many fields, which has significantly improved the prediction accuracy over traditional machine learning techniques in image classifications, speech recognitions, machine translations, and so on. To some degree, these tasks resemble the decision makings in many HPC tasks, including the aforementioned format selection for SpMV and linear solver selection. For instance, sparse matrix format selection is akin to image classification—such as, to tell whether an image contains a dog or a cat; in both problems, the right decisions are primarily determined by the spatial patterns of the elements in an input. For image classification, the patterns are of pixels, and for sparse matrix format selection, they are of non-zero elements. DNN could be naturally applied if we regard a sparse matrix as an image and the format selection or solver selection as classification problems.
- Research Organization:
- Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
- Sponsoring Organization:
- USDOE
- DOE Contract Number:
- AC52-07NA27344
- OSTI ID:
- 1426119
- Report Number(s):
- LLNL--SR-747311
- Country of Publication:
- United States
- Language:
- English
Similar Records
Train Like a (Var)Pro: Efficient Training of Neural Networks with Variable Projection
Using genetic algorithms to select and create features for pattern classification. Technical report
Journal Article
·
Mon Oct 04 20:00:00 EDT 2021
· SIAM Journal on Mathematics of Data Science
·
OSTI ID:1834344
Using genetic algorithms to select and create features for pattern classification. Technical report
Technical Report
·
Sun Mar 10 23:00:00 EST 1991
·
OSTI ID:5439322