Abstract
OJINN Is an easy-to-use deep neural network model that requires fewer user-specified hyper-parameters than traditional neural networks. The algorithm leverages decision trees trained on the data to determine an appropriate deep neural network architectures and weight Initializations. Optional functions also select the learning rate, batch size. and number of training Iterations necessary to create an accurate model.
- Developers:
-
Humbird, Kelli [1] ; Peterson, Luc [1]
- Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
- Release Date:
- 2018-01-11
- Project Type:
- Open Source, Publicly Available Repository
- Software Type:
- Scientific
- Licenses:
-
BSD 3-clause "New" or "Revised" License
- Sponsoring Org.:
-
USDOE National Nuclear Security Administration (NNSA)Primary Award/Contract Number:AC52-07NA27344
- Code ID:
- 15238
- Site Accession Number:
- LLNL-CODE-754815
- Research Org.:
- Lawrence Livermore National Laboratory (LLNL), Livermore, CA (United States)
- Country of Origin:
- United States
Citation Formats
Humbird, Kelli, and Peterson, Luc.
Deep Jointly-Informed neural networks.
Computer Software.
https://github.com/LLNL/DJINN.
USDOE National Nuclear Security Administration (NNSA).
11 Jan. 2018.
Web.
doi:10.11578/dc.20180724.1.
Humbird, Kelli, & Peterson, Luc.
(2018, January 11).
Deep Jointly-Informed neural networks.
[Computer software].
https://github.com/LLNL/DJINN.
https://doi.org/10.11578/dc.20180724.1.
Humbird, Kelli, and Peterson, Luc.
"Deep Jointly-Informed neural networks." Computer software.
January 11, 2018.
https://github.com/LLNL/DJINN.
https://doi.org/10.11578/dc.20180724.1.
@misc{
doecode_15238,
title = {Deep Jointly-Informed neural networks},
author = {Humbird, Kelli and Peterson, Luc},
abstractNote = {OJINN Is an easy-to-use deep neural network model that requires fewer user-specified hyper-parameters than traditional neural networks. The algorithm leverages decision trees trained on the data to determine an appropriate deep neural network architectures and weight Initializations. Optional functions also select the learning rate, batch size. and number of training Iterations necessary to create an accurate model.},
doi = {10.11578/dc.20180724.1},
url = {https://doi.org/10.11578/dc.20180724.1},
howpublished = {[Computer Software] \url{https://doi.org/10.11578/dc.20180724.1}},
year = {2018},
month = {jan}
}