Distributed Workflows for Modeling Experimental Data
Abstract
Modeling helps explain the fundamental physics hidden behind experimental data. In the case of material modeling, running one simulation rarely results in output that reproduces the experimental data. Often one or more of the force field parameters are not precisely known and must be optimized for the output to match that of the experiment. Since the simulations require high performance computing (HPC) resources and there are usually many simulations to run, a workflow is very useful to prevent errors and assure that the simulations are identical except for the parameters that need to be varied. These workflows are usually distributed because the simulations require HPC, but the optimization and steps to compare the simulation results and experimental data are done on a local workstation. We will present results from force field refinement of data collected at the Spallation Neutron Source using Kepler, Pegasus, and Beam workflows and discuss what we have learned from using these workflows.
- Authors:
-
- ISI
- ORNL
- Publication Date:
- Research Org.:
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
- Sponsoring Org.:
- USDOE
- OSTI Identifier:
- 1410941
- DOE Contract Number:
- AC05-00OR22725
- Resource Type:
- Conference
- Resource Relation:
- Conference: 2017 IEEE High Performance Extreme Computing Conference - Waltham, Massachusetts, United States of America - 9/12/2017 12:00:00 AM-
- Country of Publication:
- United States
- Language:
- English
Citation Formats
Deelman, Ewa, Ferreira Da Silva, Rafael, Lynch, Vickie E., Lingerfelt, Eric J., Vetter, Jeffrey S., Goswami, Monojoy, Hui, Yawei, and Borreguero Calvo, Jose M. Distributed Workflows for Modeling Experimental Data. United States: N. p., 2017.
Web.
Deelman, Ewa, Ferreira Da Silva, Rafael, Lynch, Vickie E., Lingerfelt, Eric J., Vetter, Jeffrey S., Goswami, Monojoy, Hui, Yawei, & Borreguero Calvo, Jose M. Distributed Workflows for Modeling Experimental Data. United States.
Deelman, Ewa, Ferreira Da Silva, Rafael, Lynch, Vickie E., Lingerfelt, Eric J., Vetter, Jeffrey S., Goswami, Monojoy, Hui, Yawei, and Borreguero Calvo, Jose M. 2017.
"Distributed Workflows for Modeling Experimental Data". United States. https://www.osti.gov/servlets/purl/1410941.
@article{osti_1410941,
title = {Distributed Workflows for Modeling Experimental Data},
author = {Deelman, Ewa and Ferreira Da Silva, Rafael and Lynch, Vickie E. and Lingerfelt, Eric J. and Vetter, Jeffrey S. and Goswami, Monojoy and Hui, Yawei and Borreguero Calvo, Jose M.},
abstractNote = {Modeling helps explain the fundamental physics hidden behind experimental data. In the case of material modeling, running one simulation rarely results in output that reproduces the experimental data. Often one or more of the force field parameters are not precisely known and must be optimized for the output to match that of the experiment. Since the simulations require high performance computing (HPC) resources and there are usually many simulations to run, a workflow is very useful to prevent errors and assure that the simulations are identical except for the parameters that need to be varied. These workflows are usually distributed because the simulations require HPC, but the optimization and steps to compare the simulation results and experimental data are done on a local workstation. We will present results from force field refinement of data collected at the Spallation Neutron Source using Kepler, Pegasus, and Beam workflows and discuss what we have learned from using these workflows.},
doi = {},
url = {https://www.osti.gov/biblio/1410941},
journal = {},
number = ,
volume = ,
place = {United States},
year = {2017},
month = {11}
}