skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Final Report LDRD 04-ERD-021

Abstract

In this project, we performed experiments and simulations to establish constitutive models for plastic behavior and to determine the deformation mechanism of nanocrystalline materials at different grain sizes (<100 nm) and high strain rates (>10{sup 6}/s). The experiments used both laser-induced shocks and isentropic compression to investigate, for the first time, the high-strain-rate deformation of nanocrystalline Ni. Samples were characterized using transmission electron microscopy, nanoindentation, profilometry, and x-ray diffraction before and after loading. We validated constitutive models using both atomistic molecular dynamics and continuum simulations performed at the boundary of their current computational possibilities to match experimental scales.

Authors:
Publication Date:
Research Org.:
Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
Sponsoring Org.:
USDOE
OSTI Identifier:
902248
Report Number(s):
UCRL-TR-228484
TRN: US200717%%502
DOE Contract Number:
W-7405-ENG-48
Resource Type:
Technical Report
Country of Publication:
United States
Language:
English
Subject:
36 MATERIALS SCIENCE; COMPRESSION; DEFORMATION; GRAIN SIZE; PLASTICS; STRAIN RATE; TRANSMISSION ELECTRON MICROSCOPY; X-RAY DIFFRACTION

Citation Formats

Bringa, E. Final Report LDRD 04-ERD-021. United States: N. p., 2007. Web. doi:10.2172/902248.
Bringa, E. Final Report LDRD 04-ERD-021. United States. doi:10.2172/902248.
Bringa, E. Fri . "Final Report LDRD 04-ERD-021". United States. doi:10.2172/902248. https://www.osti.gov/servlets/purl/902248.
@article{osti_902248,
title = {Final Report LDRD 04-ERD-021},
author = {Bringa, E},
abstractNote = {In this project, we performed experiments and simulations to establish constitutive models for plastic behavior and to determine the deformation mechanism of nanocrystalline materials at different grain sizes (<100 nm) and high strain rates (>10{sup 6}/s). The experiments used both laser-induced shocks and isentropic compression to investigate, for the first time, the high-strain-rate deformation of nanocrystalline Ni. Samples were characterized using transmission electron microscopy, nanoindentation, profilometry, and x-ray diffraction before and after loading. We validated constitutive models using both atomistic molecular dynamics and continuum simulations performed at the boundary of their current computational possibilities to match experimental scales.},
doi = {10.2172/902248},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Fri Feb 23 00:00:00 EST 2007},
month = {Fri Feb 23 00:00:00 EST 2007}
}

Technical Report:

Save / Share:
  • The primary purposes of this project were to (1) improve and validate the LLNL/IMPACT atmospheric chemistry and aerosol transport model, (2) experimentally analyze size- and time-resolved aerosol measurements taken during spring 2001 in Northern California, and (3) understand the origin of dust impacting Northern California. Under this project, we (1) more than doubled the resolution of the LLNL-IMPACT global atmospheric chemistry and aerosol model (to 1 x 1 degree), (2) added an interactive dust emission algorithm to the IMPACT model in order to simulate observed events, (3) added detailed microphysics to the IMPACT model to calculate the size-distribution of aerosolsmore » in terms of mass, (4) analyzed the aerosol mass and elemental composition of the size- and time-resolved aerosol measurements made by our UC Davis collaborators, and (5) determined that the majority of the observed soil dust is from intercontinental transport across the Pacific. A detailed report on this project is in the attached document ''Impact of Long-Range Dust Transport on Northern California in Spring 2002'' (UCRL-TR-209597), except for the addition of aerosol microphysics, which is covered in the attached document ''Implementation of the Missing Aerosol Physics into LLNL IMPACT'' (UCRL-TR-209568). In addition to the technical results, this project has (1) produced a journal article presenting our results that will be submitted shortly, (2) enabled collaborations with UC Davis and the California Air Resources Board, (3) generated a direct DOE request and large computer allocation to simulate the radiative impact of sulfate aerosols at high-resolution over the last 50 years, and (4) contributed to successful LLNL responses to requests for proposals from (a) the DOE Atmospheric Science Program ($780k), (b) the DOE Atmospheric Radiation Measurement Program ($720k), and (c) the NASA Global Modeling and Analysis Program ($525k). The journal article will be based on the report listed above (''Impact of Long-Range Dust Transport on Northern California in Spring 2002''), and will be submitted to the Journal of Geophysical Research in the near future.« less
  • In the event of a nuclear or radiological accident or terrorist event, it is important to identify individuals that can benefit from prompt medical care and to reassure those that do not need it. Achieving these goals will maximize the ability to manage the medical consequences of radiation exposure that unfold over a period of hours, days, weeks, years, depending on dose. Medical interventions that reduce near term morbidity and mortality from high but non-lethal exposures require advanced medical support and must be focused on those in need as soon as possible. There are two traditional approaches to radiation dosimetry,more » physical and biological. Each as currently practiced has strengths and limitations. Physical dosimetry for radiation exposure is routine for selected sites and for individual nuclear workers in certain industries, medical centers and research institutions. No monitoring of individuals in the general population is currently performed. When physical dosimetry is available at the time of an accident/event or soon thereafter, it can provide valuable information in support of accident/event triage. Lack of data for most individuals is a major limitation, as differences in exposure can be significant due to shielding, atmospherics, etc. A smaller issue in terms of number of people affected is that the same dose may have more or less biological effect on subsets of the population. Biological dosimetry is the estimation of exposure based on physiological or cellular alterations induced in an individual by radiation. The best established and precise biodosimetric methods are measurement of the decline of blood cells over time and measurement of the frequency of chromosome aberrations. In accidents or events affecting small numbers of people, it is practical to allocate the resources and time (days of clinical follow-up or specialists laboratory time) to conduct these studies. However, if large numbers of people have been exposed, or fear they may have been, these methods are not suitable. The best current option for triage radiation biodosimetry is self-report of time to onset of emesis after the event, a biomarker that is subject to many false positives. The premise of this project is that greatly improved radiation dosimetry can be achieved by research and development directed toward detection of molecular changes induced by radiation in cells or other biological materials. Basic research on the responses of cells to radiation at the molecular level, particularly of message RNA and proteins, has identified biomolecules whose levels increase (or decrease) as part of cellular responses to radiation. Concerted efforts to identify markers useful for triage and clinical applications have not been reported as yet. Such studies would scan responses over a broad range of doses, below, at and above the threshold of clinical significance in the first weeks after exposure, and would collect global proteome and/or transcriptome information on all tissue samples accessible to either first responders or clinicians. For triage, the goal is to identify those needing medical treatment. Treatment will be guided by refined dosimetry. Achieving this goal entails determining whether radiation exposure was below or above the threshold of concern, using one sample collected within days of an event, with simple devices that first responders either use or distribute for self-testing. For the clinic, better resolution of dose and tissue damage is needed to determine the nature and time sensitivity of therapy, but multiple sampling times may be acceptable and clinical staff and equipment can be utilized. Two complementary areas of research and development are needed once candidate biomarkers are identified, validation of the biomarker responses and validation of devices/instrumentation for detection of responses. Validation of biomarkers per se is confirmation that the dose, time, and tissue specific responses meet the reporting requirements in a high proportion of the population, and that variation among nonexposed people due to age, life-style factors, common medical conditions, variables that are not radiation related, do not lead to unacceptable frequencies of false negatives or false positives. Validation of detection requires testing of devices/instruments for accuracy and reproducibility of results with the intended reagents, sampling protocols, and users. Different technologies, each with intrinsic virtues and liabilities, will be appropriate for RNA and protein biomarkers. Fortunately, device and instrumentation development for other clinical applications is a major industry. Hence the major challenges for radiation biodosimetry are identification of potential radiation exposure biomarkers and development of model systems that enable validation of responses of biomarkers and detection systems.« less
  • Research carried out in the framework of the LDRD project ''Surrogate Nuclear Reactions and the Origin of the Heavy Elements'' (04-ERD-057) is summarized. The project was designed to address the challenge of determining cross sections for nuclear reactions involving unstable targets, with a particular emphasis on reactions that play a key role in the production of the elements between Iron and Uranium. This report reviews the motivation for the research, introduces the approach employed to address the problem, and summarizes the resulting scientific insights, technical findings, and related accomplishments.
  • Probabilistic inverse techniques, like the Markov Chain Monte Carlo (MCMC) algorithm, have had recent success in combining disparate data types into a consistent model. The Stochastic Engine (SE) initiative was a technique that developed this method and applied it to a number of earth science and national security applications. For instance, while the method was originally developed to solve ground flow problems (Aines et al.), it has also been applied to atmospheric modeling and engineering problems. The investigators of this proposal have applied the SE to regional-scale lithospheric earth models, which have applications to hazard analysis and nuclear explosion monitoring.more » While this broad applicability is appealing, tailoring the method for each application is inefficient and time-consuming. Stochastic methods invert data by probabilistically sampling the model space and comparing observations predicted by the proposed model to observed data and preferentially accepting models that produce a good fit, generating a posterior distribution. In other words, the method ''inverts'' for a model or, more precisely, a distribution of models, by a series of forward calculations. While powerful, the technique is often challenging to implement, as the mapping from model space to data needs to be ''customized'' for each data type. For example, all proposed models might need to be transformed through sensitivity kernels from 3-D models to 2-D models in one step in order to compute path integrals, and transformed in a completely different manner in the next step. We seek technical enhancements that widen the applicability of the Stochastic Engine by generalizing some aspects of the method (i.e. model-to-data transformation types, configuration, model representation). Initially, we wish to generalize the transformations that are necessary to match the observations to proposed models. These transformations are sufficiently general not to pertain to any single application. This is a new and innovative approach to the problem, providing a framework to increase the efficiency of its implementation. The overall goal is to reduce response time and make the approach as ''plug-and-play'' as possible, and will result in the rapid accumulation of new data types for a host of both earth science and non-earth science problems.« less
  • The LDRD project 'A New Method for Wave Propagation in Elastic Media' developed several improvements to the traditional finite difference technique for seismic wave propagation, including a summation-by-parts discretization which is provably stable for arbitrary heterogeneous materials, an accurate treatment of non-planar topography, local mesh refinement, and stable outflow boundary conditions. This project also implemented these techniques in a parallel open source computer code called WPP, and participated in several seismic modeling efforts to simulate ground motion due to earthquakes in Northern California. This research has been documented in six individual publications which are summarized in this report. Of thesemore » publications, four are published refereed journal articles, one is an accepted refereed journal article which has not yet been published, and one is a non-refereed software manual. The report concludes with a discussion of future research directions and exit plan.« less