National Library of Energy BETA

Sample records for advanced scientific computing

  1. Advanced Scientific Computing Research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, ... The DOE Office of Science's Advanced Scientific Computing Research (ASCR) program ...

  2. Advanced Scientific Computing Research

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, developing, and deploying computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to the Department of Energy. Get Expertise Pieter Swart (505) 665 9437 Email Pat McCormick (505) 665-0201 Email Dave Higdon (505) 667-2091 Email Fulfilling the potential of emerging computing systems and architectures beyond today's tools and techniques to deliver

  3. Advanced Scientific Computing Research (ASCR)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... ASCR's programs have helped establish computation as a third pillar of science along with theory and physical experiments. Sandia has extensive ASCR programs in Computer Science ...

  4. Large Scale Computing and Storage Requirements for Advanced Scientific...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research: Target 2014 ASCRFrontcover.png Large Scale Computing and Storage Requirements for ...

  5. Large Scale Computing and Storage Requirements for Advanced Scientific

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing Research: Target 2014 Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research: Target 2014 ASCRFrontcover.png Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research An ASCR / NERSC Review January 5-6, 2011 Final Report Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research, Report of the Joint ASCR / NERSC Workshop conducted January 5-6, 2011 Goals This workshop is being

  6. Advanced Scientific Computing Research Network Requirements

    SciTech Connect (OSTI)

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  7. Energy Department Requests Proposals for Advanced Scientific Computing

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Research | Department of Energy Advanced Scientific Computing Research Energy Department Requests Proposals for Advanced Scientific Computing Research December 27, 2005 - 4:55pm Addthis WASHINGTON, DC - The Department of Energy's Office of Science and the National Nuclear Security Administration (NNSA) have issued a joint Request for Proposals for advanced scientific computing research. DOE expects to fund $67 million annually for three years to five years under its Scientific Discovery

  8. NERSC Role in Advanced Scientific Computing Research Katherine Yelick

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Advanced Scientific Computing Research Katherine Yelick NERSC Director Requirements Workshop NERSC Mission The mission of the National Energy Research Scientific Computing Center (NERSC) is to accelerate the pace of scientific discovery by providing high performance computing, information, data, and communications services for all DOE Office of Science (SC) research. Sample Scientific Accomplishments at NERSC 3 Award-winning software uses massively-parallel supercomputing to map hydrocarbon

  9. Advanced Scientific Computing Advisory Committee (ASCAC) Homepage | U.S.

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    DOE Office of Science (SC) ASCAC Home Advanced Scientific Computing Advisory Committee (ASCAC) ASCAC Home Meetings Members Charges/Reports ASCAC Charter 2015 - signed .pdf file (134KB) ASCR Committees of Visitors Federal Advisory Committees ASCR Home Exascale Advisory Committee Report .pdf file (2.1MB) The Opportunities and Challenges of Exascale Computing The Exascale initiative will be significant and transformative for Department of Energy missions. The ASCAC Subcommitte report is

  10. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Subcommittee Report on Scientific and Technical Information

    SciTech Connect (OSTI)

    Hey, Tony; Agarwal, Deborah; Borgman, Christine; Cartaro, Concetta; Crivelli, Silvia; Van Dam, Kerstin Kleese; Luce, Richard; Arjun, Shankar; Trefethen, Anne; Wade, Alex; Williams, Dean

    2015-09-04

    The Advanced Scientific Computing Advisory Committee (ASCAC) was charged to form a standing subcommittee to review the Department of Energy’s Office of Scientific and Technical Information (OSTI) and to begin by assessing the quality and effectiveness of OSTI’s recent and current products and services and to comment on its mission and future directions in the rapidly changing environment for scientific publication and data. The Committee met with OSTI staff and reviewed available products, services and other materials. This report summaries their initial findings and recommendations.

  11. Advanced Scientific Computing Research (ASCR) Homepage | U.S...

    Office of Science (SC) Website

    Edison Dedication External link Users are invited to make heavy use of new computer as ... computing, including the need for a new scientific workflow.Read More .pdf file ...

  12. DOE Advanced Scientific Computing Advisory Committee (ASCAC) Report: Exascale Computing Initiative Review

    SciTech Connect (OSTI)

    Reed, Daniel; Berzins, Martin; Pennington, Robert; Sarkar, Vivek; Taylor, Valerie

    2015-08-01

    On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.

  13. National facility for advanced computational science: A sustainable path to scientific discovery

    SciTech Connect (OSTI)

    Simon, Horst; Kramer, William; Saphir, William; Shalf, John; Bailey, David; Oliker, Leonid; Banda, Michael; McCurdy, C. William; Hules, John; Canning, Andrew; Day, Marc; Colella, Philip; Serafini, David; Wehner, Michael; Nugent, Peter

    2004-04-02

    Lawrence Berkeley National Laboratory (Berkeley Lab) proposes to create a National Facility for Advanced Computational Science (NFACS) and to establish a new partnership between the American computer industry and a national consortium of laboratories, universities, and computing facilities. NFACS will provide leadership-class scientific computing capability to scientists and engineers nationwide, independent of their institutional affiliation or source of funding. This partnership will bring into existence a new class of computational capability in the United States that is optimal for science and will create a sustainable path towards petaflops performance.

  14. ADVANCED SCIENTIFIC COMPUTING ADVISORY COMMITTEE April 4, 2016 | U.S. DOE

    Office of Science (SC) Website

    Office of Science (SC) 16 Advanced Scientific Computing Advisory Committee (ASCAC) ASCAC Home Meetings September 2016 April 2016 December 2015 July 2015 March 2015 November 2014 March 2014 November 2013 March 2013 October 2012 August 2012 March 2012 November 2011 August 2011 March 2011 November 2010 August 2010 March 2010 November 2009 August 2009 March 2009 October 2008 August 2008 February 2008 November 2007 August 2007 February 2007 November 2006 August 2006 March 2006 April 2004 March

  15. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC) Report: Top Ten Exascale Research Challenges

    SciTech Connect (OSTI)

    Lucas, Robert; Ang, James; Bergman, Keren; Borkar, Shekhar; Carlson, William; Carrington, Laura; Chiu, George; Colwell, Robert; Dally, William; Dongarra, Jack; Geist, Al; Haring, Rud; Hittinger, Jeffrey; Hoisie, Adolfy; Klein, Dean Micron; Kogge, Peter; Lethin, Richard; Sarkar, Vivek; Schreiber, Robert; Shalf, John; Sterling, Thomas; Stevens, Rick; Bashor, Jon; Brightwell, Ron; Coteus, Paul; Debenedictus, Erik; Hiller, Jon; Kim, K. H.; Langston, Harper; Murphy, Richard Micron; Webster, Clayton; Wild, Stefan; Grider, Gary; Ross, Rob; Leyffer, Sven; Laros III, James

    2014-02-10

    Exascale computing systems are essential for the scientific fields that will transform the 21st century global economy, including energy, biotechnology, nanotechnology, and materials science. Progress in these fields is predicated on the ability to perform advanced scientific and engineering simulations, and analyze the deluge of data. On July 29, 2013, ASCAC was charged by Patricia Dehmer, the Acting Director of the Office of Science, to assemble a subcommittee to provide advice on exascale computing. This subcommittee was directed to return a list of no more than ten technical approaches (hardware and software) that will enable the development of a system that achieves the Department's goals for exascale computing. Numerous reports over the past few years have documented the technical challenges and the non¬-viability of simply scaling existing computer designs to reach exascale. The technical challenges revolve around energy consumption, memory performance, resilience, extreme concurrency, and big data. Drawing from these reports and more recent experience, this ASCAC subcommittee has identified the top ten computing technology advancements that are critical to making a capable, economically viable, exascale system.

  16. Edison Electrifies Scientific Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Edison Electrifies Scientific Computing Edison Electrifies Scientific Computing NERSC Flips Switch on New Flagship Supercomputer January 31, 2014 Contact: Margie Wylie, mwylie@lbl.gov, +1 510 486 7421 The National Energy Research Scientific Computing (NERSC) Center recently accepted "Edison," a new flagship supercomputer designed for scientific productivity. Named in honor of American inventor Thomas Alva Edison, the Cray XC30 will be dedicated in a ceremony held at the Department of

  17. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    SciTech Connect (OSTI)

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..; Easter, Richard C; Elliott, Scott M.; Ghan, Steven J.; Liu, Xiaohong; Lowrie, Robert B.; Lucas, Donald D.; Ma, Po-lun; Sacks, William J.; Shrivastava, Manish; Singh, Balwinder; Tautges, Timothy J.; Taylor, Mark A.; Vertenstein, Mariana; Worley, Patrick H.

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  18. Scientific Cloud Computing Misconceptions

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scientific Cloud Computing Misconceptions Scientific Cloud Computing Misconceptions July 1, 2011 Part of the Magellan project was to understand both the possibilities and the limitations of cloud computing in the pursuit of science. At a recent conference, Magellan investigator Shane Canon outlined some persistent misconceptions about doing science in the cloud - and what Magellan has taught us about them. » Read the ISGTW story. » Download the slides (PDF, 4.1MB

  19. Helping Advance the Scientific Foundation that Enables Major...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... Quantum Optics Polariton Lasing Unconventional Lasing Enabling Energy Efficiency ... Fusion Energy Sciences Advanced Scientific Computing Research (ASCR) Biological and ...

  20. National Energy Research Scientific Computing Center

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scientific Computing Center 2004 annual report Cover image: Visualization based on a simulation of the density of a fuel pellet after it is injected into a tokamak fusion reactor. See page 40 for more information. National Energy Research Scientific Computing Center 2004 annual report Ernest Orlando Lawrence Berkeley National Laboratory * University of California * Berkeley, California 94720 This work was supported by the Director, Office of Science, Office of Advanced Scientific Computing

  1. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

    SciTech Connect (OSTI)

    Saffer, Shelley I.

    2014-12-01

    This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

  2. Advanced Computing Tech Team | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Advanced Computing Tech Team Advanced Computing Tech Team Advanced Computing Tech Team The Advanced Computing Tech Team is working with the DOE Energy Technology Offices, the Office of Science, and the National Nuclear Security Administration to deliver technologies that will be used to create new scientific insights into complex physical systems. Advanced computing technologies have been used for decades to provide better understanding of the performance and reliability of the nuclear stockpile

  3. Edison Electrifies Scientific Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... Deployment of Edison was made possible in part by funding from DOE's Office of Science and the DARPA High Productivity Computing Systems program. DOE's Office of Science is the ...

  4. Energy Department Requests Proposals for Advanced Scientific...

    Broader source: Energy.gov (indexed) [DOE]

    integrates applied mathematics, computer science and computational science in the physical, biological and environmental sciences for scientific discovery on petascale computers. ...

  5. About the Advanced Computing Tech Team | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    About the Advanced Computing Tech Team About the Advanced Computing Tech Team The Advanced Computing Tech Team is made up of representatives from DOE and its national laboratories who are involved with developing and using advanced computing tools. The following is a list of some of those programs and what how they are currently using advanced computing in pursuit of their respective missions. Advanced Science Computing Research (ASCR) The mission of the Advanced Scientific Computing Research

  6. Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect (OSTI)

    Damevski, Kostadin

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  7. DOE Supercomputing Resources Available for Advancing Scientific

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Breakthroughs | Department of Energy Supercomputing Resources Available for Advancing Scientific Breakthroughs DOE Supercomputing Resources Available for Advancing Scientific Breakthroughs April 15, 2009 - 12:00am Addthis Washington, DC - The U.S. Department of Energy (DOE) announced today it is accepting proposals for a program to support high-impact scientific advances through the use of some of the world's most powerful supercomputers located at DOE national laboratories. Approximately

  8. Advanced Simulation and Computing

    National Nuclear Security Administration (NNSA)

    NA-ASC-117R-09-Vol.1-Rev.0 Advanced Simulation and Computing PROGRAM PLAN FY09 October 2008 ASC Focal Point Robert Meisner, Director DOE/NNSA NA-121.2 202-586-0908 Program Plan Focal Point for NA-121.2 Njema Frazier DOE/NNSA NA-121.2 202-586-5789 A Publication of the Office of Advanced Simulation & Computing, NNSA Defense Programs i Contents Executive Summary ----------------------------------------------------------------------------------------------- 1 I. Introduction

  9. NERSC National Energy Research Scientific Computing Center

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    National Energy Research Scientific Computing Center 2007 Annual Report National Energy Research Scientific Computing Center 2007 Annual Report Ernest Orlando Lawrence Berkeley National Laboratory 1 Cyclotron Road, Berkeley, CA 94720-8148 This work was supported by the Director, Office of Science, Office of Ad- vanced Scientific Computing Research of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. LBNL-1143E, October 2008 iii National Energy Research Scientific Computing

  10. What Are the Computational Keys to Future Scientific Discoveries?

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    What are the Computational Keys to Future Scientific Discoveries? What Are the Computational Keys to Future Scientific Discoveries? NERSC Develops a Data Intensive Pilot Program to Help Scientists Find Out August 23, 2012 Linda Vu,lvu@lbl.gov, +1 510 495 2402 ALS.jpg Advanced Light Source at the Lawrence Berkeley National Laboratory. (Photo by: Roy Kaltschmidt, Berkeley Lab) A new camera at the hard x-ray tomography beamline of Lawrence Berkeley National Laboratory's (Berkeley Lab's) Advanced

  11. Sandia National Laboratories: Advanced Simulation and Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Facebook Twitter YouTube Flickr RSS Advanced Simulation and Computing Advanced Simulation and Computing Taking on the World's Complex Challenges Advancing Science Frontiers Our research is producing new scientific insights about the world in which we live and assists in certifying the safety and reliability of the nation's nuclear weapons stockpile. Technology Provides the Tools Growth in data and the software and hardware demands needed for physics-based answers and predictive capabilities are

  12. Large Scale Production Computing and Storage Requirements for Advanced

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scientific Computing Research: Target 2017 Large Scale Production Computing and Storage Requirements for Advanced Scientific Computing Research: Target 2017 ASCRLogo.png This is an invitation-only review organized by the Department of Energy's Office of Advanced Scientific Computing Research (ASCR) and NERSC. The general goal is to determine production high-performance computing, storage, and services that will be needed for ASCR to achieve its science goals through 2017. A specific focus

  13. Advanced Scientific Computing Research Jobs

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

  14. PIA - Advanced Test Reactor National Scientific User Facility...

    Broader source: Energy.gov (indexed) [DOE]

    Advanced Test Reactor National Scientific User Facility Users Week 2009 PIA - Advanced Test Reactor National Scientific User Facility Users Week 2009 (316.78 KB) More Documents & ...

  15. Fermilab | Science | Particle Physics | Scientific Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scientific Computing Feynman Computing Center State-of-the-art computing facilities and expertise drive successful research in experimental and theoretical particle physics. Fermilab is a pioneer in managing "big data" and counts scientific computing as one of its core competencies. For scientists to understand the huge amounts of raw information coming from particle physics experiments, they must process, analyze and compare the information to simulations. To accomplish these feats,

  16. Advances and Challenges in Computational Plasma Science

    SciTech Connect (OSTI)

    W.M. Tang; V.S. Chan

    2005-01-03

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behavior. Recent advances in simulations of magnetically-confined plasmas are reviewed in this paper with illustrative examples chosen from associated research areas such as microturbulence, magnetohydrodynamics, and other topics. Progress has been stimulated in particular by the exponential growth of computer speed along with significant improvements in computer technology.

  17. advanced simulation and computing

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  18. Scientific Computing at Los Alamos National Laboratory (Conference...

    Office of Scientific and Technical Information (OSTI)

    Scientific Computing at Los Alamos National Laboratory Citation Details In-Document Search Title: Scientific Computing at Los Alamos National Laboratory You are accessing a ...

  19. Recap: Advancing Scientific Innovation at the National Labs | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Advancing Scientific Innovation at the National Labs Recap: Advancing Scientific Innovation at the National Labs April 3, 2014 - 1:00pm Addthis Ben Dotson Ben Dotson Former Project Coordinator for Digital Reform, Office of Public Affairs Advancing Scientific Innovation at the National Labs During the month of March, we featured the Energy Department's National Labs and how they are advancing scientific innovation through user facilities and industry partnerships. Storified by Energy

  20. PIA - Advanced Test Reactor National Scientific User Facility Users Week

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    2009 | Department of Energy Advanced Test Reactor National Scientific User Facility Users Week 2009 PIA - Advanced Test Reactor National Scientific User Facility Users Week 2009 PIA - Advanced Test Reactor National Scientific User Facility Users Week 2009 PIA - Advanced Test Reactor National Scientific User Facility Users Week 2009 (316.78 KB) More Documents & Publications PIA - INL SECURITY INFORMATION MANAGEMENT SYSTEM BUSINESS ENCLAVE PIA - INL Education Programs Business Enclav

  1. Can Cloud Computing Address the Scientific Computing Requirements...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    the ever-increasing computational needs of scientists, Department of Energy ... and as the largest funder of basic scientific research in the U.S., DOE was interested in ...

  2. Exploring HPCS Languages in Scientific Computing

    SciTech Connect (OSTI)

    Barrett, Richard F; Alam, Sadaf R; de Almeida, Valmor F; Bernholdt, David E; Elwasif, Wael R; Kuehn, Jeffery A; Poole, Stephen W; Shet, Aniruddha G

    2008-01-01

    As computers scale up dramatically to tens and hundreds of thousands of cores, develop deeper computational and memory hierarchies, and increased heterogeneity, developers of scientific software are increasingly challenged to express complex parallel simulations effectively and efficiently. In this paper, we explore the three languages developed under the DARPA High-Productivity Computing Systems (HPCS) program to help address these concerns: Chapel, Fortress, and X10. These languages provide a variety of features not found in currently popular HPC programming environments and make it easier to express powerful computational constructs, leading to new ways of thinking about parallel programming. Though the languages and their implementations are not yet mature enough for a comprehensive evaluation, we discuss some of the important features, and provide examples of how they can be used in scientific computing. We believe that these characteristics will be important to the future of high-performance scientific computing, whether the ultimate language of choice is one of the HPCS languages or something else.

  3. (Sparsity in large scale scientific computation)

    SciTech Connect (OSTI)

    Ng, E.G.

    1990-08-20

    The traveler attended a conference organized by the 1990 IBM Europe Institute at Oberlech, Austria. The theme of the conference was on sparsity in large scale scientific computation. The conference featured many presentations and other activities of direct interest to ORNL research programs on sparse matrix computations and parallel computing, which are funded by the Applied Mathematical Sciences Subprogram of the DOE Office of Energy Research. The traveler presented a talk on his work at ORNL on the development of efficient algorithms for solving sparse nonsymmetric systems of linear equations. The traveler held numerous technical discussions on issues having direct relevance to the research programs on sparse matrix computations and parallel computing at ORNL.

  4. Center for Technology for Advanced Scientific Component Software (TASCS) Consolidated Progress Report July 2006 - March 2009

    SciTech Connect (OSTI)

    Bernholdt, D E; McInnes, L C; Govindaraju, M; Bramley, R; Epperly, T; Kohl, J A; Nieplocha, J; Armstrong, R; Shasharina, S; Sussman, A L; Sottile, M; Damevski, K

    2009-04-14

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  5. Magellan Explores Cloud Computing for DOE's Scientific Mission

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Explores Cloud Computing for DOE's Scientific Mission Magellan Explores Cloud Computing for DOE's Scientific Mission March 30, 2011 Cloud Control -This is a picture of the Magellan management and network control racks at NERSC. To test cloud computing for scientific capability, NERSC and the Argonne Leadership Computing Facility (ALCF) installed purpose-built testbeds for running scientific applications on the IBM iDataPlex cluster. (Photo Credit: Roy Kaltschmidt) Cloud computing is gaining

  6. Sandia National Laboratories: Advanced Simulation and Computing:

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Systems & Software Environment Computational Systems & Software Environment Advanced Simulation and Computing Computational Systems & Software Environment Integrated Codes Physics & Engineering Models Verification & Validation Facilities Operation & User Support Research & Collaboration Contact ASC Advanced Simulation and Computing Computational Systems & Software Environment Crack Modeling The Computational Systems & Software Environment

  7. Final Report for "Center for Technology for Advanced Scientific Component Software"

    SciTech Connect (OSTI)

    Svetlana Shasharina

    2010-12-01

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  8. Advanced Simulation and Computing Program

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Advanced Simulation and Computing (ASC) Program Unstable intermixing of heavy (sulfur hexafluoride) and light fluid (air). Show Caption Turbulence generated by unstable fluid flow. Show Caption Examining the effects of a one-megaton nuclear energy source detonated on the surface of an asteroid. Show Caption Los Alamos National Laboratory is home to two of the world's most powerful supercomputers, each capable of performing more than 1,000 trillion operations per second. The newer one, Cielo, was

  9. Institute for Scientific Computing Research Fiscal Year 2002 Annual Report

    SciTech Connect (OSTI)

    Keyes, D E; McGraw, J R; Bodtker, L K

    2003-03-11

    The Institute for Scientific Computing Research (ISCR) at Lawrence Livermore National Laboratory is jointly administered by the Computing Applications and Research Department (CAR) and the University Relations Program (URP), and this joint relationship expresses its mission. An extensively externally networked ISCR cost-effectively expands the level and scope of national computational science expertise available to the Laboratory through CAR. The URP, with its infrastructure for managing six institutes and numerous educational programs at LLNL, assumes much of the logistical burden that is unavoidable in bridging the Laboratory's internal computational research environment with that of the academic community. As large-scale simulations on the parallel platforms of DOE's Advanced Simulation and Computing (ASCI) become increasingly important to the overall mission of LLNL, the role of the ISCR expands in importance, accordingly. Relying primarily on non-permanent staffing, the ISCR complements Laboratory research in areas of the computer and information sciences that are needed at the frontier of Laboratory missions. The ISCR strives to be the ''eyes and ears'' of the Laboratory in the computer and information sciences, in keeping the Laboratory aware of and connected to important external advances. It also attempts to be ''feet and hands, in carrying those advances into the Laboratory and incorporating them into practice. In addition to conducting research, the ISCR provides continuing education opportunities to Laboratory personnel, in the form of on-site workshops taught by experts on novel software or hardware technologies. The ISCR also seeks to influence the research community external to the Laboratory to pursue Laboratory-related interests and to train the workforce that will be required by the Laboratory. Part of the performance of this function is interpreting to the external community appropriate (unclassified) aspects of the Laboratory's own contributions

  10. ASCR Cybersecurity for Scientific Computing Integrity

    SciTech Connect (OSTI)

    Piesert, Sean

    2015-02-27

    The Department of Energy (DOE) has the responsibility to address the energy, environmental, and nuclear security challenges that face our nation. Much of DOE’s enterprise involves distributed, collaborative teams; a signi¬cant fraction involves “open science,” which depends on multi-institutional, often international collaborations that must access or share signi¬cant amounts of information between institutions and over networks around the world. The mission of the Office of Science is the delivery of scienti¬c discoveries and major scienti¬c tools to transform our understanding of nature and to advance the energy, economic, and national security of the United States. The ability of DOE to execute its responsibilities depends critically on its ability to assure the integrity and availability of scienti¬c facilities and computer systems, and of the scienti¬c, engineering, and operational software and data that support its mission.

  11. Barbara Helland Advanced Scientific Computing Research NERSC...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    rive d iscussions Program R equirements R eviews Program offices evaluated every two-three years Participants include program managers, PI Scientists, ESnetNERSC staff...

  12. Supporting Advanced Scientific Computing Research * Basic Energy...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Peeringupgrades: * EQX-SJ:installedMX480onOct15 th * EQX-ASH:installedMX480onNov30 th * EQX-CHI:PendingMX480ins...

  13. Supporting Advanced Scientific Computing Research * Basic Energy...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    * Human interface * Machine interfaces: RESTJSON Net Almanac: Example You got your ... - http:graphite.wikidot.com * REST - http:www.infoq.comarticles...

  14. DOE Advanced Scientific Computing Advisory Committee (ASCAC)...

    Office of Scientific and Technical Information (OSTI)

    Kerstin Kleese 5 ; Luce, Richard 6 ; Arjun, Shankar 7 ; Trefethen, Anne 8 ; Wade, Alex 9 ; Williams, Dean 10 + Show Author Affiliations eScience Institute, ...

  15. ADVANCED SCIENTIFIC COMPUTING ADVISORY COMMITTEEMonday, July...

    Office of Science (SC) Website

    All times listed are given in Eastern Standard Time we request that members of the public notify the DFO, Christine Chalk, that you intend to call into the meeting via email ...

  16. DOE Advanced Scientific Computing Advisory Subcommittee (ASCAC...

    Office of Scientific and Technical Information (OSTI)

    Intel Institute for Defense Analyses University of California, San Diego IBM DARPA NVIDIA University of Tennessee Oak Ridge National Laboratory Lawrence Livermore ...

  17. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    SciTech Connect (OSTI)

    William M. Tang

    2011-02-09

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  18. National Energy Research Scientific Computing Center | U.S. DOE...

    Office of Science (SC) Website

    National Labs, Profiles, and Contacts National Energy Research Scientific Computing ... Technology Transfer U.S. Department of Energy SC-29Germantown Building 1000 ...

  19. National Energy Research Scientific Computing Center NERSC Exceeds...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scientific Computing Center NERSC Exceeds Reliability Standards With Tape-Based Active ... on the archive, NERSC's storage capacity and reliability requirements are significant. ...

  20. The National Energy Research Scientific Computing Center: Forty...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    The National Energy Research Scientific Computing Center: Forty Years of Supercomputing ... discovery has been evident in both simulation and data analysis for many years. ...

  1. Energy Department Seeks Proposals to Use Scientific Computing Resources at

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Lawrence Berkeley, Oak Ridge National Laboratories | Department of Energy Proposals to Use Scientific Computing Resources at Lawrence Berkeley, Oak Ridge National Laboratories Energy Department Seeks Proposals to Use Scientific Computing Resources at Lawrence Berkeley, Oak Ridge National Laboratories June 29, 2005 - 1:50pm Addthis WASHINGTON, DC -- Secretary of Energy Samuel W. Bodman announced today that DOE's Office of Science is seeking proposals to support computational science projects

  2. National Energy Research Scientific Computing Center

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... use include on-demand computing functionality for ... mega-electron volts per meter before the metal breaks down. ... been collaborating with earth scientists at Berkeley Lab ...

  3. Initial explorations of ARM processors for scientific computing...

    Office of Scientific and Technical Information (OSTI)

    DOE Contract Number: AC02-07CH11359 Resource Type: Conference Resource Relation: Conference: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics ...

  4. NERSC, Cray Move Forward With Next-Generation Scientific Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    NERSC, Cray Move Forward With Next-Generation Scientific Computing NERSC, Cray Move Forward With Next-Generation Scientific Computing New Cray XC40 will be first supercomputer in Berkeley Lab's new Computational Research and Theory facility April 22, 2015 Contact: Jon Bashor, jbashor@lbl.gov, 510-486-5849 NewCRT.jpg The Cori Phase 1 system will be the first supercomputer installed in the new Computational Research and Theory Facility now in the final stages of construction at Lawrence Berkeley

  5. Scientific computations section monthly report, November 1993

    SciTech Connect (OSTI)

    Buckner, M.R.

    1993-12-30

    This progress report from the Savannah River Technology Center contains abstracts from papers from the computational modeling, applied statistics, applied physics, experimental thermal hydraulics, and packaging and transportation groups. Specific topics covered include: engineering modeling and process simulation, criticality methods and analysis, plutonium disposition.

  6. Helping Advance the Scientific Foundation that Enables Major Efficiency

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Improvements Helping Advance the Scientific Foundation that Enables Major Efficiency Improvements - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization

  7. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect (OSTI)

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  8. Collaboration to advance high-performance computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Collaboration to advance high-performance computing Collaboration to advance high-performance computing LANL and EMC will enhance, design, build, test, and deploy new cutting-edge technologies to meet some of the most difficult information technology challenges. December 21, 2011 Los Alamos National Laboratory sits on top of a once-remote mesa in northern New Mexico with the Jemez mountains as a backdrop to research and innovation covering multi-disciplines from bioscience, sustainable energy

  9. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect (OSTI)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  10. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    SciTech Connect (OSTI)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  11. Scientific and technological advancements in inertial fusion energy

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Hinkel, D. E.

    2013-09-26

    Scientific advancements in inertial fusion energy (IFE) were reported on at the IAEA Fusion Energy Conference, October 2012. Results presented transect the different ways to assemble the fuel, different scenarios for igniting the fuel, and progress in IFE technologies. The achievements of the National Ignition Campaign within the USA, using the National Ignition Facility (NIF) to indirectly drive laser fusion, have found beneficial the achievements in other IFE arenas such as directly driven laser fusion and target fabrication. Moreover, the successes at NIF have pay-off to alternative scenarios such as fast ignition, shock ignition, and heavy-ion fusion as well asmore » to directly driven laser fusion. As a result, this synergy is summarized here, and future scientific studies are detailed.« less

  12. Scientific and technological advancements in inertial fusion energy

    SciTech Connect (OSTI)

    Hinkel, D. E.

    2013-09-26

    Scientific advancements in inertial fusion energy (IFE) were reported on at the IAEA Fusion Energy Conference, October 2012. Results presented transect the different ways to assemble the fuel, different scenarios for igniting the fuel, and progress in IFE technologies. The achievements of the National Ignition Campaign within the USA, using the National Ignition Facility (NIF) to indirectly drive laser fusion, have found beneficial the achievements in other IFE arenas such as directly driven laser fusion and target fabrication. Moreover, the successes at NIF have pay-off to alternative scenarios such as fast ignition, shock ignition, and heavy-ion fusion as well as to directly driven laser fusion. As a result, this synergy is summarized here, and future scientific studies are detailed.

  13. A Component Architecture for High-Performance Scientific Computing

    SciTech Connect (OSTI)

    Bernholdt, D E; Allan, B A; Armstrong, R; Bertrand, F; Chiu, K; Dahlgren, T L; Damevski, K; Elwasif, W R; Epperly, T W; Govindaraju, M; Katz, D S; Kohl, J A; Krishnan, M; Kumfert, G; Larson, J W; Lefantzi, S; Lewis, M J; Malony, A D; McInnes, L C; Nieplocha, J; Norris, B; Parker, S G; Ray, J; Shende, S; Windus, T L; Zhou, S

    2004-12-14

    The Common Component Architecture (CCA) provides a means for software developers to manage the complexity of large-scale scientific simulations and to move toward a plug-and-play environment for high-performance computing. In the scientific computing context, component models also promote collaboration using independently developed software, thereby allowing particular individuals or groups to focus on the aspects of greatest interest to them. The CCA supports parallel and distributed computing as well as local high-performance connections between components in a language-independent manner. The design places minimal requirements on components and thus facilitates the integration of existing code into the CCA environment. The CCA model imposes minimal overhead to minimize the impact on application performance. The focus on high performance distinguishes the CCA from most other component models. The CCA is being applied within an increasing range of disciplines, including combustion research, global climate simulation, and computational chemistry.

  14. A Computing Environment to Support Repeatable Scientific Big Data Experimentation of World-Wide Scientific Literature

    SciTech Connect (OSTI)

    Schlicher, Bob G; Kulesz, James J; Abercrombie, Robert K; Kruse, Kara L

    2015-01-01

    A principal tenant of the scientific method is that experiments must be repeatable and relies on ceteris paribus (i.e., all other things being equal). As a scientific community, involved in data sciences, we must investigate ways to establish an environment where experiments can be repeated. We can no longer allude to where the data comes from, we must add rigor to the data collection and management process from which our analysis is conducted. This paper describes a computing environment to support repeatable scientific big data experimentation of world-wide scientific literature, and recommends a system that is housed at the Oak Ridge National Laboratory in order to provide value to investigators from government agencies, academic institutions, and industry entities. The described computing environment also adheres to the recently instituted digital data management plan mandated by multiple US government agencies, which involves all stages of the digital data life cycle including capture, analysis, sharing, and preservation. It particularly focuses on the sharing and preservation of digital research data. The details of this computing environment are explained within the context of cloud services by the three layer classification of Software as a Service , Platform as a Service , and Infrastructure as a Service .

  15. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    SciTech Connect (OSTI)

    Peisert, Sean; Potok, Thomas E.; Jones, Todd

    2015-06-03

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues included research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three

  16. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect (OSTI)

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  17. PNNL pushing scientific discovery through data intensive computing breakthroughs

    ScienceCinema (OSTI)

    Deborah Gracio; David Koppenaal; Ruby Leung

    2012-12-31

    The Pacific Northwest National Laboratorys approach to data intensive computing (DIC) is focused on three key research areas: hybrid hardware architectures, software architectures, and analytic algorithms. Advancements in these areas will help to address, and solve, DIC issues associated with capturing, managing, analyzing and understanding, in near real time, data at volumes and rates that push the frontiers of current technologies.

  18. Scientific Grand Challenges: Forefront Questions in Nuclear Science and the Role of High Performance Computing

    SciTech Connect (OSTI)

    Khaleel, Mohammad A.

    2009-10-01

    This report is an account of the deliberations and conclusions of the workshop on "Forefront Questions in Nuclear Science and the Role of High Performance Computing" held January 26-28, 2009, co-sponsored by the U.S. Department of Energy (DOE) Office of Nuclear Physics (ONP) and the DOE Office of Advanced Scientific Computing (ASCR). Representatives from the national and international nuclear physics communities, as well as from the high performance computing community, participated. The purpose of this workshop was to 1) identify forefront scientific challenges in nuclear physics and then determine which-if any-of these could be aided by high performance computing at the extreme scale; 2) establish how and why new high performance computing capabilities could address issues at the frontiers of nuclear science; 3) provide nuclear physicists the opportunity to influence the development of high performance computing; and 4) provide the nuclear physics community with plans for development of future high performance computing capability by DOE ASCR.

  19. Computational Design of Advanced Nuclear Fuels

    SciTech Connect (OSTI)

    Savrasov, Sergey; Kotliar, Gabriel; Haule, Kristjan

    2014-06-03

    The objective of the project was to develop a method for theoretical understanding of nuclear fuel materials whose physical and thermophysical properties can be predicted from first principles using a novel dynamical mean field method for electronic structure calculations. We concentrated our study on uranium, plutonium, their oxides, nitrides, carbides, as well as some rare earth materials whose 4f eletrons provide a simplified framework for understanding complex behavior of the f electrons. We addressed the issues connected to the electronic structure, lattice instabilities, phonon and magnon dynamics as well as thermal conductivity. This allowed us to evaluate characteristics of advanced nuclear fuel systems using computer based simulations and avoid costly experiments.

  20. BUSINESS PLAN ADVANCED SIMULATION AND COMPUTING

    National Nuclear Security Administration (NNSA)

    i BUSINESS PLAN ADVANCED SIMULATION AND COMPUTING 2015 NA-ASC-104R-15-Vol.1-Rev.0 ii Prepared by LLNL under Contract DE-AC52-07NA27344. This document was prepared as an account of work sponsored by an agency of the United States government. Neither the United States government nor Lawrence Livermore National Security, LLC, nor any of their employees makes any warranty, expressed or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any

  1. Advanced Test Reactor - A National Scientific User Facility

    SciTech Connect (OSTI)

    Clifford J. Stanley

    2008-05-01

    The ATR is a pressurized, light-water moderated and cooled, beryllium-reflected nuclear research reactor with a maximum operating power of 250 MWth. The unique serpentine configuration of the fuel elements creates five main reactor power lobes (regions) and nine flux traps. In addition to these nine flux traps there are 68 additional irradiation positions in the reactor core reflector tank. There are also 34 low-flux irradiation positions in the irradiation tanks outside the core reflector tank. The ATR is designed to provide a test environment for the evaluation of the effects of intense radiation (neutron and gamma). Due to the unique serpentine core design each of the five lobes can be operated at different powers and controlled independently. Options exist for the individual test trains and assemblies to be either cooled by the ATR coolant (i.e., exposed to ATR coolant flow rates, pressures, temperatures, and neutron flux) or to be installed in their own independent test loops where such parameters as temperature, pressure, flow rate, neutron flux, and energy can be controlled per experimenter specifications. The full-power maximum thermal neutron flux is ~1.0 x1015 n/cm2-sec with a maximum fast flux of ~5.0 x1014 n/cm2-sec. The Advanced Test Reactor, now a National Scientific User Facility, is a versatile tool in which a variety of nuclear reactor, nuclear physics, reactor fuel, and structural material irradiation experiments can be conducted. The cumulative effects of years of irradiation in a normal power reactor can be duplicated in a few weeks or months in the ATR due to its unique design, power density, and operating flexibility.

  2. SciDAC Advances and Applications in Computational Beam Dynamics

    SciTech Connect (OSTI)

    Ryne, R.; Abell, D.; Adelmann, A.; Amundson, J.; Bohn, C.; Cary, J.; Colella, P.; Dechow, D.; Decyk, V.; Dragt, A.; Gerber, R.; Habib, S.; Higdon, D.; Katsouleas, T.; Ma, K.-L.; McCorquodale, P.; Mihalcea, D.; Mitchell, C.; Mori, W.; Mottershead, C.T.; Neri, F.; Pogorelov, I.; Qiang, J.; Samulyak, R.; Serafini, D.; Shalf, J.; Siegerist, C.; Spentzouris, P.; Stoltz, P.; Terzic, B.; Venturini, M.; Walstrom, P.

    2005-06-26

    SciDAC has had a major impact on computational beam dynamics and the design of particle accelerators. Particle accelerators--which account for half of the facilities in the DOE Office of Science Facilities for the Future of Science 20 Year Outlook--are crucial for US scientific, industrial, and economic competitiveness. Thanks to SciDAC, accelerator design calculations that were once thought impossible are now carried routinely, and new challenging and important calculations are within reach. SciDAC accelerator modeling codes are being used to get the most science out of existing facilities, to produce optimal designs for future facilities, and to explore advanced accelerator concepts that may hold the key to qualitatively new ways of accelerating charged particle beams. In this poster we present highlights from the SciDAC Accelerator Science and Technology (AST) project Beam Dynamics focus area in regard to algorithm development, software development, and applications.

  3. DOE Issues Funding Opportunity for Advanced Computational and Modeling Research for the Electric Power System

    Broader source: Energy.gov [DOE]

    The objective of this Funding Opportunity Announcement (FOA) is to leverage scientific advancements in mathematics and computation for application to power system models and software tools, with the long-term goal of enabling real-time protection and control based on wide-area sensor measurements.

  4. Advanced Test Reactor National Scientific User Facility 2010 Annual Report

    SciTech Connect (OSTI)

    Mary Catherine Thelen; Todd R. Allen

    2011-05-01

    This is the 2010 ATR National Scientific User Facility Annual Report. This report provides an overview of the program for 2010, along with individual project reports from each of the university principal investigators. The report also describes the capabilities offered to university researchers here at INL and at the ATR NSUF partner facilities.

  5. High Performance Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    HPC INL Logo Home High-Performance Computing INL's high-performance computing center provides general use scientific computing capabilities to support the lab's efforts in advanced...

  6. Final Scientific Report - Wireless and Sensing Solutions Advancing Industrial Efficiency

    SciTech Connect (OSTI)

    Budampati, Rama; McBrady, Adam; Nusseibeh, Fouad

    2009-09-28

    The project team's goal for the Wireless and Sensing Solution Advancing Industrial Efficiency award (DE-FC36-04GO14002) was to develop, demonstrate, and test a number of leading edge technologies that could enable the emergence of wireless sensor and sampling systems for the industrial market space. This effort combined initiatives in advanced sensor development, configurable sampling and deployment platforms, and robust wireless communications to address critical obstacles in enabling enhanced industrial efficiency.

  7. Final Technical Report - Center for Technology for Advanced Scientific...

    Office of Scientific and Technical Information (OSTI)

    that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes. ...

  8. High-Performance Computing for Advanced Smart Grid Applications...

    Office of Scientific and Technical Information (OSTI)

    Title: High-Performance Computing for Advanced Smart Grid Applications The power grid is becoming far more complex as a result of the grid evolution meeting an information ...

  9. Ames Lab 101: Improving Materials with Advanced Computing

    ScienceCinema (OSTI)

    Johnson, Duane

    2014-06-04

    Ames Laboratory's Chief Research Officer Duane Johnson talks about using advanced computing to develop new materials and predict what types of properties those materials will have.

  10. New partnership uses advanced computer science modeling to address...

    National Nuclear Security Administration (NNSA)

    New partnership uses advanced computer science modeling to address climate change Friday, August 29, 2014 - 10:26am Several national laboratories and institutions have joined ...

  11. Advanced Computing Tech Team | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    the Office of Science, and the National Nuclear Security Administration to deliver technologies that will be used to create new scientific insights into complex physical systems. ...

  12. #WomenInSTEM: A Physicist Focuses on Scientific Advancement | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy A Physicist Focuses on Scientific Advancement #WomenInSTEM: A Physicist Focuses on Scientific Advancement July 17, 2014 - 4:59pm Addthis How Angela Capece got her start as a physicist at the Princeton Plasma Physics Laboratory. | Video by Matty Greene. Ben Dotson Ben Dotson Former Project Coordinator for Digital Reform, Office of Public Affairs Matty Greene Matty Greene Former Videographer More STEM Watch how scientists at the Princeton Plasma Physics Lab are creating a star on Earth.

  13. advanced simulation and computing | National Nuclear Security...

    National Nuclear Security Administration (NNSA)

    NNSA's missions get a boost from brain-inspired, radically different computer design The first computers to contribute to the nation's nuclear security work used thousands of ...

  14. DOE Advanced Scientific Advisory Committee (ASCAC): Workforce Subcommittee Letter

    SciTech Connect (OSTI)

    Chapman, Barbara; Calandra, Henri; Crivelli, Silvia; Dongarra, Jack; Hittinger, Jeffrey; Lathrop, Scott A.; Sarkar, Vivek; Stahlberg, Eric; Vetter, Jeffrey S.; Williams, Dean

    2014-07-23

    Simulation and computing are essential to much of the research conducted at the DOE national laboratories. Experts in the ASCR ¬relevant Computing Sciences, which encompass a range of disciplines including Computer Science, Applied Mathematics, Statistics and domain Computational Sciences, are an essential element of the workforce in nearly all of the DOE national laboratories. This report seeks to identify the gaps and challenges facing DOE with respect to this workforce. This letter is ASCAC’s response to the charge of February 19, 2014 to identify disciplines in which significantly greater emphasis in workforce training at the graduate or postdoctoral levels is necessary to address workforce gaps in current and future Office of Science mission needs.

  15. Scientific Application Requirements for Leadership Computing at the Exascale

    SciTech Connect (OSTI)

    Ahern, Sean; Alam, Sadaf R; Fahey, Mark R; Hartman-Baker, Rebecca J; Barrett, Richard F; Kendall, Ricky A; Kothe, Douglas B; Mills, Richard T; Sankaran, Ramanan; Tharrington, Arnold N; White III, James B

    2007-12-01

    The Department of Energy s Leadership Computing Facility, located at Oak Ridge National Laboratory s National Center for Computational Sciences, recently polled scientific teams that had large allocations at the center in 2007, asking them to identify computational science requirements for future exascale systems (capable of an exaflop, or 1018 floating point operations per second). These requirements are necessarily speculative, since an exascale system will not be realized until the 2015 2020 timeframe, and are expressed where possible relative to a recent petascale requirements analysis of similar science applications [1]. Our initial findings, which beg further data collection, validation, and analysis, did in fact align with many of our expectations and existing petascale requirements, yet they also contained some surprises, complete with new challenges and opportunities. First and foremost, the breadth and depth of science prospects and benefits on an exascale computing system are striking. Without a doubt, they justify a large investment, even with its inherent risks. The possibilities for return on investment (by any measure) are too large to let us ignore this opportunity. The software opportunities and challenges are enormous. In fact, as one notable computational scientist put it, the scale of questions being asked at the exascale is tremendous and the hardware has gotten way ahead of the software. We are in grave danger of failing because of a software crisis unless concerted investments and coordinating activities are undertaken to reduce and close this hardwaresoftware gap over the next decade. Key to success will be a rigorous requirement for natural mapping of algorithms to hardware in a way that complements (rather than competes with) compilers and runtime systems. The level of abstraction must be raised, and more attention must be paid to functionalities and capabilities that incorporate intent into data structures, are aware of memory hierarchy

  16. Advanced Test Reactor National Scientific User Facility: Addressing advanced nuclear materials research

    SciTech Connect (OSTI)

    John Jackson; Todd Allen; Frances Marshall; Jim Cole

    2013-03-01

    The Advanced Test Reactor National Scientific User Facility (ATR NSUF), based at the Idaho National Laboratory in the United States, is supporting Department of Energy and industry research efforts to ensure the properties of materials in light water reactors are well understood. The ATR NSUF is providing this support through three main efforts: establishing unique infrastructure necessary to conduct research on highly radioactive materials, conducting research in conjunction with industry partners on life extension relevant topics, and providing training courses to encourage more U.S. researchers to understand and address LWR materials issues. In 2010 and 2011, several advanced instruments with capability focused on resolving nuclear material performance issues through analysis on the micro (10-6 m) to atomic (10-10 m) scales were installed primarily at the Center for Advanced Energy Studies (CAES) in Idaho Falls, Idaho. These instruments included a local electrode atom probe (LEAP), a field-emission gun scanning transmission electron microscope (FEG-STEM), a focused ion beam (FIB) system, a Raman spectrometer, and an nanoindentor/atomic force microscope. Ongoing capability enhancements intended to support industry efforts include completion of two shielded, irradiation assisted stress corrosion cracking (IASCC) test loops, the first of which will come online in early calendar year 2013, a pressurized and controlled chemistry water loop for the ATR center flux trap, and a dedicated facility intended to house post irradiation examination equipment. In addition to capability enhancements at the main site in Idaho, the ATR NSUF also welcomed two new partner facilities in 2011 and two new partner facilities in 2012; the Oak Ridge National Laboratory, High Flux Isotope Reactor (HFIR) and associated hot cells and the University California Berkeley capabilities in irradiated materials analysis were added in 2011. In 2012, Purdue University’s Interaction of Materials

  17. Computational Advances in Applied Energy | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Computational Advances in Applied Energy Computational Advances in Applied Energy Friedmann-LLNL-SEAB.10.11.pdf (19.92 MB) More Documents & Publications Director's Perspective by George Miller Fact Sheet: Collaboration of Oak Ridge, Argonne, and Livermore (CORAL) QER - Comment of Canadian Hydropower Association

  18. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership...

    Office of Scientific and Technical Information (OSTI)

    Authors: Hoffman, Forest M. 1 ; Bochev, Pavel B. 2 ; Cameron-Smith, Philip J.. 3 ; Easter, Richard C 4 ; Elliott, Scott M. 5 ; Ghan, Steven J. 4 ; Liu, Xiaohong 6 ; ...

  19. Supporting Advanced Scientific Computing Research * Basic Energy Sciences * Biological

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    DNSSEC Implementa/on at ESnet R. Kevin Oberman Sr. Network Engineer February 2, 2010 Why ESnet is Signing * While not covered by the OMB mandate, ESnet supports several organizations which are required to sign * ESnet needs experience with DNSSEC to support these organizations effectively * Future mandates may cover ESnet How ESnet is Signing * Secure64 Secure Signer appliance - Transfers zones from existing master - Public DNS Servers transfer data from the appliance * Compliant with all

  20. Supporting Advanced Scientific Computing Research * Basic Energy Sciences * Biological

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Energy S ciences N etwork Enabling Virtual Science June 9, 2009 Steve C o/er steve@es.net Dept. H ead, E nergy S ciences N etwork Lawrence B erkeley N aDonal L ab The E nergy S ciences N etwork The D epartment o f E nergy's O ffice o f S cience i s o ne o f t he l argest s upporters o f basic r esearch i n t he p hysical s ciences i n t he U .S. * Directly s upports t he r esearch o f s ome 1 5,000 s cienDsts, p ostdocs a nd g raduate s tudents at D OE l aboratories, u niversiDes, o ther F

  1. Supporting Advanced Scientific Computing Research * Basic Energy Sciences * Biological

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ESCC,
Salt
Lake
City
 Steve
Co6er,
Dept
Head

 steve@es.net

 Lawrence
Berkeley
NaDonal
Lab
 Outline
 * Staff
Updates
 * Network
Update
 * Advanced
Networking
IniDaDve
 * ESnet
Projects
 * Infrastructure
Projects
 * Staff
Projects
 Staff
Update
 New
hires:
 * Hing
Chow:

Project
Manager
(ANI)
 * Chris
Tracy:

Network
/
SoVware
Engineer
(ANI)
 * Andy
Lake:

SoVware
Engineer
(ANI)
 *

  2. Energy Department Requests Proposals for Advanced Scientific Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Hydroelectric Power | Department of Energy Report Finds Major Potential to Increase Clean Hydroelectric Power Energy Department Report Finds Major Potential to Increase Clean Hydroelectric Power April 17, 2012 - 12:39pm Addthis Washington, D.C. -- As part of President Obama's all-out, all-of-the-above energy strategy, the Energy Department today released a renewable energy resource assessment detailing the potential to develop electric power generation at existing dams across the United

  3. OSTIblog Articles in the Advanced Scientific Computing Research...

    Office of Scientific and Technical Information (OSTI)

    The models created will be used to simulate changes in the hydrological cycle, with a specific focus on precipitation and surface water in orographically complex regions such as ...

  4. Scientific Discovery through Advanced Computing (SciDAC) | U...

    Office of Science (SC) Website

    Historical information on the previous portfolios can be found on the SciDAC web site. ... Email a Friend Email link to: send SciDAC Web Site SciDAC Logo Meetings and Workshops ...

  5. Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect (OSTI)

    Bramley, Randall B.

    2012-08-02

    Indiana University’s SWIM activities have primarily been in three areas. All are completed, but we are continuing to work on two of them because refinements are useful to both DoE laboratories and the high performance computing community.

  6. NERSC, Cray Move Forward With Next-Generation Scientific Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... (a Cray XC30 system) and will include a number of advanced features designed to accelerate data-intensive applications: Large number of logininteractive nodes to ...

  7. Sandia Energy - High Performance Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    High Performance Computing Home Energy Research Advanced Scientific Computing Research (ASCR) High Performance Computing High Performance Computingcwdd2015-03-18T21:41:24+00:00...

  8. The implications of spatial locality on scientific computing...

    Office of Scientific and Technical Information (OSTI)

    Research Org: Sandia National Laboratories Sponsoring Org: USDOE Country of Publication: United States Language: English Subject: 97 MATHEMATICAL METHODS AND COMPUTING; BENCHMARKS; ...

  9. Scientific Computing at Los Alamos National Laboratory (Conference...

    Office of Scientific and Technical Information (OSTI)

    States Research Org: Los Alamos National Laboratory (LANL) Sponsoring Org: DOELANL Country of Publication: United States Language: English Subject: Mathematics & Computing(97

  10. Supercomputing and Advanced Computing at the National Labs | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Supercomputing and Advanced Computing at the National Labs Supercomputing and Advanced Computing at the National Labs RSS September 30, 2013 Lab Breakthrough: Supercomputing Power to Accelerate Fossil Energy Research Learn how a new supercomputer at the National Energy Technology Laboratory will accelerate research into the next generation of fossil fuel systems. September 26, 2013 Infographic by <a href="/node/379579">Sarah Gerrity</a>, Energy Department.

  11. Multicore Challenges and Benefits for High Performance Scientific Computing

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Nielsen, Ida M.B.; Janssen, Curtis L.

    2008-01-01

    Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexitymore » of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.« less

  12. Energy Department Seeks Proposals to Use Scientific Computing...

    Office of Environmental Management (EM)

    ... machines, as well as five percent of the computer time at DOE's Argonne and Pacific ... DOE's Office of Science is the single largest supporter of basic research in the physical ...

  13. Data-aware distributed scientific computing for big-data problems...

    Office of Scientific and Technical Information (OSTI)

    big-data problems in bio-surveillance Citation Details In-Document Search Title: Data-aware distributed scientific computing for big-data problems in bio-surveillance You are ...

  14. Laboratory Directed Research & Development Page National Energy Research Scientific Computing Center

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Directed Research & Development Page National Energy Research Scientific Computing Center T3E Individual Node Optimization Michael Stewart, SGI/Cray, 4/9/98 * Introduction * T3E Processor * T3E Local Memory * Cache Structure * Optimizing Codes for Cache Usage * Loop Unrolling * Other Useful Optimization Options * References 1 Laboratory Directed Research & Development Page National Energy Research Scientific Computing Center Introduction * Primary topic will be single processor

  15. Sandia National Laboratories: Advanced Simulation and Computing: Facilities

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Operation & User Support Facilities Operation & User Support APPRO The Facilities, Operations and User Support (FOUS) program is responsible for operating and maintaining the computing systems procured by the Advanced Simulation and Computing (ASC) program, and for delivering additional computing related services to Defense Program customers located across the Nuclear Weapons Complex. Sandia has developed a robust User Support capability which provides various services to analysts,

  16. 2012 Scientific Collaborations at Extreme-Scale | U.S. DOE Office of

    Office of Science (SC) Website

    Science (SC) 2 Scientific Collaborations at Extreme-Scale Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Next Generation Networking 2012 Scientific Collaborations at Extreme-Scale Scientific Discovery through Advanced Computing (SciDAC) ASCR SBIR-STTR Facilities Science Highlights Benefits of ASCR Funding Opportunities Advanced Scientific Computing Advisory Committee (ASCAC) Community Resources Contact Information Advanced

  17. Scientific Grand Challenges: Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

    SciTech Connect (OSTI)

    Khaleel, Mohammad A.; Johnson, Gary M.; Washington, Warren M.

    2009-07-02

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) in partnership with the Office of Advanced Scientific Computing Research (ASCR) held a workshop on the challenges in climate change science and the role of computing at the extreme scale, November 6-7, 2008, in Bethesda, Maryland. At the workshop, participants identified the scientific challenges facing the field of climate science and outlined the research directions of highest priority that should be pursued to meet these challenges. Representatives from the national and international climate change research community as well as representatives from the high-performance computing community attended the workshop. This group represented a broad mix of expertise. Of the 99 participants, 6 were from international institutions. Before the workshop, each of the four panels prepared a white paper, which provided the starting place for the workshop discussions. These four panels of workshop attendees devoted to their efforts the following themes: Model Development and Integrated Assessment; Algorithms and Computational Environment; Decadal Predictability and Prediction; Data, Visualization, and Computing Productivity. The recommendations of the panels are summarized in the body of this report.

  18. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    SciTech Connect (OSTI)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  19. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Office of Advanced Scientific Computing Research in the Department of Energy Office of Science under contract number DE-AC02-05CH11231. ! Application and System Memory Use, Configuration, and Problems on Bassi Richard Gerber Lawrence Berkeley National Laboratory NERSC User Services ScicomP 13 Garching bei München, Germany, July 17, 2007 ScicomP 13, July 17, 2007, Garching Overview * About Bassi * Memory on Bassi * Large Page Memory (It's Great!) * System Configuration * Large Page

  20. Advanced computational tools for 3-D seismic analysis

    SciTech Connect (OSTI)

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  1. Scientific Process Automation Improves Data Interaction

    SciTech Connect (OSTI)

    Critchlow, Terence J.

    2009-09-30

    This is an article written for the September 09 Scientific Computing magazine about the work of the Scientific Process Automation team of The U.S. Department of Energy (DOE) Scientific Discovery through Advanced Computing (SciDAC) program. The SPA team is focused on developing and deploying automated workflows for a variety of computational science domains. Scientific workflows are the formalization of a scientific process that is frequently and repetitively performed.

  2. Heterogeneous high throughput scientific computing with APM X-Gene and Intel Xeon Phi

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Abdurachmanov, David; Bockelman, Brian; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Muzaffar, Shahzad

    2015-01-01

    Electrical power requirements will be a constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics. Performance-per-watt is a critical metric for the evaluation of computer architectures for cost- efficient computing. Additionally, future performance growth will come from heterogeneous, many-core, and high computing density platforms with specialized processors. In this paper, we examine the Intel Xeon Phi Many Integrated Cores (MIC) co-processor and Applied Micro X-Gene ARMv8 64-bit low-power server system-on-a-chip (SoC) solutions for scientific computing applications. As a result, we report our experience on software porting, performance and energy efficiency and evaluatemore » the potential for use of such technologies in the context of distributed computing systems such as the Worldwide LHC Computing Grid (WLCG).« less

  3. Department of Energy Designates the Idaho National Laboratory Advanced Test Reactor as a National Scientific User Facility

    Broader source: Energy.gov [DOE]

    WASHINGTON, DC - The U.S. Department of Energy (DOE) today designated the Idaho National Laboratory's (INL) Advanced Test Reactor (ATR) as a National Scientific User Facility.  Establishing the ATR...

  4. New classes of magnetoelectric materials promise advances in computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    technology | Argonne National Laboratory New classes of magnetoelectric materials promise advances in computing technology By Jared Sagoff * February 7, 2013 Tweet EmailPrint ARGONNE, Ill. - Although scientists have been aware that magnetism and electricity are two sides of the same proverbial coin for almost 150 years, researchers are still trying to find new ways to use a material's electric behavior to influence its magnetic behavior, or vice versa. Thanks to new research by an

  5. Sandia National Laboratories: Advanced Simulation Computing: Verification &

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Validation Verification & Validation high-fidelity simulations The Verification and Validation (V&V) program conducts two major activities at Sandia. The first is to perform assessments and studies that quantify confidence in Advanced Simulation and Computing (ASC) calculation results. The second activity develops and improves V&V and uncertainty quantification methods, metrics, and standards. Assessments This project area conducts studies and assessments for Sandia's engineering

  6. Sandia National Laboratories: Advanced Simulation and Computing: Integrated

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Codes Integrated Codes Fuego Within the Advanced Simulation and Computing (ASC) program, the Integrated Codes area develops and improves predictive simulation tools to support U.S. stockpile stewardship. These large-scale codes incorporate physics and engineering models and specialized codes to predict, with reduced uncertainty, the behavior of weapons and their components in a variety of environments. In addition to supporting the stockpile, a number of other national security missions use

  7. New partnership uses advanced computer science modeling to address climate

    National Nuclear Security Administration (NNSA)

    change | National Nuclear Security Administration | (NNSA) partnership uses advanced computer science modeling to address climate change Friday, August 29, 2014 - 10:26am Several national laboratories and institutions have joined forces to develop and apply the most complete climate and Earth system model to address the most challenging and demanding climate change issues. Accelerated Climate Modeling for Energy, or ACME, is designed to accelerate the development and application of fully

  8. Advanced Reactor Thermal Hydraulic Modeling | Argonne Leadership Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Facility Temperature distribution illustrating thermal striping in a T-junction. Computed on Intrepid with Nek5000 and visualized on Eureka with VisIt at the ALCF. Paul Fischer (ANL), Aleks Obabko (ANL), and Hank Childs (LBNL) Advanced Reactor Thermal Hydraulic Modeling PI Name: Paul Fischer PI Email: fischer@mcs.anl.gov Institution: Argonne National Laboratory Allocation Program: INCITE Allocation Hours at ALCF: 25 Million Year: 2012 Research Domain: Energy Technologies The DOE Nuclear

  9. Scientific Grand Challenges: Discovery In Basic Energy Sciences: The Role of Computing at the Extreme Scale - August 13-15, 2009, Washington, D.C.

    SciTech Connect (OSTI)

    Galli, Giulia; Dunning, Thom

    2009-08-13

    The U.S. Department of Energy’s (DOE) Office of Basic Energy Sciences (BES) and Office of Advanced Scientific Computing Research (ASCR) workshop in August 2009 on extreme-scale computing provided a forum for more than 130 researchers to explore the needs and opportunities that will arise due to expected dramatic advances in computing power over the next decade. This scientific community firmly believes that the development of advanced theoretical tools within chemistry, physics, and materials science—combined with the development of efficient computational techniques and algorithms—has the potential to revolutionize the discovery process for materials and molecules with desirable properties. Doing so is necessary to meet the energy and environmental challenges of the 21st century as described in various DOE BES Basic Research Needs reports. Furthermore, computational modeling and simulation are a crucial complement to experimental studies, particularly when quantum mechanical processes controlling energy production, transformations, and storage are not directly observable and/or controllable. Many processes related to the Earth’s climate and subsurface need better modeling capabilities at the molecular level, which will be enabled by extreme-scale computing.

  10. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    SciTech Connect (OSTI)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats

    2014-01-01

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.

  11. High-Performance Computing for Advanced Smart Grid Applications

    SciTech Connect (OSTI)

    Huang, Zhenyu; Chen, Yousu

    2012-07-06

    The power grid is becoming far more complex as a result of the grid evolution meeting an information revolution. Due to the penetration of smart grid technologies, the grid is evolving as an unprecedented speed and the information infrastructure is fundamentally improved with a large number of smart meters and sensors that produce several orders of magnitude larger amounts of data. How to pull data in, perform analysis, and put information out in a real-time manner is a fundamental challenge in smart grid operation and planning. The future power grid requires high performance computing to be one of the foundational technologies in developing the algorithms and tools for the significantly increased complexity. New techniques and computational capabilities are required to meet the demands for higher reliability and better asset utilization, including advanced algorithms and computing hardware for large-scale modeling, simulation, and analysis. This chapter summarizes the computational challenges in smart grid and the need for high performance computing, and present examples of how high performance computing might be used for future smart grid operation and planning.

  12. Previous Computer Science Award Announcements | U.S. DOE Office...

    Office of Science (SC) Website

    Previous Computer Science Award Announcements Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Exascale Tools Workshop ...

  13. Advances in Computational Methods for X-Ray Optics III (Conference...

    Office of Scientific and Technical Information (OSTI)

    Conference: Advances in Computational Methods for X-Ray Optics III Citation Details In-Document Search Title: Advances in Computational Methods for X-Ray Optics III Authors: ...

  14. About the ASCR Computer Science Program | U.S. DOE Office of Science (SC)

    Office of Science (SC) Website

    About the ASCR Computer Science Program Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Exascale Tools Workshop Programming Challenges Workshop Architectures I Workshop External link Architectures II Workshop External link Next Generation Networking Scientific Discovery through Advanced Computing (SciDAC) ASCR SBIR-STTR Facilities Science Highlights Benefits of ASCR Funding Opportunities Advanced Scientific Computing Advisory Committee

  15. Previous Computer Science Award Announcements | U.S. DOE Office of Science

    Office of Science (SC) Website

    (SC) Previous Computer Science Award Announcements Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Exascale Tools Workshop Programming Challenges Workshop Architectures I Workshop External link Architectures II Workshop External link Next Generation Networking Scientific Discovery through Advanced Computing (SciDAC) ASCR SBIR-STTR Facilities Science Highlights Benefits of ASCR Funding Opportunities Advanced Scientific Computing

  16. Computer Science Program | U.S. DOE Office of Science (SC)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computer Science Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Exascale Tools Workshop Programming Challenges Workshop Architectures I Workshop External link Architectures II Workshop External link Next Generation Networking Scientific Discovery through Advanced Computing (SciDAC) ASCR SBIR-STTR Facilities Science Highlights Benefits of ASCR Funding Opportunities Advanced Scientific Computing Advisory Committee (ASCAC) Community

  17. National Energy Research Scientific Computing Center | U.S. DOE Office of

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Science (SC) National Labs, Profiles, and Contacts » National Energy Research Scientific Computing Center (NERSC) Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) SBIR/STTR Home About Funding Opportunity Announcements (FOAs) Applicant and Awardee Resources Quick Links DOE SBIR Online Learning Center External link DOE Phase 0 Small Business Assistance External link Protecting your Trade Secrets, Commercial, and Financial Information Preparing and

  18. Operational Philosophy for the Advanced Test Reactor National Scientific User Facility

    SciTech Connect (OSTI)

    J. Benson; J. Cole; J. Jackson; F. Marshall; D. Ogden; J. Rempe; M. C. Thelen

    2013-02-01

    In 2007, the Department of Energy (DOE) designated the Advanced Test Reactor (ATR) as a National Scientific User Facility (NSUF). At its core, the ATR NSUF Program combines access to a portion of the available ATR radiation capability, the associated required examination and analysis facilities at the Idaho National Laboratory (INL), and INL staff expertise with novel ideas provided by external contributors (universities, laboratories, and industry). These collaborations define the cutting edge of nuclear technology research in high-temperature and radiation environments, contribute to improved industry performance of current and future light-water reactors (LWRs), and stimulate cooperative research between user groups conducting basic and applied research. To make possible the broadest access to key national capability, the ATR NSUF formed a partnership program that also makes available access to critical facilities outside of the INL. Finally, the ATR NSUF has established a sample library that allows access to pre-irradiated samples as needed by national research teams.

  19. Scientific Grand Challenges Workshop Series | U.S. DOE Office of Science

    Office of Science (SC) Website

    (SC) Scientific Grand Challenges Workshop Series Advanced Scientific Computing Research (ASCR) ASCR Home About Research Facilities Science Highlights Benefits of ASCR Funding Opportunities Advanced Scientific Computing Advisory Committee (ASCAC) Community Resources Featured Content ASCR Discovery ASCR Program Documents ASCR Workshops and Conferences Workshops & Conferences Archive DOE Simulations Summit Scientific Grand Challenges Workshop Series SciDAC Conferences HPC Operations Review

  20. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    SciTech Connect (OSTI)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank; Ma, Kwan-Liu; Geveci, Berk; Meredith, Jeremy

    2015-12-01

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressing four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.

  1. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    SciTech Connect (OSTI)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Cook, A.; Deppe, J.; Edel, M.; Fischler, M.; Gaines, I.; Hance, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs.

  2. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect (OSTI)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  3. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    SciTech Connect (OSTI)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  4. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    SciTech Connect (OSTI)

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  5. Advanced Simulation & Computing FY09-FY10 Implementation Plan Volume 2, Rev. 0

    SciTech Connect (OSTI)

    Meisner, R; Perry, J; McCoy, M; Hopson, J

    2008-04-30

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  6. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    SciTech Connect (OSTI)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  7. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    SciTech Connect (OSTI)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  8. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    SciTech Connect (OSTI)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  9. STATEMENT OF CONSIDERATIONS CLASS ADVANCE WAIVER OF THE GOVERNMENT...

    Broader source: Energy.gov (indexed) [DOE]

    Within DOE's Office of Science (SC), the mission of the Advanced Scientific Computing Research (ASCR) program is to discover, develop, and deploy computational and networking ...

  10. Fortran Transformational Tools in Support of Scientific Application Development for Petascale Computer Architectures

    SciTech Connect (OSTI)

    Sottille, Matthew

    2013-09-12

    This document is the final report for a multi-year effort building infrastructure to support tool development for Fortran programs. We also investigated static analysis and code transformation methods relevant to scientific programmers who are writing Fortran programs for petascale-class high performance computing systems. This report details our accomplishments, technical approaches, and provides information on where the research results and code may be obtained from an open source software repository. The report for the first year of the project that was performed at the University of Oregon prior to the PI moving to Galois, Inc. is included as an appendix.

  11. Challenges and Opportunities in Using Automatic Differentiation with Object-Oriented Toolkits for Scientific Computing

    SciTech Connect (OSTI)

    Hovland, P; Lee, S; McInnes, L; Norris, B; Smith, B

    2001-04-17

    The increased use of object-oriented toolkits in large-scale scientific simulation presents new opportunities and challenges for the use of automatic (or algorithmic) differentiation (AD) techniques, especially in the context of optimization. Because object-oriented toolkits use well-defined interfaces and data structures, there is potential for simplifying the AD process. Furthermore, derivative computation can be improved by exploiting high-level information about numerical and computational abstractions. However, challenges to the successful use of AD with these toolkits also exist. Among the greatest challenges is balancing the desire to limit the scope of the AD process with the desire to minimize the work required of a user. They discuss their experiences in integrating AD with the PETSc, PVODE, and TAO toolkits and the plans for future research and development in this area.

  12. Large Scale Production Computing and Storage Requirements for...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Requirements for Advanced Scientific Computing Research: Target 2017 ASCRLogo.png This is an invitation-only review organized by the Department of Energy's Office of Advanced ...

  13. Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report October 2014

    SciTech Connect (OSTI)

    Dan Ogden

    2014-10-01

    Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report October 2014 Highlights • Rory Kennedy, Dan Ogden and Brenden Heidrich traveled to Germantown October 6-7, for a review of the Infrastructure Management mission with Shane Johnson, Mike Worley, Bradley Williams and Alison Hahn from NE-4 and Mary McCune from NE-3. Heidrich briefed the group on the project progress from July to October 2014 as well as the planned path forward for FY15. • Jim Cole gave two invited university seminars at Ohio State University and University of Florida, providing an overview of NSUF including available capabilities and the process for accessing facilities through the peer reviewed proposal process. • Jim Cole and Rory Kennedy co-chaired the NuMat meeting with Todd Allen. The meeting, sponsored by Elsevier publishing, was held in Clearwater, Florida, and is considered one of the premier nuclear fuels and materials conferences. Over 340 delegates attended with 160 oral and over 200 posters presented over 4 days. • Thirty-one pre-applications were submitted for NSUF access through the NE-4 Combined Innovative Nuclear Research Funding Opportunity Announcement. • Fourteen proposals were received for the NSUF Rapid Turnaround Experiment Summer 2014 call. Proposal evaluations are underway. • John Jackson and Rory Kennedy attended the Nuclear Fuels Industry Research meeting. Jackson presented an overview of ongoing NSUF industry research.

  14. DOE Announces $60 Million in Projects to Accelerate Scientific Discovery

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    through Advanced Computing | Department of Energy 0 Million in Projects to Accelerate Scientific Discovery through Advanced Computing DOE Announces $60 Million in Projects to Accelerate Scientific Discovery through Advanced Computing September 7, 2006 - 8:53am Addthis WASHINGTON, D.C. - The U.S. Department of Energy's (DOE) Office of Science today announced approximately $60 million in new awards annually for 30 computational science projects over the next three to five years. The projects

  15. Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing

    SciTech Connect (OSTI)

    Fletcher, James H.; Cox, Philip; Harrington, William J; Campbell, Joseph L

    2013-09-03

    ABSTRACT Project Title: Recovery Act: Advanced Direct Methanol Fuel Cell for Mobile Computing PROJECT OBJECTIVE The objective of the project was to advance portable fuel cell system technology towards the commercial targets of power density, energy density and lifetime. These targets were laid out in the DOE’s R&D roadmap to develop an advanced direct methanol fuel cell power supply that meets commercial entry requirements. Such a power supply will enable mobile computers to operate non-stop, unplugged from the wall power outlet, by using the high energy density of methanol fuel contained in a replaceable fuel cartridge. Specifically this project focused on balance-of-plant component integration and miniaturization, as well as extensive component, subassembly and integrated system durability and validation testing. This design has resulted in a pre-production power supply design and a prototype that meet the rigorous demands of consumer electronic applications. PROJECT TASKS The proposed work plan was designed to meet the project objectives, which corresponded directly with the objectives outlined in the Funding Opportunity Announcement: To engineer the fuel cell balance-of-plant and packaging to meet the needs of consumer electronic systems, specifically at power levels required for mobile computing. UNF used existing balance-of-plant component technologies developed under its current US Army CERDEC project, as well as a previous DOE project completed by PolyFuel, to further refine them to both miniaturize and integrate their functionality to increase the system power density and energy density. Benefits of UNF’s novel passive water recycling MEA (membrane electrode assembly) and the simplified system architecture it enabled formed the foundation of the design approach. The package design was hardened to address orientation independence, shock, vibration, and environmental requirements. Fuel cartridge and fuel subsystems were improved to ensure effective fuel

  16. Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report November 2014

    SciTech Connect (OSTI)

    Soelberg, Renae

    2014-11-01

    Advanced Test Reactor National Scientific User Facility (ATR NSUF) Monthly Report November 2014 Highlights Rory Kennedy and Sarah Robertson attended the American Nuclear Society Winter Meeting and Nuclear Technology Expo in Anaheim, California, Nov. 10-13. ATR NSUF exhibited at the technology expo where hundreds of meeting participants had an opportunity to learn more about ATR NSUF. Dr. Kennedy briefed the Nuclear Engineering Department Heads Organization (NEDHO) on the workings of the ATR NSUF. • Rory Kennedy, James Cole and Dan Ogden participated in a reactor instrumentation discussion with Jean-Francois Villard and Christopher Destouches of CEA and several members of the INL staff. • ATR NSUF received approval from the NE-20 office to start planning the annual Users Meeting. The meeting will be held at INL, June 22-25. • Mike Worley, director of the Office of Innovative Nuclear Research (NE-42), visited INL Nov. 4-5. Milestones Completed • Recommendations for the Summer Rapid Turnaround Experiment awards were submitted to DOE-HQ Nov. 12 (Level 2 milestone due Nov. 30). Major Accomplishments/Activities • The University of California, Santa Barbara 2 experiment was unloaded from the GE-2000 at HFEF. The experiment specimen packs will be removed and shipped to ORNL for PIE. • The Terrani experiment, one of three FY 2014 new awards, was completed utilizing the Advanced Photon Source MRCAT beamline. The experiment investigated the chemical state of Ag and Pd in SiC shell of irradiated TRISO particles via X-ray Absorption Fine Structure (XAFS) spectroscopy. Upcoming Meetings/Events • The ATR NSUF program review meeting will be held Dec. 9-10 at L’Enfant Plaza. In addition to NSUF staff and users, NE-4, NE-5 and NE-7 representatives will attend the meeting. Awarded Research Projects Boise State University Rapid Turnaround Experiments (14-485 and 14-486) Nanoindentation and TEM work on the T91, HT9, HCM12A and 9Cr ODS specimens has been completed at

  17. Unsolicited Projects in 2012: Research in Computer Architecture...

    Office of Science (SC) Website

    Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Exascale Tools Workshop Programming Challenges Workshop Architectures I ...

  18. DOE High Performance Computing Operational Review (HPCOR): Enabling Data-Driven Scientific Discovery at HPC Facilities

    SciTech Connect (OSTI)

    Gerber, Richard; Allcock, William; Beggio, Chris; Campbell, Stuart; Cherry, Andrew; Cholia, Shreyas; Dart, Eli; England, Clay; Fahey, Tim; Foertter, Fernanda; Goldstone, Robin; Hick, Jason; Karelitz, David; Kelly, Kaki; Monroe, Laura; Prabhat,; Skinner, David; White, Julia

    2014-10-17

    U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at the DOE national laboratories. The report contains findings from that review.

  19. ComputerIntegration.jpg | OSTI, US Dept of Energy Office of Scientific and

    Office of Scientific and Technical Information (OSTI)

    Technical Information ComputerIntegration

  20. About the Advanced Computing Tech Team | Department of Energy

    Energy Savers [EERE]

    ... for wide area visibility and advanced meter infrastructure (AMI) for dynamic pricing and demand response, can be a great benefit for electric system reliability and flexibility. ...

  1. September is Scientific Supercomputing Month

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    September is Scientific Supercomputing Month September is Scientific Supercomputing Month DOE celebrates the science and technology that drive modern discovery September 3, 2013 hopper2cshp.jpg NERSC's flagship Cray XE6 system is called "Hopper" in honor of American computer scientist Grace Murray Hopper. Whether it's building a car battery that will take you 500 miles on a single charge or understanding the impact of Earth's changing climate on agriculture-advanced computing is a

  2. Secretary Bodman in Illinois Highlights Scientific Research Investments to Advance America's Innovation

    Broader source: Energy.gov [DOE]

    ROMEOVILLE, IL - U.S. Secretary of Energy Samuel Bodman today joined Rep. Judy Biggert (IL-13th) at a technology firm in Illinois to highlight scientific research investments that have led to...

  3. Advanced Simulation and Computing and Institutional R&D Programs | National

    National Nuclear Security Administration (NNSA)

    Nuclear Security Administration | (NNSA) Programs Advanced Simulation and Computing and Institutional R&D Programs The Advanced Simulation and Computing (ASC) Program supports the Department of Energy's National Nuclear Security Administration (DOE/NNSA) Defense Programs' use of simulation-based evaluation of the nation's nuclear weapons stockpile. The ASC Program is responsible for providing the simulation tools and computing environments required to qualify and certify the nation's

  4. Nationwide Buildings Energy Research enabled through an integrated Data Intensive Scientific Workflow and Advanced Analysis Environment

    SciTech Connect (OSTI)

    Kleese van Dam, Kerstin; Lansing, Carina S.; Elsethagen, Todd O.; Hathaway, John E.; Guillen, Zoe C.; Dirks, James A.; Skorski, Daniel C.; Stephan, Eric G.; Gorrissen, Willy J.; Gorton, Ian; Liu, Yan

    2014-01-28

    Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create and execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern

  5. Sandia National Laboratories: Advanced Simulation Computing: Research &

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Collaboration Research & Collaboration Partnerships among the national laboratories, industry, and academia leverage a broad spectrum of talent and multiply the effectiveness of our research efforts. These collaborations help solve the challenges of developing computing platforms and simulation tools across a number of disciplines. Computer Science Research Institute The Computer Science Research Institute brings university faculty and students to Sandia for focused collaborative

  6. Computing Events

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Events Computing Events Spotlighting the most advanced scientific and technical applications in the world! Featuring exhibits of the latest and greatest technologies from industry, academia and government research organizations; many of these technologies will be seen for the first time in Denver. Supercomputing Conference 13 Denver, Colorado November 17-22, 2013 Spotlighting the most advanced scientific and technical applications in the world, SC13 will bring together the international

  7. NERSC seeks Computational Systems Group Lead

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and advanced development for the supercomputer systems at NERSC (National Energy Research Scientific Computing ... workload demands within hiring and budget constraints. ...

  8. 2015 Annual Report - Argonne Leadership Computing Facility

    SciTech Connect (OSTI)

    Collins, James R.; Papka, Michael E.; Cerny, Beth A.; Coffey, Richard M.

    2015-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  9. 2014 Annual Report - Argonne Leadership Computing Facility

    SciTech Connect (OSTI)

    Collins, James R.; Papka, Michael E.; Cerny, Beth A.; Coffey, Richard M.

    2014-01-01

    The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines.

  10. New DOE-Sponsored Study Helps Advance Scientific Understanding of Potential CO2 Storage Impacts

    Broader source: Energy.gov [DOE]

    In another step forward toward improved scientific understanding of potential geologic carbon dioxide storage impacts, a new U.S. Department of Energy sponsored study has confirmed earlier research showing that proper site selection and monitoring is essential for helping anticipate and mitigate possible risks.

  11. Final Week of National Energy Action Month Features Technological Advances in Clean Energy and DOE Support of Scientific Research

    Broader source: Energy.gov [DOE]

    WASHINGTON—Department of Energy officials will attend events across the country next week to highlight the clean energy technological advances and scientific initiatives supported by DOE. During the final week of National Energy Action Month, senior DOE officials will participate in events from San Francisco to North Carolina to Washington. Throughout October, Secretary of Energy Ernest Moniz and other Department officials are participating in events to emphasize the important role that the Administration’s all-of-the-above energy strategy plays in strengthening America’s economic, environmental and national security future.

  12. A Computationally Based Approach to Homogenizing Advanced Alloys

    SciTech Connect (OSTI)

    Jablonski, P D; Cowen, C J

    2011-02-27

    We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.

  13. Data-aware distributed scientific computing for big-data problems...

    Office of Scientific and Technical Information (OSTI)

    Country of Publication: United States Language: English Subject: Mathematics & Computing(97) Computer Science Word Cloud More Like This Full Text File size NAView Full Text View ...

  14. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect (OSTI)

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  15. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect (OSTI)

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  16. Final Scientific Report: A Scalable Development Environment for Peta-Scale Computing

    SciTech Connect (OSTI)

    Karbach, Carsten; Frings, Wolfgang

    2013-02-20

    This document is the final scientific report of the project DE-SC000120 (A scalable Development Environment for Peta-Scale Computing). The objective of this project is the extension of the Parallel Tools Platform (PTP) for applying it to peta-scale systems. PTP is an integrated development environment for parallel applications. It comprises code analysis, performance tuning, parallel debugging and system monitoring. The contribution of the Juelich Supercomputing Centre (JSC) aims to provide a scalable solution for system monitoring of supercomputers. This includes the development of a new communication protocol for exchanging status data between the target remote system and the client running PTP. The communication has to work for high latency. PTP needs to be implemented robustly and should hide the complexity of the supercomputer's architecture in order to provide a transparent access to various remote systems via a uniform user interface. This simplifies the porting of applications to different systems, because PTP functions as abstraction layer between parallel application developer and compute resources. The common requirement for all PTP components is that they have to interact with the remote supercomputer. E.g. applications are built remotely and performance tools are attached to job submissions and their output data resides on the remote system. Status data has to be collected by evaluating outputs of the remote job scheduler and the parallel debugger needs to control an application executed on the supercomputer. The challenge is to provide this functionality for peta-scale systems in real-time. The client server architecture of the established monitoring application LLview, developed by the JSC, can be applied to PTP's system monitoring. LLview provides a well-arranged overview of the supercomputer's current status. A set of statistics, a list of running and queued jobs as well as a node display mapping running jobs to their compute resources form

  17. History | Argonne Leadership Computing Facility

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Leadership Computing The Argonne Leadership Computing Facility (ALCF) was established at Argonne National Laboratory in 2004 as part of a U.S. Department of Energy (DOE) initiative dedicated to enabling leading-edge computational capabilities to advance fundamental discovery and understanding in a broad range of scientific and engineering disciplines. Supported by the Advanced Scientific Computing Research (ASCR) program within DOE's Office of Science, the ALCF is one half of the DOE Leadership

  18. Advanced Computational Thermal Studies and their Assessment for Supercritical-Pressure Reactors (SCRs)

    SciTech Connect (OSTI)

    D. M. McEligot; J. Y. Yoo; J. S. Lee; S. T. Ro; E. Lurien; S. O. Park; R. H. Pletcher; B. L. Smith; P. Vukoslavcevic; J. M. Wallace

    2009-04-01

    The goal of this laboratory / university collaboration of coupled computational and experimental studies is the improvement of predictive methods for supercritical-pressure reactors. The general objective is to develop supporting knowledge needed of advanced computational techniques for the technology development of the concepts and their safety systems.

  19. Sandia National Laboratories: Advanced Simulation and Computing: Contact

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ASC Contact ASC Sandia ASC Program Contacts Program Director Bruce Hendrickson bahendr@sandia.gov Program Manager David Womble dewombl@sandia.gov Integrated Codes Lead Scott Hutchinson sahutch@sandia.gov Physics & Engineering Modeling Lead Jim Redmond jmredmo@sandia.gov Verification & Validation Lead Curt Nilsen canilse@sandia.gov Computational Systems & Software Engineering Lead Ken Alvin kfalvin@sandia.gov Facilities Operations & User Support Lead Tom Klitsner

  20. Advanced Scienti c Computing Research Network Requirements Review

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scienti c Computing Research Network Requirements Review Final Report April 22-23, 2015 Disclaimer This document was prepared as an account of work sponsored by the United States Government. While this doc- ument is believed to contain correct informa on, neither the United States Government nor any agency thereof, nor The Regents of the University of California, nor any of their employees, makes any warranty, express or implied, or assumes any legal responsibility for the accuracy,

  1. Advanced Computational Methods for Security Constrained Financial Transmission Rights

    SciTech Connect (OSTI)

    Kalsi, Karanjit; Elbert, Stephen T.; Vlachopoulou, Maria; Zhou, Ning; Huang, Zhenyu

    2012-07-26

    Financial Transmission Rights (FTRs) are financial insurance tools to help power market participants reduce price risks associated with transmission congestion. FTRs are issued based on a process of solving a constrained optimization problem with the objective to maximize the FTR social welfare under power flow security constraints. Security constraints for different FTR categories (monthly, seasonal or annual) are usually coupled and the number of constraints increases exponentially with the number of categories. Commercial software for FTR calculation can only provide limited categories of FTRs due to the inherent computational challenges mentioned above. In this paper, first an innovative mathematical reformulation of the FTR problem is presented which dramatically improves the computational efficiency of optimization problem. After having re-formulated the problem, a novel non-linear dynamic system (NDS) approach is proposed to solve the optimization problem. The new formulation and performance of the NDS solver is benchmarked against widely used linear programming (LP) solvers like CPLEX™ and tested on both standard IEEE test systems and large-scale systems using data from the Western Electricity Coordinating Council (WECC). The performance of the NDS is demonstrated to be comparable and in some cases is shown to outperform the widely used CPLEX algorithms. The proposed formulation and NDS based solver is also easily parallelizable enabling further computational improvement.

  2. FY 2012 Budget Request Advanced Research Projects Agency - Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ... risk analyses * Advanced Modeling Grid Research - Continues development of computational, mathematical, and scientific ... needed to transform the tools and algorithms that ...

  3. Computing and Computational Sciences Directorate - Computer Science and

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Mathematics Division Computer Science and Mathematics Division The Computer Science and Mathematics Division (CSMD) is ORNL's premier source of basic and applied research in high-performance computing, applied mathematics, and intelligent systems. Our mission includes basic research in computational sciences and application of advanced computing systems, computational, mathematical and analysis techniques to the solution of scientific problems of national importance. We seek to work

  4. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect (OSTI)

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  5. Computational Science and Engineering

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Science and Engineering NETL's Computational Science and Engineering competency consists of conducting applied scientific research and developing physics-based simulation models, methods, and tools to support the development and deployment of novel process and equipment designs. Research includes advanced computations to generate information beyond the reach of experiments alone by integrating experimental and computational sciences across different length and time scales. Specific

  6. Advances in x-ray computed microtomography at the NSLS

    SciTech Connect (OSTI)

    Dowd, B.A.; Andrews, A.B.; Marr, R.B.; Siddons, D.P.; Jones, K.W.; Peskin, A.M.

    1998-08-01

    The X-Ray Computed Microtomography workstation at beamline X27A at the NSLS has been utilized by scientists from a broad range of disciplines from industrial materials processing to environmental science. The most recent applications are presented here as well as a description of the facility that has evolved to accommodate a wide variety of materials and sample sizes. One of the most exciting new developments reported here resulted from a pursuit of faster reconstruction techniques. A Fast Filtered Back Transform (FFBT) reconstruction program has been developed and implemented, that is based on a refinement of the gridding algorithm first developed for use with radio astronomical data. This program has reduced the reconstruction time to 8.5 sec for a 929 x 929 pixel{sup 2} slice on an R10,000 CPU, more than 8x reduction compared with the Filtered Back-Projection method.

  7. The National Center for Biomedical Ontology: Advancing Biomedicinethrough Structured Organization of Scientific Knowledge

    SciTech Connect (OSTI)

    Rubin, Daniel L.; Lewis, Suzanna E.; Mungall, Chris J.; Misra,Sima; Westerfield, Monte; Ashburner, Michael; Sim, Ida; Chute,Christopher G.; Solbrig, Harold; Storey, Margaret-Anne; Smith, Barry; Day-Richter, John; Noy, Natalya F.; Musen, Mark A.

    2006-01-23

    The National Center for Biomedical Ontology (http://bioontology.org) is a consortium that comprises leading informaticians, biologists, clinicians, and ontologists funded by the NIH Roadmap to develop innovative technology and methods that allow scientists to record, manage, and disseminate biomedical information and knowledge in machine-processable form. The goals of the Center are: (1) to help unify the divergent and isolated efforts in ontology development by promoting high quality open-source, standards-based tools to create, manage, and use ontologies, (2) to create new software tools so that scientists can use ontologies to annotate and analyze biomedical data, (3) to provide a national resource for the ongoing evaluation, integration, and evolution of biomedical ontologies and associated tools and theories in the context of driving biomedical projects (DBPs), and (4) to disseminate the tools and resources of the Center and to identify, evaluate, and communicate best practices of ontology development to the biomedical community. The Center is working toward these objectives by providing tools to develop ontologies and to annotate experimental data, and by developing resources to integrate and relate existing ontologies as well as by creating repositories of biomedical data that are annotated using those ontologies. The Center is providing training workshops in ontology design, development, and usage, and is also pursuing research in ontology evaluation, quality, and use of ontologies to promote scientific discovery. Through the research activities within the Center, collaborations with the DBPs, and interactions with the biomedical community, our goal is to help scientists to work more effectively in the e-science paradigm, enhancing experiment design, experiment execution, data analysis, information synthesis, hypothesis generation and testing, and understand human disease.

  8. ADVANCED METHODS FOR THE COMPUTATION OF PARTICLE BEAM TRANSPORT AND THE COMPUTATION OF ELECTROMAGNETIC FIELDS AND MULTIPARTICLE PHENOMENA

    SciTech Connect (OSTI)

    Alex J. Dragt

    2012-08-31

    Since 1980, under the grant DEFG02-96ER40949, the Department of Energy has supported the educational and research work of the University of Maryland Dynamical Systems and Accelerator Theory (DSAT) Group. The primary focus of this educational/research group has been on the computation and analysis of charged-particle beam transport using Lie algebraic methods, and on advanced methods for the computation of electromagnetic fields and multiparticle phenomena. This Final Report summarizes the accomplishments of the DSAT Group from its inception in 1980 through its end in 2011.

  9. Large Scale Production Computing and Storage Requirements for...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    This is an invitation-only review organized by the Department of Energy's Office of Basic Energy Sciences (BES), Office of Advanced Scientific Computing Research (ASCR), and the ...

  10. Large Scale Production Computing and Storage Requirements for...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Large Scale Production Computing and Storage Requirements for High Energy Physics: Target 2017 ... Energy's Office of High Energy Physics (HEP), Office of Advanced Scientific ...

  11. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    SciTech Connect (OSTI)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  12. DOE's Office of Science Seeks Proposals for Expanded Large-Scale Scientific Computing

    Broader source: Energy.gov [DOE]

    WASHINGTON, D.C. -- Secretary of Energy Samuel W. Bodman announced today that DOE’s Office of Science is seeking proposals to support innovative, large-scale computational science projects to...

  13. High performance computing and communications: Advancing the frontiers of information technology

    SciTech Connect (OSTI)

    1997-12-31

    This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental in the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.

  14. Advanced Communication and Control for Distributed Energy Resource Integration: Phase 2 Scientific Report

    SciTech Connect (OSTI)

    BPL Global

    2008-09-30

    The objective of this research project is to demonstrate sensing, communication, information and control technologies to achieve a seamless integration of multivendor distributed energy resource (DER) units at aggregation levels that meet individual user requirements for facility operations (residential, commercial, industrial, manufacturing, etc.) and further serve as resource options for electric and natural gas utilities. The fully demonstrated DER aggregation system with embodiment of communication and control technologies will lead to real-time, interactive, customer-managed service networks to achieve greater customer value. Work on this Advanced Communication and Control Project (ACCP) consists of a two-phase approach for an integrated demonstration of communication and control technologies to achieve a seamless integration of DER units to reach progressive levels of aggregated power output. Phase I involved design and proof-of-design, and Phase II involves real-world demonstration of the Phase I design architecture. The scope of work for Phase II of this ACCP involves demonstrating the Phase I design architecture in large scale real-world settings while integrating with the operations of one or more electricity supplier feeder lines. The communication and control architectures for integrated demonstration shall encompass combinations of software and hardware components, including: sensors, data acquisition and communication systems, remote monitoring systems, metering (interval revenue, real-time), local and wide area networks, Web-based systems, smart controls, energy management/information systems with control and automation of building energy loads, and demand-response management with integration of real-time market pricing. For Phase II, BPL Global shall demonstrate the Phase I design for integrating and controlling the operation of more than 10 DER units, dispersed at various locations in one or more Independent System Operator (ISO) Control Areas, at

  15. FY05-FY06 Advanced Simulation and Computing Implementation Plan, Volume 2

    SciTech Connect (OSTI)

    Baron, A L

    2004-07-19

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program will require the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapon design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile life extension programs and the resolution of significant finding investigations (SFIs). This requires a balanced system of technical staff, hardware, simulation software, and computer science solutions.

  16. SciCADE 95: International conference on scientific computation and differential equations

    SciTech Connect (OSTI)

    1995-12-31

    This report consists of abstracts from the conference. Topics include algorithms, computer codes, and numerical solutions for differential equations. Linear and nonlinear as well as boundary-value and initial-value problems are covered. Various applications of these problems are also included.

  17. Oak Ridge Leadership Computing Facility (OLCF) | U.S. DOE Office of Science

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (SC) Oak Ridge Leadership Computing Facility (OLCF) Advanced Scientific Computing Research (ASCR) ASCR Home About Research Facilities User Facilities Argonne Leadership Computing Facility (ALCF) Energy Sciences Network (ESnet) National Energy Research Scientific Computing Center (NERSC) Oak Ridge Leadership Computing Facility (OLCF) Accessing ASCR Facilities Computational Science Graduate Fellowship (CSGF) Research & Evaluation Prototypes (REP) Science Highlights Benefits of ASCR Funding

  18. Eighth SIAM conference on parallel processing for scientific computing: Final program and abstracts

    SciTech Connect (OSTI)

    1997-12-31

    This SIAM conference is the premier forum for developments in parallel numerical algorithms, a field that has seen very lively and fruitful developments over the past decade, and whose health is still robust. Themes for this conference were: combinatorial optimization; data-parallel languages; large-scale parallel applications; message-passing; molecular modeling; parallel I/O; parallel libraries; parallel software tools; parallel compilers; particle simulations; problem-solving environments; and sparse matrix computations.

  19. Development of high performance scientific components for interoperability of computing packages

    SciTech Connect (OSTI)

    Gulabani, Teena Pratap

    2008-12-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  20. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect (OSTI)

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.

  1. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-11-01

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation packagemorecapable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).less

  2. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  3. High-Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    SciTech Connect (OSTI)

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-11-01

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing (HPC) are essential for accurately modeling them. In the past decade, the DOE SciDAC program has produced such accelerator-modeling tools, which have beem employed to tackle some of the most difficult accelerator science problems. In this article we discuss the Synergia beam-dynamics framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable of handling the entire spectrum of beam dynamics simulations. We present the design principles, key physical and numerical models in Synergia and its performance on HPC platforms. Finally, we present the results of Synergia applications for the Fermilab proton source upgrade, known as the Proton Improvement Plan (PIP).

  4. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect (OSTI)

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  5. Berkeley Lab Opens State-of-the-Art Facility for Computational...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Complementing NERSC and ESnet in the facility will be research programs in applied mathematics and computer science that develop new methods for advancing scientific discovery. ...

  6. Computing and Computational Sciences Directorate - Joint Institute for

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Sciences Joint Institute for Computational Sciences To help realize the full potential of new-generation computers for advancing scientific discovery, the University of Tennessee (UT) and Oak Ridge National Laboratory (ORNL) have created the Joint Institute for Computational Sciences (JICS). JICS combines the experience and expertise in theoretical and computational science and engineering, computer science, and mathematics in these two institutions and focuses these skills on

  7. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect (OSTI)

    Spentzouris, P.; /Fermilab; Cary, J.; /Tech-X, Boulder; McInnes, L.C.; /Argonne; Mori, W.; /UCLA; Ng, C.; /SLAC; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  8. DOE Science Showcase - High-Performance Computing | OSTI, US...

    Office of Scientific and Technical Information (OSTI)

    DOE Computing, Energy.gov DOE Office of Science Advanced Scientific Computing Research ... SciTech Connect National Library of EnergyBeta Science.gov Ciencia.Science.gov ...

  9. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    SciTech Connect (OSTI)

    McCoy, M.; Archer, B.; Hendrickson, B.

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  10. Construction of Blaze at the University of Illinois at Chicago: A Shared, High-Performance, Visual Computer for Next-Generation Cyberinfrastructure-Accelerated Scientific, Engineering, Medical and Public Policy Research

    SciTech Connect (OSTI)

    Brown, Maxine D.; Leigh, Jason

    2014-02-17

    The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascale computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.

  11. Unsolicited Projects in 2012: Research in Computer Architecture, Modeling,

    Office of Science (SC) Website

    and Evolving MPI for Exascale | U.S. DOE Office of Science (SC) 2: Research in Computer Architecture, Modeling, and Evolving MPI for Exascale Advanced Scientific Computing Research (ASCR) ASCR Home About Research Applied Mathematics Computer Science Exascale Tools Workshop Programming Challenges Workshop Architectures I Workshop External link Architectures II Workshop External link Next Generation Networking Scientific Discovery through Advanced Computing (SciDAC) ASCR SBIR-STTR Facilities

  12. NERSC seeks Computational Systems Group Lead

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    seeks Computational Systems Group Lead NERSC seeks Computational Systems Group Lead January 6, 2011 by Katie Antypas Note: This position is now closed. The Computational Systems Group provides production support and advanced development for the supercomputer systems at NERSC. Manage the Computational Systems Group (CSG) which provides production support and advanced development for the supercomputer systems at NERSC (National Energy Research Scientific Computing Center). These systems, which

  13. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect (OSTI)

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    are to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  14. Amplify scientific discovery with artificial intelligence

    SciTech Connect (OSTI)

    Gil, Yolanda; Greaves, Mark T.; Hendler, James; Hirsch, Hyam

    2014-10-10

    Computing innovations have fundamentally changed many aspects of scientific inquiry. For example, advances in robotics, high-end computing, networking, and databases now underlie much of what we do in science such as gene sequencing, general number crunching, sharing information between scientists, and analyzing large amounts of data. As computing has evolved at a rapid pace, so too has its impact in science, with the most recent computing innovations repeatedly being brought to bear to facilitate new forms of inquiry. Recently, advances in Artificial Intelligence (AI) have deeply penetrated many consumer sectors, including for example Apple’s Siri™ speech recognition system, real-time automated language translation services, and a new generation of self-driving cars and self-navigating drones. However, AI has yet to achieve comparable levels of penetration in scientific inquiry, despite its tremendous potential in aiding computers to help scientists tackle tasks that require scientific reasoning. We contend that advances in AI will transform the practice of science as we are increasingly able to effectively and jointly harness human and machine intelligence in the pursuit of major scientific challenges.

  15. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computations - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear Energy

  16. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect (OSTI)

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  17. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    SciTech Connect (OSTI)

    G. R. Odette; G. E. Lucas

    2005-11-15

    This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

  18. DOE's Office of Science Awards 18 Million Hours of Supercomputing Time to 15 Teams for Large-Scale Scientific Computing

    Office of Energy Efficiency and Renewable Energy (EERE)

    WASHINGTON, D.C. - Secretary of Energy Samuel W. Bodman announced today that DOE's Office of Science has awarded a total of 18.2 million hours of computing time on some of the world's most powerful...

  19. Large Scale Production Computing and Storage Requirements for Basic Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Sciences: Target 2017 Large Scale Production Computing and Storage Requirements for Basic Energy Sciences: Target 2017 BES-Montage.png This is an invitation-only review organized by the Department of Energy's Office of Basic Energy Sciences (BES), Office of Advanced Scientific Computing Research (ASCR), and the National Energy Research Scientific Computing Center (NERSC). The goal is to determine production high-performance computing, storage, and services that will be needed for BES to

  20. Large Scale Production Computing and Storage Requirements for Fusion Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Sciences: Target 2017 Large Scale Production Computing and Storage Requirements for Fusion Energy Sciences: Target 2017 The NERSC Program Requirements Review "Large Scale Production Computing and Storage Requirements for Fusion Energy Sciences" is organized by the Department of Energy's Office of Fusion Energy Sciences (FES), Office of Advanced Scientific Computing Research (ASCR), and the National Energy Research Scientific Computing Center (NERSC). The review's goal is to

  1. Large Scale Production Computing and Storage Requirements for High Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Physics: Target 2017 Large Scale Production Computing and Storage Requirements for High Energy Physics: Target 2017 HEPlogo.jpg The NERSC Program Requirements Review "Large Scale Computing and Storage Requirements for High Energy Physics" is organized by the Department of Energy's Office of High Energy Physics (HEP), Office of Advanced Scientific Computing Research (ASCR), and the National Energy Research Scientific Computing Center (NERSC). The review's goal is to characterize

  2. Computational Physics and Methods

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2 Computational Physics and Methods Performing innovative simulations of physics phenomena on tomorrow's scientific computing platforms Growth and emissivity of young galaxy hosting a supermassive black hole as calculated in cosmological code ENZO and post-processed with radiative transfer code AURORA. image showing detailed turbulence simulation, Rayleigh-Taylor Turbulence imaging: the largest turbulence simulations to date Advanced multi-scale modeling Turbulence datasets Density iso-surfaces

  3. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    SciTech Connect (OSTI)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.; Crawford, Dona L.; Khaleel, Mohammad A.; John, Grosh; Stults, B. Ray; Lee, Steven L.; Hammond, Steven W.; Grover, Benjamin T.; Neely, Rob; Dudney, Lee Ann; Goldstein, Noah C.; Wells, Jack; Peltz, Jim

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set of recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.

  4. ASCR Leadership Computing Challenge (ALCC) | U.S. DOE Office of Science

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (SC) ASCR Leadership Computing Challenge (ALCC) Advanced Scientific Computing Research (ASCR) ASCR Home About Research Facilities User Facilities Accessing ASCR Facilities Innovative & Novel Computational Impact on Theory & Experiement (INCITE) ASCR Leadership Computing Challenge (ALCC) Current Awards Past Awards Industrial Users Computational Science Graduate Fellowship (CSGF) Research & Evaluation Prototypes (REP) Science Highlights Benefits of ASCR Funding Opportunities

  5. Nick Wright Named Advanced Technologies Group Lead

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Nick Wright Named Advanced Technologies Group Lead Nick Wright Named Advanced Technologies Group Lead February 4, 2013 Nick Nick Wright has been named head of the National Energy Research Scientific Computing Center's (NERSC) Advanced Technologies Group (ATG), which focuses on understanding the requirements of current and emerging applications to make choices in hardware design and programming models that best serve the science needs of NERSC users. ATG specializes in benchmarking, system

  6. Computing at JLab

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    JLab --- Accelerator Controls CAD CDEV CODA Computer Center High Performance Computing Scientific Computing JLab Computer Silo maintained by webmaster@jlab.org...

  7. Scientific Impact

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    impact Scientific Impact Since its inception over twenty years ago, CAMS has achieved noteworthy scientific progress by developing new capabilities and by combining state-of-the-art tools and expertise to address important scientific challenges. Scientific Leadership CAMS scientists are recognized as scientific leaders in the field of AMS and the disciplines that it supports. Many CAMS staff participate on federal agency (NIH, NSF, NOAA and DOE) scientific review panels as well as giving a

  8. computers

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  9. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

    SciTech Connect (OSTI)

    Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

    2012-07-31

    This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

  10. Throwback Thursdays Celebrate Scientific Supercomputing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Throwback Thursdays Celebrate Scientific Supercomputing A Cray-1 supercomputer arrives at the Magnetic Fusion Energy Computer Center in A Cray-1 supercomputer arrives at the ...

  11. Scientific and Organizational Awards | NREL

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scientific and Organizational Awards NREL's facility and staff are regularly recognized by scientific societies and community and government organizations. Find awards and honors by category below. Scientific and Technical Society Honors and Awards Scientific and technical society fellows are listed below, along with recent awards. American Association for the Advancement of Science 2015 Fellow -Brian Gregg 2014 Fellow - David S. Ginley 2013 Fellow - Martin Keller 2011 Fellow - Stanley Bull 2003

  12. Computing Frontier: Distributed Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing Frontier: Distributed Computing and Facility Infrastructures Conveners: Kenneth Bloom 1 , Richard Gerber 2 1 Department of Physics and Astronomy, University of Nebraska-Lincoln 2 National Energy Research Scientific Computing Center (NERSC), Lawrence Berkeley National Laboratory 1.1 Introduction The field of particle physics has become increasingly reliant on large-scale computing resources to address the challenges of analyzing large datasets, completing specialized computations and

  13. Development of Computational Capabilities to Predict the Corrosion Wastage of Boiler Tubes in Advanced Combustion Systems

    SciTech Connect (OSTI)

    Kung, Steven; Rapp, Robert

    2014-08-31

    A comprehensive corrosion research project consisting of pilot-scale combustion testing and long-term laboratory corrosion study has been successfully performed. A pilot-scale combustion facility available at Brigham Young University was selected and modified to enable burning of pulverized coals under the operating conditions typical for advanced coal-fired utility boilers. Eight United States (U.S.) coals were selected for this investigation, with the test conditions for all coals set to have the same heat input to the combustor. In addition, the air/fuel stoichiometric ratio was controlled so that staged combustion was established, with the stoichiometric ratio maintained at 0.85 in the burner zone and 1.15 in the burnout zone. The burner zone represented the lower furnace of utility boilers, while the burnout zone mimicked the upper furnace areas adjacent to the superheaters and reheaters. From this staged combustion, approximately 3% excess oxygen was attained in the combustion gas at the furnace outlet. During each of the pilot-scale combustion tests, extensive online measurements of the flue gas compositions were performed. In addition, deposit samples were collected at the same location for chemical analyses. Such extensive gas and deposit analyses enabled detailed characterization of the actual combustion environments existing at the lower furnace walls under reducing conditions and those adjacent to the superheaters and reheaters under oxidizing conditions in advanced U.S. coal-fired utility boilers. The gas and deposit compositions were then carefully simulated in a series of 1000-hour laboratory corrosion tests, in which the corrosion performances of different commercial candidate alloys and weld overlays were evaluated at various temperatures for advanced boiler systems. Results of this laboratory study led to significant improvement in understanding of the corrosion mechanisms operating on the furnace walls as well as superheaters and reheaters in

  14. Computer Aided Design of Advanced Turbine Airfoil Alloys for Industrial Gas Turbines in Coal Fired Environments

    SciTech Connect (OSTI)

    G.E. Fuchs

    2007-12-31

    Recent initiatives for fuel flexibility, increased efficiency and decreased emissions in power generating industrial gas turbines (IGT's), have highlighted the need for the development of techniques to produce large single crystal or columnar grained, directionally solidified Ni-base superalloy turbine blades and vanes. In order to address the technical difficulties of producing large single crystal components, a program has been initiated to, using computational materials science, better understand how alloy composition in potential IGT alloys and solidification conditions during processing, effect castability, defect formation and environmental resistance. This program will help to identify potential routes for the development of high strength, corrosion resistant airfoil/vane alloys, which would be a benefit to all IGT's, including small IGT's and even aerospace gas turbines. During the first year, collaboration with Siemens Power Corporation (SPC), Rolls-Royce, Howmet and Solar Turbines has identified and evaluated about 50 alloy compositions that are of interest for this potential application. In addition, alloy modifications to an existing alloy (CMSX-4) were also evaluated. Collaborating with SPC and using computational software at SPC to evaluate about 50 alloy compositions identified 5 candidate alloys for experimental evaluation. The results obtained from the experimentally determined phase transformation temperatures did not compare well to the calculated values in many cases. The effects of small additions of boundary strengtheners (i.e., C, B and N) to CMSX-4 were also examined. The calculated phase transformation temperatures were somewhat closer to the experimentally determined values than for the 5 candidate alloys, discussed above. The calculated partitioning coefficients were similar for all of the CMSX-4 alloys, similar to the experimentally determined segregation behavior. In general, it appears that computational materials science has become a

  15. Advanced Computational Thermal Fluid Physics (CTFP) and Its Assessment for Light Water Reactors and Supercritical Reactors

    SciTech Connect (OSTI)

    D.M. McEligot; K. G. Condie; G. E. McCreery; H. M. McIlroy; R. J. Pink; L.E. Hochreiter; J.D. Jackson; R.H. Pletcher; B.L. Smith; P. Vukoslavcevic; J.M. Wallace; J.Y. Yoo; J.S. Lee; S.T. Ro; S.O. Park

    2005-10-01

    Background: The ultimate goal of the study is the improvement of predictive methods for safety analyses and design of Generation IV reactor systems such as supercritical water reactors (SCWR) for higher efficiency, improved performance and operation, design simplification, enhanced safety and reduced waste and cost. The objective of this Korean / US / laboratory / university collaboration of coupled fundamental computational and experimental studies is to develop the supporting knowledge needed for improved predictive techniques for use in the technology development of Generation IV reactor concepts and their passive safety systems. The present study emphasizes SCWR concepts in the Generation IV program.

  16. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing and Storage Requirements Computing and Storage Requirements for FES J. Candy General Atomics, San Diego, CA Presented at DOE Technical Program Review Hilton Washington DC/Rockville Rockville, MD 19-20 March 2013 2 Computing and Storage Requirements Drift waves and tokamak plasma turbulence Role in the context of fusion research * Plasma performance: In tokamak plasmas, performance is limited by turbulent radial transport of both energy and particles. * Gradient-driven: This turbulent

  17. computers

    National Nuclear Security Administration (NNSA)

    California.

    Retired computers used for cybersecurity research at Sandia National...

  18. Accelerating scientific discovery : 2007 annual report.

    SciTech Connect (OSTI)

    Beckman, P.; Dave, P.; Drugan, C.

    2008-11-14

    As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis of Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications

  19. Berkeley Lab Highlights HPC at Advanced Manufacturing Event

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Lab Highlights HPC at Advanced Manufacturing Event Berkeley Lab Highlights HPC at Advanced Manufacturing Event September 14, 2015 Peter Nugent, Division Deputy for Scientific Engagement in Berkeley Lab's Computational Research Division, and David Skinner, who leads NERSC's Strategic Partnerships effort, are participating this week in the third annual 2015 American Energy & Manufacturing Competitiveness Summit, where they will be discussing the increasing role of high performance computing in

  20. Throwback Thursdays Celebrate Scientific Supercomputing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Home » News & Publications » News » Center News » Throwback Thursdays Celebrate Scientific Supercomputing Throwback Thursdays Celebrate Scientific Supercomputing A Cray-1 supercomputer arrives at the Magnetic Fusion Energy Computer Center in A Cray-1 supercomputer arrives at the Magnetic Fusion Energy Computer Center in May 1978. The U.S. Department of Energy (DOE) was investing in scientific supercomputing long before the internet became the internet, and back when clouds only came in

    1. Scientific Bio

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Scientific Bio Director Deputy Director Leadership Team Advisory Board Directorate Staff Org Chart Navigate Section Director Deputy Director Leadership Team Advisory Board...

    2. Computational physics and applied mathematics capability review June 8-10, 2010 (Advance materials to committee members)

      SciTech Connect (OSTI)

      Lee, Stephen R

      2010-01-01

      Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the laboratory, starting from the inception of the Laboratory in 1943. The CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled mUlti-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CP AM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections): (1) Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the laboratory; (2) Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution of partial differential equations

    3. Scientific and Technical Information Management

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2010-12-13

      The purpose of this directive is to ensure that STI is appropriately managed as part of the DOE mission to enable the advancement of scientific knowledge and technological innovation. Supersedes DOE O 241.1B.

    4. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

      SciTech Connect (OSTI)

      Scott, Bobby, R., Ph.D.

      2003-06-27

      OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer. Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based on

    5. Hydrogen Materials Advanced Research Consortium

      Broader source: Energy.gov [DOE]

      An overview of the organization and scientific activities of the Hydrogen Materials—Advanced Research Consortium (HyMARC).

    6. Software and High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational physics, computer science, applied mathematics, statistics and the ... a fully operational supercomputing environment Providing Current Capability Scientific ...

    7. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

      SciTech Connect (OSTI)

      Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

      2009-01-01

      The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

    8. Scientific Visualization, Seeing the Unseeable

      ScienceCinema (OSTI)

      LBNL

      2009-09-01

      June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in bo... June 24, 2008 Berkeley Lab lecture: Scientific visualization transforms abstract data into readily comprehensible images, provide a vehicle for "seeing the unseeable," and play a central role in both experimental and computational sciences. Wes Bethel, who heads the Scientific Visualization Group in the Computational Research Division, presents an overview of visualization and computer graphics, current research challenges, and future directions for the field.

    9. advanced simulation and computing

      National Nuclear Security Administration (NNSA)

      NIF, in particular the first Pu experiment on NIF, the return to operations of the TA-55 gas gun, a successful series of plutonium experiments on Joint Actinide Shock Physics...

    10. Large Scale Computing and Storage Requirements for Fusion Energy Sciences: Target 2017

      SciTech Connect (OSTI)

      Gerber, Richard

      2014-05-02

      The National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,500 users working on some 650 projects that involve nearly 600 codes in a wide variety of scientific disciplines. In March 2013, NERSC, DOE?s Office of Advanced Scientific Computing Research (ASCR) and DOE?s Office of Fusion Energy Sciences (FES) held a review to characterize High Performance Computing (HPC) and storage requirements for FES research through 2017. This report is the result.