National Library of Energy BETA

Sample records for ascii comma delimited

  1. ascii2gdocs

    Energy Science and Technology Software Center (OSTI)

    2011-11-30

    Enables UNIX and Mac OS X command line users to put (individually or batch mode) local ascii files into Google Documents, where the ascii is converted to Google Document format using formatting the user can specify.

  2. Records Inventory Data Collection Software

    Energy Science and Technology Software Center (OSTI)

    1995-03-01

    DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.

  3. Convertor of MAD Programs to a Set of ASCII files to load into SYBASE

    Energy Science and Technology Software Center (OSTI)

    1995-07-12

    Used in Lattice support and maintenance; current buffer in Emacs editor is converted into a bunch of ASCII files (each for specific MAD token type). These files are in some fixed format and are ready to be loaded into the database (sysbase).

  4. Find Energy Efficiency and Renewable Energy Incentives on OpenEI...

    Open Energy Info (EERE)

    browse more choices. To handle large sets of incentives, download a comma-delimited Microsoft Excel file. If you're looking for a particular incentive, try the Incentive Search...

  5. FIA-15-0007 - In the Matter of FOIA Group, Inc. | Department of Energy

    Broader source: Energy.gov (indexed) [DOE]

    25, 2015, OHA denied a FOIA Appeal filed by FOIA Group, Inc. (FGI). In its Appeal, FGI challenged a determination issued to it by the Office of Information Resources (OIR). In that determination, OIR concluded that the information sought by FGI could not be readily reproduced in the "COMMA DELIMITED TEXT FILE" (CDTF) or "TAB DELIMITED TEXT FILE" (TDTF) formats requested by FGI, and that converting the information sought into Excel spreadsheets would cost over $8,000. OIR

  6. Seismic analysis applied to the delimiting of a gas reservoir

    SciTech Connect (OSTI)

    Ronquillo, G.; Navarro, M.; Lozada, M.; Tafolla, C.

    1996-08-01

    We present the results of correlating seismic models with petrophysical parameters and well logs to mark the limits of a gas reservoir in sand lenses. To fulfill the objectives of the study, we used a data processing sequence that included wavelet manipulation, complex trace attributes and pseudovelocities inversion, along with several quality control schemes to insure proper amplitude preservation. Based on the analysis and interpretation of the seismic sections, several areas of interest were selected to apply additional signal treatment as preconditioning for petrophysical inversion. Signal classification was performed to control the amplitudes along the horizons of interest, and to be able to find an indirect interpretation of lithologies. Additionally, seismic modeling was done to support the results obtained and to help integrate the interpretation. The study proved to be a good auxiliary tool in the location of the probable extension of the gas reservoir in sand lenses.

  7. Updated Users' Guide for RSAP -- A Code for Display and Manipulation of Neutron Cross Section Data and SAMMY Fit Results

    SciTech Connect (OSTI)

    Sayer, R.O.

    2003-07-29

    RSAP [1] is a computer code for display and manipulation of neutron cross section data and selected SAMMY output. SAMMY [2] is a multilevel R-matrix code for fitting neutron time-of-flight cross-section data using Bayes' method. This users' guide provides documentation for the recently updated RSAP code (version 6). The code has been ported to the Linux platform, and several new features have been added, including the capability to read cross section data from ASCII pointwise ENDF files as well as double-precision PLT output from SAMMY. A number of bugs have been found and corrected, and the input formats have been improved. Input items are parsed so that items may be separated by spaces or commas.

  8. U.S. Energy Information Administration (EIA) Indexed Site

    ELCOVR1 Bill coverage/Electricity $BLCOV23. NGCOVR1 Bill coverage/Natural gas $BLCOV23. STCOVR1 Bill coverage/Steam $BLCOV23. FKCOVR1 Bill coverage/Fuel oil-kerosene $BLCOV23. PRCOVR1 Bill coverage/Propane $BLCOV23. ELBTU1 Consumption (MBtu)/Electricity COMMA20. ELEXP1 Expenditure/Electricity COMMA12. ELCNS1 Consumption (Kwh) Electricity COMMA17. NGBTU1 Consumption (MBtu)/Natural gas COMMA20. NGEXP1 Expenditure/Natural gas COMMA12. NGCNS1 Consumption (ccf)/Natural gas COMMA17. PRBTU1 Consumption

  9. A SOAP Web Service for accessing MODIS land product subsets

    SciTech Connect (OSTI)

    SanthanaVannan, Suresh K; Cook, Robert B; Pan, Jerry Yun; Wilson, Bruce E

    2011-01-01

    Remote sensing data from satellites have provided valuable information on the state of the earth for several decades. Since March 2000, the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on board NASA s Terra and Aqua satellites have been providing estimates of several land parameters useful in understanding earth system processes at global, continental, and regional scales. However, the HDF-EOS file format, specialized software needed to process the HDF-EOS files, data volume, and the high spatial and temporal resolution of MODIS data make it difficult for users wanting to extract small but valuable amounts of information from the MODIS record. To overcome this usability issue, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemical Dynamics at Oak Ridge National Laboratory (ORNL) developed a Web service that provides subsets of MODIS land products using Simple Object Access Protocol (SOAP). The ORNL DAAC MODIS subsetting Web service is a unique way of serving satellite data that exploits a fairly established and popular Internet protocol to allow users access to massive amounts of remote sensing data. The Web service provides MODIS land product subsets up to 201 x 201 km in a non-proprietary comma delimited text file format. Users can programmatically query the Web service to extract MODIS land parameters for real time data integration into models, decision support tools or connect to workflow software. Information regarding the MODIS SOAP subsetting Web service is available on the World Wide Web (WWW) at http://daac.ornl.gov/modiswebservice.

  10. Template:Define | Open Energy Information

    Open Energy Info (EERE)

    - OpenEI's definition of the term. This should be unique. (required) Aliases - Synonyms of the term, or phrases which have the same meaning. (comma delimeted) Related -...

  11. Your Data Analysis & Visualization Needs Mira Performance Boot...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    .inp) CMAT (.cmat) CML (.cml) CTRL (.ctrl) Chombo (.hdf5, .h5) Claw (.claw) Comma Separated Values (.csv) Cosmology Files (.cosmo,...

  12. Clarian Power | Open Energy Information

    Open Energy Info (EERE)

    Power Place: Seattle, Washington Sector: Solar Website: www.clariantechnologies.comma Coordinates: 47.6062095, -122.3320708 Show Map Loading map... "minzoom":false,"mapp...

  13. Energy blogs | OpenEI Community

    Open Energy Info (EERE)

    Energy blogs Home > Community Filter Search Author Enter a comma separated list of user names. Tags My groups True False Apply Kmorales City of McPherson, Kansas, Board of...

  14. MHK ISDB/Instruments/Nortek Signature 1000/500 | Open Energy...

    Open Energy Info (EERE)

    Notes, Considerations & Recommendations Data conversion to ASCII and MatLab format. Call for quote and additional information. User Experience No user experiences...

  15. Rainfall Manipulation Plot Study (RaMPS)

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Blair, John [Kansas State University; Fay, Phillip [USDA-ARS; Knapp, Alan [Colorado State University; Collins, Scott [University of New Mexico; Smith, Melinda [Yale University

    Data sets are available as ASCII files, in Excel spreadsheets, and in SAS format. (Taken from http://www.konza.ksu.edu/ramps/backgrnd.html

  16. Monthly Energy Review - October 1999

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    October 26, 1999 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1), and...

  17. Monthly Energy Review - June 2000

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    June 27, 2000 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1), and...

  18. Monthly Energy Review - April 200

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    April 26, 2000 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1), and...

  19. Monthly Energy Review - September 1999

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    September 27, 1999 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1),...

  20. Monthly Energy Review - December 1999

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    December 22, 1999 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1),...

  1. Monthly Energy Review - February 2000

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    February 24, 2000 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1),...

  2. Monthly Energy Review, January 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    January 27, 1998 Electronic Access Monthly Energy Review (MER) data are also available through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  3. Monthly Energy Review - March 2000

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    March 28, 2000 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1), and...

  4. Monthly Energy Review

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    December 23, 1997 Electronic Access Monthly Energy Review (MER) data are also available through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  5. Monthly Energy Review - January 2000

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    January 28, 2000 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1), and...

  6. Monthly Energy Review - May 2000

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    May 26, 2000 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1), and...

  7. Monthly Energy Review - July 2000

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    July 26, 2000 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1), and...

  8. Monthly Energy Review, October 1997

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    October 27, 1997 Electronic Access Monthly Energy Review (MER) data are also available through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  9. Monthly Energy Review, September 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    September 25, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of...

  10. Monthly Energy Review, November 1997

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    November 24, 1997 Electronic Access Monthly Energy Review (MER) data are also available through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  11. Monthly Energy Review - November 1999

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    November 23, 1999 Electronic Access The Monthly Energy Review is available on the Energy Information Administration's website in a variety of formats: * ASCII text, Lotus (wk1),...

  12. Los National Alamos Lab: Credit Card Payments

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    spacer Please enter all information. All fields required This payment is for ... [select one]: Employee receivable Event registration Event sponsorship Insurance License and Royalty MCNP training Miscellaneous Parking citation Travel payment on account Email address: Enter the payment amount in the following format: 9999.99 - do not use comma (,) or $ Review Payment spacer

  13. --No Title--

    U.S. Energy Information Administration (EIA) Indexed Site

    PORDP3 57- 58 B-9Bg Percent out-patient health care HCOUTP3 60- 61 B-9Bh Percent ... 90- 93 COMMA9. B-9Bo Percent in-patient health care HCINP3 95- 96 B-10o Licensed bed ...

  14. Slide 1

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    0 0.5 1 0 50 100 150 200 250 u (ms) Time (s) z 0.125m z 0.425m * Developed in Matlab * Input: instantaneous velocity timeseries in ASCII format * Noise filtering methods:...

  15. Sandia Network Intrusion Detection Q.2931 Sensor Version 1.0

    Energy Science and Technology Software Center (OSTI)

    2002-09-20

    This program was developed to read, plot and translate data taken by another program (Monogrow) and make the data readable for processing by other programs like "EXCEL", or any program which requires ASCII data files.

  16. Monthly Energy Review, October 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    October 27, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  17. Monthly Energy Review, November 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    November 24, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  18. Monthly Energy Review, March 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    March 27, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  19. Monthly Energy Review, June 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    June 25, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the MER...

  20. Monthly Energy Review, August 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    August 25, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  1. Monthly Energy Review

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    May 26, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the MER...

  2. Monthly Energy Review - December 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    December 22, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the...

  3. Monthly Energy Review, July 1998

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    July 28, 1998 Electronic Access Monthly Energy Review (MER) data are also avail- able through these electronic means: * ASCII text, Lotus (wk1), and Excel (xls) versions of the MER...

  4. --No Title--

    U.S. Energy Information Administration (EIA) Indexed Site

    Survey Public Use Data The data from the 1991 RTECS is distributed in DBase and ASCii format on three diskettes each. The data on the six diskettes have been compressed...

  5. SEDS CSV File Documentation: Price and Expenditure

    Gasoline and Diesel Fuel Update (EIA)

    Price and Expenditure Estimates The State Energy Data System (SEDS) comma-separated value (CSV) files contain the price and expenditure estimates shown in the tables located on the SEDS website. There are three files that contain estimates for all states and years. Prices contains the price estimates for all states and Expenditures contains the expenditure estimates for all states. The third file, Adjusted Consumption for Expenditure Calculations contains adjusted consumption estimates used in

  6. 1990 RECS Public Use Microdata Files

    Gasoline and Diesel Fuel Update (EIA)

    90 Microdata 1990 RECS Public Use Microdata Files Data for: 1990 Released: September 2008 The 1990 Residential Energy Consumption Survey (RECS) was designed by the Energy Information Administration (EIA) to provide information on how households in the United States and District of Columbia use energy within the home. The RECS Public Use Files are comma separated value (.txt) files. Each record corresponds to a single responding household. The smallest level of geographic detail available is the

  7. CSV File Documentation: Consumption

    Gasoline and Diesel Fuel Update (EIA)

    Consumption Estimates The State Energy Data System (SEDS) comma-separated value (CSV) files contain consumption estimates shown in the tables located on the SEDS website. There are four files that contain estimates for all states and years. Consumption in Physical Units contains the consumption estimates in physical units for all states; Consumption in Btu contains the consumption estimates in billion British thermal units (Btu) for all states. There are two data files for thermal conversion

  8. 1987 RECS Public Use Microdata Files

    Gasoline and Diesel Fuel Update (EIA)

    87 Microdata 1987 RECS Public Use Microdata Files Data for: 1987 Released: September 2008 The 1987 Residential Energy Consumption Survey (RECS) was designed by the Energy Information Administration (EIA) to provide information on how households in the United States and District of Columbia use energy within the home. The RECS Public Use Files are comma separated value (.txt) files. Each record corresponds to a single responding household. The smallest level of geographic detail available is the

  9. Energy Information Administration (EIA)- Commercial Buildings Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Consumption Survey (CBECS) Data Previous CBECS Survey Data 2012 | 2003 | 1999 | 1995 | 1992 | Previous 1989 Building Characteristics Tables Consumption and Expenditures Tables Microdata Released: January 2009 The 1989 CBECS Public Use Files are comma separated value (.csv) files that each contain 5,876 records. They represent commercial buildings from the 50 States and the District of Columbia. Each record corresponds to a single responding, in-scope sampled building, and contains

  10. T:\ClearanceEMEUConsumption\cbecs\pubuse89\txt\layouts&formats.txt

    U.S. Energy Information Administration (EIA) Indexed Site

    9/txt/layouts&formats.txt[3/19/2009 11:26:00 AM] December, 2008 1989 CBECS Building Characteristics and Consumption and Expenditures for All Buildings Public Use Files This document contains all the file layouts and format codes for the 1989 Commercial Buildings Energy Consumption Survey (CBECS) building characteristics and consumption and expenditures public use files. The files themselves can be downloaded in CSV (comma separated values) files from the CBECS web site:

  11. Variable Definitions

    Gasoline and Diesel Fuel Update (EIA)

    CSV File Documentation Energy Source Tables This document explains the contents of the comma-separated value (CSV) files by energy source located on the State Energy Data System (SEDS) Update web page at http://www.eia.gov/state/seds/seds-data-fuel.cfm. There is a CSV file for the most recent year's data for each energy source as shown in the HTML and PDF data tables. In some cases there is one data file for two tables. The first record in each file contains the column headings. The first data

  12. Search for: All records | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    Save Results Save this search to My Library Excel (limit 2000) CSV (limit 5000) XML (limit ... on Web pages that can be downloaded to Excel or in delimited text formats that can be ...

  13. Cathodoluminescence Spectrum Imaging Software

    Energy Science and Technology Software Center (OSTI)

    2011-04-07

    The software developed for spectrum imaging is applied to the analysis of the spectrum series generated by our cathodoluminescence instrumentation. This software provides advanced processing capabilities s such: reconstruction of photon intensity (resolved in energy) and photon energy maps, extraction of the spectrum from selected areas, quantitative imaging mode, pixel-to-pixel correlation spectrum line scans, ASCII, output, filling routines, drift correction, etc.

  14. Circumsolar Radiation Data: The Lawrence Berkeley Laboratory Reduced Data Base

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Note that each data set is composed of 20 lines of information with each line consistingof 77 characters. These are archived ASCII files. [Information on sites, number of data sets, etc. taken from the online publication (out of print) at http://rredc.nrel.gov/solar/pubs/circumsolar/index.html

  15. Geographical Distribution of Biomass Carbon in Tropical Southeast Asian Forests: A Database

    SciTech Connect (OSTI)

    Brown, S.

    2002-02-07

    A database was generated of estimates of geographically referenced carbon densities of forest vegetation in tropical Southeast Asia for 1980. A geographic information system (GIS) was used to incorporate spatial databases of climatic, edaphic, and geomorphological indices and vegetation to estimate potential (i.e., in the absence of human intervention and natural disturbance) carbon densities of forests. The resulting map was then modified to estimate actual 1980 carbon density as a function of population density and climatic zone. The database covers the following 13 countries: Bangladesh, Brunei, Cambodia (Campuchea), India, Indonesia, Laos, Malaysia, Myanmar (Burma), Nepal, the Philippines, Sri Lanka, Thailand, and Vietnam. The data sets within this database are provided in three file formats: ARC/INFO{trademark} exported integer grids, ASCII (American Standard Code for Information Interchange) files formatted for raster-based GIS software packages, and generic ASCII files with x, y coordinates for use with non-GIS software packages. This database includes ten ARC/INFO exported integer grid files (five with the pixel size 3.75 km x 3.75 km and five with the pixel size 0.25 degree longitude x 0.25 degree latitude) and 27 ASCII files. The first ASCII file contains the documentation associated with this database. Twenty-four of the ASCII files were generated by means of the ARC/INFO GRIDASCII command and can be used by most raster-based GIS software packages. The 24 files can be subdivided into two groups of 12 files each. These files contain real data values representing actual carbon and potential carbon density in Mg C/ha (1 megagram = 10{sup 6} grams) and integer- coded values for country name, Weck's Climatic Index, ecofloristic zone, elevation, forest or non-forest designation, population density, mean annual precipitation, slope, soil texture, and vegetation classification. One set of 12 files contains these data at a spatial resolution of 3.75 km, whereas the other set of 12 files has a spatial resolution of 0.25 degree. The remaining two ASCII data files combine all of the data from the 24 ASCII data files into 2 single generic data files. The first file has a spatial resolution of 3.75 km, and the second has a resolution of 0.25 degree. Both files also provide a grid-cell identification number and the longitude and latitude of the centerpoint of each grid cell. The 3.75-km data in this numeric data package yield an actual total carbon estimate of 42.1 Pg (1 petagram = 10{sup 15} grams) and a potential carbon estimate of 73.6 Pg; whereas the 0.25-degree data produced an actual total carbon estimate of 41.8 Pg and a total potential carbon estimate of 73.9 Pg. Fortran and SASTM access codes are provided to read the ASCII data files, and ARC/INFO and ARCVIEW command syntax are provided to import the ARC/INFO exported integer grid files. The data files and this documentation are available without charge on a variety of media and via the Internet from the Carbon Dioxide Information Analysis Center (CDIAC).

  16. Geographical Distribution of Biomass Carbon in Tropical Southeast Asian Forests: A Database

    SciTech Connect (OSTI)

    Brown, S

    2001-05-22

    A database was generated of estimates of geographically referenced carbon densities of forest vegetation in tropical Southeast Asia for 1980. A geographic information system (GIS) was used to incorporate spatial databases of climatic, edaphic, and geomorphological indices and vegetation to estimate potential (i.e., in the absence of human intervention and natural disturbance) carbon densities of forests. The resulting map was then modified to estimate actual 1980 carbon density as a function of population density and climatic zone. The database covers the following 13 countries: Bangladesh, Brunei, Cambodia (Campuchea), India, Indonesia, Laos, Malaysia, Myanmar (Burma), Nepal, the Philippines, Sri Lanka, Thailand, and Vietnam. The data sets within this database are provided in three file formats: ARC/INFOTM exported integer grids, ASCII (American Standard Code for Information Interchange) files formatted for raster-based GIS software packages, and generic ASCII files with x, y coordinates for use with non-GIS software packages. This database includes ten ARC/INFO exported integer grid files (five with the pixel size 3.75 km x 3.75 km and five with the pixel size 0.25 degree longitude x 0.25 degree latitude) and 27 ASCII files. The first ASCII file contains the documentation associated with this database. Twenty-four of the ASCII files were generated by means of the ARC/INFO GRIDASCII command and can be used by most raster-based GIS software packages. The 24 files can be subdivided into two groups of 12 files each. These files contain real data values representing actual carbon and potential carbon density in Mg C/ha (1 megagram = 10{sup 6} grams) and integer-coded values for country name, Weck's Climatic Index, ecofloristic zone, elevation, forest or non-forest designation, population density, mean annual precipitation, slope, soil texture, and vegetation classification. One set of 12 files contains these data at a spatial resolution of 3.75 km, whereas the other set of 12 files has a spatial resolution of 0.25 degree. The remaining two ASCII data files combine all of the data from the 24 ASCII data files into 2 single generic data files. The first file has a spatial resolution of 3.75 km, and the second has a resolution of 0.25 degree. Both files also provide a grid-cell identification number and the longitude and latitude of the center-point of each grid cell. The 3.75-km data in this numeric data package yield an actual total carbon estimate of 42.1 Pg (1 petagram = 10{sup 15} grams) and a potential carbon estimate of 73.6 Pg; whereas the 0.25-degree data produced an actual total carbon estimate of 41.8 Pg and a total potential carbon estimate of 73.9 Pg. Fortran and SAS{trademark} access codes are provided to read the ASCII data files, and ARC/INFO and ARCVIEW command syntax are provided to import the ARC/INFO exported integer grid files. The data files and this documentation are available without charge on a variety of media and via the Internet from the Carbon Dioxide Information Analysis Center (CDIAC).

  17. GSOD Based Daily Global Mean Surface Temperature and Mean Sea Level Air Pressure (1982-2011)

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Xuan Shi, Dali Wang

    This data product contains all the gridded data set at 1/4 degree resolution in ASCII format. Both mean temperature and mean sea level air pressure data are available. It also contains the GSOD data (1982-2011) from NOAA site, contains station number, location, temperature and pressures (sea level and station level). The data package also contains information related to the data processing methods

  18. GAMMAV1.2

    Energy Science and Technology Software Center (OSTI)

    2001-06-26

    This program was developed to read, plot and translate data taken by Sandia's Thermogrow program. It also allows one to determine the "gamma factor" for correcting small systematic errors in reflectance-correcting pyrometry. The computed gamma factor is used in Sandia's Thermogrow data collection and relectance-correcting pyrometer software. Data may be saved for processing by other programs like "Excel", Power point", "Origin" or any other program which requires ASCII data files.

  19. GSOD Based Daily Global Mean Surface Temperature and Mean Sea Level Air Pressure (1982-2011)

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Xuan Shi, Dali Wang

    2014-05-05

    This data product contains all the gridded data set at 1/4 degree resolution in ASCII format. Both mean temperature and mean sea level air pressure data are available. It also contains the GSOD data (1982-2011) from NOAA site, contains station number, location, temperature and pressures (sea level and station level). The data package also contains information related to the data processing methods

  20. ER2 Instrumentation and Measurements for CLASIC (Cloud Land Surface Interaction Campaign) June-2007 SGP {Author-Jimmy Voyles}

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ER2 Desired Measurements for CLASIC June 2007 SGP May 31, 2007 1 MEASUREMENT SOURCE DESIRED MEASUREMENTS AND PRODUCTS INSTRUMENT SYSTEMS Cloud Radar System (CRS), W-Band (95 GHz) 1) Vertical profiles of calibrated radar reflectivity 2) Vertical profiles of Doppler velocity 3) Vertical profiles of estimated IWC 4) Vertical profiles of linear depolarization ratio 5) (ASCII files and quick look plots for all of the above quantities) Resolution: 5 sec resolution? Cloud Physics Lidar (CPL) (1064,

  1. Report on Matters Identified at Strategic Petroleum Reserve During Audit of Statement of Financial Position, CR-FS-96-03

    Energy Savers [EERE]

    is an ASCII-formatted version of a printed document. The page numbers in this electronic version may not be in the same order as those in the printed document. The printed document may also contain charts and photographs which are not reproduced in this electronic version. If you require the printed version of this document, contact the Office of Inspector General (IG-1), Department of Energy, 1000 Independence Avenue, SW, Washington, DC, 20585, or call the Office of Inspector General Reports

  2. SPOCS User Guide

    SciTech Connect (OSTI)

    Curtis, Darren S.; Phillips, Aaron R.; McCue, Lee Ann

    2013-04-15

    SPOCS implements a graph-based ortholog prediction method to generate a simple tab-delimited table of orthologs, and in addition, html files that provide a visualization of the ortholog/paralog relationships to which gene/protein expression metadata may be overlaid.

  3. Evaluation Data of a High Temperature COTS Flash Memory Module (TI SM28VLT32) for Use in Geothermal Electronics Packages

    SciTech Connect (OSTI)

    Cashion, Avery

    2014-08-29

    The accompanying raw data is composslection. Each file is 3 columns and tab-delimited with the first column being the data address, the second column being the first byte of the data, and the third column being the second byte of the data.

  4. RPF: An Extensible, Cross-Platform, Binary File Format for Radiation Physics Data

    SciTech Connect (OSTI)

    Ham, C L

    2002-09-10

    Lawrence Livermore National Laboratory's Radiation Technology Group (RTG) uses a number of computer codes for simulation and analysis of radiation data. The number of incompatible data formats that these data presented themselves in have continued to multiply. In the 1980's a Common Data Format (CDF, see Appendix A) was devised for internal use by the RTG. This format represented a single gamma-ray spectrum as ASCII energy/count pairs preceded by an ASCII header. The ASCII representation of the data assured that it was compatible on any computing platform and this format is still in use. In the mid 1990's it became apparent that instrument systems of greater complexity would demand a file format of larger capacity to support systems then on the drawing board, including networks of sensors collecting time series of gamma-ray spectra. These systems were in the planning stage and defined data structures were not available. It became apparent that a new storage format for nuclear measurements data would be needed and it would have to be flexible and extensible to accommodate the requirements of systems of the future. As part of an LDRD, we began to investigate what others were doing, especially in the high-energy physics community, to deal with the large volumes of data being generated. Of particular interest was the very general Hierarchical Data Format (HDF), developed and maintained by the National Center for Supercomputing Applications (NCSA), that we ultimately used to develop the Radiation Physics Format (RPF). The HDF subroutine library provides users with the ability to customize a data file format based on standard calls to the HDF subroutine library. The RPF was developed and deployed on Sun and Hewlett-Packard workstations running their proprietary versions of UNIX.

  5. ARM - Publications: Science Team Meeting Documents

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    NCDX: NetCdr Data eXtraction utility for Examination and Visualization of Netcdf Data Flynn, C.J. and Ermold, B., Pacific Northwest National Laboratory Eleventh Atmospheric Radiation Measurement (ARM) Science Team Meeting NCDX is a command-line utility designed for routine examination and extraction of data from netcdf files. Data can be displayed graphically (line-plot, scatter-plot, overlay, color-intensity, etc.) or extracted as ASCII data. In either case, results can be saved to disk or

  6. Laser goniometer

    DOE Patents [OSTI]

    Fairer, George M.; Boernge, James M.; Harris, David W.; Campbell, DeWayne A.; Tuttle, Gene E.; McKeown, Mark H.; Beason, Steven C.

    1993-01-01

    The laser goniometer is an apparatus which permits an operator to sight along a geologic feature and orient a collimated lamer beam to match the attitude of the feature directly. The horizontal orientation (strike) and the angle from horizontal (dip), are detected by rotary incremental encoders attached to the laser goniometer which provide a digital readout of the azimuth and tilt of the collimated laser beam. A microprocessor then translates the square wave signal encoder outputs into an ASCII signal for use by data recording equipment.

  7. Microsoft PowerPoint - G-Allen-TWPICE-ACTIVE-data-provision-Nov06.ppt

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ACTIVE / TWP-ICE Science Meeting, Nov 06 The ACTIVE Dataset: Status and Availability Grant Allen (grant.allen@man.ac.uk) SEAES, University of Manchester For: TWP-ICE/ACTIVE Science meeting, New York, Nov 06 ACTIVE / TWP-ICE Science Meeting, Nov 06 The ACTIVE database * All data is available via FTP access to: mobile4.phy.umist.ac.uk * User account requests to grant.allen@man.ac.uk * Geophysical processed data in ASCii format is in directory: /Database3/ACTIVE/PROCESSED/current/ * Users can

  8. MOSS2D V1

    Energy Science and Technology Software Center (OSTI)

    2001-01-31

    This software reduces the data from two-dimensional kSA MOS program, k-Space Associates, Ann Arbor, MI. Initial MOS data is recorded without headers in 38 columns, with one row of data per acquisition per lase beam tracked. The final MOSS 2d data file is reduced, graphed, and saved in a tab-delimited column format with headers that can be plotted in any graphing software.

  9. NDMAS System and Process Description (Technical Report) | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    Technical Report: NDMAS System and Process Description Citation Details In-Document Search Title: NDMAS System and Process Description Experimental data generated by the Very High Temperature Reactor Program need to be more available to users in the form of data tables on Web pages that can be downloaded to Excel or in delimited text formats that can be used directly for input to analysis and simulation codes, statistical packages, and graphics software. One solution that can provide current and

  10. H:\DOD\OUTREACH\PUBS\DRAFTS\Mobile 6 users guide\420r03010.wpd

    National Nuclear Security Administration (NNSA)

    filename suffix indicates that the output is a tab' delimited output file. This type of output file generally can be read into Excel or Lotus123 directly. The information in the data file (spreadsheet output) is organized into spreadsheet columns and spreadsheet rows. They have the following definitions. Columns The spreadsheet output consists of 60 individual columns of information. These columns contain most of the information that is found in the descriptive output. The first row of each

  11. DAQMAN - A flexible configurable data acquisition system

    Energy Science and Technology Software Center (OSTI)

    2012-08-01

    DAQMAN is a flexible configurable interface that allows the user to build and operate a VME-based data acquisition system on a Linux workstation. It consists of two parts: a Java-based Graphical User Interface to configure the system, and a C-based utility that reads out the data and creates the output ASCII data file, with two levels of diagnostic tools. The data acquisition system requires a CAEN CONET-VME Bridge to communicate between the hardware in themore » VME crate and the Linux workstation. Data acquisition modules, such as ADCs, TDC, Scalers, can be loaded into the system, or removed easily. The GUI allows users to activate modules, and channels within modules by clicking on icons. Running configurations are stored; data are collected and can be viewed either as raw numbers, or by charts and histograms that update as the data are accumulated. Data files are written to disk in ASCII format, with a date and time stamp.« less

  12. Physical database port to workstations project plan. Version 2.6

    SciTech Connect (OSTI)

    Rhoades, C.E. Jr.

    1993-03-01

    The project goal is to port those physical databases used on the Cray by our important production codes to high-performance Unix workstations while maintaining the current computational capabilities and accuracies, and achieving reasonably efficient execution on the workstations. The port must strike a judicious balance between (a) not changing the current N/LTSS databases, accessing libraries, generating codes and using codes, and (b) adversely impacting the maintenance or performance of the various codes that create or use the databases on the Cray. (Because of its forthcoming delivery, the Sun Sparcstation 2, using SunOS 4.0.3 or later, is the initial hardware platform selected for the first workstation port.) The purpose in undertaking this project is to enable the production codes, Tart, Lasnex, Meg, Xraser, Sandyl (and its planned successor), Nike3d and Dyna3d to get up and running on the Unix platforms as soon as possible. Since most Cray file formats are not available on the workstations, the workstation databases and their libraries may have to use a variety of techniques to provide the same capabilities. The project`s primary approach will be to support either an ascii portable format (where this is readily feasible) or a bit-for-bit Cray identical absolute binary format (where ascii is not available or suitable). The physical databases are identified.

  13. SOLDESIGN user's manual copyright

    SciTech Connect (OSTI)

    Pillsbury, R.D. Jr.

    1991-02-01

    SOLDESIGN is a general purpose program for calculating and plotting magnetic fields, Lorentz body forces, resistances and inductances for a system of coaxial uniform current density solenoidal elements. The program was originally written in 1980 and has been evolving ever since. SOLDESIGN can be used with either interactive (terminal) or file input. Output can be to the terminal or to a file. All input is free-field with comma or space separators. SOLDESIGN contains an interactive help feature that allows the user to examine documentation while executing the program. Input to the program consists of a sequence of word commands and numeric data. Initially, the geometry of the elements or coils is defined by specifying either the coordinates of one corner of the coil or the coil centroid, a symmetry parameter to allow certain reflections of the coil (e.g., a split pair), the radial and axial builds, and either the overall current density or the total ampere-turns (NI). A more general quadrilateral element is also available. If inductances or resistances are desired, the number of turns must be specified. Field, force, and inductance calculations also require the number of radial current sheets (or integration points). Work is underway to extend the field, force, and, possibly, inductances to non-coaxial solenoidal elements.

  14. National Spill Test Technology Database

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sheesley, David [Western Research Institute

    Western Research Institute established, and ACRC continues to maintain, the National Spill Technology database to provide support to the Liquified Gaseous Fuels Spill Test Facility (now called the National HAZMAT Spill Center) as directed by Congress in Section 118(n) of the Superfund Amendments and Reauthorization Act of 1986 (SARA). The Albany County Research Corporation (ACRC) was established to make publicly funded data developed from research projects available to benefit public safety. The founders since 1987 have been investigating the behavior of toxic chemicals that are deliberately or accidentally spilled, educating emergency response organizations, and maintaining funding to conduct the research at the DOEÆs HAZMAT Spill Center (HSC) located on the Nevada Test Site. ACRC also supports DOE in collaborative research and development efforts mandated by Congress in the Clean Air Act Amendments. The data files are results of spill tests conducted at various times by the Silicones Environmental Health and Safety Council (SEHSC) and DOE, ANSUL, Dow Chemical, the Center for Chemical Process Safety (CCPS) and DOE, Lawrence Livermore National Laboratory (LLNL), OSHA, and DOT; DuPont, and the Western Research Institute (WRI), Desert Research Institute (DRI), and EPA. Each test data page contains one executable file for each test in the test series as well as a file named DOC.EXE that contains information documenting the test series. These executable files are actually self-extracting zip files that, when executed, create one or more comma separated value (CSV) text files containing the actual test data or other test information.

  15. National Spill Test Technology Database

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sheesley, David [Western Research Institute

    Western Research Institute established, and ACRC continues to maintain, the National Spill Technology database to provide support to the Liquified Gaseous Fuels Spill Test Facility (now called the National HAZMAT Spill Center) as directed by Congress in Section 118(n) of the Superfund Amendments and Reauthorization Act of 1986 (SARA). The Albany County Research Corporation (ACRC) was established to make publicly funded data developed from research projects available to benefit public safety. The founders since 1987 have been investigating the behavior of toxic chemicals that are deliberately or accidentally spilled, educating emergency response organizations, and maintaining funding to conduct the research at the DOEs HAZMAT Spill Center (HSC) located on the Nevada Test Site. ACRC also supports DOE in collaborative research and development efforts mandated by Congress in the Clean Air Act Amendments. The data files are results of spill tests conducted at various times by the Silicones Environmental Health and Safety Council (SEHSC) and DOE, ANSUL, Dow Chemical, the Center for Chemical Process Safety (CCPS) and DOE, Lawrence Livermore National Laboratory (LLNL), OSHA, and DOT; DuPont, and the Western Research Institute (WRI), Desert Research Institute (DRI), and EPA. Each test data page contains one executable file for each test in the test series as well as a file named DOC.EXE that contains information documenting the test series. These executable files are actually self-extracting zip files that, when executed, create one or more comma separated value (CSV) text files containing the actual test data or other test information.

  16. SREL Reprint #3345

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    5 Delimiting road-effect zones for threatened species: implications for mitigation fencing J. Mark Peaden1, Tracey D. Tuberville2, Kurt A. Buhlmann2, Melia G. Nafus1,3, and Brian D. Todd1 1Department of Wildlife, Fish, and Conservation Biology, University of California, Davis, One Shields Ave, Davis, CA 95616, USA. 2University of Georgia’s Savannah River Ecology Lab, Drawer E, Aiken, SC 29802, USA. 3San Diego Zoo Institute for Conservation Research, 15600 San Pasqual Valley Rd, Escondido, CA

  17. MOSSPATCH V1

    Energy Science and Technology Software Center (OSTI)

    2001-01-31

    This program reduces the data from one-dimensional laser beam arrays from kSA MOS program, k-Space Associates, Ann Arbor, MI. Initial data is recorded without headers in 38 columns, with one row of data per acquisition per laser beam tracked. MOSS Patch can merge several data files together, filters out beam overlaps, and reduces the data. The final data is graphed and saved in a tab-delimited column format with headers that can be plotted in anymore » graphing software.« less

  18. Rapid automatic keyword extraction for information retrieval and analysis

    DOE Patents [OSTI]

    Rose, Stuart J (Richland, WA); Cowley,; Wendy E (Richland, WA); Crow, Vernon L (Richland, WA); Cramer, Nicholas O (Richland, WA)

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  19. Electrode With Porous Three-Dimensional Support

    DOE Patents [OSTI]

    Bernard, Patrick (Massy, FR); Dauchier, Jean-Michel (Martignas, FR); Simonneau, Olivier (Dourdan, FR)

    1999-07-27

    Electrode including a paste containing particles of electrochemically active material and a conductive support consisting of a three-dimensional porous material comprising strands delimiting contiguous pores communicating via passages, characterized in that the average width L in .mu.m of said passages is related to the average diameter .O slashed. in .mu.m of said particles by the following equation, in which W and Y are dimensionless coefficients: wherein W=0.16 Y=1.69 X=202.4 .mu.m and Z=80 .mu.m

  20. Flexible collapse-resistant and length-stable vaccum hose

    DOE Patents [OSTI]

    Kashy, David H.

    2003-08-19

    A hose for containing a vacuum, which hose has an impermeable flexible tube capable of holding a vacuum and a braided or interwoven flexible interior wall, said wall providing support to said interior wall of said impermeable flexible tube. Optionally, an exterior braided or woven wall may be provided to the hose for protection or to allow the hose to be used as a pressure hose. The hose may delimit a vacuum space through which may travel a thermal transfer line containing, for example, cryogenic fluid.

  1. Method for guessing the response of a physical system to an arbitrary input

    DOE Patents [OSTI]

    Wolpert, David H.

    1996-01-01

    Stacked generalization is used to minimize the generalization errors of one or more generalizers acting on a known set of input values and output values representing a physical manifestation and a transformation of that manifestation, e.g., hand-written characters to ASCII characters, spoken speech to computer command, etc. Stacked generalization acts to deduce the biases of the generalizer(s) with respect to a known learning set and then correct for those biases. This deduction proceeds by generalizing in a second space whose inputs are the guesses of the original generalizers when taught with part of the learning set and trying to guess the rest of it, and whose output is the correct guess. Stacked generalization can be used to combine multiple generalizers or to provide a correction to a guess from a single generalizer.

  2. Graduate student theses supported by DOE`s Environmental Sciences Division

    SciTech Connect (OSTI)

    Cushman, R.M.; Parra, B.M.

    1995-07-01

    This report provides complete bibliographic citations, abstracts, and keywords for 212 doctoral and master`s theses supported fully or partly by the U.S. Department of Energy`s Environmental Sciences Division (and its predecessors) in the following areas: Atmospheric Sciences; Marine Transport; Terrestrial Transport; Ecosystems Function and Response; Carbon, Climate, and Vegetation; Information; Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP); Atmospheric Radiation Measurement (ARM); Oceans; National Institute for Global Environmental Change (NIGEC); Unmanned Aerial Vehicles (UAV); Integrated Assessment; Graduate Fellowships for Global Change; and Quantitative Links. Information on the major professor, department, principal investigator, and program area is given for each abstract. Indexes are provided for major professor, university, principal investigator, program area, and keywords. This bibliography is also available in various machine-readable formats (ASCII text file, WordPerfect{reg_sign} files, and PAPYRUS{trademark} files).

  3. Work Order Generation Macros for Word Perfect 6.X for Windows

    Energy Science and Technology Software Center (OSTI)

    1997-09-02

    Included are three general WP macros (two independent and one multiple) and a template used at the Test Reactor Area (TRA) for the generation of the Work Orders (WO's) used to perform corrective and preventative maintenance, as well as modifications of existing systems and installation of new systems. They incorporate facility specific requirements as well as selected federal/state orders. These macros are used to generate a WP document which is then converted into ASCII textmore » for import to the maintenance software. Currently we are using MCRS but should be compatible with other platforms such as Passport. Reference the included file Wogen.txt for installation and usage instructions.« less

  4. Seismic Waves, 4th order accurate

    Energy Science and Technology Software Center (OSTI)

    2013-08-16

    SW4 is a program for simulating seismic wave propagation on parallel computers. SW4 colves the seismic wave equations in Cartesian corrdinates. It is therefore appropriate for regional simulations, where the curvature of the earth can be neglected. SW4 implements a free surface boundary condition on a realistic topography, absorbing super-grid conditions on the far-field boundaries, and a kinematic source model consisting of point force and/or point moment tensor source terms. SW4 supports a fully 3-Dmore » heterogeneous material model that can be specified in several formats. SW4 can output synthetic seismograms in an ASCII test format, or in the SAC finary format. It can also present simulation information as GMT scripts, whixh can be used to create annotated maps. Furthermore, SW4 can output the solution as well as the material model along 2-D grid planes.« less

  5. Simple Electric Vehicle Simulation

    Energy Science and Technology Software Center (OSTI)

    1993-07-29

    SIMPLEV2.0 is an electric vehicle simulation code which can be used with any IBM compatible personal computer. This general purpose simulation program is useful for performing parametric studies of electric and series hybrid electric vehicle performance on user input driving cycles.. The program is run interactively and guides the user through all of the necessary inputs. Driveline components and the traction battery are described and defined by ASCII files which may be customized by themore » user. Scaling of these components is also possible. Detailed simulation results are plotted on the PC monitor and may also be printed on a printer attached to the PC.« less

  6. Flow and Containment Transport Code for Modeling Variably Saturated Porous Media

    Energy Science and Technology Software Center (OSTI)

    1998-05-14

    FACT is a finite element based code designed to model subsurface flow and contaminant transport. It was designed to perform transient three-dimensional calculations that simulate isothermal groundwater flow, moisture movement, and solute transport in variably saturated and fully saturated subsurface porous media. The code is designed specifically to handle complex multi-layer and/or heterogenous aquifer systems in an efficient manner and accommodates a wide range of boundary conditions. Additionally 1-D and 2-D (in Cartesian coordinates) problemsmore » are handled in FACT by simply limiting the number of elements in a particular direction(s) to one. The governing equations in FACT are formulated only in Cartesian coordinates. FACT writes out both ascii and graphical binary files that are TECPLOT-ready. Special features are also available within FACT for handling the typical groundwater modeling needs for remediation efforts at the Savannah River Site.« less

  7. Gauge Configurations for Lattice QCD from The Gauge Connection

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    The Gauge Connection is an experimental archive for lattice QCD and a repository of gauge configurations made freely available to the community. Contributors to the archive include the Columbia QCDSP collaboration, the MILC collaboration, and others. Configurations are stored in QCD archive format, consisting of an ASCII header which defines various parameters, followed by binary data. NERSC has also provided some utilities and examples that will aid users in handling the data. Users may browse the archive, but are required to register for a password in order to download data. Contents of the archive are organized under four broad headings: Quenched (more than 1200 configurations); Dynamical, Zero Temperature (more than 300 configurations); MILC Improved Staggered Asqtad Lattices (more than 7000 configurations); and Dynamical, Finite Temperature (more than 1200 configurations)

  8. CHARICE1.0

    Energy Science and Technology Software Center (OSTI)

    2007-10-25

    CHARICE analyzes velcity waveform data from ramp-wave experiments to determine a sample material's quasi-isentropic loading response in stress and density. A graphical interface handles all user interaction. CHARICE uses a generalized ASCII file format for input waveform data, obviating the need for pre-processing of these data. Capabilities include calculation of uncertainty bounds, correction for non-uniform baseplate thickness, and user-provided ramp-wave loading response for interferometer window materials. Output consists of particle velocity, lagrangian wave speed, density,more » and stress along the loading quasi-isentrope, as well as in-situ time istory for any of these variables at the front or back surface of each sample.« less

  9. A Pyrolysis and Primary Migration Model

    Energy Science and Technology Software Center (OSTI)

    1993-08-11

    PMOD-Version 1.6 is a copyrighted computer program for simulating oil generation, cracking, and other chemical reactions occurring during the pyrolysis of petroleum source rocks over a specified history of temperature and either depth or hydrostatic pressure. The chemical reaction mechanism is defined by the user and, within limits, can be as simple or complex as desired. The model also simulates compaction of the source rock and expulsion of a liquid water phase and a liquidmore » hydrocarbon phase. The expulsion is done by either a simple, constant-fluid-density model or by a more rigorous model using a modified Redlich-Kwong-Soave equation of state. The latter model also calculates overpressuring. An auxiliary program, PLOTPMOD, permits graphical display and hardcopy of the results, as well as preparation of ASCII-file subsets of the results for use with a spreadsheet or other graphics program.« less

  10. SIMPLEV: A simple electric vehicle simulation program, Version 1.0

    SciTech Connect (OSTI)

    Cole, G.H.

    1991-06-01

    An electric vehicle simulation code which can be used with any IBM compatible personal computer was written. This general purpose simulation program is useful for performing parametric studies of electric vehicle performance on user input driving cycles. The program is run interactively and guides the user through all of the necessary inputs. Driveline components and the traction battery are described and defined by ASCII files which may be customized by the user. Scaling of these components is also possible. Detailed simulation results are plotted on the PC monitor and may also be printed on a printer attached to the PC. This report serves as a users` manual and documents the mathematical relationships used in the simulation.

  11. Flow and Containment Transport Code for Modeling Variably Saturated Porous Media

    Energy Science and Technology Software Center (OSTI)

    1998-05-14

    FACT is a finite element based code designed to model subsurface flow and contaminant transport. It was designed to perform transient three-dimensional calculations that simulate isothermal groundwater flow, moisture movement, and solute transport in variably saturated and fully saturated subsurface porous media. The code is designed specifically to handle complex multi-layer and/or heterogenous aquifer systems in an efficient manner and accommodates a wide range of boundary conditions. Additionally 1-D and 2-D (in Cartesian coordinates) problemsmore »are handled in FACT by simply limiting the number of elements in a particular direction(s) to one. The governing equations in FACT are formulated only in Cartesian coordinates. FACT writes out both ascii and graphical binary files that are TECPLOT-ready. Special features are also available within FACT for handling the typical groundwater modeling needs for remediation efforts at the Savannah River Site.« less

  12. Hydroacoustic Evaluation of Fish Passage Through Bonneville Dam in 2005

    SciTech Connect (OSTI)

    Ploskey, Gene R.; Weiland, Mark A.; Zimmerman, Shon A.; Hughes, James S.; Bouchard, Kyle E.; Fischer, Eric S.; Schilt, Carl R.; Hanks, Michael E.; Kim, Jina; Skalski, John R.; Hedgepeth, J.; Nagy, William T.

    2006-12-04

    The Portland District of the U.S. Army Corps of Engineers requested that the Pacific Northwest National Laboratory (PNNL) conduct fish-passage studies at Bonneville Dam in 2005. These studies support the Portland District's goal of maximizing fish-passage efficiency (FPE) and obtaining 95% survival for juvenile salmon passing Bonneville Dam. Major passage routes include 10 turbines and a sluiceway at Powerhouse 1 (B1), an 18-bay spillway, and eight turbines and a sluiceway at Powerhouse 2 (B2). In this report, we present results of two studies related to juvenile salmonid passage at Bonneville Dam. The studies were conducted between April 16 and July 15, 2005, encompassing most of the spring and summer migrations. Studies included evaluations of (1) Project fish passage efficiency and other major passage metrics, and (2) smolt approach and fate at B1 Sluiceway Outlet 3C from the B1 forebay. Some of the large appendices are only presented on the compact disk (CD) that accompanies the final report. Examples include six large comma-separated-variable (.CSV) files of hourly fish passage, hourly variances, and Project operations for spring and summer from Appendix E, and large Audio Video Interleave (AVI) files with DIDSON-movie clips of the area upstream of B1 Sluiceway Outlet 3C (Appendix H). Those video clips show smolts approaching the outlet, predators feeding on smolts, and vortices that sometimes entrained approaching smolts into turbines. The CD also includes Adobe Acrobat Portable Document Files (PDF) of the entire report and appendices.

  13. Berkeley Quantitative Genome Browser

    Energy Science and Technology Software Center (OSTI)

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation.more » The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.« less

  14. SPOCS: Software for Predicting and Visualizing Orthology/Paralogy Relationships Among Genomes

    SciTech Connect (OSTI)

    Curtis, Darren S.; Phillips, Aaron R.; Callister, Stephen J.; Conlan, Sean; McCue, Lee Ann

    2013-10-15

    At the rate that prokaryotic genomes can now be generated, comparative genomics studies require a flexible method for quickly and accurately predicting orthologs among the rapidly changing set of genomes available. SPOCS implements a graph-based ortholog prediction method to generate a simple tab-delimited table of orthologs and in addition, html files that provide a visualization of the predicted ortholog/paralog relationships to which gene/protein expression metadata may be overlaid. AVAILABILITY AND IMPLEMENTATION: A SPOCS web application is freely available at http://cbb.pnnl.gov/portal/tools/spocs.html. Source code for Linux systems is also freely available under an open source license at http://cbb.pnnl.gov/portal/software/spocs.html; the Boost C++ libraries and BLAST are required.

  15. One-Piece Battery Incorporating A Circulating Fluid Type Heat Exchanger

    DOE Patents [OSTI]

    Verhoog, Roelof (Bordeaux, FR)

    2001-10-02

    A one-piece battery comprises a tank divided into cells each receiving an electrode assembly, closure means for the tank and a circulating fluid type heat exchanger facing the relatively larger faces of the electrode assembly. The fluid flows in a compartment defined by two flanges which incorporate a fluid inlet orifice communicating with a common inlet manifold and a fluid outlet orifice communicating with a common outlet manifold. The tank comprises at least two units and each unit comprises at least one cell delimited by walls. The wall facing a relatively larger face of the electrode assembly constitutes one of the flanges. Each unit further incorporates a portion of an inlet and outlet manifold. The units are fastened together so that the flanges when placed face-to-face form a sealed circulation compartment and the portions of the same manifold are aligned with each other.

  16. Automated D/3 to Visio Analog Diagrams

    Energy Science and Technology Software Center (OSTI)

    2000-08-10

    ADVAD1 reads an ASCII file containing the D/3 DCS MDL input for analog points for a D/3 continuous database. It uses the information in the files to create a series of Visio files representing the structure of each analog chain, one drawing per Visio file. The actual drawing function is performed by Visio (requires Visio version 4.5+). The user can configure the program to select which fields in the database are shown on the diagrammore » and how the information is to be presented. This gives a visual representation of the structure of the analog chains, showing selected fields in a consistent manner. Updating documentation can be done easily and the automated approach eliminates human error in the cadding process. The program can also create the drawings far faster than a human operator is capable, able to create approximately 270 typical diagrams in about 8 minutes on a Pentium II 400 MHz PC. The program allows for multiple option sets to be saved to provide different settings (i.e., different fields, different field presentations, and /or different diagram layouts) for various scenarios or facilities on one workstation. Option sets may be exported from the Windows registry to allow duplication of settings on another workstation.« less

  17. Source Catalog Data from FIRST (Faint Images of the Radio Sky at Twenty-Centimeters)

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Becker, Robert H.; Helfand, David J.; White, Richard L.; Gregg, Michael D.; Laurent-Muehleisen, Sally A.

    FIRST, Faint Images of the Radio Sky at Twenty-Centimeters, is a project designed to produce the radio equivalent of the Palomar Observatory Sky Survey over 10,000 square degrees of the North Galactic Cap. Using the National Radio Astronomy Observatory's (NRAO) Very Large Array (VLA) in its B-configuration, the Survey acquired 3-minute snapshots covering a hexagonal grid using 2?7 3-MHz frequency channels centered at 1365 and 1435 MHz. The data were edited, self-calibrated, mapped, and CLEANed using an automated pipeline based largely on routines in the Astronomical Image Processing System (AIPS). A final atlas of maps is produced by coadding the twelve images adjacent to each pointing center. Source catalogs with flux densities and size information are generated from the coadded images also. The 2011 catalog is the latest version and has been tested to ensure reliability and completness. The catalog, generated from the 1993 through 2004 images, contains 816,000 sources and covers more than 9000 square degrees. A specialized search interface for the catalog resides at this website, and the catalog is also available as a compressed ASCII file. The user may also view earlier versions of the source catalog. The FIRST survey area was chosen to coincide with that of the Sloan Digital Sky Survey (SDSS); at the m(v)~24 limit of SDSS, ~50% of the optical counterparts to FIRST sources will be detected.

  18. Solving iTOUGH2 simulation and optimization problems using the PEST protocol

    SciTech Connect (OSTI)

    Finsterle, S.A.; Zhang, Y.

    2011-02-01

    The PEST protocol has been implemented into the iTOUGH2 code, allowing the user to link any simulation program (with ASCII-based inputs and outputs) to iTOUGH2's sensitivity analysis, inverse modeling, and uncertainty quantification capabilities. These application models can be pre- or post-processors of the TOUGH2 non-isothermal multiphase flow and transport simulator, or programs that are unrelated to the TOUGH suite of codes. PEST-style template and instruction files are used, respectively, to pass input parameters updated by the iTOUGH2 optimization routines to the model, and to retrieve the model-calculated values that correspond to observable variables. We summarize the iTOUGH2 capabilities and demonstrate the flexibility added by the PEST protocol for the solution of a variety of simulation-optimization problems. In particular, the combination of loosely coupled and tightly integrated simulation and optimization routines provides both the flexibility and control needed to solve challenging inversion problems for the analysis of multiphase subsurface flow and transport systems.

  19. Express Primer Tool for high-throughput gene cloning and expression

    Energy Science and Technology Software Center (OSTI)

    2002-12-01

    A tool to assist in the design of primers for DNA amplification. The Express Primer web-based tool generates primer sequences specifically for the generation of expression clones for both lab scale and high-throughput projects. The application is designed not only to allow the user complete flexibility to specify primer design parameters but also to minimize the amount of manual intervention needed to generate a large number of primers for simultaneous amplification of multiple target genes.more » The Express Primer Tool enables the user to specify various experimental parameters (e.g. optimal Tm, Tm range, maximum Tm difference) for single or multiple candidate sequence(s) in FASTA format input as a flat text (ASCII) file. The application generates condidate primers, selects optimal primer pairs, and writes the forward and reverse primers pairs to an Excel file that is suitable for electronic submission to a synthesis facility. The program parameters emphasize high-throughput but allow for target atrition at various stages of the project.« less

  20. PFIDL Version 2.0

    Energy Science and Technology Software Center (OSTI)

    2007-05-15

    PFIDL is an analysis and visualization package written to assist scientists in obtaining a better understanding of experimental and theoretical data and for the graphical generation of theoretical code input. PFIDL is written in Fortran, C, and the IDL procedural language. It extends the functionality and provides a more user-friendly interface for the commercial software IDL from Research Systems, Inc. In addition to several standard ASCII data set formats, PFIDL provides a convenient interface tomore » the PFF, Exodus-II, SAF, PDS, PDB and ACIS data file formats, and has convenient output routines for standard image formats. Full IDL functionality is maintained to facilitate redundant analysis of multiple data sets. Special purpose analysis routines, the rich selection of input routines, and extensive comparison, mathematical, and display routines, facilitate the rapid understanding of experimental and theoretical data sets. PFIDL also includes a Fortran library and graphical user interface for IDL that facilitates the use of PFIDL graphics and analysis procedures in Fortran programs while maintaining the command recall and command editing functionality of IDL. On-line documentation is included.« less

  1. SAPHIRE 8 Volume 7 - Data Loading

    SciTech Connect (OSTI)

    K. J. Kvarfordt; S. T. Wood; C. L. Smith; S. R. Prescott

    2011-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 8. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  2. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Data Loading Manual

    SciTech Connect (OSTI)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 6.0 and Version 7.0. In general, the data transfer procedures for version 6 and 7 are the same, but where deviations exist, the differences are noted. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  3. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Data Loading Manual

    SciTech Connect (OSTI)

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 6.0 and Version 7.0. In general, the data transfer procedures for version 6 and 7 are the same, but where deviations exist, the differences are noted. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  4. System and method for simultaneously collecting serial number information from numerous identity tags

    DOE Patents [OSTI]

    Doty, Michael A. (Manteca, CA)

    1997-01-01

    A system and method for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded "D" character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the "D" interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available.

  5. System and method for simultaneously collecting serial number information from numerous identity tags

    DOE Patents [OSTI]

    Doty, M.A.

    1997-01-07

    A system and method are disclosed for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded ``D`` character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the ``D`` interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available. 5 figs.

  6. Groundwater Screen

    Energy Science and Technology Software Center (OSTI)

    1993-11-09

    GWSCREEN was developed for assessment of the groundwater pathway from leaching of radioactive and non radioactive substances from surface or buried sources and release to percolation ponds. The code calculates the limiting soil concentration or effluent release concentration such that, after leaching and transport to the aquifer, regulatory contaminant levels in groundwater are not exceeded. The code uses a mass conservation approach to model three processes: Contaminant release from a source volume, contaminant transport inmore »the unsaturated zone, and contaminant transport in the saturated zone. The source model considers the sorptive properties and solubility of the contaminant. Transport in the unsaturated zone is described by a plug flow model. Transport in the saturated zone is calculated with a semi-analytical solution to the advection dispersion equation in groundwater. Concentration as a function of time at a user specified receptor point and maximum concentration averaged over the exposure interval are also calculated. In addition, the code calculates transport and impacts of radioactive progeny. Input to GWSCREEN is through one, free format ASCII file. This code was designed for assessment and screening of the groundwater pathway when field data is limited. It was not intended to be a predictive tool.« less

  7. PFIDL Version 2.0

    SciTech Connect (OSTI)

    2007-05-15

    PFIDL is an analysis and visualization package written to assist scientists in obtaining a better understanding of experimental and theoretical data and for the graphical generation of theoretical code input. PFIDL is written in Fortran, C, and the IDL procedural language. It extends the functionality and provides a more user-friendly interface for the commercial software IDL from Research Systems, Inc. In addition to several standard ASCII data set formats, PFIDL provides a convenient interface to the PFF, Exodus-II, SAF, PDS, PDB and ACIS data file formats, and has convenient output routines for standard image formats. Full IDL functionality is maintained to facilitate redundant analysis of multiple data sets. Special purpose analysis routines, the rich selection of input routines, and extensive comparison, mathematical, and display routines, facilitate the rapid understanding of experimental and theoretical data sets. PFIDL also includes a Fortran library and graphical user interface for IDL that facilitates the use of PFIDL graphics and analysis procedures in Fortran programs while maintaining the command recall and command editing functionality of IDL. On-line documentation is included.

  8. Coal Preparation Plant Simulation

    Energy Science and Technology Software Center (OSTI)

    1992-02-25

    COALPREP assesses the degree of cleaning obtained with different coal feeds for a given plant configuration and mode of operation. It allows the user to simulate coal preparation plants to determine an optimum plant configuration for a given degree of cleaning. The user can compare the performance of alternative plant configurations as well as determine the impact of various modes of operation for a proposed configuration. The devices that can be modelled include froth flotationmore » devices, washers, dewatering equipment, thermal dryers, rotary breakers, roll crushers, classifiers, screens, blenders and splitters, and gravity thickeners. The user must specify the plant configuration and operating conditions and a description of the coal feed. COALPREP then determines the flowrates within the plant and a description of each flow stream (i.e. the weight distribution, percent ash, pyritic sulfur and total sulfur, moisture, BTU content, recoveries, and specific gravity of separation). COALPREP also includes a capability for calculating the cleaning cost per ton of coal. The IBM PC version contains two auxiliary programs, DATAPREP and FORLIST. DATAPREP is an interactive preprocessor for creating and editing COALPREP input data. FORLIST converts carriage-control characters in FORTRAN output data to ASCII line-feed (X''0A'') characters.« less

  9. Designs for Risk Evaluation and Management

    SciTech Connect (OSTI)

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy?s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user?s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.

  10. High-Energy Cosmic Ray Event Data from the Pierre Auger Cosmic Ray Observatory

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    The Pierre Auger Cosmic Ray Observatory in Mendoza, Argentina is the result of an international collaboration funded by 15 countries and many different organizations. Its mission is to capture high-energy cosmic ray events or air showers for research into their origin and nature. The Pierre Auger Collaboration agreed to make 1% of its data available to the public. The Public Event Explorer is a search tool that allows users to browse or search for and display figures and data plots of events collected since 2004. The repository is updated daily, and, as of June, 2014, makes more than 35,000 events publicly available. The energy of a cosmic ray is measured in Exa electron volts or EeV. These event displays can be browsed in order of their energy level from 0.1 to 41.1 EeV. Each event has an individual identification number.

    The event displays provide station data, cosmic ray incoming direction, various energy measurements, plots, vector-based images, and an ASCII data file.

  11. Task-parallel message passing interface implementation of Autodock4 for docking of very large databases of compounds using high-performance super-computers

    SciTech Connect (OSTI)

    Collignon, Barbara C; Schultz, Roland; Smith, Jeremy C; Baudry, Jerome Y

    2011-01-01

    A message passing interface (MPI)-based implementation (Autodock4.lga.MPI) of the grid-based docking program Autodock4 has been developed to allow simultaneous and independent docking of multiple compounds on up to thousands of central processing units (CPUs) using the Lamarkian genetic algorithm. The MPI version reads a single binary file containing precalculated grids that represent the protein-ligand interactions, i.e., van der Waals, electrostatic, and desolvation potentials, and needs only two input parameter files for the entire docking run. In comparison, the serial version of Autodock4 reads ASCII grid files and requires one parameter file per compound. The modifications performed result in significantly reduced input/output activity compared with the serial version. Autodock4.lga.MPI scales up to 8192 CPUs with a maximal overhead of 16.3%, of which two thirds is due to input/output operations and one third originates from MPI operations. The optimal docking strategy, which minimizes docking CPU time without lowering the quality of the database enrichments, comprises the docking of ligands preordered from the most to the least flexible and the assignment of the number of energy evaluations as a function of the number of rotatable bounds. In 24 h, on 8192 high-performance computing CPUs, the present MPI version would allow docking to a rigid protein of about 300K small flexible compounds or 11 million rigid compounds.

  12. Brain-Emulating Cognition and Control Architecture (BECCA) V1.0 beta

    Energy Science and Technology Software Center (OSTI)

    2007-09-30

    BECCA is a learning and control method based on the function of the human brain. The goal behind its creation is to learn to control robots in unfamiliar environments in a way that is very robust, similar to the way that an infant learns to interact with her environment by trial and error. As of this release, this software contains two simulations of BECCA controlling robots: one is a one degree-of-freedom spinner robot and themore » other is a 7 degree-of-freedom serial link arm with a terminal gripper. In addition, the software contains code that identifies synonyms in a untagged corpus of ASCII words. This last is a demonstration of BECCA's ability to generate abstract concepts from concrete experience. The BECCA simulation is coded so as to make it extensible to new applications. It is modular, object-oriented code in which the portions of the code that are specific to one simulation are easily separable from those portions that are the constant between implementations. BECCA makes very few assumptions about the robot and environment it is learning, and so is applicable to a wide range of learning and control problems.« less

  13. iTOUGH2 Universal Optimization Using the PEST Protocol

    SciTech Connect (OSTI)

    Finsterle, S.A.

    2010-07-01

    iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2 is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many application models, the PEST protocol [Doherty, 2007] has been implemented into iTOUGH2. This protocol enables communication between the application (which can be a single 'black-box' executable or a script or batch file that calls multiple codes) and iTOUGH2. The concept requires that for the application model: (1) Input is provided on one or more ASCII text input files; (2) Output is returned to one or more ASCII text output files; (3) The model is run using a system command (executable or script/batch file); and (4) The model runs to completion without any user intervention. For each forward run invoked by iTOUGH2, select parameters cited within the application model input files are then overwritten with values provided by iTOUGH2, and select variables cited within the output files are extracted and returned to iTOUGH2. It should be noted that the core of iTOUGH2, i.e., its optimization routines and related analysis tools, remains unchanged; it is only the communication format between input parameters, the application model, and output variables that are borrowed from PEST. The interface routines have been provided by Doherty [2007]. The iTOUGH2-PEST architecture is shown in Figure 1. This manual contains installation instructions for the iTOUGH2-PEST module, and describes the PEST protocol as well as the input formats needed in iTOUGH2. Examples are provided that demonstrate the use of model-independent optimization and analysis using iTOUGH2.

  14. Testing actinide fission yield treatment in CINDER90 for use in MCNP6 burnup calculations

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Fensin, Michael Lorne; Umbel, Marissa

    2015-09-18

    Most of the development of the MCNPX/6 burnup capability focused on features that were applied to the Boltzman transport or used to prepare coefficients for use in CINDER90, with little change to CINDER90 or the CINDER90 data. Though a scheme exists for best solving the coupled Boltzman and Bateman equations, the most significant approximation is that the employed nuclear data are correct and complete. Thus, the CINDER90 library file contains 60 different actinide fission yields encompassing 36 fissionable actinides (thermal, fast, high energy and spontaneous fission). Fission reaction data exists for more than 60 actinides and as a result, fissionmore » yield data must be approximated for actinides that do not possess fission yield information. Several types of approximations are used for estimating fission yields for actinides which do not possess explicit fission yield data. The objective of this study is to test whether or not certain approximations of fission yield selection have any impact on predictability of major actinides and fission products. Further we assess which other fission products, available in MCNP6 Tier 3, result in the largest difference in production. Because the CINDER90 library file is in ASCII format and therefore easily amendable, we assess reasons for choosing, as well as compare actinide and major fission product prediction for the H. B. Robinson benchmark for, three separate fission yield selection methods: (1) the current CINDER90 library file method (Base); (2) the element method (Element); and (3) the isobar method (Isobar). Results show that the three methods tested result in similar prediction of major actinides, Tc-99 and Cs-137; however, certain fission products resulted in significantly different production depending on the method of choice.« less

  15. CA_OPPUSST - Cantera OPUS Steady State

    Energy Science and Technology Software Center (OSTI)

    2005-03-01

    The Cantera Opus Steady State (ca-opusst) applications solves steady reacting flow problems in opposed-flow geometries. It is a 1-0 application that represents axisymmetnc 3-0 physical systems that can be reduced via a similarity transformation to a 1-0 mathematical representation. The code contain solutions of the general dynamic equations for the particle distribution functions using a sectional model to describe the particle distribution function. Operators for particle nucleation, coagulation, condensation (i.e., growth/etching via reactions with themore » gas ambient), internal particle reactions. particle transport due to convection and due to molecular transport, are included in the particle general dynamics equation. Heat transport due to radiation exchange of the environment with particles in local thermal equilibrium to the surrounding gas will be included in the enthalpy conservation equation that is solved for the coupled gas! particle system in an upcoming version of the code due in June 2005. The codes use Cantera , a C++ Cal Tech code, for determination of gas phase species transport, reaction, and thermodynamics physical properties and source terms. The Codes use the Cantera Aerosol Dynamics Simulator (CADS) package, a general library for aerosol modeling, to calculate properties and source terms for the aerosol general dynamics equation, including particle formation from gas phase reactions, particle surface chemistry (growth and oxidation), bulk particle chemistry, particle transport by Brownian diffusion, thermophoresis, and diffusiophoresis, and thermal radiative transport involving particles. Also included are post-processing programs, cajost and cajrof, to extract ascii data from binary output files to produce plots.« less

  16. Designs for Risk Evaluation and Management

    Energy Science and Technology Software Center (OSTI)

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy’s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakagemore » simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user’s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.« less

  17. Testing actinide fission yield treatment in CINDER90 for use in MCNP6 burnup calculations

    SciTech Connect (OSTI)

    Fensin, Michael Lorne; Umbel, Marissa

    2015-09-18

    Most of the development of the MCNPX/6 burnup capability focused on features that were applied to the Boltzman transport or used to prepare coefficients for use in CINDER90, with little change to CINDER90 or the CINDER90 data. Though a scheme exists for best solving the coupled Boltzman and Bateman equations, the most significant approximation is that the employed nuclear data are correct and complete. Thus, the CINDER90 library file contains 60 different actinide fission yields encompassing 36 fissionable actinides (thermal, fast, high energy and spontaneous fission). Fission reaction data exists for more than 60 actinides and as a result, fission yield data must be approximated for actinides that do not possess fission yield information. Several types of approximations are used for estimating fission yields for actinides which do not possess explicit fission yield data. The objective of this study is to test whether or not certain approximations of fission yield selection have any impact on predictability of major actinides and fission products. Further we assess which other fission products, available in MCNP6 Tier 3, result in the largest difference in production. Because the CINDER90 library file is in ASCII format and therefore easily amendable, we assess reasons for choosing, as well as compare actinide and major fission product prediction for the H. B. Robinson benchmark for, three separate fission yield selection methods: (1) the current CINDER90 library file method (Base); (2) the element method (Element); and (3) the isobar method (Isobar). Results show that the three methods tested result in similar prediction of major actinides, Tc-99 and Cs-137; however, certain fission products resulted in significantly different production depending on the method of choice.

  18. Underground Coal Gasification Program

    Energy Science and Technology Software Center (OSTI)

    1994-12-01

    CAVSIM is a three-dimensional, axisymmetric model for resource recovery and cavity growth during underground coal gasification (UCG). CAVSIM is capable of following the evolution of the cavity from near startup to exhaustion, and couples explicitly wall and roof surface growth to material and energy balances in the underlying rubble zones. Growth mechanisms are allowed to change smoothly as the system evolves from a small, relatively empty cavity low in the coal seam to a large,more » almost completely rubble-filled cavity extending high into the overburden rock. The model is applicable to nonswelling coals of arbitrary seam thickness and can handle a variety of gas injection flow schedules or compositions. Water influx from the coal aquifer is calculated by a gravity drainage-permeation submodel which is integrated into the general solution. The cavity is considered to consist of up to three distinct rubble zones and a void space at the top. Resistance to gas flow injected from a stationary source at the cavity floor is assumed to be concentrated in the ash pile, which builds up around the source, and also the overburden rubble which accumulates on top of this ash once overburden rock is exposed at the cavity top. Char rubble zones at the cavity side and edges are assumed to be highly permeable. Flow of injected gas through the ash to char rubble piles and the void space is coupled by material and energy balances to cavity growth at the rubble/coal, void/coal and void/rock interfaces. One preprocessor and two postprocessor programs are included - SPALL calculates one-dimensional mean spalling rates of coal or rock surfaces exposed to high temperatures and generates CAVSIM input: TAB reads CAVSIM binary output files and generates ASCII tables of selected data for display; and PLOT produces dot matrix printer or HP printer plots from TAB output.« less

  19. Slip length crossover on a graphene surface

    SciTech Connect (OSTI)

    Liang, Zhi; Keblinski, Pawel

    2015-04-07

    Using equilibrium and non-equilibrium molecular dynamics simulations, we study the flow of argon fluid above the critical temperature in a planar nanochannel delimited by graphene walls. We observe that, as a function of pressure, the slip length first decreases due to the decreasing mean free path of gas molecules, reaches the minimum value when the pressure is close to the critical pressure, and then increases with further increase in pressure. We demonstrate that the slip length increase at high pressures is due to the fact that the viscosity of fluid increases much faster with pressure than the friction coefficient between the fluid and the graphene. This behavior is clearly exhibited in the case of graphene due to a very smooth potential landscape originating from a very high atomic density of graphene planes. By contrast, on surfaces with lower atomic density, such as an (100) Au surface, the slip length for high fluid pressures is essentially zero, regardless of the nature of interaction between fluid and the solid wall.

  20. Method and apparatus for optimized processing of sparse matrices

    DOE Patents [OSTI]

    Taylor, Valerie E. (Evanston, IL)

    1993-01-01

    A computer architecture for processing a sparse matrix is disclosed. The apparatus stores a value-row vector corresponding to nonzero values of a sparse matrix. Each of the nonzero values is located at a defined row and column position in the matrix. The value-row vector includes a first vector including nonzero values and delimiting characters indicating a transition from one column to another. The value-row vector also includes a second vector which defines row position values in the matrix corresponding to the nonzero values in the first vector and column position values in the matrix corresponding to the column position of the nonzero values in the first vector. The architecture also includes a circuit for detecting a special character within the value-row vector. Matrix-vector multiplication is executed on the value-row vector. This multiplication is performed by multiplying an index value of the first vector value by a column value from a second matrix to form a matrix-vector product which is added to a previous matrix-vector product.

  1. The hydrological model of the Mahanagdong sector, Greater Tongonan Geothermal Field, Philippines

    SciTech Connect (OSTI)

    Herras, E.B.; Licup, A.C. Jr.; Vicedo, R.O.

    1996-12-31

    The Mahanagdong sector of the Greater Tongonan Geothermal Field is committed to supply 180 MWe of steam by mid-1997. An updated hydrological model was constructed based on available geoscientific and reservoir engineering data from a total of 34 wells drilled in the area. The Mahanagdong; resource is derived from a fracture-controlled and volcano hosted geothermal system characterized by neutral to slightly alkali-chloride fluids with reservoir temperatures exceeding 295{degrees}C. A major upflow region was identified in the vicinity of MG-3D, MG-14D and MG-5D. Isochemical contours indicate outflowing fluids with temperatures of 270-275{degrees}C to the south and west. Its southwesterly flow is restricted by the intersection of the impermeable Mahanagdong Claystone near MG-10D, which delimits the southern part of the resource. Low temperature (<200{degrees}C), shallow inflows are evident at the west near MG-4D and MG-17D wells which act as a cold recharge in this sector.

  2. Accelerator-based neutron source for boron neutron capture therapy (BNCT) and method

    DOE Patents [OSTI]

    Yoon, W.Y.; Jones, J.L.; Nigg, D.W.; Harker, Y.D.

    1999-05-11

    A source for boron neutron capture therapy (BNCT) comprises a body of photoneutron emitter that includes heavy water and is closely surrounded in heat-imparting relationship by target material; one or more electron linear accelerators for supplying electron radiation having energy of substantially 2 to 10 MeV and for impinging such radiation on the target material, whereby photoneutrons are produced and heat is absorbed from the target material by the body of photoneutron emitter. The heavy water is circulated through a cooling arrangement to remove heat. A tank, desirably cylindrical or spherical, contains the heavy water, and a desired number of the electron accelerators circumferentially surround the tank and the target material as preferably made up of thin plates of metallic tungsten. Neutrons generated within the tank are passed through a surrounding region containing neutron filtering and moderating materials and through neutron delimiting structure to produce a beam or beams of epithermal neutrons normally having a minimum flux intensity level of 1.0{times}10{sup 9} neutrons per square centimeter per second. Such beam or beams of epithermal neutrons are passed through gamma ray attenuating material to provide the required epithermal neutrons for BNCT use. 3 figs.

  3. PDS SHRINK. PDS SHRINK

    SciTech Connect (OSTI)

    Phillion, D.

    1991-12-15

    This code enables one to display, take line-outs on, and perform various transformations on an image created by an array of integer*2 data. Uncompressed eight-bit TIFF files created on either the Macintosh or the IBM PC may also be read in and converted to a 16 bit signed integer image. This code is designed to handle all the formats used for PDS (photo-densitometer) files at the Lawrence Livermore National Laboratory. These formats are all explained by the application code. The image may be zoomed infinitely and the gray scale mapping can be easily changed. Line-outs may be horizontal or vertical with arbitrary width, angled with arbitrary end points, or taken along any path. This code is usually used to examine spectrograph data. Spectral lines may be identified and a polynomial fit from position to wavelength may be found. The image array can be remapped so that the pixels all have the same change of lambda width. It is not necessary to do this, however. Lineouts may be printed, saved as Cricket tab-delimited files, or saved as PICT2 files. The plots may be linear, semilog, or logarithmic with nice values and proper scientific notation. Typically, spectral lines are curved.

  4. Current Density Scaling in Electrochemical Flow Capacitors

    SciTech Connect (OSTI)

    Hoyt, NC; Wainright, JS; Savinell, RF

    2015-02-18

    Electrochemical flow capacitors (EFCs) are a recently developed energy storage technology. One of the principal performance metrics of an EFC is the steady-state electrical current density that it can accept or deliver. Numerical models exist to predict this performance for specific cases, but here we present a study of how the current varies with respect to the applied cell voltage, flow rate, cell dimensions, and slurry properties using scaling laws. The scaling relationships are confirmed by numerical simulations and then subsequently by comparison to results from symmetric cell EFC experiments. This modeling approach permits the delimitation of three distinct operational regimes dependent on the values of two nondimensional combinations of the pertinent variables (specifically, a capacitive Graetz number and a conductivity ratio). Lastly, the models and nondimensional numbers are used to provide design guidance in terms of criteria for proper EFC operation. (C) The Author(s) 2015. Published by ECS. This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 License (CC BY, http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse of the work in any medium, provided the original work is properly cited. All rights reserved.

  5. Structural anisotropic properties of a-plane GaN epilayers grown on r-plane sapphire by molecular beam epitaxy

    SciTech Connect (OSTI)

    Lotsari, A.; Kehagias, Th.; Katsikini, M.; Arvanitidis, J.; Ves, S.; Komninou, Ph.; Dimitrakopulos, G. P.; Tsiakatouras, G.; Tsagaraki, K.; Georgakilas, A.; Christofilos, D.

    2014-06-07

    Heteroepitaxial non-polar III-Nitride layers may exhibit extensive anisotropy in the surface morphology and the epilayer microstructure along distinct in-plane directions. The structural anisotropy, evidenced by the M-shape dependence of the (112{sup }0) x-ray rocking curve widths on the beam azimuth angle, was studied by combining transmission electron microscopy observations, Raman spectroscopy, high resolution x-ray diffraction, and atomic force microscopy in a-plane GaN epilayers grown on r-plane sapphire substrates by plasma-assisted molecular beam epitaxy (PAMBE). The structural anisotropic behavior was attributed quantitatively to the high dislocation densities, particularly the Frank-Shockley partial dislocations that delimit the I{sub 1} intrinsic basal stacking faults, and to the concomitant plastic strain relaxation. On the other hand, isotropic samples exhibited lower dislocation densities and a biaxial residual stress state. For PAMBE growth, the anisotropy was correlated to N-rich (or Ga-poor) conditions on the surface during growth, that result in formation of asymmetric a-plane GaN grains elongated along the c-axis. Such conditions enhance the anisotropy of gallium diffusion on the surface and reduce the GaN nucleation rate.

  6. Hydraulically operated mine prop with safety valve

    SciTech Connect (OSTI)

    Koppers, M.; Marr, P.

    1981-02-24

    A mine prop is disclosed that is provided with an overpressure valve of large flowthrough section which opens when excessive forces suddenly act on the prop to discharge pressure fluid from the interior of the prop. The overpressure valve comprises a hollow cylindrical housing forming at one end a valve seat communicating with the fluid-filled inner space of the prop and a valve member axially guided in the housing for movement between a closed position engaging the valve seat and an open position. The fluid-filled space of the prop communicates with channels leading to the outside of the prop. The valve member is normally maintained in the closed position by a gas pillow under high pretension confined in a pressure space delimited peripherally by the wall of the valve housing and at opposite ends respectively by the valve member and a plug fluid tightly mounted in and closing the other end of the housing. The valve member and the plug are both formed from metal and are both provided at facing ends with annular slender sealing lips of triangular cross-section tapering toward each other, the outer surfaces of which are pressed into engagement with the inner peripheral surface of the housing by the pressure of the gas pillow.

  7. Accelerator-based neutron source for boron neutron capture therapy (BNCT) and method

    DOE Patents [OSTI]

    Yoon, Woo Y. (Idaho Falls, ID); Jones, James L. (Idaho Falls, ID); Nigg, David W. (Idaho Falls, ID); Harker, Yale D. (Idaho Falls, ID)

    1999-01-01

    A source for boron neutron capture therapy (BNCT) comprises a body of photoneutron emitter that includes heavy water and is closely surrounded in heat-imparting relationship by target material; one or more electron linear accelerators for supplying electron radiation having energy of substantially 2 to 10 MeV and for impinging such radiation on the target material, whereby photoneutrons are produced and heat is absorbed from the target material by the body of photoneutron emitter. The heavy water is circulated through a cooling arrangement to remove heat. A tank, desirably cylindrical or spherical, contains the heavy water, and a desired number of the electron accelerators circumferentially surround the tank and the target material as preferably made up of thin plates of metallic tungsten. Neutrons generated within the tank are passed through a surrounding region containing neutron filtering and moderating materials and through neutron delimiting structure to produce a beam or beams of epithermal neutrons normally having a minimum flux intensity level of 1.0.times.10.sup.9 neutrons per square centimeter per second. Such beam or beams of epithermal neutrons are passed through gamma ray attenuating material to provide the required epithermal neutrons for BNCT use.

  8. Generalized Nuclear Data: A New Structure (with Supporting Infrastructure) for Handling Nuclear Data

    SciTech Connect (OSTI)

    Mattoon, C.M.; Beck, B.R.; Patel, N.R.; Summers, N.C.; Hedstrom, G.W.; Brown, D.A.

    2012-12-15

    The Evaluated Nuclear Data File (ENDF) format was designed in the 1960s to accommodate neutron reaction data to support nuclear engineering applications in power, national security and criticality safety. Over the years, the scope of the format has been extended to handle many other kinds of data including charged particle, decay, atomic, photo-nuclear and thermal neutron scattering. Although ENDF has wide acceptance and support for many data types, its limited support for correlated particle emission, limited numeric precision, and general lack of extensibility mean that the nuclear data community cannot take advantage of many emerging opportunities. More generally, the ENDF format provides an unfriendly environment that makes it difficult for new data evaluators and users to create and access nuclear data. The Cross Section Evaluation Working Group (CSEWG) has begun the design of a new Generalized Nuclear Data (or 'GND') structure, meant to replace older formats with a hierarchy that mirrors the underlying physics, and is aligned with modern coding and database practices. In support of this new structure, Lawrence Livermore National Laboratory (LLNL) has updated its nuclear data/reactions management package Fudge to handle GND structured nuclear data. Fudge provides tools for converting both the latest ENDF format (ENDF-6) and the LLNL Evaluated Nuclear Data Library (ENDL) format to and from GND, as well as for visualizing, modifying and processing (i.e., converting evaluated nuclear data into a form more suitable to transport codes) GND structured nuclear data. GND defines the structure needed for storing nuclear data evaluations and the type of data that needs to be stored. But unlike ENDF and ENDL, GND does not define how the data are to be stored in a file. Currently, Fudge writes the structured GND data to a file using the eXtensible Markup Language (XML), as it is ASCII based and can be viewed with any text editor. XML is a meta-language, meaning that it has a primitive set of definitions for representing hierarchical data/text in a file. Other meta-languages, like HDF5 which stores the data in binary form, can also be used to store GND in a file. In this paper, we will present an overview of the new GND data structures along with associated tools in Fudge.

  9. Review of LCA studies of solid waste management systems – Part II: Methodological guidance for a better practice

    SciTech Connect (OSTI)

    Laurent, Alexis; Clavreul, Julie; Bernstad, Anna; Bakas, Ioannis; Niero, Monia; Gentil, Emmanuel; Christensen, Thomas H.; Hauschild, Michael Z.

    2014-03-01

    Highlights: • We perform a critical review of 222 LCA studies of solid waste management systems. • We analyse the past LCA practice against the ISO standard and ILCD Handbook guidance. • Malpractices exist in many methodological aspects with large variations among studies. • Many of these aspects are important for the reliability of the results. • We provide detailed recommendations to practitioners of waste management LCAs. - Abstract: Life cycle assessment (LCA) is increasingly used in waste management to identify strategies that prevent or minimise negative impacts on ecosystems, human health or natural resources. However, the quality of the provided support to decision- and policy-makers is strongly dependent on a proper conduct of the LCA. How has LCA been applied until now? Are there any inconsistencies in the past practice? To answer these questions, we draw on a critical review of 222 published LCA studies of solid waste management systems. We analyse the past practice against the ISO standard requirements and the ILCD Handbook guidelines for each major step within the goal definition, scope definition, inventory analysis, impact assessment, and interpretation phases of the methodology. Results show that malpractices exist in several aspects of the LCA with large differences across studies. Examples are a frequent neglect of the goal definition, a frequent lack of transparency and precision in the definition of the scope of the study, e.g. an unclear delimitation of the system boundaries, a truncated impact coverage, difficulties in capturing influential local specificities such as representative waste compositions into the inventory, and a frequent lack of essential sensitivity and uncertainty analyses. Many of these aspects are important for the reliability of the results. For each of them, we therefore provide detailed recommendations to practitioners of waste management LCAs.

  10. Confidence building measures at sea:opportunities for India and Pakistan.

    SciTech Connect (OSTI)

    Vohra, Ravi Bhushan Rear Admiral; Ansari, Hasan Masood Rear Admiral

    2003-12-01

    The sea presents unique possibilities for implementing confidence building measures (CBMs) between India and Pakistan that are currently not available along the contentious land borders surrounding Jammu and Kashmir. This is due to the nature of maritime issues, the common military culture of naval forces, and a less contentious history of maritime interaction between the two nations. Maritime issues of mutual concern provide a strong foundation for more far-reaching future CBMs on land, while addressing pressing security, economic, and humanitarian needs at sea in the near-term. Although Indian and Pakistani maritime forces currently have stronger opportunities to cooperate with one another than their counterparts on land, reliable mechanisms to alleviate tension or promote operational coordination remain non-existent. Therefore, possible maritime CBMs, as well as pragmatic mechanisms to initiate and sustain cooperation, require serious examination. This report reflects the unique joint research undertaking of two retired Senior Naval Officers from both India and Pakistan, sponsored by the Cooperative Monitoring Center of the International Security Center at Sandia National Laboratories. Research focuses on technology as a valuable tool to facilitate confidence building between states having a low level of initial trust. Technical CBMs not only increase transparency, but also provide standardized, scientific means of interacting on politically difficult problems. Admirals Vohra and Ansari introduce technology as a mechanism to facilitate consistent forms of cooperation and initiate discussion in the maritime realm. They present technical CBMs capable of being acted upon as well as high-level political recommendations regarding the following issues: (1) Delimitation of the maritime boundary between India and Pakistan and its relationship to the Sir Creek dispute; (2) Restoration of full shipping links and the security of ports and cargos; (3) Fishing within disputed areas and resolution of issues relating to arrest and repatriation of fishermen from both sides; and (4) Naval and maritime agency interaction and possibilities for cooperation.

  11. Thermal stability of curved ray tomography for corrosion monitoring

    SciTech Connect (OSTI)

    Willey, C. L.; Simonetti, F.; Nagy, P. B.; Instanes, G.

    2014-02-18

    Guided wave tomography is being developed as an effective tool for continuous monitoring of corrosion and erosion depth in pipelines. A pair of transmit- and receive-ring arrays of ultrasonic transducers encircles the pipe and delimits the section to be monitored. In curved ray tomography (CRT), the depth profile is estimated from the time delay matrix, ??, whose ij-th entry is the phase traveltime difference between the current and baseline signals measured between transducers i and j of the transmit and receive-ring arrays, respectively. Under perfectly stable experimental conditions, the non-zero entries of ?? are only due to the occurrence of damage and provide a reliable input to CRT. However, during field operation, ?? can develop non-zero entries due to a number of environmental changes ranging from temperature variations to degradation of transducer-pipe coupling and transducer intrinsic performance. Here, we demonstrate that these sources of instability can be eliminated by exploiting the spatial diversity of array measurements in conjunction with EMAT transducer technology which is intrinsically stable owing to its non-contact nature. The study is based on a full-scale experiment performed on a schedule 40, 8 diameter, 3 m length steel pipe, monitored with two EMAT ring arrays. It is shown that for an irregularly shaped defect the proposed method yields maximum depth estimations that are as accurate as single point ultrasonic thickness gaging measurements and over a wide temperature range up to 175C. The results indicate that advanced inversion schemes in combination with EMAT transduction offer great potential for continuously monitoring the progression of corrosion or erosion damage in the oil and gas industry.

  12. Gamma Knife irradiation method based on dosimetric controls to target small areas in rat brains

    SciTech Connect (OSTI)

    Constanzo, Julie; Paquette, Benoit; Charest, Gabriel; Masson-Côté, Laurence; Guillot, Mathieu

    2015-05-15

    Purpose: Targeted and whole-brain irradiation in humans can result in significant side effects causing decreased patient quality of life. To adequately investigate structural and functional alterations after stereotactic radiosurgery, preclinical studies are needed. The purpose of this work is to establish a robust standardized method of targeted irradiation on small regions of the rat brain. Methods: Euthanized male Fischer rats were imaged in a stereotactic bed, by computed tomography (CT), to estimate positioning variations relative to the bregma skull reference point. Using a rat brain atlas and the stereotactic bregma coordinates obtained from CT images, different regions of the brain were delimited and a treatment plan was generated. A single isocenter treatment plan delivering ≥100 Gy in 100% of the target volume was produced by Leksell GammaPlan using the 4 mm diameter collimator of sectors 4, 5, 7, and 8 of the Gamma Knife unit. Impact of positioning deviations of the rat brain on dose deposition was simulated by GammaPlan and validated with dosimetric measurements. Results: The authors’ results showed that 90% of the target volume received 100 ± 8 Gy and the maximum of deposited dose was 125 ± 0.7 Gy, which corresponds to an excellent relative standard deviation of 0.6%. This dose deposition calculated with GammaPlan was validated with dosimetric films resulting in a dose-profile agreement within 5%, both in X- and Z-axes. Conclusions: The authors’ results demonstrate the feasibility of standardizing the irradiation procedure of a small volume in the rat brain using a Gamma Knife.

  13. Exploring the Limits of Methane Storage and Delivery in Nanoporous Materials

    SciTech Connect (OSTI)

    Gomez-Gualdron, DA; Wilmer, CE; Farha, OK; Hupp, JT; Snurr, RQ

    2014-04-03

    The physical limits for methane storage and delivery in nanoporous materials were investigated, with a focus on whether it is possible to reach a methane deliverable capacity of 315 cm(3)(STP)/cm(3) in line with the adsorption target established by the ARPA-E agency. Our efforts focused on how both geometric and chemical properties, such as void fraction (V-f), volumetric surface area (S-v), and heat of adsorption (Q(st)), impact methane deliverable capacity, i.e., the amount of methane adsorbed at some storage pressure minus the amount adsorbed at the delivery pressure. With the aid of grand canonical Monte Carlo (GCMC) simulations, we studied methane adsorption and delivery properties in a population of 122 835 hypothetical pcu metal organic frameworks (MOFs) and 39 idealized carbon-based porous materials. From the simulation results, we developed an analytical equation that helped us delimit the necessary material properties to reach specific methane deliverable capacity targets. The maximum deliverable capacity between 65 and 5.8 bar among the hypothetical MOFs was 206 cm(3)(STP)/cm(3) at 298 K. We found that artificially increasing the methane MOF interaction strength by increasing the Lennard-Jones e parameters of the MOF atoms by 2- and 4-fold only improved the maximum deliverable capacity up to 223 and 228 cm(3)(STP)/cm(3), respectively. However, the effect on the amount stored at 65 bar was more significant, which suggested another strategy; raising the temperature of the system by 100 K can recover 70% of the methane stranded at the delivery pressure. By increasing the delivery temperature to 398 K, the ARPA-E target was reached by a few hypothetical MOFs with quadrupled e values. This work shows the difficulty in reaching the ARPA-E target but also suggests that a strategy that combines a material with a large volumetric density of sites that interact strongly with methane and raising the delivery temperature can greatly improve the performance of nanoporous materials for methane storage and delivery. The optimal heat of adsorption in an isothermal storage and delivery scenario is approximately 10.5-14.5 kJ/mol, whereas in the nonisothermal storage and delivery scenario the optimal heats of adsorption fell within a range of 11.8-19.8 kEmol.

  14. Structure and magnetic properties of a new anion-deficient perovskite Pb{sub 2}Ba{sub 2}BiFe{sub 4}ScO{sub 13} with crystallographic shear structure

    SciTech Connect (OSTI)

    Batuk, Maria; Tyablikov, Oleg A.; Tsirlin, Alexander A.; Kazakov, Sergey M.; Rozova, Marina G.; Pokholok, Konstantin V.; Filimonov, Dmitry S.; Antipov, Evgeny V.; Abakumov, Artem M.; Hadermann, Joke

    2013-09-01

    Graphical abstract: - Highlights: Pb{sub 2}Ba{sub 2}BiFe{sub 4}ScO{sub 13} was obtained by solid state synthesis. Its structure was refined from combination of XPD and TEM. It is a new member of the perovskite-related homologous series A{sub n}B{sub n}O{sub 3n?2} with n = 5. Pb{sub 2}Ba{sub 2}BiFe{sub 4}ScO{sub 13} is antiferromagnetically ordered below T{sub N} ?350 K. - Abstract: Pb{sub 2}Ba{sub 2}BiFe{sub 4}ScO{sub 13}, a new n = 5 member of the oxygen-deficient perovskite-based A{sub n}B{sub n}O{sub 3n?2} homologous series, was synthesized using a solid-state method. The crystal structure of Pb{sub 2}Ba{sub 2}BiFe{sub 4}ScO{sub 13} was investigated by a combination of synchrotron X-ray powder diffraction, electron diffraction, high-angle annular dark-field scanning transmission electron microscopy and Mssbauer spectroscopy. At 900 K, it crystallizes in the Ammm space group with the unit cell parameters a = 5.8459(1) ?, b = 4.0426(1) ?, and c = 27.3435(1) ?. In the Pb{sub 2}Ba{sub 2}BiFe{sub 4}ScO{sub 13} structure, quasi-two-dimensional perovskite blocks are periodically interleaved with [1 1 0] (1{sup }01){sub p} crystallographic shear (CS) planes. At the CS planes, the corner-sharing FeO{sub 6} octahedra are transformed into chains of edge-sharing FeO{sub 5} distorted tetragonal pyramids. B-positions of the perovskite blocks between the CS planes are jointly occupied by Fe{sup 3+} and Sc{sup 3+}. The chains of the FeO{sub 5} pyramids and (Fe,Sc)O{sub 6} octahedra delimit six-sided tunnels that are occupied by double columns of cations with a lone electron pair (Pb{sup 2+}). The remaining A-cations (Bi{sup 3+}, Ba{sup 2+}) occupy positions in the perovskite block. According to the magnetic susceptibility measurements, Pb{sub 2}Ba{sub 2}BiFe{sub 4}ScO{sub 13} is antiferromagnetically ordered below T{sub N} ?350 K.

  15. SMART (Sandia's Modular Architecture for Robotics and Teleoperation) Ver. 1.0

    Energy Science and Technology Software Center (OSTI)

    2009-12-15

    "SMART Ver. 0.8 Beta" provides a system developer with software tools to create a telerobotic control system, i.e., a system whereby an end-user can interact with mechatronic equipment. It consists of three main components: the SMART Editor (tsmed), the SMART Real-time kernel (rtos), and the SMART Supervisor (gui). The SMART Editor is a graphical icon-based code generation tool for creating end-user systems, given descriptions of SMART modules. The SMART real-time kernel implements behaviors that combinemore » modules representing input devices, sensors, constraints, filters, and robotic devices. Included with this software release is a number of core modules, which can be combined with additional project and device specific modules to create a telerobotic controller. The SMART Supervisor is a graphical front-end for running a SMART system. It is an optional component of the SMART Environment and utilizes the TeVTk windowing and scripting environment. Although the code contained within this release is complete, and can be utilized for defining, running, and interfacing to a sample end-user SMART system, most systems will include additional project and hardware specific modules developed either by the system developer or obtained independently from a SMART module developer. SMART is a software system designed to integrate the different robots, input devices, sensors and dynamic elements required for advanced modes of telerobotic control. "SMART Ver. 0.8 Beta" defines and implements a telerobotic controller. A telerobotic system consists of combinations of modules that implement behaviors. Each real-time module represents an input device, robot device, sensor, constraint, connection or filter. The underlying theory utilizes non-linear discretized multidimensional network elements to model each individual module, and guarantees that upon a valid connection, the resulting system will perform in a stable fashion. Different combinations of modules implement different behaviors. Each module must have at a minimum an initialization routine, a parameter adjustment routine, and an update routine. The SMART runtime kernel runs continuously within a real-time embedded system. Each module is first set-up by the kernel, initialized, and then updated at a fixed rate whenever it is in context. The kernel responds to operator directed commands by changing the state of the system, changing parameters on individual modules, and switching behavioral modes. The SMART Editor is a tool used to define, verify, configure and generate source code for a SMART control system. It uses icon representations of the modules, code patches from valid configurations of the modules, and configuration files describing how a module can be connected into a system to lead the end-user in through the steps needed to create a final system. The SMART Supervisor serves as an interface to a SMART run-time system. It provides an interface on a host computer that connects to the embedded system via TCPIIP ASCII commands. It utilizes a scripting language (Tel) and a graphics windowing environment (Tk). This system can either be customized to fit an end-user's needs or completely replaced as needed.« less

  16. xdamp An IDL-based data and image manipulation program

    Energy Science and Technology Software Center (OSTI)

    2002-06-26

    xdamp is a graphical user interface (GUI) designed to allow the user to manipulate two-dimensional waveforms (data vs. time) and images (usually digitized radiographic film or digital camera outputs)that are typical of electrical engineering applications. A typical single data set from these applications will generate ~ 100 time-dependent waveforms and possibly a few images. xdamp can manipulate waveforms both in time and in amplitude. Typical operations are: time shifting, truncating before or after a specificmore » time, adding, multiplying, integrating, and averaging. When manipulating images, the spatial dimensions are maintained as important data. Standard electrical engineering quantities (maximum, minimum, fully-width-at-half-maximum, rise-time, mean, standard deviation) are calculated for each waveform and automatically displayed. Annotation can be added to each waveform and/or image and the overall file so that the data contains full documentation. PostScript printing is supported. xdamp supports full audit trail information on each waveform. Data are saved using the Hierarchical Data Format (HDF) from the National Center for Supercomputing Applications. xdamp uses the Interactive Data Language (IDL) from Research Systems, Inc., a Xerox company, as the processing engine. The entire program is written inthe IDL macro language to enhance portability. IDL is currently supported on the macintosh, alpha computers, Windows-based computers, and on virtually all UNIX platforms. Portability to all of these platforms has been verified. xdamp has a full internal language for creating macros useful for repetitive data reduction and analysis. xdamp can manipulate waveforms both in time and in amplitude. Some advanced features included are: the ability to compare waveforms in time and amplitude, the ability to generate high-frequence cable compensators, both integration and differentiation of waveforms, Fourier transforms of waveforms, and automatic execution of macros. Annotation can be added to each waveform and the overall file so that the data contains full documentation. Audit trails are maintained on each waveform. In addition to most of the waveform capabilities, xdamp can manipulate images in both space and amplitude. A variety of image processing capabilities are available such as mirroring around various axes, rotating through arbitrary angles, histogram equalization, Lee filtering, Roberts and Sobel edge enhancement algorithms and creating waveforms consisting of line-out profiles of images. Flexible printing to PostScript printers is supported as are both local and networked printers. Data are saved using the hierarchical Data Format (HDF) from the National Center for Supercomputing Applications. This format is highly compressed, self-encoded, and easily transportable across the Internet. There is also the capability of writing ASCII data files suitable for use by other plotting programs. Automatic generation of spreadsheet compatible files containing summary information for waveforms is also supported.« less

  17. Sandia Infrared Analysis Program

    Energy Science and Technology Software Center (OSTI)

    2004-05-11

    SandIR is a sophisticated Windows2000/WindowsXP program for the capture and analysis of thermal images in real time. It is a 32-bit, 5 thread C++ OOP application that rests on Microsoft’s MFC and DirectDraw libraries, the DT3152LS driver functions and the LabEngine link libraries of Origin 4.1 for full functionality. Images may be loaded in from saved files or viewed live by connection to a FLIR (Inframetrics) 600 or 760 IR camera or a video cassettemore » recorder playing tapes recorded from a FLIR (Inframetrlcs) 600 or 760 IR camera- At this time, no other IR camera formats are supported. The raw radiosity data used by SandiR is derived from the 8-bit, 256 level, RS-170 (grayscale) NTSC camera signal. The FLIR camera images contain 175x131 pixels of real IR data. SandIR displays these data in a 604x410 image. The maximum matrix size is 640x452 Including VIR and grayscale. Live IR images can be frozen and then stored to computer disk. An incrementing save command makes It easy to save a sequence of images with a series of related file names. These files can then be loaded into SandIR at a later time for anatysis by a number of predefined tools or data probes. Multiple pseudo-color palettes containing 64 colors are available as well as a 256 level grayscale palette for image colorizing. SandtR always processes all the data in the ROt for each acquired image; so a complete temperature matrix is always available for any frozen image. SandIR performs nearly 7 million temperature calculations per second and updates the image display through Direct Draw over the PCI bus at frequencies of 30 Hz. 3-d surface plots, projections, wire maps or contour plots of absolute temperatures are also updated at 20 to 30 Hz which approaches the real-time acquisition rate of the camera, These plots may be viewed full-screen or frozen in separate windows for comparison to later images. The full set of both 2-d and 3-d Origin plotting tools can be used to manipulate the attached plots. M plots can be manually rescaled. As, Origin script window can be used to send text commands directly to the Origin graphics server. This allows any of the SandIR plots to be formatted using the full power and capabilities of Origin. A section tool is provided to dissector cleave the surface plots. The options button in the Origin 3-d toolbar can be used to enhance the plotting resolution, which defaults to 24x16 when In surveillance mode, Note, higher resolutions slow the plot update rate and are best used for frozen images. The program provides an extensive range of tools, which can be applied to static images loaded from disk as well as live images. Tools used on live images display and update in real time with temperature data overlaid on the image itself. Active spot and area tools may be selected for temporal plotting using acquisition time intervals in the range of 33 ms to 1 hour. Temporal data are plotted on a scrolling strlpchart and can be recorded to an ascii or Excel file. Analysis tools include: Spot temperature tools to display the temperature at a set point in the image; Cursor probe to display the temperature under the tip of the mouse Line tools to be used as graphic objects or rulers, or to display the temperature profile along the line in a separate xy graphic window; On-screen profile tools, which superimpose the display of a temperature profile over the image itself; Rectangle, ellipse and polygon tools to be used as graphic objects, to display the maximum, minimum and mean temperatures within the region; Annotation tool to identify points in the image and make notations on the Image; Selector tool used to move or delete one of the above analysis tools.« less