Sample records for ascii comma-delimited csv

  1. How do I display the Map of Wind Farms csv coordinates in ArcMap...

    Open Energy Info (EERE)

    display the Map of Wind Farms csv coordinates in ArcMap software? Home > Groups > Geospatial I downloaded the Map of Wind Farms data as a .csv from http:en.openei.orgwiki...

  2. How do I display the Map of Wind Farms csv coordinates in ArcMap...

    Open Energy Info (EERE)

    do I display the Map of Wind Farms csv coordinates in ArcMap software? Home > Groups > Geospatial I downloaded the Map of Wind Farms data as a .csv from http:en.openei.orgwiki...

  3. Notes on the PMC Journal List CSV file This CSV file, available from the Journal List page on the PMC site, http://www.ncbi.nlm.nih.gov/pmc/

    E-Print Network [OSTI]

    Levin, Judith G.

    Notes on the PMC Journal List CSV file This CSV file, available from the Journal List page on the PMC site, http://www.ncbi.nlm.nih.gov/pmc/ journals/, is a list of journals that currently deposit in PMC. It includes journals that are no longer in publication or no longer deposit material in PMC

  4. CSV File Documentation: Consumption

    Gasoline and Diesel Fuel Update (EIA)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for On-Highway4,1,50022,3,,,,6,1,9,1,50022,3,,,,6,1,Decade Year-0E (2001)gasoline prices4Consumption The State Energy Data System

  5. SEDS CSV File Documentation: Price and Expenditure

    Gasoline and Diesel Fuel Update (EIA)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for On-Highway4,1,50022,3,,,,6,1,9,1,50022,3,,,,6,1,Decade Energy I I' a eviequestionnairesMillion U.S.A.A.Prices and

  6. ascii text documents: Topics by E-print Network

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    users, notably children. Additionally for adults, some contents included in abnormal porn sites can do ordinary peoples mental health harm. In this paper, we propose an...

  7. Title: Gridded Population of World and Global Rural-Urban Mapping Project Data Creator /

    E-Print Network [OSTI]

    Title: Gridded Population of World and Global Rural-Urban Mapping Project Data Creator / Copyright Data Format: BIL, ASCII, Grid, Shapefile, CSV, XLS, E00 Datum / Map Projection: N/A Resolution: N Science Information Network (CIESIN). "Gridded Population of World and Global Rural-Urban Mapping Project

  8. Constitutive model effects on finite element modeling of elastomer behavior in radial interference seal configurations

    E-Print Network [OSTI]

    Jackson, Jason R.

    1996-01-01T23:59:59.000Z

    described in the ASTM procedure 412. The specimens were then loaded into an Instron Model 4202 with the series IX Version 4. 02 software. Testing was done on an Instron 200 pound load cell. The two data files from Federal Mogul Corporation contained... stress values. The two columns containing the stress and strain data points were saved to a text file in a comma delimited format for subsequent insertion into the ABAQUS input file. Two options are available in ABAQUS for specifying the elastomer...

  9. A representation of changing heading direction in human cortical areas pVIP and CSv

    E-Print Network [OSTI]

    Royal Holloway, University of London

    Running title: Changing heading direction in the human brain Keywords: egomotion; f1 A representation of changing heading direction in human cortical in the environment, we continually change direction. Much work has examined how the brain

  10. How do I display the Map of Wind Farms csv coordinates in ArcMap software?

    Open Energy Info (EERE)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page onYou are now leaving Energy.gov You are now leaving Energy.gov You are8COaBulkTransmissionSitingProcess.pdfGetecGtel Jump to:Pennsylvania:County, Wyoming: EnergyCareview| OpenEI

  11. File: NPPFIELD.DOC Word2000 1 of 9 Printed 1/24/2007 5:02:53 PM File: NPPFIELD.PRO ASCII

    E-Print Network [OSTI]

    Protocol for Reading NPP Compiled by Cathie Sandell Edited by John Anderson, Cathie Sandell, Nancy Stotz Information We read the NPP sites in spring (May), when shrubs and spring annuals have reached peak biomass. A regression of biomass against volume is constructed by harvesting plants near but outside the grid

  12. mpimemu

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    mpimemu-mkstats -i mpimemu-username-01112013 creates two files, 01112013-node-mem-usage-xxx.csv and 01112013-proc-mem-usage-xxx.csv. For Trinity and NERSC-8, the value ((Post-Init...

  13. Petroleum Supply Annual 2005, Volume 1

    Annual Energy Outlook 2013 [U.S. Energy Information Administration (EIA)]

    Petroleum Products by PAD and Refining Districts PDF CSV XLS 19 Motor Gasoline Terminal Blenders Net Input and Net Production PDF CSV XLS 20 Refinery Stocks of Crude Oil and...

  14. US Army Corps of Engineers BUILDING STRONG

    E-Print Network [OSTI]

    US Army Corps of Engineers

    Introduction to working with bathymetric datasets Importing Datasets (xyz, points, shapefiles, other ascii

  15. active network analysis: Topics by E-print Network

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Summary: :www.iai.uni-bonn.dejzgenemodules2.csv 3. Perform a hierarchical cluster analysis and plot the dendrograms. What is the number these correlation diagrams? Are...

  16. ATF Video Frame Grabber Subsystems - Frequently Asked Questions

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    in the captured data files from a comma to something else? (A2) Most statistics, graphics, and mathematical analysis tools support reading data in comma separated value (CSV)...

  17. Alexander Hinneburg Aufgabe 5.1.1

    E-Print Network [OSTI]

    Hinneburg, Alexander

    Übung 5 Alexander Hinneburg #12;Aufgabe 5.1.1 # die Dateien ALL_AML_gcol.test.join.csv ALL_AML_gcol.train.join.csv aus der # letzten Uebung sind im akt. Verzeichnis, ebenso transpose.awk rm all_t.txt aml_t.txt gene_names.txt all_s2n.txt aml_s2n.txt cat ALL_AML_gcol.train.join.csv | gawk -f s2n_t.awk # all_t.txt all_t.txt all_s

  18. auswahl der pruefbereiche: Topics by E-print Network

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    reagieren. Lsungsvorschlag)ex oder (A)SCII > "); eingabe getchar(); printf(" Hoffmann, Rolf 8 Literatur (Auswahl) Antoine, Annettev. Boetticher, Annette: Leibniz-Zitate....

  19. Rainfall Manipulation Plot Study (RaMPS)

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Blair, John (Kansas State University); Fay, Phillip (USDA-ARS); Knapp, Alan (Colorado State University); Collins, Scott (University of New Mexico); Smith, Melinda (Yale University)

    Data sets are available as ASCII files, in Excel spreadsheets, and in SAS format. (Taken from http://www.konza.ksu.edu/ramps/backgrnd.html

  20. Strategies and Technologies for Improving Air Quality Around Ports

    E-Print Network [OSTI]

    Khan, Mohammad Yusuf

    2013-01-01T23:59:59.000Z

    Particle Counter CS Cyclone Separator CSV Comma Separatedtunnel after a 2.5 ”m cyclone separator, collected on filterand without a 2.5 ”m cyclone separator. This measurement

  1. SciTech Connect:

    Office of Scientific and Technical Information (OSTI)

    this search to My Library Excel (limit 2000) CSV (limit 5000) XML (limit 5000) Have feedback or suggestions for a way to improve these results? Page 1 of 0 Prev Next ...

  2. http://www.dkd.de d dkdesign

    E-Print Network [OSTI]

    Bongartz, Klaus

    basiert im Kern auf Apache Lucene REST-Ă€hnliche HTTP-Schnittstelle verarbeitet XML, JSON, CSV und Indexierung von externen Webseiten verwendet werden EXT:solr bietet eine API an, die ein selbst entwickeltes

  3. BIOL 471.02 Comp. Methods in Systematics Spring 2009

    E-Print Network [OSTI]

    Hardy, Christopher R.

    and view data. 1) Open DIVA-GIS. 2) Add data layers (here, shapefiles, .shp) for the following: North.csv file you used for the MAXENT procedure. 4) Open this file using DATA>Import points to shapefile. Double

  4. ASSOCIATIONS BETWEEN CLINICAL AND LABORATORY PARAMETERS IN PNEUMOCOCCAL PNEUMONIA FROM THE UNIVERSITY OF KANSAS HOSPITAL DURING THE YEARS 1996 TO 2005

    E-Print Network [OSTI]

    Reed, Natalie Ann

    2011-08-31T23:59:59.000Z

    program network, Streptococcus pneumoniae, 2005. Croucher et al. Rapid pneumococcal evolution in response to clinical interventions. Science. 2011;331:430-434. Esperatti M, Ferrer M, Theessen A, Liapikou A, Valencia M, Saucedo LM, Zavala E, Welte T... Thesis.csv" DBMS=CSV REPLACE; GETNAMES=YES; DATAROW=2; RUN; data nat; length locacq2 $15 Location_of_Spn_Aquisition $4 pms $3.; format Location_of_Spn_Aquisition $4.; set nat; where collection_date ^= . and box...

  5. Estensioni MIME ! Come detto, il corpo dei messaggi di posta

    E-Print Network [OSTI]

    Prencipe, Giuseppe

    Estensioni MIME ! Come detto, il corpo dei messaggi di posta deve essere in formato ASCII a 7 bit. #12;Estensioni MIME ! Come detto, il corpo dei messaggi di posta deve essere in formato ASCII a 7 bit ! Risposta: Bisogna codificare!! ! Quindi, se ad esempio si vuole inviare un'immagine JPG ! Il cliente di

  6. Computer News, Volume 30

    E-Print Network [OSTI]

    MATH DEPT Computer News, Volume 30. How to create an ASCII version of the Purdue Logo and other matters of e-mail etiquette. The Purdue logo... ... ah, yes ...

  7. COMPUTER APPLICATIONS IN THE GEOSCIENCES In this lab, you will learn how to import topography data, display it in 3-D, and add several

    E-Print Network [OSTI]

    Smith-Konter, Bridget

    to import topography data, display it in 3-D, and add several layers of other important for your final project) Fledermaus is a powerful, interactive 3-D data, vertical imagery, ASCII points and lines, Electronic Nautical Charts (ENCs), 3-D

  8. SOFTWARE CATALOG

    E-Print Network [OSTI]

    Beyers, Evelyn

    2013-01-01T23:59:59.000Z

    ASCII EBCDIC Computer CDC VAX IBM (Other) Programs DesiredEBCDIC Density Computer CDC VAX IBM (Other) Programs DesiredASSMBLY/CDC RANNUM/ASSMBLY/VAX CFS/DOC CFS/SOURCE VELCOR/

  9. JOURNAL DE PHYSIQUE CoZZoque 68, suppZ6rnent au n012, Tome 42, de'cembre 1981 page C8-233

    E-Print Network [OSTI]

    Paris-Sud XI, Université de

    change in this estimate. In addition, comparisons with the cesium standard of the PTB, CsI, made during.- Prior to 1970, when CsV, the present NRC primary clock was designed, laboratory cesium standards were of contamination of the hot wire detector to be used for the new beam direction. The time required for the Cs

  10. 12/11/13 EURES WS -Job vacancydetails https://ec.europa.eu/eures/eures-searchengine/servlet/ShowJvServlet?lg=IT&pesId=40&uniqueJvId=52593328&nnImport=false 1/2

    E-Print Network [OSTI]

    Malerba, Donato

    in software validation of ERP Systems (SAP) is a big plus Support and consulting during projects handling Gx the QA part in the sub projects in an overall remediation project for GxP systems Having experience Reference 078 ­ 064 Job Description Working in a remediation project as CSV Validation Engineer Represent

  11. Statistics 36-315: Statistical Graphics and Visualization Lab 2: Histograms and Alternatives

    E-Print Network [OSTI]

    Fienberg, Stephen E.

    to My Documents: File -> Change dir... 5. Load the special functions for this lab by typing Windows -> Tile. Load the data 7. Read the csv file into an R data frame, by typing into the command the handout on the density.curve function, plot a density curve for x. You do not need to create a new window

  12. Chemical Safety Vulnerability Working Group report. Volume 2

    SciTech Connect (OSTI)

    Not Available

    1994-09-01T23:59:59.000Z

    The Chemical Safety Vulnerability (CSV) Working Group was established to identify adverse conditions involving hazardous chemicals at DOE facilities that might result in fires or explosions, release of hazardous chemicals to the environment, or exposure of workers or the public to chemicals. A CSV Review was conducted in 148 facilities at 29 sites. Eight generic vulnerabilities were documented related to: abandoned chemicals and chemical residuals; past chemical spills and ground releases; characterization of legacy chemicals and wastes; disposition of legacy chemicals; storage facilities and conditions; condition of facilities and support systems; unanalyzed and unaddressed hazards; and inventory control and tracking. Weaknesses in five programmatic areas were also identified related to: management commitment and planning; chemical safety management programs; aging facilities that continue to operate; nonoperating facilities awaiting deactivation; and resource allocations. Volume 2 consists of seven appendices containing the following: Tasking memorandums; Project plan for the CSV Review; Field verification guide for the CSV Review; Field verification report, Lawrence Livermore National Lab.; Field verification report, Oak Ridge Reservation; Field verification report, Savannah River Site; and the Field verification report, Hanford Site.

  13. The Online Laboratory Taking Experimental Social Science

    E-Print Network [OSTI]

    Rand, David G.

    on websites, surveysg , y Base rate + possible bonus paymentBase rate + possible bonus payment #12;How does file with survey file, calculate bonus Upload csv with approve/reject & bonus info #12;How much does.40 flat rate plus bonus of up to $1 00 for 5 minute taskplus bonus of up to $1.00 for ~5 minute task #12

  14. Discrete Mathematics and Theoretical Computer Science 6, 2004, 253264 On Cheating Immune Secret Sharing

    E-Print Network [OSTI]

    Boyer, Edmond

    the valid secret (see (Car95; CSV93; RBO89)), · discouraging cheaters from sending invalid shares Sharing Josef Pieprzyk and Xian-Mo Zhang Centre for Advanced Computing ­ Algorithms and Cryptography,xianmo@ics.mq.edu.au received Nov 14, 2003, accepted Mar 23, 2004. The paper addresses the cheating prevention in secret sharing

  15. Page 1 of 3 OUA results procedures

    E-Print Network [OSTI]

    Offices Review Date: June 2014 Background As a result of the creation of Badged and CSP courses, there has, they are: · Generic, · Badged, · CSP (standard) and · CSP (Not For Degree [NFD]) In addition according to internal procedure, then hold a BOE and ratify results. CSP Standard FLS create a .csv version

  16. Day 2 MTH 490.433 Dying Flies

    E-Print Network [OSTI]

    [-0.1 i],{i,0,32},PlotRange{0,1}]; 5 10 15 20 25 30 0.2 0.4 0.6 0.8 1 Note: more analysis of this data download this data set at http://www.msu.edu/~brassilc/ELME/flies.csv. Day 0 1 2 3 4 5 6 7 8 9 10 11 12

  17. Quick Start The various sample data files after expansion (use Zip)

    E-Print Network [OSTI]

    library (49 signature files and 1 library list file, all in ASCII, 300 KB). Duncan Knob.sdf Lidar full wave form SDF file (60 MB). Duncan Knob.idx Required index file for Duncan Knob.sdf (4.5 MB). sbet_mission 1.out Smoothed Best Estimate of Trajectory file. Needed for Duncan Knob.sdf (98 MB). Immediate

  18. Inventory Routing and On-line Inventory Routing File Format M. Sevaux1,2

    E-Print Network [OSTI]

    Brest, Université de

    Inventory Routing and On-line Inventory Routing File Format M. Sevaux1,2 M. J. Geiger1 1 Helmut needs in the Inventory Routing Problem types. Instead of creating a new file format or putting ASCII is an extension of the TSPLIB file format description proposed in [1] to be used for the Inventory Routing Problem

  19. The "FISH" Quad Hand Sensor Physics and Media Group

    E-Print Network [OSTI]

    The "FISH" Quad Hand Sensor Physics and Media Group MIT Media Laboratory 20 Ames Street E15 OF CONTENTS ----------------- 1. ASCII SERIAL FISH PROTOCAL 2. HOW TO MAKE FISH ANTENNA 3. CALIBRATION SOFTWARE INSTALLATION 4. HOW TO CALIBRATE A FISH 5. COMPONENT PLACEMENT 6. SCHEMATICS 7. PARTS LIST HOW

  20. communications (physically

    E-Print Network [OSTI]

    Fulp, Errin W.

    /End Character Framing Use special characters to delineate frame . Each frame starts with the ASCII sequence DLE (Data Link Escape) STX (Start of TeXt) . Each frame ends with the sequence DLE ETX (End of TeXt) DLE STX P L U F L I V E S DLE ETX . If the destination losses track of boundary, it just needs to look

  1. Circumsolar Radiation Data: The Lawrence Berkeley Laboratory Reduced Data Base

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Note that each data set is composed of 20 lines of information with each line consistingof 77 characters. These are archived ASCII files. [Information on sites, number of data sets, etc. taken from the online publication (out of print) at http://rredc.nrel.gov/solar/pubs/circumsolar/index.html

  2. PaleoMac: A Macintosh application for treating

    E-Print Network [OSTI]

    Cogne, Jean-Pascal

    PaleoMac: A Macintosh TM application for treating paleomagnetic data and making plate as ASCII files. Beyond its usefulness in treating paleomagnetic data, its ability to handle plate motionTM application for treating paleomagnetic data and making plate reconstructions, Geochem. Geophys. Geosyst., 4

  3. An Introduction Phil Spector

    E-Print Network [OSTI]

    Spector, Phil

    at the command line · Output is displayed unless the input line is terminated by a semi-colon (;) · If you forget num.txt contains 20 lines, each with 10 space-delimited values. Then the command load -ascii num;Accessing Matlab By default, matlab will open a console with a file browser and command history window along

  4. Library & Information Science Research (LISR) Reviews Instructions to Reviewers

    E-Print Network [OSTI]

    White, Marilyn Domas

    : Swisher, Robert, & McClure, Charles R. Research for Decision Making: Methods for Librarians. Chicago, IL making: Methods for librarians. Chicago, IL: American Library Association. Chapter in a book: #12 no longer than 65 characters) or on disk, in plain ASCII text or Word/WordPerfect format. E-mail certainly

  5. Occupational radiation exposure at commercial nuclear power reactors and other facilities 1994. Twenty-seventh annual report

    SciTech Connect (OSTI)

    Thomas, M.L.; Hagemeyer, D. [Science Applications International Corporation, Oak Ridge, TN (United States)

    1996-01-01T23:59:59.000Z

    This report summarizes the occupational exposure data that are maintained in the U.S. Nuclear Regulatory Commission`s (NRC) Radiation Exposure Information and Reporting System (REIRS). Annual reports for 1994 were received from a total of 303 NRC licensees, of which 109 were operators of nuclear power reactors in commercial operation. Compilations of the reports submitted by the 303 licensees indicated that 152,028 individuals were monitored, 79,780 of whom received a measurable dose. The collective dose incurred by these individuals was 24,740 person-cSv (person-rem){sup 2} which represents a 15% decrease from the 1993 value. The number of workers receiving a measurable dose also decreased, resulting in the average measurable dose of 0.31 cSv (rem) for 1994. The average measurable dose is defined to be the total collective dose (TEDE) divided by the number of workers receiving a measurable dose. These figures have been adjusted to account for transient reactor workers. In 1994, the annual collective dose per reactor for light water reactor licensees (LWRs) was 198 person-cSv (person-rem). This represents a 18% decrease from the 1993 value of 242 person-cSv (person-rem). The annual collective dose per reactor for boiling water reactors (BWRs) was 327 person-cSv (person-rem) and, for pressurized water reactors (PWRs), it was 131 person-cSv (person-rem). Analyses of transient worker data indicate that 18,178 individuals completed work assignments at two or more licensees during the monitoring year. The dose distributions are adjusted each year to account for the duplicate reporting of transient workers by multiple licensees. In 1994, the average measurable dose calculated from reported data was 0.28 cSv (rem). The corrected dose distribution resulted in an average measurable dose of 0.31 cSv (rem).

  6. Variable Definitions

    Gasoline and Diesel Fuel Update (EIA)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level: National5Sales for On-Highway4,1,50022,3,,,,6,1,9,1,50022,3,,,,6,1,Decade Energy I I'26,282.1 26,672.1MonthFeet) YearDay)Data System CSV

  7. DOE Facility Database - Datasets - OpenEI Datasets

    Open Energy Info (EERE)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page onYou are now leaving Energy.gov You are now leaving Energy.gov You are beingZealand JumpConceptual Model,DOE Facility Database Data and Resources DOE Facility DatabaseCSV Preview

  8. DOE Report Describes Progress in the Deployment of Synchrophasor

    Open Energy Info (EERE)

    AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Home Page onYou are now leaving Energy.gov You are now leaving Energy.gov You are beingZealand JumpConceptual Model,DOE Facility Database Data and Resources DOE Facility DatabaseCSV

  9. Geographical Distribution of Biomass Carbon in Tropical Southeast Asian Forests: A Database

    SciTech Connect (OSTI)

    Brown, S

    2001-05-22T23:59:59.000Z

    A database was generated of estimates of geographically referenced carbon densities of forest vegetation in tropical Southeast Asia for 1980. A geographic information system (GIS) was used to incorporate spatial databases of climatic, edaphic, and geomorphological indices and vegetation to estimate potential (i.e., in the absence of human intervention and natural disturbance) carbon densities of forests. The resulting map was then modified to estimate actual 1980 carbon density as a function of population density and climatic zone. The database covers the following 13 countries: Bangladesh, Brunei, Cambodia (Campuchea), India, Indonesia, Laos, Malaysia, Myanmar (Burma), Nepal, the Philippines, Sri Lanka, Thailand, and Vietnam. The data sets within this database are provided in three file formats: ARC/INFOTM exported integer grids, ASCII (American Standard Code for Information Interchange) files formatted for raster-based GIS software packages, and generic ASCII files with x, y coordinates for use with non-GIS software packages. This database includes ten ARC/INFO exported integer grid files (five with the pixel size 3.75 km x 3.75 km and five with the pixel size 0.25 degree longitude x 0.25 degree latitude) and 27 ASCII files. The first ASCII file contains the documentation associated with this database. Twenty-four of the ASCII files were generated by means of the ARC/INFO GRIDASCII command and can be used by most raster-based GIS software packages. The 24 files can be subdivided into two groups of 12 files each. These files contain real data values representing actual carbon and potential carbon density in Mg C/ha (1 megagram = 10{sup 6} grams) and integer-coded values for country name, Weck's Climatic Index, ecofloristic zone, elevation, forest or non-forest designation, population density, mean annual precipitation, slope, soil texture, and vegetation classification. One set of 12 files contains these data at a spatial resolution of 3.75 km, whereas the other set of 12 files has a spatial resolution of 0.25 degree. The remaining two ASCII data files combine all of the data from the 24 ASCII data files into 2 single generic data files. The first file has a spatial resolution of 3.75 km, and the second has a resolution of 0.25 degree. Both files also provide a grid-cell identification number and the longitude and latitude of the center-point of each grid cell. The 3.75-km data in this numeric data package yield an actual total carbon estimate of 42.1 Pg (1 petagram = 10{sup 15} grams) and a potential carbon estimate of 73.6 Pg; whereas the 0.25-degree data produced an actual total carbon estimate of 41.8 Pg and a total potential carbon estimate of 73.9 Pg. Fortran and SAS{trademark} access codes are provided to read the ASCII data files, and ARC/INFO and ARCVIEW command syntax are provided to import the ARC/INFO exported integer grid files. The data files and this documentation are available without charge on a variety of media and via the Internet from the Carbon Dioxide Information Analysis Center (CDIAC).

  10. Geographical Distribution of Biomass Carbon in Tropical Southeast Asian Forests: A Database

    SciTech Connect (OSTI)

    Brown, S.

    2002-02-07T23:59:59.000Z

    A database was generated of estimates of geographically referenced carbon densities of forest vegetation in tropical Southeast Asia for 1980. A geographic information system (GIS) was used to incorporate spatial databases of climatic, edaphic, and geomorphological indices and vegetation to estimate potential (i.e., in the absence of human intervention and natural disturbance) carbon densities of forests. The resulting map was then modified to estimate actual 1980 carbon density as a function of population density and climatic zone. The database covers the following 13 countries: Bangladesh, Brunei, Cambodia (Campuchea), India, Indonesia, Laos, Malaysia, Myanmar (Burma), Nepal, the Philippines, Sri Lanka, Thailand, and Vietnam. The data sets within this database are provided in three file formats: ARC/INFO{trademark} exported integer grids, ASCII (American Standard Code for Information Interchange) files formatted for raster-based GIS software packages, and generic ASCII files with x, y coordinates for use with non-GIS software packages. This database includes ten ARC/INFO exported integer grid files (five with the pixel size 3.75 km x 3.75 km and five with the pixel size 0.25 degree longitude x 0.25 degree latitude) and 27 ASCII files. The first ASCII file contains the documentation associated with this database. Twenty-four of the ASCII files were generated by means of the ARC/INFO GRIDASCII command and can be used by most raster-based GIS software packages. The 24 files can be subdivided into two groups of 12 files each. These files contain real data values representing actual carbon and potential carbon density in Mg C/ha (1 megagram = 10{sup 6} grams) and integer- coded values for country name, Weck's Climatic Index, ecofloristic zone, elevation, forest or non-forest designation, population density, mean annual precipitation, slope, soil texture, and vegetation classification. One set of 12 files contains these data at a spatial resolution of 3.75 km, whereas the other set of 12 files has a spatial resolution of 0.25 degree. The remaining two ASCII data files combine all of the data from the 24 ASCII data files into 2 single generic data files. The first file has a spatial resolution of 3.75 km, and the second has a resolution of 0.25 degree. Both files also provide a grid-cell identification number and the longitude and latitude of the centerpoint of each grid cell. The 3.75-km data in this numeric data package yield an actual total carbon estimate of 42.1 Pg (1 petagram = 10{sup 15} grams) and a potential carbon estimate of 73.6 Pg; whereas the 0.25-degree data produced an actual total carbon estimate of 41.8 Pg and a total potential carbon estimate of 73.9 Pg. Fortran and SASTM access codes are provided to read the ASCII data files, and ARC/INFO and ARCVIEW command syntax are provided to import the ARC/INFO exported integer grid files. The data files and this documentation are available without charge on a variety of media and via the Internet from the Carbon Dioxide Information Analysis Center (CDIAC).

  11. Data Link Layer, Part 1 CSC 343643

    E-Print Network [OSTI]

    Fulp, Errin W.

    starts with the ASCII sequence DLE (Data Link Escape) STX (Start of TeXt) · Each frame ends with the sequence DLE ETX (End of TeXt) DLE STX P L U F L I V E S DLE ETX · If the destination losses track in the actual data stream (consider binary data) · One solution is character stuffing ­ Insert an extra DLE

  12. Robotics research projects report

    SciTech Connect (OSTI)

    Hsia, T.C. (ed.)

    1983-06-01T23:59:59.000Z

    The research results of the Robotics Research Laboratory are summarized. Areas of research include robotic control, a stand-alone vision system for industrial robots, and sensors other than vision that would be useful for image ranging, including ultrasonic and infra-red devices. One particular project involves RHINO, a 6-axis robotic arm that can be manipulated by serial transmission of ASCII command strings to its interfaced controller. (LEW)

  13. Metering and Calibration in LoanSTAR Buildings

    E-Print Network [OSTI]

    O'Neal, D. L.; Bryant, J. A.; Turner, W. D.; Glass, M. G.

    1990-01-01T23:59:59.000Z

    installations (such as schools and local government buildings) to "calibrate" the simpler levels (i.e., daily or monthly manual watt-hour readings) of monitoring for different building types in Texas. The feasibility of using an agency's existing EMCS... to gather some or all of the required data is also being explored. Some vendors' EMCS allow for data collection, writing data to ASCII files, and remote interrogation. The EMCS would potentially offer a significant cost reduction over the installation...

  14. Extensible Software Architecture for a Distributed Engineering Simulation Facility

    E-Print Network [OSTI]

    May, James F

    2013-03-18T23:59:59.000Z

    DEDICATION To Anne, Jim, Liz, Cali, and Scout iii NOMENCLATURE A/P Autopilot API Application Programming Interface ASCII American Standard Code for Information Interchange CFG Con guration COTS Commercial O -The-Shelf EFS Engineering Flight Simulator..., although a custom API to the HLA could be written to ease federate program integration, the HLA standards are excessive compared to the requirements of a typical university research laboratory and Vehicle Systems & Control Laboratory simulations[23...

  15. DCHAIN

    SciTech Connect (OSTI)

    East, L.V. (EG and G Idaho, Inc., Idaho Falls, ID (United States))

    1994-04-01T23:59:59.000Z

    DCHAIN calculates the time-dependent daughter populations in radioactive decay and nuclear reaction chains. Chain members can have non-zero initial populations and be produced from the preceding chain member as the result of radioactive decay, a nuclear reaction, or both. Parent-daughter equilibrium times and relative activities at equilibrium can also be calculated. Program input can be supplied interactively or read from ASCII files.

  16. DCHAIN V1.3. Radioactive Decay and Reaction Chain Calculations

    SciTech Connect (OSTI)

    East, L.V. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1994-04-01T23:59:59.000Z

    DCHAIN calculates the time-dependent daughter populations in radioactive decay and nuclear reaction chains. Chain members can have non-zero initial populations and be produced from the preceding chain member as the result of radioactive decay, a nuclear reaction, or both. Parent-daughter equilibrium times and relative activities at equilibrium can also be calculated. Program input can be supplied interactively or read from ASCII files.

  17. {ldalove,jbyoo}@konkuk.ac.kr pSET2TC6: A Translation Tool to Standardize the Output Format

    E-Print Network [OSTI]

    pSET : pSET2TC6 O , {ldalove,jbyoo}@konkuk.ac.kr pSET2TC6: A Translation Tool' . PLCVerifier pSET PLCopen XML pSET2TC6 . . 1. pSET (POSAFE-Q Software Engineering Tool] . ASCII `*.ld' PLCopen TC6 (PLCopen Technical Commit 6) [8] XML . 2011 Vol.38, No.2(B) #12

  18. GSOD Based Daily Global Mean Surface Temperature and Mean Sea Level Air Pressure (1982-2011)

    SciTech Connect (OSTI)

    Xuan Shi, Dali Wang

    2014-05-05T23:59:59.000Z

    This data product contains all the gridded data set at 1/4 degree resolution in ASCII format. Both mean temperature and mean sea level air pressure data are available. It also contains the GSOD data (1982-2011) from NOAA site, contains station number, location, temperature and pressures (sea level and station level). The data package also contains information related to the data processing methods

  19. GSOD Based Daily Global Mean Surface Temperature and Mean Sea Level Air Pressure (1982-2011)

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Xuan Shi, Dali Wang

    This data product contains all the gridded data set at 1/4 degree resolution in ASCII format. Both mean temperature and mean sea level air pressure data are available. It also contains the GSOD data (1982-2011) from NOAA site, contains station number, location, temperature and pressures (sea level and station level). The data package also contains information related to the data processing methods

  20. Management response plan for the Chemical Safety Vulnerability Working Group report. Volume 1

    SciTech Connect (OSTI)

    Not Available

    1994-09-01T23:59:59.000Z

    The Chemical Safety Vulnerability (CSV) Working Group was established to identify adverse conditions involving hazardous chemicals at DOE facilities that might result in fires or explosions, release of hazardous chemicals to the environment, or exposure of workers or the public to chemicals. A CSV Review was conducted in 146 facilities at 29 sites. Eight generic vulnerabilities were documented related to: abandoned chemicals and chemical residuals; past chemical spills and ground releases; characterization of legacy chemicals and wastes; disposition of legacy chemicals; storage facilities and conditions; condition of facilities and support systems; unanalyzed and unaddressed hazards; and inventory control and tracking. Weaknesses in five programmatic areas were also identified related to: management commitment and planning; chemical safety management programs; aging facilities that continue to operate; nonoperating facilities awaiting deactivation; and resource allocations. Volume 1 contains a discussion of the chemical safety improvements planned or already underway at DOE sites to correct facility or site-specific vulnerabilities. The main part of the report is a discussion of each of the programmatic deficiencies; a description of the tasks to be accomplished; the specific actions to be taken; and the organizational responsibilities for implementation.

  1. Consistency between renormalization group running of chiral operator and counting rule -- Case of chiral pion production operator --

    E-Print Network [OSTI]

    Satoshi X. Nakamura; Anders Gardestig

    2009-11-10T23:59:59.000Z

    In nuclear chiral perturbation theory (ChPT), an operator is defined in a space with a cutoff which may be varied within a certain range. The operator runs as a result of the variation of the cutoff [renormalization group (RG) running]. In order for ChPT to be useful, the operator should run in a way consistent with the counting rule; that is, the running of chiral counter terms have to be of natural size. We vary the cutoff using the Wilsonian renormalization group (WRG) equation, and examine this consistency. As an example, we study the s-wave pion production operator for NN\\to d pi, derived in ChPT. We demonstrate that the WRG running does not generate any chiral-symmetry-violating (CSV) interaction, provided that we start with an operator which does not contain a CSV term. We analytically show how the counter terms are generated in the WRG running in case of the infinitesimal cutoff reduction. Based on the analytic result, we argue a range of the cutoff variation for which the running of the counter terms is of natural size. Then, we numerically confirm this.

  2. Chemical Safety Vulnerability Working Group report. Volume 3

    SciTech Connect (OSTI)

    Not Available

    1994-09-01T23:59:59.000Z

    The Chemical Safety Vulnerability (CSV) Working Group was established to identify adverse conditions involving hazardous chemicals at DOE facilities that might result in fires or explosions, release of hazardous chemicals to the environment, or exposure of workers or the public to chemicals. A CSV Review was conducted in 148 facilities at 29 sites. Eight generic vulnerabilities were documented related to: abandoned chemicals and chemical residuals; past chemical spills and ground releases; characterization of legacy chemicals and wastes; disposition of legacy chemicals; storage facilities and conditions; condition of facilities and support systems; unanalyzed and unaddressed hazards; and inventory control and tracking. Weaknesses in five programmatic areas were also identified related to: management commitment and planning; chemical safety management programs; aging facilities that continue to operate; nonoperating facilities awaiting deactivation; and resource allocations. Volume 3 consists of eleven appendices containing the following: Field verification reports for Idaho National Engineering Lab., Rocky Flats Plant, Brookhaven National Lab., Los Alamos National Lab., and Sandia National Laboratories (NM); Mini-visits to small DOE sites; Working Group meeting, June 7--8, 1994; Commendable practices; Related chemical safety initiatives at DOE; Regulatory framework and industry initiatives related to chemical safety; and Chemical inventory data from field self-evaluation reports.

  3. Bioacoustic surveys of planktonic sound scatterers and of their diel and seasonal variability in the northwest Gulf of Mexico

    E-Print Network [OSTI]

    Zimmerman, Robert Allen

    1993-01-01T23:59:59.000Z

    of 12 and 153 kHz systems occurred during cruise 91G-02 of the R/V Gyre (March 1991). During this cruise, the 12 kHz precision depth recorder was run continuously with the gain fine-tuned from time to time to optimize visualization of scattering... at 5 minute intervals for the duration of both cruises through the serial ASCII interface (SAIL) system available onboard the R/V Gyre . As the temperature is recorded as a frequency, a calibration regression was performed to correlate the frequency...

  4. Laser goniometer

    DOE Patents [OSTI]

    Fairer, George M. (Boulder, CO); Boernge, James M. (Lakewood, CO); Harris, David W. (Lakewood, CO); Campbell, DeWayne A. (Littleton, CO); Tuttle, Gene E. (Littleton, CO); McKeown, Mark H. (Golden, CO); Beason, Steven C. (Lakewood, CO)

    1993-01-01T23:59:59.000Z

    The laser goniometer is an apparatus which permits an operator to sight along a geologic feature and orient a collimated lamer beam to match the attitude of the feature directly. The horizontal orientation (strike) and the angle from horizontal (dip), are detected by rotary incremental encoders attached to the laser goniometer which provide a digital readout of the azimuth and tilt of the collimated laser beam. A microprocessor then translates the square wave signal encoder outputs into an ASCII signal for use by data recording equipment.

  5. A survey of selected aspects of health conditions and services in Texas, 1948 

    E-Print Network [OSTI]

    Haynes, Lemuel Lee

    1951-01-01T23:59:59.000Z

    the most reeve* year was listed first. The two types of graphs used were the nerve aad th? bar. Gurve graphs were used to show the trend in the birth and death xatea in Tomas for th? years '4834 1848 and to eomyare the trend in the death rate and infant... of Uvalde County, Tomas the population is between 3, 666 and 5, 6M ascii the nearest dooior is twenty-one miles away. The oitixeas of this aron informed the authClr 'aha't they were able 'to f inane' a dootor but their effcets had boon in vain in getting...

  6. Weather Forecast Data an Important Input into Building Management Systems 

    E-Print Network [OSTI]

    Poulin, L.

    2013-01-01T23:59:59.000Z

    GEPS 21 members ? Provides probabilistic forecasts ? Can give useful outlooks for longer term weather forecasts ? Scribe matrix from GDPS ? includes UMOS post processed model data ? Variables like Temperature, humidity post processed by UMOS ? See...://collaboration.cmc.ec.gc.ca/cmc/cmoi/cmc-prob-products/ ? Link to experimental 3-day outlook of REPS quilts ? http://collaboration.cmc.ec.gc.ca/cmc/cmoi/cmc-prob-products.reps Users can also make their own products from ensemble forecast data? Sample ascii matrix of 2m temperature could be fed...

  7. Lightweight performance data collectors 2.0 with Eiger support.

    SciTech Connect (OSTI)

    Allan, Benjamin A.

    2013-05-01T23:59:59.000Z

    We report on the use and design of a portable, extensible performance data collection tool motivated by modeling needs of the high performance computing systems co-design com- munity. The lightweight performance data collectors with Eiger support is intended to be a tailorable tool, not a shrink-wrapped library product, as pro ling needs vary widely. A single code markup scheme is reported which, based on compilation ags, can send perfor- mance data from parallel applications to CSV les, to an Eiger mysql database, or (in a non-database environment) to at les for later merging and loading on a host with mysql available. The tool supports C, C++, and Fortran applications.

  8. AVTA: 2010 Ford Fusion HEV Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2010 Ford Fusion hybrid-electric vehicle. Baseline data, which provides a point of comparison for the other test results, was collected at two different research laboratories. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.transportation.anl.gov/D3/2010_fusion_hybrid.html). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  9. AVTA: 2013 Volkswagon Jetta TDI Diesel Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2013 Volkswagon Jetti TDI, which runs on diesel. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  10. AVTA: 2010 Volkswagon Golf Diesel Start-Stop Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2010 Volkswagon Golf Diesel vehicle with stop-start technology. Baseline data, which provides a point of comparison for the other test results, was collected at two different research laboratories. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  11. A gamma/neutron-discriminating, Cooled, Optically Stimulated Luminescence (COSL) dosemeter

    SciTech Connect (OSTI)

    Eschbach, P.A.; Miller, S.D.

    1992-07-01T23:59:59.000Z

    The Cooled Optically Stimulated Luminescence (COSL) of CaF{sub 2}:Mn (grain sizes from 0.1 to 100 microns) powder embedded in a hydrogenous matrix is reported as a function of fast-neutron dose. When all the CaF{sub 2}:Mn grains are interrogated at once, the COSL plastic dosemeters have a minimum detectable limit of 1 cSv fast neutrons; the gamma component from the bare {sup 252}cf exposure was determined with a separate dosemeter. We report here on a proton-recoil-based dosemeter that generates pulse height spectra, much like the scintillator of Hornyak, (2) to provide information on both the neutron and gamma dose.

  12. AVTA: 2010 Smart Fortwo Start-Stop Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. Baseline data, which provides a point of comparison for the other test results, was collected at two different research laboratories. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.transportation.anl.gov/D3/2010_smartcar_mhd.html). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  13. AVTA: 2010 Mazda 3 Hatchback Start-Stop Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2010 Mazda3 hatchback with stop-start technology. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.transportation.anl.gov/D3/2010_mazda_3istop.html). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  14. AVTA: 2010 Honda Insight HEV Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2010 Honda Insight hybrid-electric vehicle. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  15. AVTA: 2010 Toyota Prius Gen III HEV Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2010 Toyota Prius III hybrid-electric vehicle. Baseline data, which provides a point of comparison for the other test results, was collected at two different research laboratories. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  16. AVTA: 2011 Hyundai Sonata HEV Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2011 Hyundai Sonata hybrid electric vehicle. Baseline data, which provides a point of comparison for the other test results, was collected at two different research laboratories. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  17. AVTA: 2012 CNG Honda Civic Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2012 Compressed Natural Gas Honda Civic GX. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  18. AVTA: 2013 Chevrolet Volt Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a 2013 Chevrolet Volt. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). The reports for download here are based on research done at Idaho National Laboratory. Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  19. AVTA: 2013 Nissan Leaf All-Electric Vehicle Testing Reports

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe early results of testing done on an all-electric 2013 Nissan Leaf. Baseline data, which provides a point of comparison for the other test results, was collected at two different research laboratories. Baseline and other data collected at Idaho National Laboratory is in the attached documents. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  20. AVTA: 2013 Toyota Prius PHEV Testing Results

    Broader source: Energy.gov [DOE]

    The Vehicle Technologies Office's Advanced Vehicle Testing Activity carries out testing on a wide range of advanced vehicles and technologies on dynamometers, closed test tracks, and on-the-road. These results provide benchmark data that researchers can use to develop technology models and guide future research and development. The following reports describe results of testing done on a Toyota Prius PHEV 2013. Baseline and battery testing data collected at Argonne National Laboratory is available in summary and CSV form on the Argonne Downloadable Dynometer Database site (http://www.anl.gov/energy-systems/group/downloadable-dynamometer-databas...). The reports for download here are based on research done at Idaho National Laboratory. Taken together, these reports give an overall view of how this vehicle functions under extensive testing.

  1. Rocky Mountain Basins Produced Water Database

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Historical records for produced water data were collected from multiple sources, including Amoco, British Petroleum, Anadarko Petroleum Corporation, United States Geological Survey (USGS), Wyoming Oil and Gas Commission (WOGC), Denver Earth Resources Library (DERL), Bill Barrett Corporation, Stone Energy, and other operators. In addition, 86 new samples were collected during the summers of 2003 and 2004 from the following areas: Waltman-Cave Gulch, Pinedale, Tablerock and Wild Rose. Samples were tested for standard seven component "Stiff analyses", and strontium and oxygen isotopes. 16,035 analyses were winnowed to 8028 unique records for 3276 wells after a data screening process was completed. [Copied from the Readme document in the zipped file available at http://www.netl.doe.gov/technologies/oil-gas/Software/database.html] Save the Zipped file to your PC. When opened, it will contain four versions of the database: ACCESS, EXCEL, DBF, and CSV formats. The information consists of detailed water analyses from basins in the Rocky Mountain region.

  2. Gap Assessment (FY 13 Update)

    SciTech Connect (OSTI)

    Getman, Dan

    2013-09-30T23:59:59.000Z

    To help guide its future data collection efforts, The DOE GTO funded a data gap analysis in FY2012 to identify high potential hydrothermal areas where critical data are needed. This analysis was updated in FY2013 and the resulting datasets are represented by this metadata. The original process was published in FY 2012 and is available here: https://pangea.stanford.edu/ERE/db/GeoConf/papers/SGW/2013/Esposito.pdf Though there are many types of data that can be used for hydrothermal exploration, five types of exploration data were targeted for this analysis. These data types were selected for their regional reconnaissance potential, and include many of the primary exploration techniques currently used by the geothermal industry. The data types include: 1. well data 2. geologic maps 3. fault maps 4. geochemistry data 5. geophysical data To determine data coverage, metadata for exploration data (including data type, data status, and coverage information) were collected and catalogued from nodes on the National Geothermal Data System (NGDS). It is the intention of this analysis that the data be updated from this source in a semi-automated fashion as new datasets are added to the NGDS nodes. In addition to this upload, an online tool was developed to allow all geothermal data providers to access this assessment and to directly add metadata themselves and view the results of the analysis via maps of data coverage in Geothermal Prospector (http://maps.nrel.gov/gt_prospector). A grid of the contiguous U.S. was created with 88,000 10-km by 10-km grid cells, and each cell was populated with the status of data availability corresponding to the five data types. Using these five data coverage maps and the USGS Resource Potential Map, sites were identified for future data collection efforts. These sites signify both that the USGS has indicated high favorability of occurrence of geothermal resources and that data gaps exist. The uploaded data are contained in two data files for each data category. The first file contains the grid and is in the SHP file format (shape file.) Each populated grid cell represents a 10k area within which data is known to exist. The second file is a CSV (comma separated value) file that contains all of the individual layers that intersected with the grid. This CSV can be joined with the map to retrieve a list of datasets that are available at any given site. The attributes in the CSV include: 1. grid_id : The id of the grid cell that the data intersects with 2. title: This represents the name of the WFS service that intersected with this grid cell 3. abstract: This represents the description of the WFS service that intersected with this grid cell 4. gap_type: This represents the category of data availability that these data fall within. As the current processing is pulling data from NGDS, this category universally represents data that are available in the NGDS and are ready for acquisition for analytic purposes. 5. proprietary_type: Whether the data are considered proprietary 6. service_type: The type of service 7. base_url: The service URL

  3. Calculate thermal-expansion coefficients

    SciTech Connect (OSTI)

    Yaws, C.L. [Lamar Univ., Beaumont, TX (United States)

    1995-08-01T23:59:59.000Z

    To properly design and use process equipment, an engineer needs a sound knowledge of physical and thermodynamic property data. A lack of such knowledge can lead to design or operating mistakes that can be dangerous, costly or even fatal. One useful type of property data is the thermal-expansion coefficient. This article presents equations and tables to find the thermal-expansion coefficients of many liquids that contain carbon. These data are useful in process-engineering applications, including the design of relief systems which are crucial to safeguarding process equipment. Data are provided for approximately 350 compounds. A computer software program, which contains the thermophysical property data for all of the compounds discussed in this article, is available for $43 prepaid from the author (Carl L. Yaws, Box 10053, Lamar University, beaumont, TX 77710; Tel. 409-880-8787; fax 409-880-8404). The program is in ASCII format, which can be accessed by most other types of computer software.

  4. Precision control of multiple quantum cascade lasers for calibration systems

    SciTech Connect (OSTI)

    Taubman, Matthew S., E-mail: Matthew.Taubman@pnnl.gov; Myers, Tanya L.; Pratt, Richard M.; Stahl, Robert D.; Cannon, Bret D. [Pacific Northwest National Laboratory, P.O. Box 999, Richland, Washington 99352 (United States)] [Pacific Northwest National Laboratory, P.O. Box 999, Richland, Washington 99352 (United States)

    2014-01-15T23:59:59.000Z

    We present a precision, 1-A, digitally interfaced current controller for quantum cascade lasers, with demonstrated temperature coefficients for continuous and 40-kHz full-depth square-wave modulated operation, of 1–2 ppm/?°C and 15 ppm/?°C, respectively. High precision digital to analog converters (DACs) together with an ultra-precision voltage reference produce highly stable, precision voltages, which are selected by a multiplexer (MUX) chip to set output currents via a linear current regulator. The controller is operated in conjunction with a power multiplexing unit, allowing one of three lasers to be driven by the controller, while ensuring protection of controller and all lasers during operation, standby, and switching. Simple ASCII commands sent over a USB connection to a microprocessor located in the current controller operate both the controller (via the DACs and MUX chip) and the power multiplexer.

  5. Updated Users' Guide for RSAP -- A Code for Display and Manipulation of Neutron Cross Section Data and SAMMY Fit Results

    SciTech Connect (OSTI)

    Sayer, R.O.

    2003-07-29T23:59:59.000Z

    RSAP [1] is a computer code for display and manipulation of neutron cross section data and selected SAMMY output. SAMMY [2] is a multilevel R-matrix code for fitting neutron time-of-flight cross-section data using Bayes' method. This users' guide provides documentation for the recently updated RSAP code (version 6). The code has been ported to the Linux platform, and several new features have been added, including the capability to read cross section data from ASCII pointwise ENDF files as well as double-precision PLT output from SAMMY. A number of bugs have been found and corrected, and the input formats have been improved. Input items are parsed so that items may be separated by spaces or commas.

  6. Data analysis method for wind turbine dynamic response testing

    SciTech Connect (OSTI)

    Olsen, T.L.; Hock, S.M.

    1989-06-01T23:59:59.000Z

    The Wind Research Branch at the Solar Energy Research Institute (SERI) has developed an efficient data analysis package for personal computer use in response to growing needs of the wind turbine industry and SERI's Cooperative Field Test Program. This new software is used by field test engineers to examine wind turbine performance and loads during testing, as well as by data analysts for detailed post-processing. The Wind Data Analysis Tool Set, WINDATS, has been written as a collection of tools that fall into two general groups. First, the preparatory tools perform subsection, filtering, decimation, preaveraging, scaling, and derivation of new channels. Second, analysis tools are used for mean removal, linear detrending, azimuth averaging and removal, per-rev averaging, binning, and spectral analysis. The input data file can be a standard ASCII file as is generated by most data acquisition software. 9 refs., 10 figs.

  7. DEVICE CONTROL TOOL FOR CEBAF BEAM DIAGNOSTICS SOFTWARE

    SciTech Connect (OSTI)

    Pavel Chevtsov

    2008-02-11T23:59:59.000Z

    Continuously monitoring the beam quality in the CEBAF accelerator, a variety of beam diagnostics software created at Jefferson Lab makes a significant contribution to very high availability of the machine for nuclear physics experiments. The interface between this software and beam instrumentation hardware components is provided by a device control tool, which is optimized for beam diagnostics tasks. As a part of the device/driver development framework at Jefferson Lab, this tool is very easy to support and extend to integrate new beam instrumentation components. All device control functions are based on the configuration (ASCII text) files that completely define the used hardware interface standards (CAMAC, VME, RS-232, GPIB, etc.) and communication protocols. The paper presents the main elements of the device control tool for beam diagnostics software at Jefferson Lab.

  8. Graduate student theses supported by DOE`s Environmental Sciences Division

    SciTech Connect (OSTI)

    Cushman, R.M. [Oak Ridge National Lab., TN (United States); Parra, B.M. [Dept. of Energy, Germantown, MD (United States). Environmental Sciences Division] [comps.

    1995-07-01T23:59:59.000Z

    This report provides complete bibliographic citations, abstracts, and keywords for 212 doctoral and master`s theses supported fully or partly by the U.S. Department of Energy`s Environmental Sciences Division (and its predecessors) in the following areas: Atmospheric Sciences; Marine Transport; Terrestrial Transport; Ecosystems Function and Response; Carbon, Climate, and Vegetation; Information; Computer Hardware, Advanced Mathematics, and Model Physics (CHAMMP); Atmospheric Radiation Measurement (ARM); Oceans; National Institute for Global Environmental Change (NIGEC); Unmanned Aerial Vehicles (UAV); Integrated Assessment; Graduate Fellowships for Global Change; and Quantitative Links. Information on the major professor, department, principal investigator, and program area is given for each abstract. Indexes are provided for major professor, university, principal investigator, program area, and keywords. This bibliography is also available in various machine-readable formats (ASCII text file, WordPerfect{reg_sign} files, and PAPYRUS{trademark} files).

  9. Precision Control of Multiple Quantum Cascade Lasers for Calibration Systems

    SciTech Connect (OSTI)

    Taubman, Matthew S.; Myers, Tanya L.; Pratt, Richard M.; Stahl, Robert D.; Cannon, Bret D.

    2014-01-15T23:59:59.000Z

    We present a precision, digitally interfaced current controller for quantum cascade lasers, with demonstrated DC and modulated temperature coefficients of 1- 2 ppm/șC and 15 ppm/șC respectively. High linearity digital to analog converters (DACs) together with an ultra-precision voltage reference, produce highly stable, precision voltages. These are in turn selected by a low charge-injection multiplexer (MUX) chip, which are then used to set output currents via a linear current regulator. The controller is operated in conjunction with a power multiplexing unit, allowing one of three lasers to be driven by the controller while ensuring protection of controller and all lasers during operation, standby and switching. Simple ASCII commands sent over a USB connection to a microprocessor located in the current controller operate both the controller (via the DACs and MUX chip) and the power multiplexer.

  10. Gauge Configurations for Lattice QCD from The Gauge Connection

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    The Gauge Connection is an experimental archive for lattice QCD and a repository of gauge configurations made freely available to the community. Contributors to the archive include the Columbia QCDSP collaboration, the MILC collaboration, and others. Configurations are stored in QCD archive format, consisting of an ASCII header which defines various parameters, followed by binary data. NERSC has also provided some utilities and examples that will aid users in handling the data. Users may browse the archive, but are required to register for a password in order to download data. Contents of the archive are organized under four broad headings: Quenched (more than 1200 configurations); Dynamical, Zero Temperature (more than 300 configurations); MILC Improved Staggered Asqtad Lattices (more than 7000 configurations); and Dynamical, Finite Temperature (more than 1200 configurations)

  11. User's Guide to Pre-Processing Data in Universal Translator 2 for the Energy Charting and Metrics Tool (ECAM)

    SciTech Connect (OSTI)

    Taasevigen, Danny J.

    2011-11-30T23:59:59.000Z

    This document is a user's guide for the Energy Charting and Metrics Tool to facilitate the examination of energy information from buildings, reducing the time spent analyzing trend and utility meter data. This user guide was generated to help pre-process data with the intention of utilizing the Energy Charting and Metrics (ECAM) tool to improve building operational efficiency. There are numerous occasions when the metered data that is received from the building automation system (BAS) isn't in the right format acceptable for ECAM. This includes, but isn't limited to, cases such as inconsistent time-stamps for the trends (e.g., each trend has its own time-stamp), data with holes (e.g., some time-stamps have data and others are missing data), each point in the BAS is trended and exported into an individual .csv or .txt file, the time-stamp is unrecognizable by ECAM, etc. After reading through this user guide, the user should be able to pre-process all data files and be ready to use this data in ECAM to improve their building operational efficiency.

  12. National Spill Test Technology Database

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Sheesley, David [Western Research Institute

    Western Research Institute established, and ACRC continues to maintain, the National Spill Technology database to provide support to the Liquified Gaseous Fuels Spill Test Facility (now called the National HAZMAT Spill Center) as directed by Congress in Section 118(n) of the Superfund Amendments and Reauthorization Act of 1986 (SARA). The Albany County Research Corporation (ACRC) was established to make publicly funded data developed from research projects available to benefit public safety. The founders since 1987 have been investigating the behavior of toxic chemicals that are deliberately or accidentally spilled, educating emergency response organizations, and maintaining funding to conduct the research at the DOEÆs HAZMAT Spill Center (HSC) located on the Nevada Test Site. ACRC also supports DOE in collaborative research and development efforts mandated by Congress in the Clean Air Act Amendments. The data files are results of spill tests conducted at various times by the Silicones Environmental Health and Safety Council (SEHSC) and DOE, ANSUL, Dow Chemical, the Center for Chemical Process Safety (CCPS) and DOE, Lawrence Livermore National Laboratory (LLNL), OSHA, and DOT; DuPont, and the Western Research Institute (WRI), Desert Research Institute (DRI), and EPA. Each test data page contains one executable file for each test in the test series as well as a file named DOC.EXE that contains information documenting the test series. These executable files are actually self-extracting zip files that, when executed, create one or more comma separated value (CSV) text files containing the actual test data or other test information.

  13. Carbon Dioxide, Hydrographic, and Chemical Data Obtained During the R/V Knorr Cruises in the North Atlantic Ocean on WOCE Sections AR24 (November 2-December 5, 1996) and A24, A20, and A22 (May 30-September 3, 1997)

    SciTech Connect (OSTI)

    Johnson, K.M.

    2003-10-23T23:59:59.000Z

    This documentation describes the procedures and methods used to measure total carbon dioxide (TCO{sub 2}) total alkalinity (TALK), and partial pressure of CO{sub 2} (pCO{sub 2}) at hydrographic stations on the North Atlantic Ocean sections AR24, A24, A20, and A22 during the R/V Knorr Cruises 147-2, 151-2, 151-3, and 151-4 in 1996 and 1997. Conducted as part of the World Ocean Circulation Experiment (WOCE), the expeditions began at Woods Hole, Massachusetts, on October 24, 1996, and ended at Woods Hole on September 3, 1997. Instructions for accessing the data are provided. A total of 5,614 water samples were analyzed for discrete TCO{sub 2} using two single-operator multiparameter metabolic analyzers (SOMMAs) coupled to a coulometer for extracting and detecting CO{sub 2}. The overall accuracy of the TCO{sub 2} determination was {+-} 1.59 {micro}mol/kg. The TALK was determined in a total of 6,088 discrete samples on all sections by potentiometric titration using an automated titration system developed at the University of Miami. The accuracy of the TALK determination was {+-} 3 {micro}mol/kg. A total of 2,465 discrete water samples were collected for determination of pCO{sub 2} in seawater on sections A24, A20, and A22. The pCO{sub 2} was measured by means of an equilibrator-IR system by scientists from Lamont-Doherty Earth Observatory. The precision of the measurements was estimated to be about {+-} 0.15%, based on the reproducibility of the replicate equilibrations on a single hydrographic station. The North Atlantic data set is available as a numeric data package (NDP) from the Carbon Dioxide Information Analysis Center. The NDP consists of 12 ASCII data files, one Ocean Data View-formatted data file, a NDP-082 ASCII text file, a NDP-082 PDF file, and this printed documentation, which describes the contents and format of all files, as well as the procedures and methods used to obtain the data.

  14. Hydroacoustic Evaluation of Fish Passage Through Bonneville Dam in 2005

    SciTech Connect (OSTI)

    Ploskey, Gene R.; Weiland, Mark A.; Zimmerman, Shon A.; Hughes, James S.; Bouchard, Kyle E.; Fischer, Eric S.; Schilt, Carl R.; Hanks, Michael E.; Kim, Jina; Skalski, John R.; Hedgepeth, J.; Nagy, William T.

    2006-12-04T23:59:59.000Z

    The Portland District of the U.S. Army Corps of Engineers requested that the Pacific Northwest National Laboratory (PNNL) conduct fish-passage studies at Bonneville Dam in 2005. These studies support the Portland District's goal of maximizing fish-passage efficiency (FPE) and obtaining 95% survival for juvenile salmon passing Bonneville Dam. Major passage routes include 10 turbines and a sluiceway at Powerhouse 1 (B1), an 18-bay spillway, and eight turbines and a sluiceway at Powerhouse 2 (B2). In this report, we present results of two studies related to juvenile salmonid passage at Bonneville Dam. The studies were conducted between April 16 and July 15, 2005, encompassing most of the spring and summer migrations. Studies included evaluations of (1) Project fish passage efficiency and other major passage metrics, and (2) smolt approach and fate at B1 Sluiceway Outlet 3C from the B1 forebay. Some of the large appendices are only presented on the compact disk (CD) that accompanies the final report. Examples include six large comma-separated-variable (.CSV) files of hourly fish passage, hourly variances, and Project operations for spring and summer from Appendix E, and large Audio Video Interleave (AVI) files with DIDSON-movie clips of the area upstream of B1 Sluiceway Outlet 3C (Appendix H). Those video clips show smolts approaching the outlet, predators feeding on smolts, and vortices that sometimes entrained approaching smolts into turbines. The CD also includes Adobe Acrobat Portable Document Files (PDF) of the entire report and appendices.

  15. System and method for simultaneously collecting serial number information from numerous identity tags

    DOE Patents [OSTI]

    Doty, Michael A. (Manteca, CA)

    1997-01-01T23:59:59.000Z

    A system and method for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded "D" character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the "D" interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available.

  16. PM-10 Open Fugitive-Dust-Source computer model (for microcomputers). Model-Simulation

    SciTech Connect (OSTI)

    Elmore, L.

    1990-04-01T23:59:59.000Z

    The computer programs in the package are based on the material presented in the document, Control of Open Fugitive Dust Sources, EPA-450/3-88-008. The programs on these diskettes serve two purposes. Their primary purpose is to facilitate the process of data entry, allowing the user not only to enter and verify the data which he/she possesses, but also to access additional data which might not be readily available. The second purpose is to calculate emission rates for the particular source category selected using the data previously entered and verified. Software Description: The program is written in BASIC programming language for implementation on an IBM-PC/AT and compatible machines using DOS.2X or higher operating system. Hard disk with 5 1/4 inch disk drive or two disk drives, wide carriage printer (132-character) or printer capable of printing text in condensed mode required. Text editor or word processing program capable of manipulating ASCII or DOS text files is optional.

  17. Solving iTOUGH2 simulation and optimization problems using the PEST protocol

    SciTech Connect (OSTI)

    Finsterle, S.A.; Zhang, Y.

    2011-02-01T23:59:59.000Z

    The PEST protocol has been implemented into the iTOUGH2 code, allowing the user to link any simulation program (with ASCII-based inputs and outputs) to iTOUGH2's sensitivity analysis, inverse modeling, and uncertainty quantification capabilities. These application models can be pre- or post-processors of the TOUGH2 non-isothermal multiphase flow and transport simulator, or programs that are unrelated to the TOUGH suite of codes. PEST-style template and instruction files are used, respectively, to pass input parameters updated by the iTOUGH2 optimization routines to the model, and to retrieve the model-calculated values that correspond to observable variables. We summarize the iTOUGH2 capabilities and demonstrate the flexibility added by the PEST protocol for the solution of a variety of simulation-optimization problems. In particular, the combination of loosely coupled and tightly integrated simulation and optimization routines provides both the flexibility and control needed to solve challenging inversion problems for the analysis of multiphase subsurface flow and transport systems.

  18. System and method for simultaneously collecting serial number information from numerous identity tags

    DOE Patents [OSTI]

    Doty, M.A.

    1997-01-07T23:59:59.000Z

    A system and method are disclosed for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded ``D`` character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the ``D`` interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available. 5 figs.

  19. Source Catalog Data from FIRST (Faint Images of the Radio Sky at Twenty-Centimeters)

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Becker, Robert H.; Helfand, David J.; White, Richard L.; Gregg, Michael D.; Laurent-Muehleisen, Sally A.

    FIRST, Faint Images of the Radio Sky at Twenty-Centimeters, is a project designed to produce the radio equivalent of the Palomar Observatory Sky Survey over 10,000 square degrees of the North Galactic Cap. Using the National Radio Astronomy Observatory's (NRAO) Very Large Array (VLA) in its B-configuration, the Survey acquired 3-minute snapshots covering a hexagonal grid using 2?7 3-MHz frequency channels centered at 1365 and 1435 MHz. The data were edited, self-calibrated, mapped, and CLEANed using an automated pipeline based largely on routines in the Astronomical Image Processing System (AIPS). A final atlas of maps is produced by coadding the twelve images adjacent to each pointing center. Source catalogs with flux densities and size information are generated from the coadded images also. The 2011 catalog is the latest version and has been tested to ensure reliability and completness. The catalog, generated from the 1993 through 2004 images, contains 816,000 sources and covers more than 9000 square degrees. A specialized search interface for the catalog resides at this website, and the catalog is also available as a compressed ASCII file. The user may also view earlier versions of the source catalog. The FIRST survey area was chosen to coincide with that of the Sloan Digital Sky Survey (SDSS); at the m(v)~24 limit of SDSS, ~50% of the optical counterparts to FIRST sources will be detected.

  20. Solving seismological problems using SGRAPH program: I-source parameters and hypocentral location

    SciTech Connect (OSTI)

    Abdelwahed, Mohamed F. [Geological Hazards Research Unit, King Abdulaziz University (Saudi Arabia) and National Research Institute of Astronomy and Geophysics (NRIAG), Helwan (Egypt)

    2012-09-26T23:59:59.000Z

    SGRAPH program is considered one of the seismological programs that maintain seismic data. SGRAPH is considered unique for being able to read a wide range of data formats and manipulate complementary tools in different seismological subjects in a stand-alone Windows-based application. SGRAPH efficiently performs the basic waveform analysis and solves advanced seismological problems. The graphical user interface (GUI) utilities and the Windows facilities such as, dialog boxes, menus, and toolbars simplified the user interaction with data. SGRAPH supported the common data formats like, SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and others. It provides the facilities to solve many seismological problems with the built-in inversion and modeling tools. In this paper, I discuss some of the inversion tools built-in SGRAPH related to source parameters and hypocentral location estimation. Firstly, a description of the SGRAPH program is given discussing some of its features. Secondly, the inversion tools are applied to some selected events of the Dahshour earthquakes as an example of estimating the spectral and source parameters of local earthquakes. In addition, the hypocentral location of these events are estimated using the Hypoinverse 2000 program operated by SGRAPH.

  1. QTCM software documentation. Volume 2. User's manual. Final report

    SciTech Connect (OSTI)

    Not Available

    1990-10-01T23:59:59.000Z

    This Volume of the QTCM software manual is a user's manual for those who need to conduct QTCM analyses, but do not require an in-depth knowledge of the software model's structure and development. This manual describes the operating environment necessary to support QTCM and provides step-by-step instructions for operating the model. QTCM consists of three executable segments - the input file formatter, the network analysis segment, and the output file formatter. The input file formatter, INPFMT, allows the user to create or amend a formatted input file via a series of user menus and prompts. NETWORK reads in the input file produced by INPFMT and performs the network analysis function of QTCM, including traffic distribution and calculation of network statistics. The output formatter, OUTFMT, then takes the output from NETWORK and allows the user to selectively display and output statistical traffic data via a series of user menus and prompts. These output data may be written to an ASCII text file if the user desires. The operation of each of these executables is described in the following sections.

  2. Grid Logging: Best Practices Guide

    SciTech Connect (OSTI)

    Tierney, Brian L; Tierney, Brian L; Gunter, Dan

    2008-04-01T23:59:59.000Z

    The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.

  3. SAPHIRE 8 Volume 7 - Data Loading

    SciTech Connect (OSTI)

    K. J. Kvarfordt; S. T. Wood; C. L. Smith; S. R. Prescott

    2011-03-01T23:59:59.000Z

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission and developed by the Idaho National Laboratory. This report is intended to assist the user to enter PRA data into the SAPHIRE program using the built-in MAR-D ASCII-text file data transfer process. Towards this end, a small sample database is constructed and utilized for demonstration. Where applicable, the discussion includes how the data processes for loading the sample database relate to the actual processes used to load a larger PRA models. The procedures described herein were developed for use with SAPHIRE Version 8. The guidance specified in this document will allow a user to have sufficient knowledge to both understand the data format used by SAPHIRE and to carry out the transfer of data between different PRA projects.

  4. iTOUGH2 Universal Optimization Using the PEST Protocol

    SciTech Connect (OSTI)

    Finsterle, S.A.

    2010-07-01T23:59:59.000Z

    iTOUGH2 (http://www-esd.lbl.gov/iTOUGH2) is a computer program for parameter estimation, sensitivity analysis, and uncertainty propagation analysis [Finsterle, 2007a, b, c]. iTOUGH2 contains a number of local and global minimization algorithms for automatic calibration of a model against measured data, or for the solution of other, more general optimization problems (see, for example, Finsterle [2005]). A detailed residual and estimation uncertainty analysis is conducted to assess the inversion results. Moreover, iTOUGH2 can be used to perform a formal sensitivity analysis, or to conduct Monte Carlo simulations for the examination for prediction uncertainties. iTOUGH2's capabilities are continually enhanced. As the name implies, iTOUGH2 is developed for use in conjunction with the TOUGH2 forward simulator for nonisothermal multiphase flow in porous and fractured media [Pruess, 1991]. However, iTOUGH2 provides FORTRAN interfaces for the estimation of user-specified parameters (see subroutine USERPAR) based on user-specified observations (see subroutine USEROBS). These user interfaces can be invoked to add new parameter or observation types to the standard set provided in iTOUGH2. They can also be linked to non-TOUGH2 models, i.e., iTOUGH2 can be used as a universal optimization code, similar to other model-independent, nonlinear parameter estimation packages such as PEST [Doherty, 2008] or UCODE [Poeter and Hill, 1998]. However, to make iTOUGH2's optimization capabilities available for use with an external code, the user is required to write some FORTRAN code that provides the link between the iTOUGH2 parameter vector and the input parameters of the external code, and between the output variables of the external code and the iTOUGH2 observation vector. While allowing for maximum flexibility, the coding requirement of this approach limits its applicability to those users with FORTRAN coding knowledge. To make iTOUGH2 capabilities accessible to many application models, the PEST protocol [Doherty, 2007] has been implemented into iTOUGH2. This protocol enables communication between the application (which can be a single 'black-box' executable or a script or batch file that calls multiple codes) and iTOUGH2. The concept requires that for the application model: (1) Input is provided on one or more ASCII text input files; (2) Output is returned to one or more ASCII text output files; (3) The model is run using a system command (executable or script/batch file); and (4) The model runs to completion without any user intervention. For each forward run invoked by iTOUGH2, select parameters cited within the application model input files are then overwritten with values provided by iTOUGH2, and select variables cited within the output files are extracted and returned to iTOUGH2. It should be noted that the core of iTOUGH2, i.e., its optimization routines and related analysis tools, remains unchanged; it is only the communication format between input parameters, the application model, and output variables that are borrowed from PEST. The interface routines have been provided by Doherty [2007]. The iTOUGH2-PEST architecture is shown in Figure 1. This manual contains installation instructions for the iTOUGH2-PEST module, and describes the PEST protocol as well as the input formats needed in iTOUGH2. Examples are provided that demonstrate the use of model-independent optimization and analysis using iTOUGH2.

  5. The International Coal Statistics Data Base user's guide

    SciTech Connect (OSTI)

    Not Available

    1991-06-01T23:59:59.000Z

    The ICSD is a microcomputer-based system which presents four types of data: (1) the quantity of coal traded between importers and exporters, (2) the price of particular ranks of coal and the cost of shipping it in world trade, (3) a detailed look at coal shipments entering and leaving the United States, and (4) the context for world coal trade in the form of data on how coal and other primary energy sources are used now and are projected to be used in the future, especially by major industrial economies. The ICSD consists of more than 140 files organized into a rapid query system for coal data. It can operate on any IBM-compatible microcomputer with 640 kilobytes memory and a hard disk drive with at least 8 megabytes of available space. The ICSD is: 1. A menu-driven, interactive data base using Dbase 3+ and Lotus 1-2-3. 2. Inputs include official and commercial statistics on international coal trade volumes and consumption. 3. Outputs include dozens of reports and color graphic displays. Output report type include Lotus worksheets, dBase data bases, ASCII text files, screen displays, and printed reports. 4. Flexible design permits user to follow structured query system or design his own queries using either Lotus or dBase procedures. 5. Incudes maintenance programs to configure the system, correct indexing errors, back-up work, restore corrupted files, annotate user-created files and update system programs, use DOS shells, and much more. Forecasts and other information derived from the ICSD are published in EIA's Annual Prospects for World Coal Trade (DOE/EIA-0363).

  6. CHEMFORM user`s guide

    SciTech Connect (OSTI)

    Sjoreen, A.; Toran, L.

    1996-01-01T23:59:59.000Z

    CHEMFORM is a DOS-based program which converts geochemical data files into the format read by the U.S. Geological Survey family of models: WATEQ4F, PHREEQE, or NETPATH. These geochemical models require data formatted in a particular order, which typically does not match data storage. CHEMFORM converts geochemical data that are stored in an ASCII file to input files that can be read by these models, without being re-entered by hand. The data may be in any order and format in the original file, as long as they are separated by blanks. The location of each data element in the input file is entered in CHEMFORM. Any required data that are not present in your file may also be entered. The positions of the data in the input file are saved to be used as defaults for the next run. CHEMFORM runs in two modes. In the first mode, it will read one input file and write one output file. The input file may contain data on multiple lines, and the user will specify both line number and position of each item in CHEMFORM. This mode facilitates the conversion of the input from one model to the format needed by another model. In the second mode, the CHEMFORM input files contains more than one water analysis. All the geochemical data for a given sample are stored on one line, and CHEMFORM writes an output file for each line. This mode is useful when many samples are available for a site in the same format (different monitoring points or samples taken at different times from one monitoring point).

  7. Visual display of reservoir parameters affecting enhanced oil recovery. Annual report, October 1, 1995--September 30, 1996

    SciTech Connect (OSTI)

    Wood, J.R.

    1997-04-01T23:59:59.000Z

    The Multimedia Database Management System (MDMS) has been developed in the commercial software package Toolbook. Design and implementation, which was carried out by C. Asiala, is now essentially complete. Regional location maps of southern San Joaquin Valley oil fields, structure contour maps of the Pioneer area, core photos, core data, thin-section and SEM photomicrographs of core materials, structural cross sections through Pioneer Anticline, an atlas of photomicrographs; illustrating typical diagenetic features observed in San Joaquin Valley petroleum reservoirs, elemental and spectral data collected on Fourier Transform Infrared Spectroscopy (FTIR) standards, and all quarterly and annual reports submitted to DOE for this project were scanned into the MDMS. All data and information are accessible through dropdown menus and hotlinks in a Table of Contents. A tutorial is presented up front to guide users through the MDMS and instruct them on the various ways in which data can be viewed and retrieved. Version 1.0 of the MDMS was written to CD ROM and distributed to participants in a Technology Transfer Workshop in Bakersfield, CA, in September, 1996. Version 1.1, which contains additional information and has been reorganized for easier use, is nearing completion. All measured and computed log curves (computed curves represent parameters such as porosity, water saturation, and clay content, which were calculated from the measured log traces using specially developed algorithms) for the 45+ project wells on Pioneer Anticline are now in the MDMS in LAS (log ASCII) format, and can be exported to any commercial log evaluation program for manipulation and analysis. All log curves were written to the CD ROM in digital format.

  8. PC/FRAM, Version 3.2 User Manual

    SciTech Connect (OSTI)

    Kelley, T.A.; Sampson, T.E.

    1999-02-23T23:59:59.000Z

    This manual describes the use of version 3.2 of the PC/FRAM plutonium isotopic analysis software developed in the Safeguards Science and Technology Group, NE-5, Nonproliferation and International Security Division Los Alamos National Laboratory. The software analyzes the gamma ray spectrum from plutonium-bearing items and determines the isotopic distribution of the plutonium 241Am content and concentration of other isotopes in the item. The software can also determine the isotopic distribution of uranium isotopes in items containing only uranium. The body of this manual descnies the generic version of the code. Special facility-specific enhancements, if they apply, will be described in the appendices. The information in this manual applies equally well to version 3.3, which has been licensed to ORTEC. The software can analyze data that is stored in a file on disk. It understands several storage formats including Canberra's S1OO format, ORTEC'S `chn' and `SPC' formats, and several ASCII text formats. The software can also control data acquisition using an MCA and then store the results in a file on disk for later analysis or analyze the spectrum directly after the acquisition. The software currently only supports the control of ORTEC MCB'S. Support for Canbema's Genie-2000 Spectroscopy Systems will be added in the future. Support for reading and writing CAM files will also be forthcoming. A versatile parameter fde database structure governs all facets of the data analysis. User editing of the parameter sets allows great flexibility in handling data with different isotopic distributions, interfering isotopes, and different acquisition parameters such as energy calibration, and detector type. This manual is intended for the system supervisor or the local user who is to be the resident expert. Excerpts from this manual may also be appropriate for the system operator who will routinely use the instrument.

  9. Three-dimensional representations of salt-dome margins at four active strategic petroleum reserve sites.

    SciTech Connect (OSTI)

    Rautman, Christopher Arthur; Stein, Joshua S.

    2003-01-01T23:59:59.000Z

    Existing paper-based site characterization models of salt domes at the four active U.S. Strategic Petroleum Reserve sites have been converted to digital format and visualized using modern computer software. The four sites are the Bayou Choctaw dome in Iberville Parish, Louisiana; the Big Hill dome in Jefferson County, Texas; the Bryan Mound dome in Brazoria County, Texas; and the West Hackberry dome in Cameron Parish, Louisiana. A new modeling algorithm has been developed to overcome limitations of many standard geological modeling software packages in order to deal with structurally overhanging salt margins that are typical of many salt domes. This algorithm, and the implementing computer program, make use of the existing interpretive modeling conducted manually using professional geological judgement and presented in two dimensions in the original site characterization reports as structure contour maps on the top of salt. The algorithm makes use of concepts of finite-element meshes of general engineering usage. Although the specific implementation of the algorithm described in this report and the resulting output files are tailored to the modeling and visualization software used to construct the figures contained herein, the algorithm itself is generic and other implementations and output formats are possible. The graphical visualizations of the salt domes at the four Strategic Petroleum Reserve sites are believed to be major improvements over the previously available two-dimensional representations of the domes via conventional geologic drawings (cross sections and contour maps). Additionally, the numerical mesh files produced by this modeling activity are available for import into and display by other software routines. The mesh data are not explicitly tabulated in this report; however an electronic version in simple ASCII format is included on a PC-based compact disk.

  10. Mixing Cell Model: A One-Dimensional Numerical Model for Assessment of Water Flow and Contaminant Transport in the Unsaturated Zone

    SciTech Connect (OSTI)

    A. S. Rood

    2010-10-01T23:59:59.000Z

    This report describes the Mixing Cell Model code, a one-dimensional model for water flow and solute transport in the unsaturated zone under steady-state or transient flow conditions. The model is based on the principles and assumptions underlying mixing cell model formulations. The unsaturated zone is discretized into a series of independent mixing cells. Each cell may have unique hydrologic, lithologic, and sorptive properties. Ordinary differential equations describe the material (water and solute) balance within each cell. Water flow equations are derived from the continuity equation assuming that unit-gradient conditions exist at all times in each cell. Pressure gradients are considered implicitly through model discretization. Unsaturated hydraulic conductivity and moisture contents are determined by the material-specific moisture characteristic curves. Solute transport processes include explicit treatment of advective processes, first-order chain decay, and linear sorption reactions. Dispersion is addressed through implicit and explicit dispersion. Implicit dispersion is an inherent feature of all mixing cell models and originates from the formulation of the problem in terms of mass balance around fully mixed volume elements. Expressions are provided that relate implicit dispersion to the physical dispersion of the system. Two FORTRAN codes were developed to solve the water flow and solute transport equations: (1) the Mixing-Cell Model for Flow (MCMF) solves transient water flow problems and (2) the Mixing Cell Model for Transport (MCMT) solves the solute transport problem. The transient water flow problem is typically solved first by estimating the water flux through each cell in the model domain as a function of time using the MCMF code. These data are stored in either ASCII or binary files that are later read by the solute transport code (MCMT). Code output includes solute pore water concentrations, water and solute inventories in each cell and at each specified output time, and water and solute fluxes through each cell and specified output time. Computer run times for coupled transient water flow and solute transport were typically several seconds on a 2 GHz Intel Pentium IV desktop computer. The model was benchmarked against analytical solutions and finite-element approximations to the partial differential equations (PDE) describing unsaturated flow and transport. Differences between the maximum solute flux estimated by the mixing-cell model and the PDE models were typically less than two percent.

  11. Serial Input Output

    SciTech Connect (OSTI)

    Waite, Anthony; /SLAC

    2011-09-07T23:59:59.000Z

    Serial Input/Output (SIO) is designed to be a long term storage format of a sophistication somewhere between simple ASCII files and the techniques provided by inter alia Objectivity and Root. The former tend to be low density, information lossy (floating point numbers lose precision) and inflexible. The latter require abstract descriptions of the data with all that that implies in terms of extra complexity. The basic building blocks of SIO are streams, records and blocks. Streams provide the connections between the program and files. The user can define an arbitrary list of streams as required. A given stream must be opened for either reading or writing. SIO does not support read/write streams. If a stream is closed during the execution of a program, it can be reopened in either read or write mode to the same or a different file. Records represent a coherent grouping of data. Records consist of a collection of blocks (see next paragraph). The user can define a variety of records (headers, events, error logs, etc.) and request that any of them be written to any stream. When SIO reads a file, it first decodes the record name and if that record has been defined and unpacking has been requested for it, SIO proceeds to unpack the blocks. Blocks are user provided objects which do the real work of reading/writing the data. The user is responsible for writing the code for these blocks and for identifying these blocks to SIO at run time. To write a collection of blocks, the user must first connect them to a record. The record can then be written to a stream as described above. Note that the same block can be connected to many different records. When SIO reads a record, it scans through the blocks written and calls the corresponding block object (if it has been defined) to decode it. Undefined blocks are skipped. Each of these categories (streams, records and blocks) have some characteristics in common. Every stream, record and block has a name with the condition that each stream, record or block name must be unique in its category (i.e. all streams must have different names, but a stream can have the same name as a record). Each category is an arbitrary length list which is handled by a 'manager' and there is one manager for each category.

  12. AGR-2 Final Data Qualification Report for U.S. Capsules - ATR Cycles 147A Through 154B

    SciTech Connect (OSTI)

    Pham, Binh T; Einerson, Jeffrey J

    2014-07-01T23:59:59.000Z

    This report provides the data qualification status of AGR-2 fuel irradiation experimental data in four U.S. capsules from all 15 Advanced Test Reactor (ATR) Cycles 147A, 148A, 148B, 149A, 149B, 150A, 150B, 151A, 151B, 152A, 152B, 153A, 153B, 154A, and 154B, as recorded in the Nuclear Data Management and Analysis System (NDMAS). Thus, this report covers data qualification status for the entire AGR-2 irradiation and will replace four previously issued AGR-2 data qualification reports (e.g., INL/EXT-11-22798, INL/EXT-12-26184, INL/EXT-13-29701, and INL/EXT-13-30750). During AGR-2 irradiation, two cycles, 152A and 153A, occurred when the ATR core was briefly at low power, so AGR-2 irradiation data are not used for physics and thermal calculations. Also, two cycles, 150A and 153B, are Power Axial Locator Mechanism (PALM) cycles when the ATR power is higher than during normal cycles. During the first PALM cycle, 150A, the experiment was temporarily moved from the B-12 location to the ATR water canal and during the second PALM cycle, 153B, the experiment was temporarily moved from the B-12 location to the I-24 location to avoid being overheated. During the “Outage” cycle, 153A, seven flow meters were installed downstream from seven Fission Product Monitoring System (FPMS) monitors to measure flows from the monitors and these data are included in the NDMAS database. The AGR-2 data streams addressed in this report include thermocouple (TC) temperatures, sweep gas data (flow rates including new FPM downstream flows, pressure, and moisture content), and FPMS data (release rates and release-to-birth rate ratios [R/Bs]) for each of the four U.S. capsules in the AGR-2 experiment (Capsules 2, 3, 5, and 6). The final data qualification status for these data streams is determined by a Data Review Committee comprised of AGR technical leads, Very High Temperature Reactor (VHTR) Program Quality Assurance (QA), and NDMAS analysts. The Data Review Committee, which convened just before each data qualification report was issued, reviewed the data acquisition process, considered whether the data met the requirements for data collection as specified in QA-approved VHTR data collection plans, examined the results of NDMAS data testing and statistical analyses, and confirmed the qualification status of the data as given in each report. This report performs the following tasks: (1) combine existing qualification status of all AGR-2 data, (2) provide FPMS data qualification update and new release-to-birth ratio (R/B) data calculated using daily calculated birthrates, and (3) revise data qualification status of TC readings for some TCs in Capsule 6 based on their differences relative to calculated temperatures at TC locations. A total of 17,001,695 TC temperature and sweep gas data records were received and processed by NDMAS for four U.S. capsules during AGR-2 irradiation. Of these records, 9,655,474 (56.8% of the total) were determined to be Qualified; 5,792,052 (34.1% of the total) were determined to be Failed; and 1,554,169 (9.1% of the total) were determined to be Trend. For the first nice cycles, from ATR Cycle 147A to 151B, data records are 5- minute or 10-minute averaged values provided on weekly basis in EXCEL spreadsheets. For the last six cycles, ATR Cycle 152A through 154B, data records are instantaneous measurements recorded every minute and provided by .csv text files automatically every 2 hours. Therefore, the number of processed irradiation data was increased substantially from ATR Cycle 152A. For TC temperature data, there were 6,857,675 records and of these data 5,288,249 records (77.1% of the total TC data) were Failed due to TC instrument failures and 418,569 records (6.1% of the total TC data) were Trend due to large differences between TC readings and calculated values. By the end of Cycle 154A, all TCs in the AGR-2 test train failed. The overall percentage of Failed TC records is high, to some extent, because TCs failed toward the end of irradiation when the recording frequency was higher. For sweep gas data, there were 10,1