National Library of Energy BETA

Sample records for automated bot upload

  1. NERSC FTP Upload Service

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    NERSC FTP Upload Service The NERSC FTP Upload service is designed for external collaborators to be able to send data to NERSC staff and users. It allows you to create a...

  2. Help:Uploading Files | Open Energy Information

    Open Energy Info (EERE)

    Uploading Files Jump to: navigation, search Click on the upload file link at the bottom of the page (NOTE: you must be logged in to have this option). Uploading tutorial.JPG Click...

  3. FTP Document Upload Website | Open Energy Information

    Open Energy Info (EERE)

    FTP Document Upload Website Abstract The Drinking Water and Groundwater Protection Division's (DWGPD) File Transfer Protocol (FTP) document upload website. The DWGPD is a division...

  4. PARS II 104 Contractor Monthly Upload | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    104 Contractor Monthly Upload PARS II 104 Contractor Monthly Upload PDF icon PARS II 104 Contractor Monthly Upload More Documents & Publications PARS II TRAINING PARS II Training ...

  5. File FTP Document Upload Website | Open Energy Information

    Open Energy Info (EERE)

    Not Provided DOI Not Provided Check for DOI availability: http:crossref.org Online Internet link for File FTP Document Upload Website Citation Vermont Agency of Natural...

  6. Tips & Tricks for Uploading Images with Research Highlights

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    & Tricks for Uploading Images with Research Highlights Images: (optional) Only images in JPEG, BMP, GIF, or PNG can be accepted up to 10 Mb. The image caption is limited to 500 characters. Tip: For comparisons, lay multiple images out side by side, vertically or in a grid formation to create a single image file for uploading on the Research Highlight Submittal Form. Trick: If image editing soft ware is unavailable, the task can be accomplished using Microsoft (MS) Word as follows: 1. Insert

  7. EISA 432 Compliance Tracking System Data Upload Templates

    Broader source: Energy.gov [DOE]

    These generic Excel templates are available for federal contractors and service providers to provide federal clients with reports in the format agencies are required to use. Providing data in these templates will make it easy for agencies to upload your data into the EISA 432 Compliance Tracking System.

  8. U-199: Drupal Drag & Drop Gallery Module Arbitrary File Upload Vulnerability

    Broader source: Energy.gov [DOE]

    The vulnerability is caused due to the sites/all/modules/dragdrop_gallery/upload.php script improperly validating uploaded files, which can be exploited to execute arbitrary PHP code by uploading a PHP file with e.g. an appended ".gif" file extension.

  9. DropBot: An open-source digital microfluidic control system with precise control of electrostatic driving force and instantaneous drop velocity measurement

    SciTech Connect (OSTI)

    Fobel, Ryan; Donnelly Centre for Cellular and Biomolecular Research, 160 College St., Toronto, Ontario M5S 3E1 ; Fobel, Christian; Wheeler, Aaron R.; Donnelly Centre for Cellular and Biomolecular Research, 160 College St., Toronto, Ontario M5S 3E1; Department of Chemistry, University of Toronto, 80 St. George St., Toronto, Ontario M5S 3H6

    2013-05-13

    We introduce DropBot: an open-source instrument for digital microfluidics (http://microfluidics.utoronto.ca/dropbot). DropBot features two key functionalities for digital microfluidics: (1) real-time monitoring of instantaneous drop velocity (which we propose is a proxy for resistive forces), and (2) application of constant electrostatic driving forces through compensation for amplifier-loading and device capacitance. We anticipate that this system will enhance insight into failure modes and lead to new strategies for improved device reliability, and will be useful for the growing number of users who are adopting digital microfluidics for automated, miniaturized laboratory operation.

  10. V-033: ownCloud Cross-Site Scripting and File Upload Vulnerabilities |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 3: ownCloud Cross-Site Scripting and File Upload Vulnerabilities V-033: ownCloud Cross-Site Scripting and File Upload Vulnerabilities November 26, 2012 - 2:00am Addthis PROBLEM: ownCloud Cross-Site Scripting and File Upload Vulnerabilities PLATFORM: ownCloud 4.5.2, 4.5.1, 4.0.9 ABSTRACT: Multiple vulnerabilities have been reported in ownCloud REFERENCE LINKS: ownCloud Server Advisories Secunia Advisory SA51357 IMPACT ASSESSMENT: Medium DISCUSSION: 1) Input passed via the

  11. V-151: RSA Archer eGRC Bugs Let Remote Authenticated Users Upload Files and

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Let Remote Users Conduct Cross-Site Scripting Attacks | Department of Energy 51: RSA Archer eGRC Bugs Let Remote Authenticated Users Upload Files and Let Remote Users Conduct Cross-Site Scripting Attacks V-151: RSA Archer eGRC Bugs Let Remote Authenticated Users Upload Files and Let Remote Users Conduct Cross-Site Scripting Attacks May 8, 2013 - 12:06am Addthis PROBLEM: RSA Archer eGRC Bugs Let Remote Authenticated Users Upload Files and Let Remote Users Conduct Cross-Site Scripting Attacks

  12. V-151: RSA Archer eGRC Bugs Let Remote Authenticated Users Upload...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    V-151: RSA Archer eGRC Bugs Let Remote Authenticated Users Upload Files and Let Remote ... The vendor has issued a fix (5.3SP1). Addthis Related Articles V-084: RSA Archer eGRC ...

  13. AntBot: Anti-pollution peer-to-peer botnets

    SciTech Connect (OSTI)

    Yan, Guanhua; Eidenbenz, Stephan; Ha, Duc T

    2009-01-01

    Botnets, which are responsible for many email sparnming and DDoS (Distributed Denial of Service) attacks in the current Internet, have emerged as one of most severe cyber-threats in recent years. To evade detection and improve resistance against countermeasures, botnets have evolved from the first generation that relies on IRC chat channels to deliver commands to the current generation that uses highly resilient P2P (Peer-to-Peer) protocols to spread their C&C (Command and Control) information. It is, however, revealed that P2P botnets, although relieved from the single point of failure that IRC botnets suffer, can be easily disrupted using pollution-based mitigation schemes [15]. In this paper, we play the devil's advocate and propose a new type of hypothetical botnets called AntBot, which aim to propagate their C&C information to individual bots even though there exists an adversary that persistently pollutes keys used by seized bots to search the command information. The key idea of AntBot is a tree-like structure that bots use to deliver the command so that captured bots reveal only limited information. To evaluate effectiveness of AntBot against pollution-based mitigation in a virtual environment, we develop a distributed P2P botnet simulator. Using extensive experiments, we demonstrate that AntBot operates resiliently against pollution-based mitigation. We further present a few potential defense schemes that could effectively disrupt AntBot operations.

  14. File:INL-geothermal-mt.pdf | Open Energy Information

    Open Energy Info (EERE)

    current 12:41, 16 December 2010 Thumbnail for version as of 12:41, 16 December 2010 5,100 4,200 (1.99 MB) MapBot (Talk | contribs) Automated upload from NREL's "mapsearch"...

  15. V-177: VMware vCenter Chargeback Manager File Upload Handling...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Apache modproxymodrewrite Bug Lets Remote Users Access Internal Servers U-047: Siemens Automation License Manager Bugs Let Remote Users Deny Service or Execute Arbitrary Code...

  16. GPFA-AB_Phase1RiskAnalysisTask5DataUpload

    SciTech Connect (OSTI)

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  17. GPFA-AB_Phase1RiskAnalysisTask5DataUpload

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with fairways or industries added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet list_of_contents.csv in the folder SupportingInfo. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder combining_metrics.

  18. RatBot: anti-enumeration peer-to-peer botnets

    SciTech Connect (OSTI)

    Yan, Guanhua; Eidenbenz, Stephan; Chen, Songqing

    2010-01-01

    Botnets have emerged as one of the most severe cyber threats in recent years. To obtain high resilience against a single point of failure, the new generation of botnets have adopted the peer-to-peer (P2P) structure. One critical question regarding these P2P botnets is: how big are they indeed? To address this question, researchers have proposed both actively crawling and passively monitoring methods to enumerate existing P2P botnets. In this work, we go further to explore the potential strategies that botnets may have to obfuscate their true sizes. Towards this end, this paper introduces RatBot, a P2P botnet that applies some statistical techniques to defeat existing P2P botnet enumeration methods. The key ideas of RatBot are two-fold: (1) there exist a fraction of bots that are indistinguishable from their fake identities, which are spoofing IP addresses they use to hide themselves; (2) we use a heavy-tailed distribution to generate the number of fake identities for each of these bots so that the sum of observed fake identities converges only slowly and thus has high variation. We use large-scale high-fidelity simulation to quantify the estimation errors under diverse settings, and the results show that a naive enumeration technique can overestimate the sizes of P2P botnets by one order of magnitude. We believe that our work reveals new challenges of accurately estimating the sizes of P2P botnets, and hope that it will raise the awareness of security practitioners with these challenges. We further suggest a few countermeasures that can potentially defeat RatBot's anti-enumeration scheme.

  19. Program Automation

    Broader source: Energy.gov [DOE]

    Better Buildings Residential Network Data and Evaluation Peer Exchange Call Series: Program Automation, Call Slides and Discussion Summary, November 21, 2013. This data and evaluation peer exchange call discussed program automation.

  20. Automated diagnostic kiosk for diagnosing diseases

    DOE Patents [OSTI]

    Regan, John Frederick; Birch, James Michael

    2014-02-11

    An automated and autonomous diagnostic apparatus that is capable of dispensing collection vials and collections kits to users interesting in collecting a biological sample and submitting their collected sample contained within a collection vial into the apparatus for automated diagnostic services. The user communicates with the apparatus through a touch-screen monitor. A user is able to enter personnel information into the apparatus including medical history, insurance information, co-payment, and answer a series of questions regarding their illness, which is used to determine the assay most likely to yield a positive result. Remotely-located physicians can communicate with users of the apparatus using video tele-medicine and request specific assays to be performed. The apparatus archives submitted samples for additional testing. Users may receive their assay results electronically. Users may allow the uploading of their diagnoses into a central databank for disease surveillance purposes.

  1. TJ Automation | Open Energy Information

    Open Energy Info (EERE)

    TJ Automation Jump to: navigation, search Name TJ Automation Facility TJ Automation Sector Wind energy Facility Type Small Scale Wind Facility Status In Service Owner TJ Automation...

  2. Multiplex automated genome engineering

    DOE Patents [OSTI]

    Church, George M; Wang, Harris H; Isaacs, Farren J

    2013-10-29

    The present invention relates to automated methods of introducing multiple nucleic acid sequences into one or more target cells.

  3. Shoe-String Automation

    SciTech Connect (OSTI)

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  4. Meikle Automation Inc | Open Energy Information

    Open Energy Info (EERE)

    Meikle Automation Inc Jump to: navigation, search Name: Meikle Automation Inc Place: Kitchener, Ontario, Canada Zip: N2E 3Z5 Product: Canadian manufacturer of automation systems...

  5. LANL to certify automated influenza surveillance system

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    LANL to certify automated influenza surveillance system LANL to certify automated influenza surveillance system A compact automated system for surveillance and screening of...

  6. Automated gas chromatography

    DOE Patents [OSTI]

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  7. Automated CCTV Tester

    Energy Science and Technology Software Center (OSTI)

    2000-09-13

    The purpose of an automated CCTV tester is to automatically and continuously monitor multiple perimeter security cameras for changes in a camera's measured resolution and alignment (camera looking at the proper area). It shall track and record the image quality and position of each camera and produce an alarm when a camera is out of specification.

  8. Honeywell Demonstrates Automated Demand Response Benefits for...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Honeywell Demonstrates Automated Demand Response Benefits for Utility, Commercial, and Industrial Customers Honeywell Demonstrates Automated Demand Response Benefits for Utility, ...

  9. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    High-Consequence Automation Robotics Homepage About Robotics Research & Development Advanced Controls Advanced Manipulation Cybernetics High-Consequence Automation Demilitarization...

  10. Automation Status | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Automation Status Automation Status Presented at the NREL Hydrogen and Fuel Cell Manufacturing R&D Workshop in Washington, DC, August 11-12, 2011. PDF icon Automation Status More Documents & Publications PEM Stack Manufacturing: Industry Status 2011 NREL/DOE Hydrogen and Fuel Cell Manufacturing R&D Workshop Report Manufacturing Barriers to High Temperature PEM Commercialization

  11. Automated gas chromatography

    DOE Patents [OSTI]

    Mowry, C.D.; Blair, D.S.; Rodacy, P.J.; Reber, S.D.

    1999-07-13

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute. 7 figs.

  12. Automated macromolecular crystallization screening

    DOE Patents [OSTI]

    Segelke, Brent W.; Rupp, Bernhard; Krupka, Heike I.

    2005-03-01

    An automated macromolecular crystallization screening system wherein a multiplicity of reagent mixes are produced. A multiplicity of analysis plates is produced utilizing the reagent mixes combined with a sample. The analysis plates are incubated to promote growth of crystals. Images of the crystals are made. The images are analyzed with regard to suitability of the crystals for analysis by x-ray crystallography. A design of reagent mixes is produced based upon the expected suitability of the crystals for analysis by x-ray crystallography. A second multiplicity of mixes of the reagent components is produced utilizing the design and a second multiplicity of reagent mixes is used for a second round of automated macromolecular crystallization screening. In one embodiment the multiplicity of reagent mixes are produced by a random selection of reagent components.

  13. Automated Job Hazards Analysis

    Broader source: Energy.gov [DOE]

    AJHA Program - The Automated Job Hazard Analysis (AJHA) computer program is part of an enhanced work planning process employed at the Department of Energy's Hanford worksite. The AJHA system is routinely used to performed evaluations for medium and high risk work, and in the development of corrective maintenance work packages at the site. The tool is designed to ensure that workers are fully involved in identifying the hazards, requirements, and controls associated with tasks.

  14. zeller-sbl2011-upload.ppt

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2011 FSI Models 41 * data in heavy use by model builders (U. Mosel) (P. dePerio) (T. Golan) * need measurements on other targets * and at higher energies (multi-) - LAr:...

  15. zeller-aps2011-upload.ppt

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... is on D 2 * QE considered the "golden channel" - it's simple ... clean - know size & ... 68k Pb, 65k Fe (DIS event reconstructed in iron) M I N E R A 1 s t g l i m p s e ...

  16. UTILITY Submit to BPA: Upload Template

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    achieved. (mmddyyyy) Was measure installed in a federal facility? (Dropdown box below) Utility assigned end user account or member number. Maximum length 50 characters....

  17. zeller-panic2011-upload.ppt

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    in precision & search for smaller and smaller effects) 1 2 3 m 2 ATM m 2 SOL S. Zeller, PANIC, July 26, 2011 Neutrino Cross Sections 3 NOvA T2K LBNE CNGS * pursuit...

  18. Upload Data - OpenEI Datasets

    Open Energy Info (EERE)

    open data DOE Open Data add to the catalog&3; for DOE-funded data GDR: DOE's Geothermal Data Repository Geothermal Data add a submission to DOE's &3;Geothermal Data Repository...

  19. Automated Proactive Fault Isolation: A Key to Automated Commissioning

    SciTech Connect (OSTI)

    Katipamula, Srinivas; Brambley, Michael R.

    2007-07-31

    In this paper, we present a generic model for automated continuous commissioing and then delve in detail into one of the processes, proactive testing for fault isolation, which is key to automating commissioning. The automated commissioining process uses passive observation-based fault detction and diagnostic techniques, followed by automated proactive testing for fault isolation, automated fault evaluation, and automated reconfiguration of controls together to continuously keep equipment controlled and running as intended. Only when hard failures occur or a physical replacement is required does the process require human intervention, and then sufficient information is provided by the automated commissioning system to target manual maintenance where it is needed. We then focus on fault isolation by presenting detailed logic that can be used to automatically isolate faults in valves, a common component in HVAC systems, as an example of how automated proactive fault isolation can be accomplished. We conclude the paper with a discussion of how this approach to isolating faults can be applied to other common HVAC components and their automated commmissioning and a summary of key conclusions of the paper.

  20. Automated fiber pigtailing machine

    DOE Patents [OSTI]

    Strand, O.T.; Lowry, M.E.

    1999-01-05

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectronic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems. 26 figs.

  1. Automated fiber pigtailing machine

    DOE Patents [OSTI]

    Strand, Oliver T.; Lowry, Mark E.

    1999-01-01

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectonic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems.

  2. Automated Hazard Analysis

    Energy Science and Technology Software Center (OSTI)

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control andmore » job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the work planning process.« less

  3. Solar Automation Inc | Open Energy Information

    Open Energy Info (EERE)

    Solar Automation Inc Place: Albuquerque, New Mexico Zip: NM 8110 Product: Produces manufacturing equipment for PV cells. References: Solar Automation Inc1 This article is a...

  4. Brooks Automation Inc | Open Energy Information

    Open Energy Info (EERE)

    Product: Automation equipment supplier, including vacuum pumps for thin film PV manufacturing facilities. References: Brooks Automation Inc1 This article is a stub. You can...

  5. Automated Defect Classification (ADC)

    Energy Science and Technology Software Center (OSTI)

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafermore » surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.« less

  6. Robust automated knowledge capture.

    SciTech Connect (OSTI)

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  7. Recommendation 207: Automate the Stewardship Verification Process

    Broader source: Energy.gov [DOE]

    ORSSAB recommends DOE automate the Stewardship Verification Process for the Remediation Effectiveness Report.

  8. Automating Shallow Seismic Imaging

    SciTech Connect (OSTI)

    Steeples, Don W.

    2004-12-09

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could make SSR surveying considerably more efficient and less expensive, particularly when geophone intervals of 25 cm or less are required. The most recent research analyzed the difference in seismic response of the geophones with variable geophone spike length and geophones attached to various steel media. Experiments investigated the azimuthal dependence of the quality of data relative to the orientation of the rigidly attached geophones. Other experiments designed to test the hypothesis that the data are being amplified in much the same way that an organ pipe amplifies sound have so far proved inconclusive. Taken together, the positive results show that SSR imaging within a few meters of the earth's surface is possible if the geology is suitable, that SSR imaging can complement GPR imaging, and that SSR imaging could be made significantly more cost effective, at least in areas where the topography and the geology are favorable. Increased knowledge of the Earth's shallow subsurface through non-intrusive techniques is of potential benefit to management of DOE facilities. Among the most significant problems facing hydrologists today is the delineation of preferential permeability paths in sufficient detail to make a quantitative analysis possible. Aquifer systems dominated by fracture flow have a reputation of being particularly difficult to characterize and model. At chemically contaminated sites, including U.S. Department of Energy (DOE) facilities and others at Department of Defense (DOD) installations worldwide, establishing the spatial extent of the contamination, along with the fate of the contaminants and their transport-flow directions, is essential to the development of effective cleanup strategies. Detailed characterization of the shallow subsurface is important not only in environmental, groundwater, and geotechnical engineering applications, but also in neotectonics, mining geology, and the analysis of petroleum reservoir analogs. Near-surface seismology is in the vanguard of non-intrusive approaches to increase knowledge of the shallow subsurface; our work is a significant departure from conventional seismic-survey field procedures.

  9. Specimen coordinate automated measuring machine/fiducial automated measuring machine

    DOE Patents [OSTI]

    Hedglen, Robert E.; Jacket, Howard S.; Schwartz, Allan I.

    1991-01-01

    The Specimen coordinate Automated Measuring Machine (SCAMM) and the Fiducial Automated Measuring Machine (FAMM) is a computer controlled metrology system capable of measuring length, width, and thickness, and of locating fiducial marks. SCAMM and FAMM have many similarities in their designs, and they can be converted from one to the other without taking them out of the hot cell. Both have means for: supporting a plurality of samples and a standard; controlling the movement of the samples in the +/- X and Y directions; determining the coordinates of the sample; compensating for temperature effects; and verifying the accuracy of the measurements and repeating as necessary. SCAMM and FAMM are designed to be used in hot cells.

  10. Automated nutrient analyses in seawater

    SciTech Connect (OSTI)

    Whitledge, T.E.; Malloy, S.C.; Patton, C.J.; Wirick, C.D.

    1981-02-01

    This manual was assembled for use as a guide for analyzing the nutrient content of seawater samples collected in the marine coastal zone of the Northeast United States and the Bering Sea. Some modifications (changes in dilution or sample pump tube sizes) may be necessary to achieve optimum measurements in very pronounced oligotrophic, eutrophic or brackish areas. Information is presented under the following section headings: theory and mechanics of automated analysis; continuous flow system description; operation of autoanalyzer system; cookbook of current nutrient methods; automated analyzer and data analysis software; computer interfacing and hardware modifications; and trouble shooting. The three appendixes are entitled: references and additional reading; manifold components and chemicals; and software listings. (JGB)

  11. Automated Demand Response and Commissioning

    SciTech Connect (OSTI)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  12. Making the transition to automation

    SciTech Connect (OSTI)

    Christenson, D.J. )

    1992-10-01

    By 1995, the Bureau of Reclamation's hydropower plant near Hungry Horse, Montana, will be remotely operated from Grand Coulee dam (about 300 miles away) in Washington State. Automation at Hungry Horse will eliminate the need for four full-time power plant operators. Between now and then, a transition plan that offers employees choices for retraining, transferring, or taking early retirement will smooth the transition in reducing from five operators to one. The transition plan also includes the use of temporary employees to offset risks of reducing staff too soon. When completed in 1953, the Hungry Horse structure was the world's fourth largest and fourth highest concrete dam. The arch-gravity structure has a crest length of 2,115 feet; it is 3,565 feet above sea level. The four turbine-generator units in the powerhouse total 284 MW, and supply approximately 1 billion kilowatt-hours of electricity annually to the federal power grid managed by the Bonneville Power Administration. In 1988, Reclamation began to automate operations at many of its hydro plants, and to establish centralized control points. The control center concept will increase efficiency. It also will coordinate water movements and power supply throughout the West. In the Pacific Northwest, the Grand Coulee and Black Canyon plants are automated control centers. Several Reclamation-owned facilities in the Columbia River Basin, including Hungry Horse, will be connected to these centers via microwave and telephone lines. When automation is complete, constant monitoring by computer will replace hourly manual readings and equipment checks. Computers also are expected to increase water use efficiency by 1 to 2 percent by ensuring operation for maximum turbine efficiency. Unit efficiency curves for various heads will be programmed into the system.

  13. LANL to certify automated influenza surveillance system

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    LANL to certify automated influenza surveillance system LANL to certify automated influenza surveillance system A compact automated system for surveillance and screening of potential pandemic strains of influenza and other deadly infectious diseases is a step closer to reality. January 31, 2011 Los Alamos National Laboratory sits on top of a once-remote mesa in northern New Mexico with the Jemez mountains as a backdrop to research and innovation covering multi-disciplines from bioscience,

  14. Automated Transportation Logistics and Analysis System (ATLAS)

    Energy Savers [EERE]

    Department of Energy Automated Office Systems Support (AOSS) Quality Assurance Model Automated Office Systems Support (AOSS) Quality Assurance Model A quality assurance model, including checklists, for activity relative to network and desktop computer support. PDF icon Automated Office Systems Support (AOSS) Quality Assurance Model More Documents & Publications Audit Report: CR-B-97-04 CITSS Project Plan Quality Assurance Checklist Insulated Cladding Systems | Department of Energy

  15. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    About Robotics Robotic arm With more than 25 years of experience and hundreds of ... HCAR has been developing high-consequence automation solutions for more than 25 years. ...

  16. Investigating Potential Strategies for Automating Commissioning Activities

    SciTech Connect (OSTI)

    Brambley, Michael R.; Briggs, Robert S.; Katipamula, Srinivas; Dasher, Carolyn; Luskay, Larry; Irvine, Linda

    2002-05-31

    This paper provides summary results from a project on automated and continuous commissioning currently underway for the Air-Conditioning & Refrigeration Technology Institute (ARTI). The project focuses on developing methods for automating parts of the commissioning of heating, ventilating and air-conditioning (HVAC) equipment in newly-built, as well as existing, commercial buildings. This paper provides a summary of work completed to date, which has focused on selecting building systems; operation problems; and parts of the commissioning process where automation is likely to provide the greatest benefits. It also includes an overview of the approach planned for development and demonstration of methods for automating the selected areas.

  17. Distributed Automated Demand Response - Energy Innovation Portal

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Energy Analysis Energy Analysis Electricity Transmission Electricity Transmission Find More Like This Return to Search Distributed Automated Demand Response Lawrence Livermore ...

  18. Ditec Automation Group | Open Energy Information

    Open Energy Info (EERE)

    Name: Ditec Automation Group Place: Mexico City, Mexico Product: Mexico City-based manufacturing and installation company. Focused on material handling, industrial ovens,...

  19. Automated High Throughput Drug Target Crystallography

    SciTech Connect (OSTI)

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  20. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    timing studies, observe layouts, and help estimate project costs. In the flexible automation pilot plant, robots are used to open and remove munitions from containers. The...

  1. Automation Alley Technology Center | Open Energy Information

    Open Energy Info (EERE)

    Alley Technology Center Jump to: navigation, search Name: Automation Alley Technology Center Place: United States Sector: Services Product: General Financial & Legal Services (...

  2. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    First responder support Contact Us Questions and comments about High Consequence, Automation, & Robotics? Contact us. Videos T An error occurred. Try watching this video on...

  3. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    team knew they needed a robot for the job and called Sandia's High Consequence, Automation, & Robotics (HCAR) team. Mighty Mouse Challenge Typically the cylinder moved back...

  4. Automated fuel pin loading system

    DOE Patents [OSTI]

    Christiansen, David W. (Kennewick, WA); Brown, William F. (West Richland, WA); Steffen, Jim M. (Richland, WA)

    1985-01-01

    An automated loading system for nuclear reactor fuel elements utilizes a gravity feed conveyor which permits individual fuel pins to roll along a constrained path perpendicular to their respective lengths. The individual lengths of fuel cladding are directed onto movable transports, where they are aligned coaxially with the axes of associated handling equipment at appropriate production stations. Each fuel pin can be reciprocated axially and/or rotated about its axis as required during handling steps. The fuel pins are inserted as a batch prior to welding of end caps by one of two disclosed welding systems.

  5. Automated fuel pin loading system

    DOE Patents [OSTI]

    Christiansen, D.W.; Brown, W.F.; Steffen, J.M.

    An automated loading system for nuclear reactor fuel elements utilizes a gravity feed conveyor which permits individual fuel pins to roll along a constrained path perpendicular to their respective lengths. The individual lengths of fuel cladding are directed onto movable transports, where they are aligned coaxially with the axes of associated handling equipment at appropriate production stations. Each fuel pin can be be reciprocated axially and/or rotated about its axis as required during handling steps. The fuel pins are inerted as a batch prior to welding of end caps by one of two disclosed welding systems.

  6. Automated Centrifugal Chiller Diagnostician - Energy Innovation Portal

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Energy Analysis Energy Analysis Find More Like This Return to Search Automated Centrifugal Chiller Diagnostician Pacific Northwest National Laboratory Contact PNNL About This Technology Centrifugal chiller display Centrifugal chiller display Typical diagnostic display Typical diagnostic display Technology Marketing Summary Researchers and engineers at PNNL have developed an automated, sophisticated, multi-level, real-time centrifugal chiller diagnostician with diagnostics available under partial

  7. Automation of the longwall mining system

    SciTech Connect (OSTI)

    Zimmerman, W.; Aster, R.; Harris, J.; High, J.

    1982-11-01

    The longwall automation study presented is the first phase of a study to evaluate mining automation opportunities. The objective was to identify cost-effective, safe, and technologically sound applications of automation technology to understand coal mining. The prime automation candidates resulting from the industry experience and survey were: (1) the shearer operation, (2) shield and conveyor pan-line advance, (3) a management information system to allow improved mine logistics support, and (4) component fault isolation and diagnostics to reduce untimely maintenance delays. A system network analysis indicated that a 40% improvement in productivity was feasible if system delays associated with all of the above four areas were removed. A technology assessment and conceptual system design of each of the four automation candidate areas showed that state-of-the-art digital computer, servomechanism, and actuator technologies could be applied to automate the longwall system. The final cost benefit analysis of all of the automation areas indicated a total net national benefit (profit) of roughly $200 million to the longwall mining industry if all automation candidates were installed. This cost benefit represented an approximate order of magnitude payback on the research and development (R and D) investment. In conclusion, it is recommended that the shearer operation be automated first because it provides a large number of other sensor inputs required for face alignment (i.e., shields and conveyor). Automation of the shield and conveyor pan-line advance is suggested as the next step since both the shearer and face alignment operations contributed the greatest time delays to the overall system downtime.

  8. Microsoft Word - Wireless Automation World for OE FINAL.doc

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Automation World Features New White Paper on Wireless Security, Interviews Authors April 16, 2009 The April 2009 issue of Automation World magazine features the white paper ...

  9. ISA Approves Standard for Wireless Automation in Process Control...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Wireless Automation in Process Control Applications September 22, 2009 On September 9, the Standards and Practices Board of the International Society for Automation (ISA) approved ...

  10. Small- and Medium-Size Building Automation and Control System...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Small- and Medium-Size Building Automation and Control System Needs: Scoping Study Small- and Medium-Size Building Automation and Control System Needs: Scoping Study Michael ...

  11. ISA Approves Standard for Wireless Automation in Process Control...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ISA Approves Standard for Wireless Automation in Process Control Applications On September 9, the Standards and Practices Board of the International Society for Automation (ISA) ...

  12. U-047: Siemens Automation License Manager Bugs Let Remote Users...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    7: Siemens Automation License Manager Bugs Let Remote Users Deny Service or Execute Arbitrary Code U-047: Siemens Automation License Manager Bugs Let Remote Users Deny Service or...

  13. Multiplex automated genome engineering Church, George M; Wang...

    Office of Scientific and Technical Information (OSTI)

    Multiplex automated genome engineering Church, George M; Wang, Harris H; Isaacs, Farren J The present invention relates to automated methods of introducing multiple nucleic acid...

  14. National SCADA Test Bed Substation Automation Evaluation Report...

    Office of Scientific and Technical Information (OSTI)

    Technical Report: National SCADA Test Bed Substation Automation Evaluation Report Citation Details In-Document Search Title: National SCADA Test Bed Substation Automation ...

  15. Hirschmann Automation and Control GmbH | Open Energy Information

    Open Energy Info (EERE)

    Hirschmann Automation and Control GmbH Jump to: navigation, search Name: Hirschmann Automation and Control GmbH Place: Neckartenzlingen, Baden-Wrttemberg, Germany Zip: 72654...

  16. DA (Distribution Automation) (Smart Grid Project) | Open Energy...

    Open Energy Info (EERE)

    DA (Distribution Automation) (Smart Grid Project) Jump to: navigation, search Project Name DA (Distribution Automation) Country Netherlands Coordinates 52.132633, 5.291266...

  17. Belden Deutschland GmbH Lumberg Automation | Open Energy Information

    Open Energy Info (EERE)

    Belden Deutschland GmbH Lumberg Automation Jump to: navigation, search Name: Belden Deutschland GmbH - Lumberg Automation Place: Schalksmuhle, North Rhine-Westphalia, Germany Zip:...

  18. Global health response more accurate with automated influenza...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Global health response more accurate with automated influenza surveillance Global health response more accurate with automated influenza surveillance Public health officials will...

  19. Highly Insulating Residential Windows Using Smart Automated Shading...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Highly Insulating Residential Windows Using Smart Automated Shading Highly Insulating Residential Windows Using Smart Automated Shading Addthis 1 of 3 Residential Smart Window with ...

  20. Small- and Medium-Size Building Automation and Control System...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Small- and Medium-Size Building Automation and Control System Needs: Scoping Study Small- and Medium-Size Building Automation and Control System Needs: Scoping Study Emerging ...

  1. Reference Model for Control and Automation Systems in Electrical...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Model for Control and Automation Systems in Electrical Power (October 2005) Reference Model for Control and Automation Systems in Electrical Power (October 2005) Modern ...

  2. Automated Fresnel lens tester system

    SciTech Connect (OSTI)

    Phipps, G.S.

    1981-07-01

    An automated data collection system controlled by a desktop computer has been developed for testing Fresnel concentrators (lenses) intended for solar energy applications. The system maps the two-dimensional irradiance pattern (image) formed in a plane parallel to the lens, whereas the lens and detector assembly track the sun. A point detector silicon diode (0.5-mm-dia active area) measures the irradiance at each point of an operator-defined rectilinear grid of data positions. Comparison with a second detector measuring solar insolation levels results in solar concentration ratios over the image plane. Summation of image plane energies allows calculation of lens efficiencies for various solar cell sizes. Various graphical plots of concentration ratio data help to visualize energy distribution patterns.

  3. Fueling Robot Automates Hydrogen Hose Reliability Testing (Fact Sheet)

    SciTech Connect (OSTI)

    Harrison, K.

    2014-01-01

    Automated robot mimics fueling action to test hydrogen hoses for durability in real-world conditions.

  4. MASS: An automated accountability system

    SciTech Connect (OSTI)

    Erkkila, B.H.; Kelso, F.

    1994-08-01

    All Department of Energy contractors who manage accountable quantities of nuclear materials are required to implement an accountability system that tracks, and records the activities associated with those materials. At Los Alamos, the automated accountability system allows data entry on computer terminals and data base updating as soon as the entry is made. It is also able to generate all required reports in a timely Fashion. Over the last several years, the hardware and software have been upgraded to provide the users with all the capability needed to manage a large variety of operations with a wide variety of nuclear materials. Enhancements to the system are implemented as the needs of the users are identified. The system has grown with the expanded needs of the user; and has survived several years of changing operations and activity. The user community served by this system includes processing, materials control and accountability, and nuclear material management personnel. In addition to serving the local users, the accountability system supports the national data base (NMMSS). This paper contains a discussion of several details of the system design and operation. After several years of successful operation, this system provides an operating example of how computer systems can be used to manage a very dynamic data management problem.

  5. Automated Parallel Capillary Electrophoretic System

    DOE Patents [OSTI]

    Li, Qingbo; Kane, Thomas E.; Liu, Changsheng; Sonnenschein, Bernard; Sharer, Michael V.; Kernan, John R.

    2000-02-22

    An automated electrophoretic system is disclosed. The system employs a capillary cartridge having a plurality of capillary tubes. The cartridge has a first array of capillary ends projecting from one side of a plate. The first array of capillary ends are spaced apart in substantially the same manner as the wells of a microtitre tray of standard size. This allows one to simultaneously perform capillary electrophoresis on samples present in each of the wells of the tray. The system includes a stacked, dual carousel arrangement to eliminate cross-contamination resulting from reuse of the same buffer tray on consecutive executions from electrophoresis. The system also has a gel delivery module containing a gel syringe/a stepper motor or a high pressure chamber with a pump to quickly and uniformly deliver gel through the capillary tubes. The system further includes a multi-wavelength beam generator to generate a laser beam which produces a beam with a wide range of wavelengths. An off-line capillary reconditioner thoroughly cleans a capillary cartridge to enable simultaneous execution of electrophoresis with another capillary cartridge. The streamlined nature of the off-line capillary reconditioner offers the advantage of increased system throughput with a minimal increase in system cost.

  6. Image upload with broken thumbnail image | OpenEI Community

    Open Energy Info (EERE)

    New page curation tool Posted: 7 May 2013 - 08:16 by Rmckeel Rmckeel 2013 Civic Hacking Day Ideas Posted: 19 Apr 2013 - 13:44 by Rmckeel 1 comment(s) 1 of 5 Groups...

  7. Empty values in upload? | OpenEI Community

    Open Energy Info (EERE)

    - 16:20 Groups Menu You must login in order to post into this group. Recent content Hello-Sorry for the delay in... Use of DynamicAggregationProcessor I submitted a pull...

  8. Error 401 on upload? | OpenEI Community

    Open Energy Info (EERE)

    - 14:30 Groups Menu You must login in order to post into this group. Recent content Hello-Sorry for the delay in... Use of DynamicAggregationProcessor I submitted a pull...

  9. Automated Sorting of Transuranic Waste

    SciTech Connect (OSTI)

    Shurtliff, Rodney Marvin

    2001-03-01

    The HANDSS-55 Transuranic Waste Sorting Module is designed to sort out items found in 55-gallon drums of waste as determined by an operator. Innovative imaging techniques coupled with fast linear motor-based motion systems and a flexible end-effector system allow the operator to remove items from the waste stream by a touch of the finger. When all desired items are removed from the waste stream, the remaining objects are automatically moved to a repackaging port for removal from the glovebox/cell. The Transuranic Waste Sorting Module consists of 1) a high accuracy XYZ Stereo Measurement and Imaging system, 2) a vibrating/tilting sorting table, 3) an XY Deployment System, 4) a ZR Deployment System, 5) several user-selectable end-effectors, 6) a waste bag opening system, 7) control and instrumentation, 8) a noncompliant waste load-out area, and 9) a Human/Machine Interface (HMI). The system is modular in design to accommodate database management tools, additional load-out ports, and other enhancements. Manually sorting the contents of a 55-gallon drum takes about one day per drum. The HANDSS-55 Waste Sorting Module is designed to significantly increase the throughput of this sorting process by automating those functions that are strenuous and tiresome for an operator to perform. The Waste Sorting Module uses the inherent ability of an operator to identify the items that need to be segregated from the waste stream and then, under computer control, picks that item out of the waste and deposits it in the appropriate location. The operator identifies the object by locating the visual image on a large color display and touches the image on the display with his finger. The computer then determines the location of the object, and performing a highspeed image analysis determines its size and orientation, so that a robotic gripper can be deployed to pick it up. Following operator verification by voice or function key, the object is deposited into a specified location.

  10. Automated imaging system for single molecules

    DOE Patents [OSTI]

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  11. Beyond Commissioning: The Role of Automation

    SciTech Connect (OSTI)

    Brambley, Michael R.; Katipamula, Srinivas

    2005-02-01

    This article takes a brief look at the benefits of commissioning and describes a vision of the future where most of the objectives of commissioning will be accomplished automatically by capabilities built into the building systems themselves. Commissioning will become an activity that's performed continuously rather than periodically, and only repairs requiring replacement or overhaul of equipment will require manual intervention. The article then identifies some of the technologies that will be needed to realize this vision and ends with a call for all involved in the enterprise of building commissioning and automation to embrace and dedicate themselves to a future of automated commissioning.

  12. Preliminary Framework for Human-Automation Collaboration

    SciTech Connect (OSTI)

    Oxstrand, Johanna Helene; Le Blanc, Katya Lee; Spielman, Zachary Alexander

    2015-09-01

    The Department of Energy’s Advanced Reactor Technologies Program sponsors research, development and deployment activities through its Next Generation Nuclear Plant, Advanced Reactor Concepts, and Advanced Small Modular Reactor (aSMR) Programs to promote safety, technical, economical, and environmental advancements of innovative Generation IV nuclear energy technologies. The Human Automation Collaboration (HAC) Research Project is located under the aSMR Program, which identifies developing advanced instrumentation and controls and human-machine interfaces as one of four key research areas. It is expected that the new nuclear power plant designs will employ technology significantly more advanced than the analog systems in the existing reactor fleet as well as utilizing automation to a greater extent. Moving towards more advanced technology and more automation does not necessary imply more efficient and safer operation of the plant. Instead, a number of concerns about how these technologies will affect human performance and the overall safety of the plant need to be addressed. More specifically, it is important to investigate how the operator and the automation work as a team to ensure effective and safe plant operation, also known as the human-automation collaboration (HAC). The focus of the HAC research is to understand how various characteristics of automation (such as its reliability, processes, and modes) effect an operator’s use and awareness of plant conditions. In other words, the research team investigates how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. This report addresses the Department of Energy milestone M4AT-15IN2302054, Complete Preliminary Framework for Human-Automation Collaboration, by discussing the two phased development of a preliminary HAC framework. The framework developed in the first phase was used as the basis for selecting topics to be investigated in more detail. The results and insights gained from the in-depth studies conducted during the second phase were used to revise the framework. This report describes the basis for the framework developed in phase 1, the changes made to the framework in phase 2, and the basis for the changes. Additional research needs are identified and presented in the last section of the report.

  13. Middleware Automated Deployment Utilities - MRW Suite

    Energy Science and Technology Software Center (OSTI)

    2014-09-18

    The Middleware Automated Deployment Utilities consists the these three components: MAD: Utility designed to automate the deployment of java applications to multiple java application servers. The product contains a front end web utility and backend deployment scripts. MAR: Web front end to maintain and update the components inside database. MWR-Encrypt: Web utility to convert a text string to an encrypted string that is used by the Oracle Weblogic application server. The encryption is done usingmore » the built in functions if the Oracle Weblogic product and is mainly used to create an encrypted version of a database password.« less

  14. KSL Kuttler Automation Systems GmbH | Open Energy Information

    Open Energy Info (EERE)

    KSL Kuttler Automation Systems GmbH Jump to: navigation, search Name: KSL-Kuttler Automation Systems GmbH Place: Dauchingen, Baden-Wrttemberg, Germany Zip: 78083 Sector: Solar...

  15. U.S. Customs and Border Protection (CBP) Announces Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Customs and Border Protection (CBP) Announces Automation of Form I-94 ArrivalDeparture Record U.S. Customs and Border Protection (CBP) will begin automation of the I-94 records on...

  16. V-132: IBM Tivoli System Automation Application Manager Multiple

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Vulnerabilities | Department of Energy 2: IBM Tivoli System Automation Application Manager Multiple Vulnerabilities V-132: IBM Tivoli System Automation Application Manager Multiple Vulnerabilities April 12, 2013 - 6:00am Addthis PROBLEM: IBM has acknowledged multiple vulnerabilities in IBM Tivoli System Automation Application Manager PLATFORM: The vulnerabilities are reported in IBM Tivoli System Automation Application Manager versions 3.1, 3.2, 3.2.1, and 3.2.2 ABSTRACT: Multiple security

  17. Automated Transportation Logistics and Analysis System (ATLAS) | Department

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Energy Services » Waste Management » Packaging and Transportation » Automated Transportation Logistics and Analysis System (ATLAS) Automated Transportation Logistics and Analysis System (ATLAS) The Department of Energy's (DOE's) Automated Transportation Logistics and Analysis System is an integrated web-based logistics management system allowing users to manage inbound and outbound freight shipments by highway, rail, and air. PDF icon Automated Transportation Logistics and Analysis

  18. Automation World Features New White Paper on Wireless Security | Department

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Energy Automation World Features New White Paper on Wireless Security Automation World Features New White Paper on Wireless Security The April 2009 issue of Automation World magazine features the white paper Wireless Systems Considerations When Implementing NERC Critical Infrastructure Protection Standards. PDF icon Automation World Features New White Paper on Wireless Security More Documents & Publications Wireless System Considerations When Implementing NERC Critical Infrastructure

  19. ISA Approves Standard for Wireless Automation in Process Control

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Applications | Department of Energy ISA Approves Standard for Wireless Automation in Process Control Applications ISA Approves Standard for Wireless Automation in Process Control Applications On September 9, the Standards and Practices Board of the International Society for Automation (ISA) approved the ISA-100.11a wireless standard, "Wireless Systems for Industrial Automation: Process Control and Related Applications," making it an official ISA standard. PDF icon ISA Approves

  20. Reference Model for Control and Automation Systems in Electrical Power

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    (October 2005) | Department of Energy Model for Control and Automation Systems in Electrical Power (October 2005) Reference Model for Control and Automation Systems in Electrical Power (October 2005) Modern infrastructure automation systems are threatened by cyber attack. Their higher visibility in recent years and the increasing use of modern information technology (IT) components contribute to increased security risk. A means of analyzing these infrastructure automation systems is needed

  1. Classified Automated Information System Security Program

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1994-07-15

    To establish uniform requirements, policies, responsibilities, and procedures for the development and implementation of a Department of Energy (DOE) Classified Computer Security Program to ensure the security of classified information in automated data processing (ADP) systems. Cancels DOE O 5637.1. Canceled by DOE O 471.2.

  2. Apparatus for automated testing of biological specimens

    DOE Patents [OSTI]

    Layne, Scott P.; Beugelsdijk, Tony J.

    1999-01-01

    An apparatus for performing automated testing of infections biological specimens is disclosed. The apparatus comprise a process controller for translating user commands into test instrument suite commands, and a test instrument suite comprising a means to treat the specimen to manifest an observable result, and a detector for measuring the observable result to generate specimen test results.

  3. Automated Office Systems Support (AOSS) Quality Assurance Model |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy Automated Office Systems Support (AOSS) Quality Assurance Model Automated Office Systems Support (AOSS) Quality Assurance Model A quality assurance model, including checklists, for activity relative to network and desktop computer support. PDF icon Automated Office Systems Support (AOSS) Quality Assurance Model More Documents & Publications Audit Report: CR-B-97-04 CITSS Project Plan Quality Assurance Checklist

  4. Automation of Capacity Bidding with an Aggregator Using Open Automated Demand Response

    SciTech Connect (OSTI)

    Kiliccote, Sila; Piette, Mary Ann

    2008-10-01

    This report summarizes San Diego Gas& Electric Company?s collaboration with the Demand Response Research Center to develop and test automation capability for the Capacity Bidding Program in 2007. The report describes the Open Automated Demand Response architecture, summarizes the history of technology development and pilot studies. It also outlines the Capacity Bidding Program and technology being used by an aggregator that participated in this demand response program. Due to delays, the program was not fully operational for summer 2007. However, a test event on October 3, 2007, showed that the project successfully achieved the objective to develop and demonstrate how an open, Web?based interoperable automated notification system for capacity bidding can be used by aggregators for demand response. The system was effective in initiating a fully automated demand response shed at the aggregated sites. This project also demonstrated how aggregators can integrate their demand response automation systems with San Diego Gas& Electric Company?s Demand Response Automation Server and capacity bidding program.

  5. User:DWC Bot | Open Energy Information

    Open Energy Info (EERE)

    funds, proprietary trading companies and brokerage houses, by offering it's world-class financial technology consulting services. DWC solutions are bleeding edge and innovative...

  6. Saturn facility oil transfer automation system

    SciTech Connect (OSTI)

    Joseph, Nathan R.; Thomas, Rayburn Dean; Lewis, Barbara Ann; Malagon, Hector M.

    2014-02-01

    The Saturn accelerator, owned by Sandia National Laboratories, has been in operation since the early 1980s and still has many of the original systems. A critical legacy system is the oil transfer system which transfers 250,000 gallons of transformer oil from outside storage tanks to the Saturn facility. The oil transfer system was iden- ti ed for upgrade to current technology standards. Using the existing valves, pumps, and relay controls, the system was automated using the National Instruments cRIO FGPA platform. Engineered safety practices, including a failure mode e ects analysis, were used to develop error handling requirements. The uniqueness of the Saturn Oil Automated Transfer System (SOATS) is in the graphical user interface. The SOATS uses an HTML interface to communicate to the cRIO, creating a platform independent control system. The SOATS was commissioned in April 2013.

  7. Automated Auditing Tool for Retrofit Building Projects

    Energy Science and Technology Software Center (OSTI)

    2011-06-23

    Building energy auditors regularly use notepads, physical forms, or simple spreadsheets to inventory energy consuming devices in buildings and audit overall performance. Mobile computing devices such as smart phones or tablet computers with camera inputs may be used to automatically capture relevant information and format audit input in a way that streamlines work flows and reduces the likelihood of error. As an example. an auditor could walk through a space holding a mobile device, whichmore » automatically identifies and appliances, windows, etc. This information would automatically be added to a mobile database associated with the building for later integration with a larger building audit database. The user experience would require little or no manual input, and could integrate with tools to automate used to automate data collection for building energy modeling.« less

  8. Flow through electrode with automated calibration

    DOE Patents [OSTI]

    Szecsody, James E [Richland, WA; Williams, Mark D [Richland, WA; Vermeul, Vince R [Richland, WA

    2002-08-20

    The present invention is an improved automated flow through electrode liquid monitoring system. The automated system has a sample inlet to a sample pump, a sample outlet from the sample pump to at least one flow through electrode with a waste port. At least one computer controls the sample pump and records data from the at least one flow through electrode for a liquid sample. The improvement relies upon (a) at least one source of a calibration sample connected to (b) an injection valve connected to said sample outlet and connected to said source, said injection valve further connected to said at least one flow through electrode, wherein said injection valve is controlled by said computer to select between said liquid sample or said calibration sample. Advantages include improved accuracy because of more frequent calibrations, no additional labor for calibration, no need to remove the flow through electrode(s), and minimal interruption of sampling.

  9. Automated Algorithm for MFRSR Data Analysis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Automated Algorithm for MFRSR Data Analysis M. D. Alexandrov and B. Cairns Columbia University and National Aeronautics and Space Administration Goddard Institute for Space Studies New York, New York A. A. Lacis and B. E. Carlson National Aeronautics and Space Administration Goddard Institute for Space Studies New York, New York A. Marshak National Aeronautics and Space Administration Goddard Space Flight Center Greenbelt, Maryland We present a substantial upgrade of our previously developed

  10. An advanced power distribution automation model system

    SciTech Connect (OSTI)

    Niwa, Shigeharu; Kanoi, Minoru; Nishijima, Kazuo; Hayami, Mitsuo

    1995-12-31

    An advanced power distribution automation (APDA) model system has been developed on the present basis of the automated distribution systems in Japan, which have been used for remote switching operations and for urgent supply restorations during faults. The increased use of electronic apparatuses sensitive to supply interruption requires very high supply reliability, and the final developed system is expected to be useful for this purpose. The developed model system adopts pole circuit breakers and remote termination units connected through 64kbps optical fibers to the computer of the automated system in the control center. Immediate switching operations for supply restorations during faults are possible through the restoration procedures, prepared beforehand, by the computer and by fast telecommunications using optical fibers. So, protection by the feeder circuit breaker in the substation can be avoided, which would otherwise cause the blackout of the whole distribution line. The test results show the effectiveness of model the system: successful fault locations and reconfiguration for supply restoration including separation of the fault sections (without blackout for the ground faults and with a short period (within 1 s) of blackout for the short-circuit faults).

  11. Automating Ontological Annotation with WordNet

    SciTech Connect (OSTI)

    Sanfilippo, Antonio P.; Tratz, Stephen C.; Gregory, Michelle L.; Chappell, Alan R.; Whitney, Paul D.; Posse, Christian; Paulson, Patrick R.; Baddeley, Bob L.; Hohimer, Ryan E.; White, Amanda M.

    2006-01-22

    Semantic Web applications require robust and accurate annotation tools that are capable of automating the assignment of ontological classes to words in naturally occurring text (ontological annotation). Most current ontologies do not include rich lexical databases and are therefore not easily integrated with word sense disambiguation algorithms that are needed to automate ontological annotation. WordNet provides a potentially ideal solution to this problem as it offers a highly structured lexical conceptual representation that has been extensively used to develop word sense disambiguation algorithms. However, WordNet has not been designed as an ontology, and while it can be easily turned into one, the result of doing this would present users with serious practical limitations due to the great number of concepts (synonym sets) it contains. Moreover, mapping WordNet to an existing ontology may be difficult and requires substantial labor. We propose to overcome these limitations by developing an analytical platform that (1) provides a WordNet-based ontology offering a manageable and yet comprehensive set of concept classes, (2) leverages the lexical richness of WordNet to give an extensive characterization of concept class in terms of lexical instances, and (3) integrates a class recognition algorithm that automates the assignment of concept classes to words in naturally occurring text. The ensuing framework makes available an ontological annotation platform that can be effectively integrated with intelligence analysis systems to facilitate evidence marshaling and sustain the creation and validation of inference models.

  12. Home Network Technologies and Automating Demand Response

    SciTech Connect (OSTI)

    McParland, Charles

    2009-12-01

    Over the past several years, interest in large-scale control of peak energy demand and total consumption has increased. While motivated by a number of factors, this interest has primarily been spurred on the demand side by the increasing cost of energy and, on the supply side by the limited ability of utilities to build sufficient electricity generation capacity to meet unrestrained future demand. To address peak electricity use Demand Response (DR) systems are being proposed to motivate reductions in electricity use through the use of price incentives. DR systems are also be design to shift or curtail energy demand at critical times when the generation, transmission, and distribution systems (i.e. the 'grid') are threatened with instabilities. To be effectively deployed on a large-scale, these proposed DR systems need to be automated. Automation will require robust and efficient data communications infrastructures across geographically dispersed markets. The present availability of widespread Internet connectivity and inexpensive, reliable computing hardware combined with the growing confidence in the capabilities of distributed, application-level communications protocols suggests that now is the time for designing and deploying practical systems. Centralized computer systems that are capable of providing continuous signals to automate customers reduction of power demand, are known as Demand Response Automation Servers (DRAS). The deployment of prototype DRAS systems has already begun - with most initial deployments targeting large commercial and industrial (C & I) customers. An examination of the current overall energy consumption by economic sector shows that the C & I market is responsible for roughly half of all energy consumption in the US. On a per customer basis, large C & I customers clearly have the most to offer - and to gain - by participating in DR programs to reduce peak demand. And, by concentrating on a small number of relatively sophisticated energy consumers, it has been possible to improve the DR 'state of the art' with a manageable commitment of technical resources on both the utility and consumer side. Although numerous C & I DR applications of a DRAS infrastructure are still in either prototype or early production phases, these early attempts at automating DR have been notably successful for both utilities and C & I customers. Several factors have strongly contributed to this success and will be discussed below. These successes have motivated utilities and regulators to look closely at how DR programs can be expanded to encompass the remaining (roughly) half of the state's energy load - the light commercial and, in numerical terms, the more important residential customer market. This survey examines technical issues facing the implementation of automated DR in the residential environment. In particular, we will look at the potential role of home automation networks in implementing wide-scale DR systems that communicate directly to individual residences.

  13. Multiplex automated genome engineering (Patent) | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    Multiplex automated genome engineering Citation Details In-Document Search Title: Multiplex automated genome engineering The present invention relates to automated methods of introducing multiple nucleic acid sequences into one or more target cells. Authors: Church, George M ; Wang, Harris H ; Isaacs, Farren J Publication Date: 2013-10-29 OSTI Identifier: 1107638 Report Number(s): 8,569,041 13/411,712 DOE Contract Number: FG02-02ER63445 Resource Type: Patent Research Org: Harvard University,

  14. Building America Expert Meeting: Minutes from Automated Home Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Management System | Department of Energy Minutes from Automated Home Energy Management System Building America Expert Meeting: Minutes from Automated Home Energy Management System These meeting minutes are from the U.S. Department of Energy Building America program expert meeting titled "Automated Home Energy Management System," held on October 1-2, 2010 in Denver, Colorado. PDF icon ahem_expert_meeting_minutes.pdf More Documents & Publications 2012 Smart Grid Peer Review

  15. Honeywell Demonstrates Automated Demand Response Benefits for Utility,

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Commercial, and Industrial Customers | Department of Energy Honeywell Demonstrates Automated Demand Response Benefits for Utility, Commercial, and Industrial Customers Honeywell Demonstrates Automated Demand Response Benefits for Utility, Commercial, and Industrial Customers September 22, 2014 - 5:59pm Addthis Honeywell's Smart Grid Investment Grant (SGIG) project demonstrates utility-scale performance of a hardware/software platform for automated demand response (ADR). This project stands

  16. Automated Home Energy Management (AHEM) Standing Technical Committee

    Energy Savers [EERE]

    Strategic Plan - February 2012 | Department of Energy Home Energy Management (AHEM) Standing Technical Committee Strategic Plan - February 2012 Automated Home Energy Management (AHEM) Standing Technical Committee Strategic Plan - February 2012 This report outlines the gaps, barriers, and opportunities in automated home energy management tools, as outlined by the Building America Standing Technical Committee. PDF icon strategic_plan_ahem_2_12.pdf More Documents & Publications Automated

  17. Aescusoft GmbH Automation | Open Energy Information

    Open Energy Info (EERE)

    Name: Aescusoft GmbH Automation Place: Ettenheim, Germany Zip: 77955 Product: Offers PV cell testing lines. Coordinates: 48.256309, 7.813654 Show Map Loading map......

  18. Demonstrations of Integrated Advanced RTU Controls and Automated...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Demonstrations of Integrated Advanced RTU Controls and Automated Fault Detection and ... of smart monitoring and diagnostic system in the field and lessons learned; 131...

  19. V-132: IBM Tivoli System Automation Application Manager Multiple...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    V-132: IBM Tivoli System Automation Application Manager Multiple Vulnerabilities April 12, ... T-694: IBM Tivoli Federated Identity Manager Products Multiple Vulnerabilities V-145: IBM ...

  20. Automated Home Energy Management (AHEM) Standing Technical Committee...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Home Energy Management (AHEM) Standing Technical Committee Strategic Plan - February 2012 Automated Home Energy Management (AHEM) Standing Technical Committee Strategic Plan - ...

  1. Automation and security of Supply (Smart Grid Project) | Open...

    Open Energy Info (EERE)

    and security of Supply (Smart Grid Project) Jump to: navigation, search Project Name Automation and security of Supply Country Denmark Coordinates 56.26392, 9.501785 Loading...

  2. Multiplex automated genome engineering (Patent) | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    The present invention relates to automated methods of introducing multiple nucleic acid sequences into one or more target cells. Authors: Church, George M ; Wang, Harris H ; ...

  3. The present invention relates to automated methods of introducing...

    Office of Scientific and Technical Information (OSTI)

    The present invention relates to automated methods of introducing multiple nucleic acid sequences into one or more target cells. Authors: Church, George M. 1 ; Wang, Harris H. ...

  4. V-205: IBM Tivoli System Automation for Multiplatforms Java Multiple...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Automation Application Manager Multiple Vulnerabilities V-145: IBM Tivoli Federated Identity Manager Products Java Multiple Vulnerabilities V-122: IBM Tivoli Application...

  5. Automated inspection of hot steel slabs

    DOE Patents [OSTI]

    Martin, Ronald J.

    1985-01-01

    The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes.

  6. Automated inspection of hot steel slabs

    DOE Patents [OSTI]

    Martin, R.J.

    1985-12-24

    The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes. 5 figs.

  7. Automated macromolecular crystal detection system and method

    DOE Patents [OSTI]

    Christian, Allen T.; Segelke, Brent; Rupp, Bernard; Toppani, Dominique

    2007-06-05

    An automated macromolecular method and system for detecting crystals in two-dimensional images, such as light microscopy images obtained from an array of crystallization screens. Edges are detected from the images by identifying local maxima of a phase congruency-based function associated with each image. The detected edges are segmented into discrete line segments, which are subsequently geometrically evaluated with respect to each other to identify any crystal-like qualities such as, for example, parallel lines, facing each other, similarity in length, and relative proximity. And from the evaluation a determination is made as to whether crystals are present in each image.

  8. Automated Expert Modeling and Student Evaluation

    Energy Science and Technology Software Center (OSTI)

    2012-09-12

    AEMASE searches a database of recorded events for combinations of events that are of interest. It compares matching combinations to a statistical model to determine similarity to previous events of interest and alerts the user as new matching examples are found. AEMASE is currently used by weapons tactics instructors to find situations of interest in recorded tactical training scenarios. AEMASE builds on a sub-component, the Relational Blackboard (RBB), which is being released as open-source software.more » AEMASE builds on RBB adding interactive expert model construction (automated knowledge capture) and re-evaluation of scenario data.« less

  9. Automated Knowledge Annotation for Dynamic Collaborative Environments

    SciTech Connect (OSTI)

    Cowell, Andrew J.; Gregory, Michelle L.; Marshall, Eric J.; McGrath, Liam R.

    2009-05-19

    This paper describes the Knowledge Encapsulation Framework (KEF), a suite of tools to enable automated knowledge annotation for modeling and simulation projects. This framework can be used to capture evidence (e.g., facts extracted from journal articles and government reports), discover new evidence (from similar peer-reviewed material as well as social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks.

  10. Automated fluid analysis apparatus and techniques

    DOE Patents [OSTI]

    Szecsody, James E.

    2004-03-16

    An automated device that couples a pair of differently sized sample loops with a syringe pump and a source of degassed water. A fluid sample is mounted at an inlet port and delivered to the sample loops. A selected sample from the sample loops is diluted in the syringe pump with the degassed water and fed to a flow through detector for analysis. The sample inlet is also directly connected to the syringe pump to selectively perform analysis without dilution. The device is airtight and used to detect oxygen-sensitive species, such as dithionite in groundwater following a remedial injection to treat soil contamination.

  11. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    SciTech Connect (OSTI)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-07-13

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion / analyte redox adjustment, chemical separations, radiochemical detection and data analysis / reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs.

  12. Task automation in a successful industrial telerobot

    SciTech Connect (OSTI)

    Spelt, P.F.; Jones, S.L.

    1994-01-01

    In this paper, we discuss cooperative work by Oak Ridge National Laboratory and Remotec{trademark}, Inc., to automate components of the operator`s workload using Remotec`s Andros telerobot, thereby providing an enhanced user interface which can be retroll to existing fielded units as well as being incorporated into now production units. Remotec`s Andros robots are presently used by numerous electric utilities to perform tasks in reactors where substantial exposure to radiation exists, as well as by the armed forces and numerous law enforcement agencies. The automation of task components, as well as the video graphics display of the robot`s position in the environment, will enhance all tasks performed by these users, as well as enabling performance in terrain where the robots cannot presently perform due to lack of knowledge about, for instance, the degree of tilt of the robot. Enhanced performance of a successful industrial mobile robot leads to increased safety and efficiency of performances in hazardous environments. The addition of these capabilities will greatly enhance the utility of the robot, as well as its marketability.

  13. Sandia National Laboratories: Research: High Consequence, Automation, &

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Robotics: Advanced Controls Controls Robotics Homepage About Robotics Research & Development Advanced Controls One-Control Many Swarm Control Technology Multi-Robot Cooperative Behavior Advanced Manipulation Cybernetics High-Consequence Automation Perception and Decision Tools Unique Mobility Facilities Publications and Factsheets Robotics Image Gallery Robotics Videos Contact Robotics Research Advanced Controls Swarm Sandia's High Consequence, Automation, & Robotics (HCAR) team

  14. Sandia National Laboratories: Research: High Consequence, Automation, &

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Robotics: Facilities High Consequence, Automation, & Robotics Robotics Homepage About Robotics Research & Development Advanced Controls Advanced Manipulation Cybernetics High-Consequence Automation Perception and Decision Tools Unique Mobility Facilities Publications and Factsheets Robotics Image Gallery Robotics Videos Contact Robotics Research Facilities Advancing the evolution of robotic & intelligent system technologies Robot Vehicle Range A cutting-edge outdoor test &

  15. Microsoft Word - Wireless Automation World for OE FINAL.doc

    Energy Savers [EERE]

    Automation World Features New White Paper on Wireless Security, Interviews Authors April 16, 2009 The April 2009 issue of Automation World magazine features the white paper Wireless Systems Considerations When Implementing NERC Critical Infrastructure Protection Standards. The paper addresses wireless protection issues arising from requirements of the Critical Infrastructure Protection (CIP) Standards for the electricity sector, developed by the North American Electric Reliability Corporation

  16. Automated interferometric alignment system for paraboloidal mirrors

    DOE Patents [OSTI]

    Maxey, L. Curtis

    1993-01-01

    A method is described for a systematic method of interpreting interference fringes obtained by using a corner cube retroreflector as an alignment aid when aigning a paraboloid to a spherical wavefront. This is applicable to any general case where such alignment is required, but is specifically applicable in the case of aligning an autocollimating test using a diverging beam wavefront. In addition, the method provides information which can be systematically interpreted such that independent information about pitch, yaw and focus errors can be obtained. Thus, the system lends itself readily to automation. Finally, although the method is developed specifically for paraboloids, it can be seen to be applicable to a variety of other aspheric optics when applied in combination with a wavefront corrector that produces a wavefront which, when reflected from the correctly aligned aspheric surface will produce a collimated wavefront like that obtained from the paraboloid when it is correctly aligned to a spherical wavefront.

  17. Automated interferometric alignment system for paraboloidal mirrors

    DOE Patents [OSTI]

    Maxey, L.C.

    1993-09-28

    A method is described for a systematic method of interpreting interference fringes obtained by using a corner cube retroreflector as an alignment aid when aligning a paraboloid to a spherical wavefront. This is applicable to any general case where such alignment is required, but is specifically applicable in the case of aligning an autocollimating test using a diverging beam wavefront. In addition, the method provides information which can be systematically interpreted such that independent information about pitch, yaw and focus errors can be obtained. Thus, the system lends itself readily to automation. Finally, although the method is developed specifically for paraboloids, it can be seen to be applicable to a variety of other aspheric optics when applied in combination with a wavefront corrector that produces a wavefront which, when reflected from the correctly aligned aspheric surface will produce a collimated wavefront like that obtained from the paraboloid when it is correctly aligned to a spherical wavefront. 14 figures.

  18. An automated neutron monitor maintenance system

    SciTech Connect (OSTI)

    Moore, F.S.; Griffin, J.C.; Odell, D.M.C.

    1996-09-01

    Neutron detectors are commonly used by the nuclear materials processing industry to monitor fissile materials in process vessels and tanks. The proper functioning of these neutron monitors must be periodically evaluated. We have developed and placed in routine use a PC-based multichannel analyzer (MCA) system for on-line BF3 and He-3 gas-filled detector function testing. The automated system: 1) acquires spectral data from the monitor system, 2) analyzes the spectrum to determine the detector`s functionality, 3) makes suggestions for maintenance or repair, as required, and 4) saves the spectrum and results to disk for review. The operator interface has been designed to be user-friendly and to minimize the training requirements of the user. The system may also be easily customized for various applications

  19. Automated DNA Base Pair Calling Algorithm

    Energy Science and Technology Software Center (OSTI)

    1999-07-07

    The procedure solves the problem of calling the DNA base pair sequence from two channel electropherogram separations in an automated fashion. The core of the program involves a peak picking algorithm based upon first, second, and third derivative spectra for each electropherogram channel, signal levels as a function of time, peak spacing, base pair signal to noise sequence patterns, frequency vs ratio of the two channel histograms, and confidence levels generated during the run. Themore » ratios of the two channels at peak centers can be used to accurately and reproducibly determine the base pair sequence. A further enhancement is a novel Gaussian deconvolution used to determine the peak heights used in generating the ratio.« less

  20. Automated eXpert Spectral Image Analysis

    Energy Science and Technology Software Center (OSTI)

    2003-11-25

    AXSIA performs automated factor analysis of hyperspectral images. In such images, a complete spectrum is collected an each point in a 1-, 2- or 3- dimensional spatial array. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful information. Multivariate factor analysis techniques have proven effective for extracting the essential information from high dimensional data sets into a limtedmore » number of factors that describe the spectral characteristics and spatial distributions of the pure components comprising the sample. AXSIA provides tools to estimate different types of factor models including Singular Value Decomposition (SVD), Principal Component Analysis (PCA), PCA with factor rotation, and Alternating Least Squares-based Multivariate Curve Resolution (MCR-ALS). As part of the analysis process, AXSIA can automatically estimate the number of pure components that comprise the data and can scale the data to account for Poisson noise. The data analysis methods are fundamentally based on eigenanalysis of the data crossproduct matrix coupled with orthogonal eigenvector rotation and constrained alternating least squares refinement. A novel method for automatically determining the number of significant components, which is based on the eigenvalues of the crossproduct matrix, has also been devised and implemented. The data can be compressed spectrally via PCA and spatially through wavelet transforms, and algorithms have been developed that perform factor analysis in the transform domain while retaining full spatial and spectral resolution in the final result. These latter innovations enable the analysis of larger-than core-memory spectrum-images. AXSIA was designed to perform automated chemical phase analysis of spectrum-images acquired by a variety of chemical imaging techniques. Successful applications include Energy Dispersive X-ray Spectroscopy, X-ray Fluorescence Spectroscopy, Laser-Induced Fluorescence Spectroscopy and Time-of-Flight Secondary Ion Mass Spectroscopy.« less

  1. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat; Cheng, Danling; Broadwater, Robert P.; Scirbona, Charlie; Cocks, George; Hamilton, Stephanie; Wang, Xiaoyu

    2016-02-26

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  2. Identifying Requirements for Effective Human-Automation Teamwork

    SciTech Connect (OSTI)

    Jeffrey C. Joe; John O'Hara; Heather D. Medema; Johanna H. Oxstrand

    2014-06-01

    Previous studies have shown that poorly designed human-automation collaboration, such as poorly designed communication protocols, often leads to problems for the human operators, such as: lack of vigilance, complacency, and loss of skills. These problems often lead to suboptimal system performance. To address this situation, a considerable amount of research has been conducted to improve human-automation collaboration and to make automation function better as a team player. Much of this research is based on an understanding of what it means to be a good team player from the perspective of a human team. However, the research is often based on a simplified view of human teams and teamwork. In this study, we sought to better understand the capabilities and limitations of automation from the standpoint of human teams. We first examined human teams to identify the principles for effective teamwork. We next reviewed the research on integrating automation agents and human agents into mixed agent teams to identify the limitations of automation agents to conform to teamwork principles. This research resulted in insights that can lead to more effective human-automation collaboration by enabling a more realistic set of requirements to be developed based on the strengths and limitations of all agents.

  3. Automated soil gas monitoring chamber (Patent) | SciTech Connect

    Office of Scientific and Technical Information (OSTI)

    Title: Automated soil gas monitoring chamber A chamber for trapping soil gases as they evolve from the soil without disturbance to the soil and to the natural microclimate within ...

  4. Automated Home Energy Management Standing Technical Committee Presentation

    Broader source: Energy.gov [DOE]

    This presentation outlines the goals of the Automated Home Energy Management Standing Technical Committee, as presented at the Building America Spring 2012 Stakeholder meeting on February 29, 2012, in Austin, Texas.

  5. Building America Expert Meeting: Minutes from Automated Home...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    are from the U.S. Department of Energy Building America program expert meeting titled "Automated Home Energy Management System," held on October 1-2, 2010 in Denver, Colorado. ...

  6. Manz Automation India Pvt Ltd | Open Energy Information

    Open Energy Info (EERE)

    India Pvt Ltd Jump to: navigation, search Name: Manz Automation India Pvt Ltd Place: New Delhi, Delhi (NCT), India Product: JV set up to sell cell and module manufacturing and test...

  7. Automated Design Space Exploration with Aspen

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Spafford, Kyle L.; Vetter, Jeffrey S.

    2015-01-01

    Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less

  8. Automated radiotherapy treatment plan integrity verification

    SciTech Connect (OSTI)

    Yang Deshan; Moore, Kevin L.

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method of dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.

  9. Automation Enhancement of Multilayer Laue Lenses

    SciTech Connect (OSTI)

    Lauer K. R.; Conley R.

    2010-12-01

    X-ray optics fabrication at Brookhaven National Laboratory has been facilitated by a new, state of the art magnetron sputtering physical deposition system. With its nine magnetron sputtering cathodes and substrate carrier that moves on a linear rail via a UHV brushless linear servo motor, the system is capable of accurately depositing the many thousands of layers necessary for multilayer Laue lenses. I have engineered a versatile and automated control program from scratch for the base system and many subsystems. Its main features include a custom scripting language, a fully customizable graphical user interface, wireless and remote control, and a terminal-based interface. This control system has already been successfully used in the creation of many types of x-ray optics, including several thousand layer multilayer Laue lenses.Before reaching the point at which a deposition can be run, stencil-like masks for the sputtering cathodes must be created to ensure the proper distribution of sputtered atoms. Quality of multilayer Laue lenses can also be difficult to measure, given the size of the thin film layers. I employ my knowledge of software and algorithms to further ease these previously painstaking processes with custom programs. Additionally, I will give an overview of an x-ray optic simulator package I helped develop during the summer of 2010. In the interest of keeping my software free and open, I have worked mostly with the multiplatform Python and the PyQt application framework, utilizing C and C++ where necessary.

  10. Chip breaking system for automated machine tool

    DOE Patents [OSTI]

    Arehart, Theodore A.; Carey, Donald O.

    1987-01-01

    The invention is a rotary selectively directional valve assembly for use in an automated turret lathe for directing a stream of high pressure liquid machining coolant to the interface of a machine tool and workpiece for breaking up ribbon-shaped chips during the formation thereof so as to inhibit scratching or other marring of the machined surfaces by these ribbon-shaped chips. The valve assembly is provided by a manifold arrangement having a plurality of circumferentially spaced apart ports each coupled to a machine tool. The manifold is rotatable with the turret when the turret is positioned for alignment of a machine tool in a machining relationship with the workpiece. The manifold is connected to a non-rotational header having a single passageway therethrough which conveys the high pressure coolant to only the port in the manifold which is in registry with the tool disposed in a working relationship with the workpiece. To position the machine tools the turret is rotated and one of the tools is placed in a material-removing relationship of the workpiece. The passageway in the header and one of the ports in the manifold arrangement are then automatically aligned to supply the machining coolant to the machine tool workpiece interface for breaking up of the chips as well as cooling the tool and workpiece during the machining operation.

  11. Automated D/3 to Visio Analog Diagrams

    Energy Science and Technology Software Center (OSTI)

    2000-08-10

    ADVAD1 reads an ASCII file containing the D/3 DCS MDL input for analog points for a D/3 continuous database. It uses the information in the files to create a series of Visio files representing the structure of each analog chain, one drawing per Visio file. The actual drawing function is performed by Visio (requires Visio version 4.5+). The user can configure the program to select which fields in the database are shown on the diagrammore » and how the information is to be presented. This gives a visual representation of the structure of the analog chains, showing selected fields in a consistent manner. Updating documentation can be done easily and the automated approach eliminates human error in the cadding process. The program can also create the drawings far faster than a human operator is capable, able to create approximately 270 typical diagrams in about 8 minutes on a Pentium II 400 MHz PC. The program allows for multiple option sets to be saved to provide different settings (i.e., different fields, different field presentations, and /or different diagram layouts) for various scenarios or facilities on one workstation. Option sets may be exported from the Windows registry to allow duplication of settings on another workstation.« less

  12. Highly Insulating Residential Windows Using Smart Automated Shading |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy Highly Insulating Residential Windows Using Smart Automated Shading Highly Insulating Residential Windows Using Smart Automated Shading Addthis 1 of 3 Residential Smart Window with integrated sensors, control logic and a motorized shade between glass panes. Image: Lawrence Berkeley National Laboratory 2 of 3 Residential Smart Window with integrated sensors, control logic and a motorized shade between glass panes. Image: Lawrence Berkeley National Laboratory 3 of 3

  13. Automated Process for the Fabrication of Highly Customized Thermally

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Insulated Cladding Systems | Department of Energy Automated Process for the Fabrication of Highly Customized Thermally Insulated Cladding Systems Automated Process for the Fabrication of Highly Customized Thermally Insulated Cladding Systems 1 of 2 Resin casting prototype Image: Worcester Polytechnic Institute 2 of 2 A project member completes cuts foam insulating via a process known as computer numerically controlled (CNC) foam cutting. Image: Worcester Polytechnic Institute Lead Performer:

  14. NREL: Energy Systems Integration - ESIF Fueling Robot Automates Hydrogen

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Hose Reliability Testing ESIF Fueling Robot Automates Hydrogen Hose Reliability Testing Watch how an automated robot in the Energy Systems Integration Facility (ESIF) mimics fueling action to test hydrogen hoses for durability in real-world conditions. Text version Learn more about this work in this fact sheet. Printable Version Energy Systems Integration Home Capabilities Research & Development Facilities Working with Us Publications News Newsletter Energy Systems Integration News

  15. Sandia National Laboratories: Research: High Consequence, Automation, &

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Robotics: One-Control Many One-Control Many One Control Many Diagram As unmanned systems (UMS) are increasingly used in the battlefield, advantages provided by strategy, tactics, and training must be translated into UMS control systems. It's a challenge to effectively control large numbers of UMS. The human operator must focus on high-level perception, tactics, and strategy while the system automates lower-level functions. High Consequence, Automation, & Robotics (HCAR) is working to

  16. Scalable HPC monitoring and analysis for understanding and automated

    Office of Scientific and Technical Information (OSTI)

    response. (Conference) | SciTech Connect Scalable HPC monitoring and analysis for understanding and automated response. Citation Details In-Document Search Title: Scalable HPC monitoring and analysis for understanding and automated response. No abstract prepared. Authors: Mayo, Jackson R. ; Chen, Frank Xiaoxiao ; Pebay, Philippe Pierre ; Wong, Matthew H. ; Thompson, David ; Gentile, Ann C. ; Roe, Diana C. ; De Sapio, Vincent ; Brandt, James M. Publication Date: 2010-10-01 OSTI Identifier:

  17. Global health response more accurate with automated influenza surveillance

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Global health response more accurate with automated influenza surveillance Global health response more accurate with automated influenza surveillance Public health officials will be able to determine whether an outbreak of an infectious disease comes from a pandemic strain or one less virulent. January 31, 2011 Lance Green of LANL tests an earlier version of a modular laboratory like the ones that will be part of the High-Throughput Laboratory Network Lance Green of LANL tests an earlier version

  18. Automated Process for the Fabrication of Highly Customized Thermally

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Insulated Cladding Systems | Department of Energy Automated Process for the Fabrication of Highly Customized Thermally Insulated Cladding Systems Automated Process for the Fabrication of Highly Customized Thermally Insulated Cladding Systems Addthis 1 of 2 Resin casting prototype Image: Worcester Polytechnic Institute 2 of 2 A project member completes cuts foam insulating via a process known as computer numerically controlled (CNC) foam cutting. Image: Worcester Polytechnic Institute

  19. Automated Steel Cleanliness Analysis Tool (ASCAT) | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Automated Steel Cleanliness Analysis Tool (ASCAT) Automated Steel Cleanliness Analysis Tool (ASCAT) New Microscopy System Improves Steel Mill Performance and Allows Production of Higher Quality Steel Inclusions are particles of insoluble impurities formed during steelmaking and casting operations that are entrapped during solidification of metal. Characterizing inclusions is important because of an increasing demand for cleaner steels with low inclusion (defect) content. The composition, and

  20. Automated Demand Response Opportunities in Wastewater Treatment Facilities

    SciTech Connect (OSTI)

    Thompson, Lisa; Song, Katherine; Lekov, Alex; McKane, Aimee

    2008-11-19

    Wastewater treatment is an energy intensive process which, together with water treatment, comprises about three percent of U.S. annual energy use. Yet, since wastewater treatment facilities are often peripheral to major electricity-using industries, they are frequently an overlooked area for automated demand response opportunities. Demand response is a set of actions taken to reduce electric loads when contingencies, such as emergencies or congestion, occur that threaten supply-demand balance, and/or market conditions occur that raise electric supply costs. Demand response programs are designed to improve the reliability of the electric grid and to lower the use of electricity during peak times to reduce the total system costs. Open automated demand response is a set of continuous, open communication signals and systems provided over the Internet to allow facilities to automate their demand response activities without the need for manual actions. Automated demand response strategies can be implemented as an enhanced use of upgraded equipment and facility control strategies installed as energy efficiency measures. Conversely, installation of controls to support automated demand response may result in improved energy efficiency through real-time access to operational data. This paper argues that the implementation of energy efficiency opportunities in wastewater treatment facilities creates a base for achieving successful demand reductions. This paper characterizes energy use and the state of demand response readiness in wastewater treatment facilities and outlines automated demand response opportunities.

  1. Process development for automated solar-cell and module production. Task 4. Automated array assembly. Quarterly report No. 3

    SciTech Connect (OSTI)

    Hagerty, J. J.; Gifford, M.

    1981-04-15

    The Automated Lamination Station is mechanically complete and is currently undergoing final wiring. The high current driver and isolator boards have been completed and installed, and the main interface board is under construction. The automated vacuum chamber has had a minor redesign to increase stiffness and improve the cover open/close mechanism. Design of the Final Assembly Station has been completed and construction is underway.

  2. Automated Cache Performance Analysis And Optimization

    SciTech Connect (OSTI)

    Mohror, Kathryn

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done by hand requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters, cache behavior could only be measured reliably in the ag- gregate across tens or hundreds of thousands of instructions. With the newest iteration of PEBS technology, cache events can be tied to a tuple of instruction pointer, target address (for both loads and stores), memory hierarchy, and observed latency. With this information we can now begin asking questions regarding the efficiency of not only regions of code, but how these regions interact with particular data structures and how these interactions evolve over time. In the short term, this information will be vital for performance analysts understanding and optimizing the behavior of their codes for the memory hierarchy. In the future, we can begin to ask how data layouts might be changed to improve performance and, for a particular application, what the theoretical optimal performance might be. The overall benefit to be produced by this effort was a commercial quality easy-to- use and scalable performance tool that will allow both beginner and experienced parallel programmers to automatically tune their applications for optimal cache usage. Effective use of such a tool can literally save weeks of performance tuning effort. Easy to use. With the proposed innovations, finding and fixing memory performance issues would be more automated and hide most to all of the performance engineer exper- tise under the hood of the Open|SpeedShop performance tool. One of the biggest public benefits from the proposed innovations is that it makes performance analysis more usable to a larger group of application developers. Intuitive reporting of results. The Open|SpeedShop performance analysis tool has a rich set of intuitive, yet detailed reports for presenting performance results to application developers. Our goal was to leverage this existing technology to present the results from our memory performance addition to Open|SpeedShop. Suitable for experts as well as novices. Application performance is getting more difficult to measure as the hardware platforms they run on become more complicated. This makes life dif

  3. Automated Proactive Techniques for Commissioning Air-Handling Units

    SciTech Connect (OSTI)

    Katipamula, Srinivas ); Brambley, Michael R. ); Luskay, Larry

    2003-08-30

    Many buildings today use sophisticated building automation systems (BASs) to manage a wide and varied range of building systems. Although the capabilities of the BASs seem to have increased over time, many buildings still are not properly commissioned, operated or maintained. Lack of or improper commissioning, the inability of the building operators to grasp the complex controls, and lack of proper maintenance leads to inefficient operations and reduced lifetimes of the equipment. If regularly scheduled manual maintenance or re-commissioning practices are adopted, they can be expensive and time consuming. Automated proactive commissioning and diagnostic technologies address two of the main barriers to commissioning: cost and schedules. Automated proactive continuous commissioning tools can reduce both the cost and time associated with commissioning, as well as enhance the persistence of commissioning fixes. In the long run, automation even offers the potential for automatically correcting problems by reconfiguring controls or changing control algorithms dynamically. This paper will discuss procedures and processes that can be used to automate and continuously commission the economizer operation and outdoor-air ventilation systems of an air-handling unit.

  4. Open Automated Demand Response Communications Specification (Version 1.0)

    SciTech Connect (OSTI)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  5. Framework for Human-Automation Collaboration: Conclusions from Four Studies

    SciTech Connect (OSTI)

    Johanna Oxstrand; Katya L. Le Blanc; John O'Hara; Jeffrey C. Joe; April M. Whaley; Heather Medema

    2013-11-01

    The Human Automation Collaboration (HAC) research project is investigating how advanced technologies that are planned for Advanced Small Modular Reactors (AdvSMR) will affect the performance and the reliability of the plant from a human factors and human performance perspective. The HAC research effort investigates the consequences of allocating functions between the operators and automated systems. More specifically, the research team is addressing how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. Oxstrand et al. (2013 - March) describes the efforts conducted by the researchers to identify the research needs for HAC. The research team reviewed the literature on HAC, developed a model of HAC, and identified gaps in the existing knowledge of human-automation collaboration. As described in Oxstrand et al. (2013 – June), the team then prioritized the research topics identified based on the specific needs in the context of AdvSMR. The prioritization was based on two sources of input: 1) The preliminary functions and tasks, and 2) The model of HAC. As a result, three analytical studies were planned and conduced; 1) Models of Teamwork, 2) Standardized HAC Performance Measurement Battery, and 3) Initiators and Triggering Conditions for Adaptive Automation. Additionally, one field study was also conducted at Idaho Falls Power.

  6. Integrated, Automated Distributed Generation Technologies Demonstration

    SciTech Connect (OSTI)

    Jensen, Kevin

    2014-09-30

    The purpose of the NETL Project was to develop a diverse combination of distributed renewable generation technologies and controls and demonstrate how the renewable generation could help manage substation peak demand at the ATK Promontory plant site. The Promontory plant site is located in the northwestern Utah desert approximately 25 miles west of Brigham City, Utah. The plant encompasses 20,000 acres and has over 500 buildings. The ATK Promontory plant primarily manufactures solid propellant rocket motors for both commercial and government launch systems. The original project objectives focused on distributed generation; a 100 kW (kilowatt) wind turbine, a 100 kW new technology waste heat generation unit, a 500 kW energy storage system, and an intelligent system-wide automation system to monitor and control the renewable energy devices then release the stored energy during the peak demand time. The original goal was to reduce peak demand from the electrical utility company, Rocky Mountain Power (RMP), by 3.4%. For a period of time we also sought to integrate our energy storage requirements with a flywheel storage system (500 kW) proposed for the Promontory/RMP Substation. Ultimately the flywheel storage system could not meet our project timetable, so the storage requirement was switched to a battery storage system (300 kW.) A secondary objective was to design/install a bi-directional customer/utility gateway application for real-time visibility and communications between RMP, and ATK. This objective was not achieved because of technical issues with RMP, ATK Information Technology Department’s stringent requirements based on being a rocket motor manufacturing facility, and budget constraints. Of the original objectives, the following were achieved: • Installation of a 100 kW wind turbine. • Installation of a 300 kW battery storage system. • Integrated control system installed to offset electrical demand by releasing stored energy from renewable sources during peak hours of the day. Control system also monitors the wind turbine and battery storage system health, power output, and issues critical alarms. Of the original objectives, the following were not achieved: • 100 kW new technology waste heat generation unit. • Bi-directional customer/utility gateway for real time visibility and communications between RMP and ATK. • 3.4% reduction in peak demand. 1.7% reduction in peak demand was realized instead.

  7. HEADLINE: DOE Pursues Automation in West Virginia Lab

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    HEADLINE: DOE Pursues Automation in West Virginia Lab By Elizabeth McGowan Hakan Inan can envision a day in the near future when electric utilities will be able to find and isolate faults, then restore service-in record time and without human intervention. His goal is to create a model for automating the process of locating a fault and reconfiguring the feeder. And in utility circles, it's considered very tricky because nobody has yet perfected the process, even though the hardware and software

  8. Automated Steel Cleanliness Analysis Tool (ASCAT)

    SciTech Connect (OSTI)

    Gary Casuccio; Michael Potter; Fred Schwerer; Dr. Richard J. Fruehan; Dr. Scott Story

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet or bloom disposition; and alloy development. Additional benefits of ASCAT include the identification of inclusions that tend to clog nozzles or interact with refractory materials. Several papers outlining the benefits of the ASCAT have been presented and published in the literature. The paper entitled ''Inclusion Analysis to Predict Casting Behavior'' was awarded the American Iron and Steel Institute (AISI) Medal in 2004 for special merit and importance to the steel industry. The ASCAT represents a quantum leap in inclusion analysis and will allow steel producers to evaluate the quality of steel and implement appropriate process improvements. In terms of performance, the ASCAT (1) allows for accurate classification of inclusions by chemistry and morphological parameters, (2) can characterize hundreds of inclusions within minutes, (3) is easy to use (does not require experts), (4) is robust, and (5) has excellent image quality for conventional SEM investigations (e.g., the ASCAT can be utilized as a dual use instrument). In summary, the ASCAT will significantly advance the tools of the industry and addresses an urgent and broadly recognized need of the steel industry. Commercialization of the ASCAT will focus on (1) a sales strategy that leverages our Industry Partners; (2) use of ''technical selling'' through papers and seminars; (3) leveraging RJ Lee Group's consulting services, and packaging of the product with a extensive consulting and training program; (4) partnering with established SEM distributors; (5) establishing relationships with professional organizations associated with the steel industry; and (6) an individualized plant by plant direct sales program.

  9. ETM (Distribution Network Automation on 10 kV cable line stations...

    Open Energy Info (EERE)

    ETM (Distribution Network Automation on 10 kV cable line stations) (Smart Grid Project) Jump to: navigation, search Project Name ETM (Distribution Network Automation on 10 kV cable...

  10. Automated deduction for first-order logic with equality

    Energy Science and Technology Software Center (OSTI)

    2001-06-01

    Otter 3.2 is the current version of ANL's automated deduction system designed to search for proofs and countermodels of conjectures stated in first-order logic with equality. It is used mostly for research in mathematics and logic and also for various applications requiring deductive data processing.

  11. Automated deduction for first-order logic with equality

    Energy Science and Technology Software Center (OSTI)

    2003-09-01

    Otter 3.3 is the current version of ANL's automated deduction system designed to search for proofs and countermodels of conjectures stated in first-order logic with equality. It is used mostly for research in mathematics and logic and also for various applications requiring deductive data processing.

  12. Sandia National Laboratories: Research: High Consequence, Automation, &

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Robotics: Advanced Manipulation Manipulation Robotics Homepage About Robotics Research & Development Advanced Controls Advanced Manipulation Mighty Mouse (M2) Sandia Hand Cybernetics High-Consequence Automation Perception and Decision Tools Unique Mobility Facilities Publications and Factsheets Robotics Image Gallery Robotics Videos Contact Robotics Research Advanced Manipulation Addressing robotics challenges The Sandia Hand has overcome issues that have prevented widespread adoption of

  13. Ideas that Work!. Retuning the Building Automation System

    SciTech Connect (OSTI)

    Parker, Steven

    2015-03-01

    A building automation system (BAS) can save considerable energy by effectively and efficiently operating building energy systems (fans, pumps, chillers boilers, etc.), but only when the BAS is properly set up and operated. Tuning, or retuning, the BAS is a cost effective process worthy of your time and attention.

  14. National SCADA Test Bed Substation Automation Evaluation Report

    SciTech Connect (OSTI)

    Kenneth Barnes; Briam Johnson

    2009-10-01

    Increased awareness of the potential for cyber attack has recently resulted in improved cyber security practices associated with the electrical power grid. However, the level of practical understanding and deployment of cyber security practices has not been evenly applied across all business sectors. Much of the focus has been centered on information technology business centers and control rooms. This report explores the current level of substation automation, communication, and cyber security protection deployed in electrical substations throughout existing utilities in the United States. This report documents the evaluation of substation automation implementation and associated vulnerabilities. This evaluation used research conducted by Newton-Evans Research Company for some of its observations and results. The Newton Evans Report aided in the determination of what is the state of substation automation in North American electric utilities. Idaho National Laboratory cyber security experts aided in the determination of what cyber vulnerabilities may pose a threat to electrical substations. This report includes cyber vulnerabilities as well as recommended mitigations. It also describes specific cyber issues found in typical substation automation configurations within the electric utility industry. The evaluation report was performed over a 5-month period starting in October 2008

  15. Automated Energy Distribution and Reliability System (AEDR): Final Report

    SciTech Connect (OSTI)

    Buche, D. L.

    2008-07-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects.

  16. Automating Nuclear-Safety-Related SQA Procedures with Custom Applications

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Freels, James D.

    2016-01-01

    Nuclear safety-related procedures are rigorous for good reason. Small design mistakes can quickly turn into unwanted failures. Researchers at Oak Ridge National Laboratory worked with COMSOL to define a simulation app that automates the software quality assurance (SQA) verification process and provides results in less than 24 hours.

  17. ARES: automated response function code. Users manual. [HPGAM and LSQVM

    SciTech Connect (OSTI)

    Maung, T.; Reynolds, G.M.

    1981-06-01

    This ARES user's manual provides detailed instructions for a general understanding of the Automated Response Function Code and gives step by step instructions for using the complete code package on a HP-1000 system. This code is designed to calculate response functions of NaI gamma-ray detectors, with cylindrical or rectangular geometries.

  18. Jayhuggins's blog | OpenEI Community

    Open Energy Info (EERE)

    Submitted by Jayhuggins(34) Member 2 October, 2012 - 15:59 OpenEI DownloadUpload Automation Scripts I've created a set of scripts for automating downloading a list of OpenEI...

  19. Automating the determination of 3D protein structure

    SciTech Connect (OSTI)

    Rayl, K.D.

    1993-12-31

    The creation of an automated method for determining 3D protein structure would be invaluable to the field of biology and presents an interesting challenge to computer science. Unfortunately, given the current level of protein knowledge, a completely automated solution method is not yet feasible, therefore, our group has decided to integrate existing databases and theories to create a software system that assists X-ray crystallographers in specifying a particular protein structure. By breaking the problem of determining overall protein structure into small subproblems, we hope to come closer to solving a novel structure by solving each component. By generating necessary information for structure determination, this method provides the first step toward designing a program to determine protein conformation automatically.

  20. Tritium Irrigation Facility & Automated Vadose Zone Monitoring System |

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Savannah River Ecology Laboratory Tritium Irrigation Facility and Automated Vadose Monitoring System The opportunity to study tritium movement in a natural system presents a rare opportunity for both physical and biological research. Researchers may take advantage of tritium's properties as a conservative tracer for modeling contaminant transport, as a radioactive tracer for examining biological processes involving water, or as an example of radionuclide contaminant behavior in natural

  1. Sandia National Laboratories: Research: High Consequence, Automation, &

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Robotics: Neural Control of Prosthetics Neural Control of Prosthetics Advanced prosthetics Researchers in High Consequence, Automation, & Robotics are working on ways to improve amputees' control over prosthetics with direct help from their own nervous systems. Neural interfaces operate where the nervous system and an artificial device intersect. Interfaces can monitor nerve signals or provide inputs that let amputees control prosthetic devices by direct neural signals, the same way they

  2. Optical Method for Automated Real Time Control of Elemental Composition,

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Distribution, and Film Thickness in CIGS Solar Cell Production - Energy Innovation Portal Find More Like This Return to Search Optical Method for Automated Real Time Control of Elemental Composition, Distribution, and Film Thickness in CIGS Solar Cell Production National Renewable Energy Laboratory Contact NREL About This Technology Technology Marketing Summary The solar industry has shown significant growth over the past decade. From 2002 to 2007 the market for Copper Indium Gallium

  3. Reference Model for Control and Automation Systems in Electrical Power

    Energy Savers [EERE]

    Reference Model for Control and Automation Systems in Electrical Power Version 1.2 October 12, 2005 Prepared by: Sandia National Laboratories' Center for SCADA Security Jason Stamp, Technical Lead Michael Berg, Co-Technical Lead Michael Baca, Project Lead This work was conducted for the DOE Office of Electricity Delivery and Energy Reliability under Contract M64SCADSNL Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department

  4. Sandia National Laboratories: Research: High Consequence, Automation, &

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Robotics: Guided Bullet Technology Guided Bullet Technology Robotics Facility Leveraging the capabilities of the High Consequence, Automation, & Robotics Precision Micro Assembly Lab, we have designed a self-guided .50 caliber projectile that utilizes a laser designated target and is configured to be fired from a small caliber, smooth bore gun barrel. Self-guided projectiles increase the probability of hit at targets at long range. Design The self-guided projectile utilizes a laser

  5. Automated defect spatial signature analysis for semiconductor manufacturing process

    DOE Patents [OSTI]

    Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed

    1999-01-01

    An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.

  6. Automated closure system for nuclear reactor fuel assemblies

    DOE Patents [OSTI]

    Christiansen, David W. (Kennewick, WA); Brown, William F. (West Richland, WA)

    1985-01-01

    A welder for automated closure of fuel pins by a pulsed magnetic process in which the open end of a length of cladding is positioned within a complementary tube surrounded by a pulsed magnetic welder. Seals are provided at each end of the tube, which can be evacuated or can receive tag gas for direct introduction to the cladding interior. Loading of magnetic rings and end caps is accomplished automatically in conjunction with the welding steps carried out within the tube.

  7. Automated Measurement and Signaling Systems for the Transactional Network |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy Measurement and Signaling Systems for the Transactional Network Automated Measurement and Signaling Systems for the Transactional Network The Transactional Network Project is a multi-lab activity funded by the U.S. Department of Energy's Building Technologies Office. The project team included staff from Lawrence Berkeley National Laboratory, Pacific Northwest National Laboratory, and Oak Ridge National Laboratory. The team designed, prototyped, and tested a transactional

  8. Automated Sealing of Home Enclosures with Aerosol Particles | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Sealing of Home Enclosures with Aerosol Particles Automated Sealing of Home Enclosures with Aerosol Particles This presentation was delivered during a Building America webinar on October 14, 2011, by the Building Industry Research Alliance team member Mark Modera. PDF icon bira_webinar_10_14_11.pdf More Documents & Publications Building America Technology Solutions for New and Existing Homes: Apartment Compartmentalization with an Aerosol-Based Sealing Process Building America

  9. Automated Image Analysis of Fibers - Energy Innovation Portal

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Startup America Startup America Industrial Technologies Industrial Technologies Advanced Materials Advanced Materials Find More Like This Return to Search Automated Image Analysis of Fibers Automatic Nanofiber Characterization and Recognition Software Argonne National Laboratory Contact ANL About This Technology Image with recognized fiber edges<br /> <br /> Diameter - Measure between each yellow and red tail. Image with recognized fiber edges Diameter - Measure between each yellow

  10. Automated Surface Sampling Probe for Mass Spectrometry - Energy Innovation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Portal Advanced Materials Advanced Materials Find More Like This Return to Search Automated Surface Sampling Probe for Mass Spectrometry Mass Spectrometry Imaging for Drug Discovery and Pharmaceutical Research Oak Ridge National Laboratory Contact ORNL About This Technology Technology Marketing SummaryDr. Gary Van Berkel and colleagues have developed a liquid microjunction surface sampling probe (LMJ?SSP). The LMJ?SSP provides mass spectrometry with a simple and efficient ambient surface

  11. Automated Testing Instrument for Verification of Complex Computational

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Systems | Princeton Plasma Physics Lab Automated Testing Instrument for Verification of Complex Computational Systems Verifying the functionality and proper operation of both hardware and software of complex, low, medium and high speed, Real-Time Instrumentation, Acquisition, Control and Protection systems is typically time consuming and costly. When these systems are expanded, modified, enhanced with new features or software 'bugs' corrected, re-verification of correct operation must be

  12. Automated hybridization/imaging device for fluorescent multiplex DNA sequencing

    DOE Patents [OSTI]

    Weiss, R.B.; Kimball, A.W.; Gesteland, R.F.; Ferguson, F.M.; Dunn, D.M.; Di Sera, L.J.; Cherry, J.L.

    1995-11-28

    A method is disclosed for automated multiplex sequencing of DNA with an integrated automated imaging hybridization chamber system. This system comprises an hybridization chamber device for mounting a membrane containing size-fractionated multiplex sequencing reaction products, apparatus for fluid delivery to the chamber device, imaging apparatus for light delivery to the membrane and image recording of fluorescence emanating from the membrane while in the chamber device, and programmable controller apparatus for controlling operation of the system. The multiplex reaction products are hybridized with a probe, the enzyme (such as alkaline phosphatase) is bound to a binding moiety on the probe, and a fluorogenic substrate (such as a benzothiazole derivative) is introduced into the chamber device by the fluid delivery apparatus. The enzyme converts the fluorogenic substrate into a fluorescent product which, when illuminated in the chamber device with a beam of light from the imaging apparatus, excites fluorescence of the fluorescent product to produce a pattern of hybridization. The pattern of hybridization is imaged by a CCD camera component of the imaging apparatus to obtain a series of digital signals. These signals are converted by the controller apparatus into a string of nucleotides corresponding to the nucleotide sequence an automated sequence reader. The method and apparatus are also applicable to other membrane-based applications such as colony and plaque hybridization and Southern, Northern, and Western blots. 9 figs.

  13. Automated hybridization/imaging device for fluorescent multiplex DNA sequencing

    DOE Patents [OSTI]

    Weiss, Robert B.; Kimball, Alvin W.; Gesteland, Raymond F.; Ferguson, F. Mark; Dunn, Diane M.; Di Sera, Leonard J.; Cherry, Joshua L.

    1995-01-01

    A method is disclosed for automated multiplex sequencing of DNA with an integrated automated imaging hybridization chamber system. This system comprises an hybridization chamber device for mounting a membrane containing size-fractionated multiplex sequencing reaction products, apparatus for fluid delivery to the chamber device, imaging apparatus for light delivery to the membrane and image recording of fluorescence emanating from the membrane while in the chamber device, and programmable controller apparatus for controlling operation of the system. The multiplex reaction products are hybridized with a probe, then an enzyme (such as alkaline phosphatase) is bound to a binding moiety on the probe, and a fluorogenic substrate (such as a benzothiazole derivative) is introduced into the chamber device by the fluid delivery apparatus. The enzyme converts the fluorogenic substrate into a fluorescent product which, when illuminated in the chamber device with a beam of light from the imaging apparatus, excites fluorescence of the fluorescent product to produce a pattern of hybridization. The pattern of hybridization is imaged by a CCD camera component of the imaging apparatus to obtain a series of digital signals. These signals are converted by the controller apparatus into a string of nucleotides corresponding to the nucleotide sequence an automated sequence reader. The method and apparatus are also applicable to other membrane-based applications such as colony and plaque hybridization and Southern, Northern, and Western blots.

  14. GPFA-AB_Phase1UtilizationTask4DataUpload

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Teresa E. Jordan

    2015-09-30

    This submission of Utilization Analysis data to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) is in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (project DE-EE0006726). The submission includes data pertinent to the methods and results of an analysis of the Surface Levelized Cost of Heat (SLCOH) for US Census Bureau Places within the study area. This was calculated using a modification of a program called GEOPHIRES, available at http://koenraadbeckers.net/geophires/index.php. The MATLAB modules used in conjunction with GEOPHIRES, the MATLAB data input file, the GEOPHIRES output data file, and an explanation of the software components have been provided. Results of the SLCOH analysis appear on 4 .png image files as mapped risk of heat utilization. For each of the 4 image (.png) files, there is an accompanying georeferenced TIF (.tif) file by the same name. In addition to calculating SLCOH, this Task 4 also identified many sites that may be prospects for use of a geothermal district heating system, based on their size and industry, rather than on the SLCOH. An industry sorted listing of the sites (.xlsx) and a map of these sites plotted as a layer onto different iterations of maps combining the three geological risk factors (Thermal Quality, Natural Reservoir Quality, and Risk of Seismicity) has been provided. In addition to the 6 image (.png) files of the maps in this series, a shape (.shp) file and 7 associated files are included as well. Finally, supporting files (.pdf) describing the utilization analysis methodology and summarizing the anticipated permitting for a deep district heating system are supplied.

  15. GPFA-AB_Phase1ReservoirTask2DataUpload

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Teresa E. Jordan

    2015-10-22

    This submission to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin. The files included in this zip file contain all data pertinent to the methods and results of this tasks output, which is a cohesive multi-state map of all known potential geothermal reservoirs in our region, ranked by their potential favorability. Favorability is quantified using a new metric, Reservoir Productivity Index, as explained in the Reservoirs Methodology Memo (included in zip file). Shapefile and images of the Reservoir Productivity and Reservoir Uncertainty are included as well.

  16. T-647: PHP File Upload Bug May Let Remote Users Overwrite Files on the Target System

    Broader source: Energy.gov [DOE]

    PHP is prone to a security-bypass vulnerability.Successful exploits will allow an attacker to delete files from the root directory, which may aid in further attacks. PHP 5.3.6 is vulnerable; other versions may also be affected.

  17. GPFA-AB_Phase1ReservoirTask2DataUpload

    SciTech Connect (OSTI)

    Teresa E. Jordan

    2015-10-22

    This submission to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin. The files included in this zip file contain all data pertinent to the methods and results of this task’s output, which is a cohesive multi-state map of all known potential geothermal reservoirs in our region, ranked by their potential favorability. Favorability is quantified using a new metric, Reservoir Productivity Index, as explained in the Reservoirs Methodology Memo (included in zip file). Shapefile and images of the Reservoir Productivity and Reservoir Uncertainty are included as well.

  18. V-177: VMware vCenter Chargeback Manager File Upload Handling Vulnerability

    Broader source: Energy.gov [DOE]

    The vCenter Chargeback Manager contains a critical vulnerability that allows for remote code execution

  19. GPFA-AB_Phase1UtilizationTask4DataUpload

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Teresa E. Jordan

    2015-09-30

    This submission of Utilization Analysis data to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) is in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (project DE-EE0006726). The submission includes data pertinent to the methods and results of an analysis of the Surface Levelized Cost of Heat (SLCOH) for US Census Bureau ‘Places’ within the study area. This was calculated using a modification of a program called GEOPHIRES, available at http://koenraadbeckers.net/geophires/index.php. The MATLAB modules used in conjunction with GEOPHIRES, the MATLAB data input file, the GEOPHIRES output data file, and an explanation of the software components have been provided. Results of the SLCOH analysis appear on 4 .png image files as mapped ‘risk’ of heat utilization. For each of the 4 image (.png) files, there is an accompanying georeferenced TIF (.tif) file by the same name. In addition to calculating SLCOH, this Task 4 also identified many sites that may be prospects for use of a geothermal district heating system, based on their size and industry, rather than on the SLCOH. An industry sorted listing of the sites (.xlsx) and a map of these sites plotted as a layer onto different iterations of maps combining the three geological risk factors (Thermal Quality, Natural Reservoir Quality, and Risk of Seismicity) has been provided. In addition to the 6 image (.png) files of the maps in this series, a shape (.shp) file and 7 associated files are included as well. Finally, supporting files (.pdf) describing the utilization analysis methodology and summarizing the anticipated permitting for a deep district heating system are supplied.

  20. The rotary zone thermal cycler: A low-power system enabling automated rapid

    Office of Scientific and Technical Information (OSTI)

    PCR (Journal Article) | DOE PAGES The rotary zone thermal cycler: A low-power system enabling automated rapid PCR « Prev Next » Title: The rotary zone thermal cycler: A low-power system enabling automated rapid PCR In this study, advances in molecular biology, microfluidics, and laboratory automation continue to expand the accessibility and applicability of these methods beyond the confines of conventional, centralized laboratory facilities and into point of use roles in clinical,

  1. Automation for industrial wastewater treatment. (Latest citations from Pollution abstracts). Published Search

    SciTech Connect (OSTI)

    1996-02-01

    The bibliography contains citations concerning automated monitoring and purification of wastewater. The design and development of new automated systems and improvements to existing applications are described. The citations examine the benefits of automation, including more efficient use of chemicals, continuous operation, and early warning of equipment failure. Disadvantages are addressed, as well, including increased cost of energy, the need for sophisticated hardware and software, and inability to maintain operation during electric power failure. Case histories of operating automated industrial and municipal systems are presented. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  2. Automation systems for Demand Response, ForskEL (Smart Grid Project...

    Open Energy Info (EERE)

    systems for Demand Response, ForskEL (Smart Grid Project) Jump to: navigation, search Project Name Automation systems for Demand Response, ForskEL Country Denmark Coordinates...

  3. AUTOMATING GROUNDWATER SAMPLING AT HANFORD THE NEXT STEP

    SciTech Connect (OSTI)

    CONNELL CW; CONLEY SF; HILDEBRAND RD; CUNNINGHAM DE; R_D_Doug_Hildebrand@rl.gov; DeVon_E_Cunningham@rl.gov

    2010-01-21

    Historically, the groundwater monitoring activities at the Department of Energy's Hanford Site in southeastern Washington State have been very "people intensive." Approximately 1500 wells are sampled each year by field personnel or "samplers." These individuals have been issued pre-printed forms showing information about the well(s) for a particular sampling evolution. This information is taken from 2 official electronic databases: the Hanford Well information System (HWIS) and the Hanford Environmental Information System (HEIS). The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and other personnel posted the collected information onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. A pilot project for automating this extremely tedious process was lauched in 2008. Initially, the automation was focused on water-level measurements. Now, the effort is being extended to automate the meta-data associated with collecting groundwater samples. The project allowed electronic forms produced in the field by samplers to be used in a work flow process where the data is transferred to the database and electronic form is filed in managed records - thus eliminating manually completed forms. Elimating the manual forms and streamlining the data entry not only improved the accuracy of the information recorded, but also enhanced the efficiency and sampling capacity of field office personnel.

  4. Opportunities for Automated Demand Response in California Wastewater Treatment Facilities

    SciTech Connect (OSTI)

    Aghajanzadeh, Arian; Wray, Craig; McKane, Aimee

    2015-08-30

    Previous research over a period of six years has identified wastewater treatment facilities as good candidates for demand response (DR), automated demand response (Auto-­DR), and Energy Efficiency (EE) measures. This report summarizes that work, including the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy used and demand, as well as details of the wastewater treatment process. It also discusses control systems and automated demand response opportunities. Furthermore, this report summarizes the DR potential of three wastewater treatment facilities. In particular, Lawrence Berkeley National Laboratory (LBNL) has collected data at these facilities from control systems, submetered process equipment, utility electricity demand records, and governmental weather stations. The collected data were then used to generate a summary of wastewater power demand, factors affecting that demand, and demand response capabilities. These case studies show that facilities that have implemented energy efficiency measures and that have centralized control systems are well suited to shed or shift electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. In summary, municipal wastewater treatment energy demand in California is large, and energy-­intensive equipment offers significant potential for automated demand response. In particular, large load reductions were achieved by targeting effluent pumps and centrifuges. One of the limiting factors to implementing demand response is the reaction of effluent turbidity to reduced aeration at an earlier stage of the process. Another limiting factor is that cogeneration capabilities of municipal facilities, including existing power purchase agreements and utility receptiveness to purchasing electricity from cogeneration facilities, limit a facility’s potential to participate in other DR activities.

  5. Meeting Minutes from Automated Home Energy Management System Expert Meeting

    Energy Savers [EERE]

    Automated Home Energy Management System Expert Meeting October 1-2, 2009 AGENDA - Day 1 8:30 - 8:45 Welcome and Debriefing of Building America and Home Energy Management Research- Lew Pratsch, DOE 8:45 - 9:15 Utilities Trends- Smart Grid Projects and Integration With Home Controls - Mike Keesee, SMUD 9:15 - 9:45 Thoughts on Controls System Performance Requirements - Rich Brown, LBL 9:45 - 10:15 Efficiency Trends in Consumer Electronics - Kurtis McKenney, TIAX 10:15 - 10:30 Session Break 10:30 -

  6. Automated Vulnerability Detection for Compiled Smart Grid Software

    SciTech Connect (OSTI)

    Prowell, Stacy J; Pleszkoch, Mark G; Sayre, Kirk D; Linger, Richard C

    2012-01-01

    While testing performed with proper experimental controls can provide scientifically quantifiable evidence that software does not contain unintentional vulnerabilities (bugs), it is insufficient to show that intentional vulnerabilities exist, and impractical to certify devices for the expected long lifetimes of use. For both of these needs, rigorous analysis of the software itself is essential. Automated software behavior computation applies rigorous static software analysis methods based on function extraction (FX) to compiled software to detect vulnerabilities, intentional or unintentional, and to verify critical functionality. This analysis is based on the compiled firmware, takes into account machine precision, and does not rely on heuristics or approximations early in the analysis.

  7. Automated Comparison of Building Energy Simulation Engines (Presentation)

    SciTech Connect (OSTI)

    Polly, B.; Horowitz, S.; Booten, B.; Kruis, N.; Christensen, C.

    2012-08-01

    This presentation describes the BEopt comparative test suite, which is a tool that facilitates the automated comparison of building energy simulation engines. It also demonstrates how the test suite is improving the accuracy of building energy simulation programs. Building energy simulation programs inform energy efficient design for new homes and energy efficient upgrades for existing homes. Stakeholders rely on accurate predictions from simulation programs. Previous research indicates that software tends to over-predict energy usage for poorly-insulated leaky homes. NREL is identifying, investigating, and resolving software inaccuracy issues. Comparative software testing is one method of many that NREL uses to identify potential software issues.

  8. Highly insulating Residential Windows Using Smart Automated Shading

    Energy Savers [EERE]

    Highly insulating Residential Windows Using Smart Automated Shading 2015 Building Technologies Office Peer Review Robert Hart, rghart@lbl.gov Stephen Selkowitz, seselkowitz@lbl.gov Lawrence Berkeley National Laboratory Kevin Gaul, GaulKJ@pella.com Pella Corporation Project Summary Timeline: Start date: 04/01/2013 Planned end date: 03/31/2016 Key Milestones 1. Measured thermal performance of static prototype windows is within 0.03 Btu/hr-ft2F (NFRC tolerance) of design specifications 09/30/2014

  9. Operating and maintenance benefits of automated oven wall temperature measurement

    SciTech Connect (OSTI)

    Leuchtmann, K.P.; Hinz, D.; Bergbau, D.; Platts, M.

    1997-12-31

    For a very long time and regardless of all shortcomings associated with it, the manual measurement of the heating flue temperature has been the only method of monitoring the temperature prevailing in a coke oven battery and discovering weak points in the heating system. In the course of the last few years a number of automated temperature measuring systems have been developed that are intended to replace or supplement the manual heating flue measurement system. These measuring systems and their advantages/disadvantages are briefly described in this paper. Additionally, operational experience gathered with the oven chamber wall temperature measuring system is discussed in detail.

  10. Automated process for solvent separation of organic/inorganic substance

    DOE Patents [OSTI]

    Schweighardt, Frank K. (Upper Macungie, PA)

    1986-01-01

    There is described an automated process for the solvent separation of organic/inorganic substances that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In the process, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control apparatus. The mixture in the filter is agitated by ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process.

  11. Automated titration method for use on blended asphalts

    DOE Patents [OSTI]

    Pauli, Adam T.; Robertson, Raymond E.; Branthaver, Jan F.; Schabron, John F.

    2012-08-07

    A system for determining parameters and compatibility of a substance such as an asphalt or other petroleum substance uses titration to highly accurately determine one or more flocculation occurrences and is especially applicable to the determination or use of Heithaus parameters and optimal mixing of various asphalt stocks. In a preferred embodiment, automated titration in an oxygen gas exclusive system and further using spectrophotometric analysis (2-8) of solution turbidity is presented. A reversible titration technique enabling in-situ titration measurement of various solution concentrations is also presented.

  12. Automated process for solvent separation of organic/inorganic substance

    DOE Patents [OSTI]

    Schweighardt, F.K.

    1986-07-29

    There is described an automated process for the solvent separation of organic/inorganic substances that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In the process, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control apparatus. The mixture in the filter is agitated by ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process. 4 figs.

  13. Experience in automating the Title V permit application

    SciTech Connect (OSTI)

    Ashcraft, T.; O`Brien, J.

    1995-12-31

    Title V of the Clean Air Act Amendments of 1990 requires that the owners and operators of certain types of industrial plants obtain a federal operating permit. In general, any plant that meets the definition of a major source must either obtain a Title V Operating permit, or take actions to ``act out`` of the Title V program. This Technical Paper describes the experience of the authors in designing and implementing a computer system for data management and reporting of Clean Air Act Title V Permit information. Recommendations are also provided to guide companies and industry groups who are planning to undertake similar automation projects.

  14. Method and apparatus for automated, modular, biomass power generation

    DOE Patents [OSTI]

    Diebold, James P; Lilley, Arthur; Browne, III, Kingsbury; Walt, Robb Ray; Duncan, Dustin; Walker, Michael; Steele, John; Fields, Michael; Smith, Trevor

    2013-11-05

    Method and apparatus for generating a low tar, renewable fuel gas from biomass and using it in other energy conversion devices, many of which were designed for use with gaseous and liquid fossil fuels. An automated, downdraft gasifier incorporates extensive air injection into the char bed to maintain the conditions that promote the destruction of residual tars. The resulting fuel gas and entrained char and ash are cooled in a special heat exchanger, and then continuously cleaned in a filter prior to usage in standalone as well as networked power systems.

  15. Method and apparatus for automated, modular, biomass power generation

    DOE Patents [OSTI]

    Diebold, James P.; Lilley, Arthur; Browne, Kingsbury III; Walt, Robb Ray; Duncan, Dustin; Walker, Michael; Steele, John; Fields, Michael; Smith, Trevor

    2011-03-22

    Method and apparatus for generating a low tar, renewable fuel gas from biomass and using it in other energy conversion devices, many of which were designed for use with gaseous and liquid fossil fuels. An automated, downdraft gasifier incorporates extensive air injection into the char bed to maintain the conditions that promote the destruction of residual tars. The resulting fuel gas and entrained char and ash are cooled in a special heat exchanger, and then continuously cleaned in a filter prior to usage in standalone as well as networked power systems.

  16. Automated identification and indexing of dislocations in crystal interfaces

    SciTech Connect (OSTI)

    Stukowski, Alexander; Bulatov, Vasily V.; Arsenlis, Athanasios

    2012-10-31

    Here, we present a computational method for identifying partial and interfacial dislocations in atomistic models of crystals with defects. Our automated algorithm is based on a discrete Burgers circuit integral over the elastic displacement field and is not limited to specific lattices or dislocation types. Dislocations in grain boundaries and other interfaces are identified by mapping atomic bonds from the dislocated interface to an ideal template configuration of the coherent interface to reveal incompatible displacements induced by dislocations and to determine their Burgers vectors. Additionally, the algorithm generates a continuous line representation of each dislocation segment in the crystal and also identifies dislocation junctions.

  17. Automated edge finishing using an active XY table

    DOE Patents [OSTI]

    Loucks, Clifford S.; Starr, Gregory P.

    1993-01-01

    The disclosure is directed to an apparatus and method for automated edge finishing using hybrid position/force control of an XY table. The disclosure is particularly directed to learning the trajectory of the edge of a workpiece by "guarded moves". Machining is done by controllably moving the XY table, with the workpiece mounted thereon, along the learned trajectory with feedback from a force sensor. Other similar workpieces can be mounted, without a fixture on the XY table, located and the learned trajectory adjusted

  18. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    SciTech Connect (OSTI)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  19. Northwest Open Automated Demand Response Technology Demonstration Project

    SciTech Connect (OSTI)

    Kiliccote, Sila; Piette, Mary Ann; Dudley, Junqiao

    2010-03-17

    The Lawrence Berkeley National Laboratory (LBNL) Demand Response Research Center (DRRC) demonstrated and evaluated open automated demand response (OpenADR) communication infrastructure to reduce winter morning and summer afternoon peak electricity demand in commercial buildings the Seattle area. LBNL performed this demonstration for the Bonneville Power Administration (BPA) in the Seattle City Light (SCL) service territory at five sites: Seattle Municipal Tower, Seattle University, McKinstry, and two Target stores. This report describes the process and results of the demonstration. OpenADR is an information exchange model that uses a client-server architecture to automate demand-response (DR) programs. These field tests evaluated the feasibility of deploying fully automated DR during both winter and summer peak periods. DR savings were evaluated for several building systems and control strategies. This project studied DR during hot summer afternoons and cold winter mornings, both periods when electricity demand is typically high. This is the DRRC project team's first experience using automation for year-round DR resources and evaluating the flexibility of commercial buildings end-use loads to participate in DR in dual-peaking climates. The lessons learned contribute to understanding end-use loads that are suitable for dispatch at different times of the year. The project was funded by BPA and SCL. BPA is a U.S. Department of Energy agency headquartered in Portland, Oregon and serving the Pacific Northwest. BPA operates an electricity transmission system and markets wholesale electrical power at cost from federal dams, one non-federal nuclear plant, and other non-federal hydroelectric and wind energy generation facilities. Created by the citizens of Seattle in 1902, SCL is the second-largest municipal utility in America. SCL purchases approximately 40% of its electricity and the majority of its transmission from BPA through a preference contract. SCL also provides ancillary services within its own balancing authority. The relationship between BPA and SCL creates a unique opportunity to create DR programs that address both BPA's and SCL's markets simultaneously. Although simultaneously addressing both market could significantly increase the value of DR programs for BPA, SCL, and the end user, establishing program parameters that maximize this value is challenging because of complex contractual arrangements and the absence of a central Independent System Operator or Regional Transmission Organization in the northwest.

  20. Modular Automated Processing System (MAPS) for analysis of biological samples.

    SciTech Connect (OSTI)

    Gil, Geun-Cheol; Chirica, Gabriela S.; Fruetel, Julia A.; VanderNoot, Victoria A.; Branda, Steven S.; Schoeniger, Joseph S.; Throckmorton, Daniel J.; Brennan, James S.; Renzi, Ronald F.

    2010-10-01

    We have developed a novel modular automated processing system (MAPS) that enables reliable, high-throughput analysis as well as sample-customized processing. This system is comprised of a set of independent modules that carry out individual sample processing functions: cell lysis, protein concentration (based on hydrophobic, ion-exchange and affinity interactions), interferent depletion, buffer exchange, and enzymatic digestion of proteins of interest. Taking advantage of its unique capacity for enclosed processing of intact bioparticulates (viruses, spores) and complex serum samples, we have used MAPS for analysis of BSL1 and BSL2 samples to identify specific protein markers through integration with the portable microChemLab{trademark} and MALDI.

  1. Integration of Real-Time Data Into Building Automation Systems

    SciTech Connect (OSTI)

    Mark J. Stunder; Perry Sebastian; Brenda A. Chube; Michael D. Koontz

    2003-04-16

    The project goal was to investigate the possibility of using predictive real-time information from the Internet as an input to building management system algorithms. The objectives were to identify the types of information most valuable to commercial and residential building owners, managers, and system designers. To comprehensively investigate and document currently available electronic real-time information suitable for use in building management systems. Verify the reliability of the information and recommend accreditation methods for data and providers. Assess methodologies to automatically retrieve and utilize the information. Characterize equipment required to implement automated integration. Demonstrate the feasibility and benefits of using the information in building management systems. Identify evolutionary control strategies.

  2. Automated identification and indexing of dislocations in crystal interfaces

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Stukowski, Alexander; Bulatov, Vasily V.; Arsenlis, Athanasios

    2012-10-31

    Here, we present a computational method for identifying partial and interfacial dislocations in atomistic models of crystals with defects. Our automated algorithm is based on a discrete Burgers circuit integral over the elastic displacement field and is not limited to specific lattices or dislocation types. Dislocations in grain boundaries and other interfaces are identified by mapping atomic bonds from the dislocated interface to an ideal template configuration of the coherent interface to reveal incompatible displacements induced by dislocations and to determine their Burgers vectors. Additionally, the algorithm generates a continuous line representation of each dislocation segment in the crystal andmore » also identifies dislocation junctions.« less

  3. Method and apparatus for globally-accessible automated testing

    DOE Patents [OSTI]

    Layne, Scott P.; Beugelsdijk, Tony J.

    1998-01-01

    A method and apparatus for sharing integrated testing services with a plurality of autonomous remote clients is disclosed. In the disclosed method, in response to an access request message, a process controller transmits an access enabling message to the remote client. The access enabling message includes instructions performable by a remote client to generate test equipment commands. A process controller interprets and transforms these commands into automated test instrument suite commands, which are provided to laboratory modules to perform the indicated tests. Test data results are then obtained and transmitted to the remote client.

  4. Open Automated Demand Response for Small Commerical Buildings

    SciTech Connect (OSTI)

    Dudley, June Han; Piette, Mary Ann; Koch, Ed; Hennage, Dan

    2009-05-01

    This report characterizes small commercial buildings by market segments, systems and end-uses; develops a framework for identifying demand response (DR) enabling technologies and communication means; and reports on the design and development of a low-cost OpenADR enabling technology that delivers demand reductions as a percentage of the total predicted building peak electric demand. The results show that small offices, restaurants and retail buildings are the major contributors making up over one third of the small commercial peak demand. The majority of the small commercial buildings in California are located in southern inland areas and the central valley. Single-zone packaged units with manual and programmable thermostat controls make up the majority of heating ventilation and air conditioning (HVAC) systems for small commercial buildings with less than 200 kW peak electric demand. Fluorescent tubes with magnetic ballast and manual controls dominate this customer group's lighting systems. There are various ways, each with its pros and cons for a particular application, to communicate with these systems and three methods to enable automated DR in small commercial buildings using the Open Automated Demand Response (or OpenADR) communications infrastructure. Development of DR strategies must consider building characteristics, such as weather sensitivity and load variability, as well as system design (i.e. under-sizing, under-lighting, over-sizing, etc). Finally, field tests show that requesting demand reductions as a percentage of the total building predicted peak electric demand is feasible using the OpenADR infrastructure.

  5. Methods for Automated and Continuous Commissioning of Building Systems

    SciTech Connect (OSTI)

    Larry Luskay; Michael Brambley; Srinivas Katipamula

    2003-04-30

    Avoidance of poorly installed HVAC systems is best accomplished at the close of construction by having a building and its systems put ''through their paces'' with a well conducted commissioning process. This research project focused on developing key components to enable the development of tools that will automatically detect and correct equipment operating problems, thus providing continuous and automatic commissioning of the HVAC systems throughout the life of a facility. A study of pervasive operating problems reveled the following would most benefit from an automated and continuous commissioning process: (1) faulty economizer operation; (2) malfunctioning sensors; (3) malfunctioning valves and dampers, and (4) access to project design data. Methodologies for detecting system operation faults in these areas were developed and validated in ''bare-bones'' forms within standard software such as spreadsheets, databases, statistical or mathematical packages. Demonstrations included flow diagrams and simplified mock-up applications. Techniques to manage data were demonstrated by illustrating how test forms could be populated with original design information and the recommended sequence of operation for equipment systems. Proposed tools would use measured data, design data, and equipment operating parameters to diagnosis system problems. Steps for future research are suggested to help more toward practical application of automated commissioning and its high potential to improve equipment availability, increase occupant comfort, and extend the life of system equipment.

  6. PR-PR: Cross-Platform Laboratory Automation System

    SciTech Connect (OSTI)

    Linshiz, G; Stawski, N; Goyal, G; Bi, CH; Poust, S; Sharma, M; Mutalik, V; Keasling, JD; Hillson, NJ

    2014-08-01

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.

  7. AUTOMATED, HIGHLY ACCURATE VERIFICATION OF RELAP5-3D

    SciTech Connect (OSTI)

    George L Mesina; David Aumiller; Francis Buschman

    2014-07-01

    Computer programs that analyze light water reactor safety solve complex systems of governing, closure and special process equations to model the underlying physics. In addition, these programs incorporate many other features and are quite large. RELAP5-3D[1] has over 300,000 lines of coding for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. Verification ensures that a program is built right by checking that it meets its design specifications. Recently, there has been an increased importance on the development of automated verification processes that compare coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions[2]. For the first time, the ability exists to ensure that the data transfer operations associated with timestep advancement/repeating and writing/reading a solution to a file have no unintended consequences. To ensure that the code performs as intended over its extensive list of applications, an automated and highly accurate verification method has been modified and applied to RELAP5-3D. Furthermore, mathematical analysis of the adequacy of the checks used in the comparisons is provided.

  8. Opportunities for Automated Demand Response in California Agricultural Irrigation

    SciTech Connect (OSTI)

    Olsen, Daniel; Aghajanzadeh, Arian; McKane, Aimee

    2015-08-01

    Pumping water for agricultural irrigation represents a significant share of California’s annual electricity use and peak demand. It also represents a large source of potential flexibility, as farms possess a form of storage in their wetted soil. By carefully modifying their irrigation schedules, growers can participate in demand response without adverse effects on their crops. This report describes the potential for participation in demand response and automated demand response by agricultural irrigators in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use in California. Typical on-­farm controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Case studies of demand response programs in California and across the country are reviewed, and their results along with overall California demand estimates are used to estimate statewide demand response potential. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  9. Evaluate fundamental approaches to longwall dust control: Subprogram D, Longwall automation technology

    SciTech Connect (OSTI)

    Ludlow, J.; Ruggieri, S.

    1990-05-01

    The use of automated equipment on longwall faces can offer significant benefits by the dust exposures of face personnel can be reduced by removing them from areas of high dust concentrations. While the advantages are clear, and sufficiently mature, and developments have been commercially available, the application of automated systems on US longwalls has met with limited success. The objective of this subprogram was to determine the engineering and economic restraints on the implementation and acceptance of longwall automation technology and to study the potential dust control benefits offered by the technology. Two of the more highly developed automation techniques were chosen for detailed investigation: automated shield advance and shearer remote control. This report discusses the manufacturer surveys, mining company surveys and underground evaluations conducted as part of this effort. Specific conclusions and recommendations are offered regarding the use of the techniques. 4 figs., 5 tabs.

  10. The present invention relates to automated methods of introducing multiple nucleic acid sequences into one or more target cells.

    DOE Patents [OSTI]

    Church, George M. (Brookline, MA); Wang, Harris H. (Cambridge, MA); Isaacs, Farren J. (Brookline, MA)

    2012-04-10

    The present invention relates to automated methods of introducing multiple nucleic acid sequences into one or more target cells.

  11. Automated event generation for loop-induced processes

    SciTech Connect (OSTI)

    Hirschi, Valentin; Mattelaer, Olivier

    2015-10-22

    We present the first fully automated implementation of cross-section computation and event generation for loop-induced processes. This work is integrated in the MadGraph5_aMC@NLO framework. We describe the optimisations implemented at the level of the matrix element evaluation, phase space integration and event generation allowing for the simulation of large multiplicity loop-induced processes. Along with some selected differential observables, we illustrate our results with a table showing inclusive cross-sections for all loop-induced hadronic scattering processes with up to three final states in the SM as well as for some relevant 2 → 4 processes. Furthermore, many of these are computed here for the first time.

  12. Automated event generation for loop-induced processes

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Hirschi, Valentin; Mattelaer, Olivier

    2015-10-22

    We present the first fully automated implementation of cross-section computation and event generation for loop-induced processes. This work is integrated in the MadGraph5_aMC@NLO framework. We describe the optimisations implemented at the level of the matrix element evaluation, phase space integration and event generation allowing for the simulation of large multiplicity loop-induced processes. Along with some selected differential observables, we illustrate our results with a table showing inclusive cross-sections for all loop-induced hadronic scattering processes with up to three final states in the SM as well as for some relevant 2 → 4 processes. Furthermore, many of these are computed heremore » for the first time.« less

  13. Automated ultrasonic inspection of turbine blade tenons results summary

    SciTech Connect (OSTI)

    Kotteakos, B.

    1996-12-31

    Cracks occurring in turbine blade tenons have the possibility of producing severe damage if not detected. Undetected cracks can propagate to a critical size, resulting in loss of shroud, excessive vibration and consequential unit shut down. Advances in the development of ultrasonic techniques have provided Southern California Edison Company (SCE) with an effective method of detecting tenon cracking prior to crack propagation to critical size. The ultrasonic system utilized by SCE incorporates focused array technology and automated scanning techniques and provides many advantages over the conventional manual scanning techniques. This paper addresses the system utilized by the company and the results of inspections since the introduction of the equipment to the power generation industry.

  14. Northwest Open Automated Demand Response Technology Demonstration Project

    SciTech Connect (OSTI)

    Kiliccote, Sila; Dudley, Junqiao Han; Piette, Mary Ann

    2009-08-01

    Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology demonstration and evaluation for Bonneville Power Administration (BPA) in Seattle City Light's (SCL) service territory. This report summarizes the process and results of deploying open automated demand response (OpenADR) in Seattle area with winter morning peaking commercial buildings. The field tests were designed to evaluate the feasibility of deploying fully automated demand response (DR) in four to six sites in the winter and the savings from various building systems. The project started in November of 2008 and lasted 6 months. The methodology for the study included site recruitment, control strategy development, automation system deployment and enhancements, and evaluation of sites participation in DR test events. LBNL subcontracted McKinstry and Akuacom for this project. McKinstry assisted with recruitment, site survey collection, strategy development and overall participant and control vendor management. Akuacom established a new server and enhanced its operations to allow for scheduling winter morning day-of and day-ahead events. Each site signed a Memorandum of Agreement with SCL. SCL offered each site $3,000 for agreeing to participate in the study and an additional $1,000 for each event they participated. Each facility and their control vendor worked with LBNL and McKinstry to select and implement control strategies for DR and developed their automation based on the existing Internet connectivity and building control system. Once the DR strategies were programmed, McKinstry commissioned them before actual test events. McKinstry worked with LBNL to identify control points that can be archived at each facility. For each site LBNL collected meter data and trend logs from the energy management and control system. The communication system allowed the sites to receive day-ahead as well as day-of DR test event signals. Measurement of DR was conducted using three different baseline models for estimation peak load reductions. One was three-in-ten baseline, which is based on the site electricity consumption from 7 am to 10 am for the three days with the highest consumption of the previous ten business days. The second model, the LBNL outside air temperature (OAT) regression baseline model, is based on OAT data and site electricity consumption from the previous ten days, adjusted using weather regressions from the fifteen-minute electric load data during each DR test event for each site. A third baseline that simply averages the available load data was used for sites less with less than 10 days of historical meter data. The evaluation also included surveying sites regarding any problems or issues that arose during the DR test events. Question covered occupant comfort, control issues and other potential problems.

  15. Automated cassette-to-cassette substrate handling system

    DOE Patents [OSTI]

    Kraus, Joseph Arthur; Boyer, Jeremy James; Mack, Joseph; DeChellis, Michael; Koo, Michael

    2014-03-18

    An automated cassette-to-cassette substrate handling system includes a cassette storage module for storing a plurality of substrates in cassettes before and after processing. A substrate carrier storage module stores a plurality of substrate carriers. A substrate carrier loading/unloading module loads substrates from the cassette storage module onto the plurality of substrate carriers and unloads substrates from the plurality of substrate carriers to the cassette storage module. A transport mechanism transports the plurality of substrates between the cassette storage module and the plurality of substrate carriers and transports the plurality of substrate carriers between the substrate carrier loading/unloading module and a processing chamber. A vision system recognizes recesses in the plurality of substrate carriers corresponding to empty substrate positions in the substrate carrier. A processor receives data from the vision system and instructs the transport mechanism to transport substrates to positions on the substrate carrier in response to the received data.

  16. Digital microfluidic hub for automated nucleic acid sample preparation.

    SciTech Connect (OSTI)

    He, Jim; Bartsch, Michael S.; Patel, Kamlesh D.; Kittlaus, Eric A.; Remillared, Erin M.; Pezzola, Genevieve L.; Renzi, Ronald F.; Kim, Hanyoup

    2010-07-01

    We have designed, fabricated, and characterized a digital microfluidic (DMF) platform to function as a central hub for interfacing multiple lab-on-a-chip sample processing modules towards automating the preparation of clinically-derived DNA samples for ultrahigh throughput sequencing (UHTS). The platform enables plug-and-play installation of a two-plate DMF device with consistent spacing, offers flexible connectivity for transferring samples between modules, and uses an intuitive programmable interface to control droplet/electrode actuations. Additionally, the hub platform uses transparent indium-tin oxide (ITO) electrodes to allow complete top and bottom optical access to the droplets on the DMF array, providing additional flexibility for various detection schemes.

  17. Measurement and evaluation techniques for automated demand response demonstration

    SciTech Connect (OSTI)

    Motegi, Naoya; Piette, Mary Ann; Watson, David S.; Sezgen, Osman; ten Hope, Laurie

    2004-08-01

    The recent electricity crisis in California and elsewhere has prompted new research to evaluate demand response strategies in large facilities. This paper describes an evaluation of fully automated demand response technologies (Auto-DR) in five large facilities. Auto-DR does not involve human intervention, but is initiated at a facility through receipt of an external communications signal. This paper summarizes the measurement and evaluation of the performance of demand response technologies and strategies in five large facilities. All the sites have data trending systems such as energy management and control systems (EMCS) and/or energy information systems (EIS). Additional sub-metering was applied where necessary to evaluate the facility's demand response performance. This paper reviews the control responses during the test period, and analyzes demand savings achieved at each site. Occupant comfort issues are investigated where data are available. This paper discusses methods to estimate demand savings and results from demand response strategies at five large facilities.

  18. Automated Pre-processing for NMR Assignments with Reduced Tedium

    Energy Science and Technology Software Center (OSTI)

    2004-05-11

    An important rate-limiting step in the reasonance asignment process is accurate identification of resonance peaks in MNR spectra. NMR spectra are noisy. Hence, automatic peak-picking programs must navigate between the Scylla of reliable but incomplete picking, and the Charybdis of noisy but complete picking. Each of these extremes complicates the assignment process: incomplete peak-picking results in the loss of essential connectivities, while noisy picking conceals the true connectivities under a combinatiorial explosion of false positives.more » Intermediate processing can simplify the assignment process by preferentially removing false peaks from noisy peak lists. This is accomplished by requiring consensus between multiple NMR experiments, exploiting a priori information about NMR spectra, and drawing on empirical statistical distributions of chemical shift extracted from the BioMagResBank. Experienced NMR practitioners currently apply many of these techniques "by hand", which is tedious, and may appear arbitrary to the novice. To increase efficiency, we have created a systematic and automated approach to this process, known as APART. Automated pre-processing has three main advantages: reduced tedium, standardization, and pedagogy. In the hands of experienced spectroscopists, the main advantage is reduced tedium (a rapid increase in the ratio of true peaks to false peaks with minimal effort). When a project is passed from hand to hand, the main advantage is standardization. APART automatically documents the peak filtering process by archiving its original recommendations, the accompanying justifications, and whether a user accepted or overrode a given filtering recommendation. In the hands of a novice, this tool can reduce the stumbling block of learning to differentiate between real peaks and noise, by providing real-time examples of how such decisions are made.« less

  19. Role of Standard Demand Response Signals for Advanced Automated Aggregation

    SciTech Connect (OSTI)

    Lawrence Berkeley National Laboratory; Kiliccote, Sila

    2011-11-18

    Emerging standards such as OpenADR enable Demand Response (DR) Resources to interact directly with Utilities and Independent System Operators to allow their facility automation equipment to respond to a variety of DR signals ranging from day ahead to real time ancillary services. In addition, there are Aggregators in today’s markets who are capable of bringing together collections of aggregated DR assets and selling them to the grid as a single resource. However, in most cases these aggregated resources are not automated and when they are, they typically use proprietary technologies. There is a need for a framework for dealing with aggregated resources that supports the following requirements: • Allows demand-side resources to participate in multiple DR markets ranging from wholesale ancillary services to retail tariffs without being completely committed to a single entity like an Aggregator; • Allow aggregated groups of demand-side resources to be formed in an ad hoc fashion to address specific grid-side issues and support the optimization of the collective response of an aggregated group along a number of different dimensions. This is important in order to taylor the aggregated performance envelope to the needs to of the grid; • Allow aggregated groups to be formed in a hierarchical fashion so that each group can participate in variety of markets from wholesale ancillary services to distribution level retail tariffs. This paper explores the issues of aggregated groups of DR resources as described above especially within the context of emerging smart grid standards and the role they will play in both the management and interaction of various grid-side entities with those resources.

  20. Automated Controlled-Potential Coulometer for the IAEA

    SciTech Connect (OSTI)

    Cordaro, J.V.; Holland, M.K.; Fields, T.

    1998-01-29

    An automated controlled-potential coulometer has been developed at the Savannah River Site (SRS) for the determination of plutonium for use at the International Atomic Energy Agency`s (IAEA) Safeguards Analytical Laboratory in Siebersdorf, Austria. The system is functionally the same as earlier systems built for use at the Savannah River Site`s Analytical Laboratory. All electronic circuits and printed circuits boards have been upgraded with state-of-the-art components. A higher amperage potentiostat with improved control stability has been developed. The system achieves electronic calibration accuracy and linearity of better than 0.01 percent, with a precision and accuracy better than 0.1 percent has been demonstrated. This coulometer features electrical calibration of the integration system, electrolysis current background corrections, and control-potential adjustment capabilities. These capabilities allow application of the system to plutonium measurements without chemical standards, achieving traceability to the international measurement system through electrical standards and Faraday`s constant. the chemist is provided with the capability to perform measurements without depending upon chemical standards, which is a significant advantage for applications such as characterization of primary and secondary standards. Additional benefits include reducing operating cost to procure, prepare and measure calibration standards and the corresponding decrease in radioactive waste generation. The design and documentation of the automated instrument are provided herein. Each individual module`s operation, wiring, layout, and alignment are described. Interconnection of the modules and system calibration are discussed. A complete set of prints and a list of associated parts are included.

  1. FEMP Presents Its Newest On-Demand eTraining Course on Building Automation

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Systems | Department of Energy Presents Its Newest On-Demand eTraining Course on Building Automation Systems FEMP Presents Its Newest On-Demand eTraining Course on Building Automation Systems November 19, 2013 - 12:00am Addthis The U.S. Department of Energy (DOE) Federal Energy Management Program (FEMP) has launched its latest eTraining Course, Building Automation Systems for Existing Federal Facilities for no-cost, on-demand access. In this course, FEMP Technical Lead Brad Gustafson offers

  2. INITIATORS AND TRIGGERING CONDITIONS FOR ADAPTIVE AUTOMATION IN ADVANCED SMALL MODULAR REACTORS

    SciTech Connect (OSTI)

    Katya L Le Blanc; Johanna h Oxstrand

    2014-04-01

    It is anticipated that Advanced Small Modular Reactors (AdvSMRs) will employ high degrees of automation. High levels of automation can enhance system performance, but often at the cost of reduced human performance. Automation can lead to human out-of the loop issues, unbalanced workload, complacency, and other problems if it is not designed properly. Researchers have proposed adaptive automation (defined as dynamic or flexible allocation of functions) as a way to get the benefits of higher levels of automation without the human performance costs. Adaptive automation has the potential to balance operator workload and enhance operator situation awareness by allocating functions to the operators in a way that is sensitive to overall workload and capabilities at the time of operation. However, there still a number of questions regarding how to effectively design adaptive automation to achieve that potential. One of those questions is related to how to initiate (or trigger) a shift in automation in order to provide maximal sensitivity to operator needs without introducing undesirable consequences (such as unpredictable mode changes). Several triggering mechanisms for shifts in adaptive automation have been proposed including: operator initiated, critical events, performance-based, physiological measurement, model-based, and hybrid methods. As part of a larger project to develop design guidance for human-automation collaboration in AdvSMRs, researchers at Idaho National Laboratory have investigated the effectiveness and applicability of each of these triggering mechanisms in the context of AdvSMR. Researchers reviewed the empirical literature on adaptive automation and assessed each triggering mechanism based on the human-system performance consequences of employing that mechanism. Researchers also assessed the practicality and feasibility of using the mechanism in the context of an AdvSMR control room. Results indicate that there are tradeoffs associated with each mechanism, but that some are more applicable to the AdvSMR domain. The two mechanisms that consistently improve performance in laboratory studies are operator initiated adaptive automation based on hierarchical task delegation and the Electroencephalogram(EEG) based measure of engagement. Current EEG methods are intrusive and require intensive analysis; therefore it is not recommended for an AdvSMR control rooms at this time. Researchers also discuss limitations in the existing empirical literature and make recommendations for further research.

  3. A DISTRIBUTED INTELLIGENT AUTOMATED DEMAND RESPONSE BUILDING MANAGEMENT SYSTEM

    SciTech Connect (OSTI)

    Auslander, David; Culler, David; Wright, Paul; Lu, Yan; Piette, Mary

    2013-12-30

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Response (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load during the study period was 1175 kW. Several new tools facilitated this work, such as the Smart Energy Box, the distributed load controller or Energy Information Gateway, the web-­based DR controller (dubbed the Central Load-­Shed Coordinator or CLSC), and the Demand Response Capacity Assessment & Operation Assistance Tool (DRCAOT). In addition, an innovative data aggregator called sMAP (simple Measurement and Actuation Profile) allowed data from different sources collected in a compact form and facilitated detailed analysis of the building systems operation. A smart phone application (RAP or Rapid Audit Protocol) facilitated an inventory of the building’s plug loads. Carbon dioxide sensors located in conference rooms and classrooms allowed demand controlled ventilation. The extensive submetering and nimble access to this data provided great insight into the details of the building operation as well as quick diagnostics and analyses of tests. For example, students discovered a short-­cycling chiller, a stuck damper, and a leaking cooling coil in the first field tests. For our final field tests, we were able to see how each zone was affected by the DR strategies (e.g., the offices on the 7th floor grew very warm quickly) and fine-­tune the strategies accordingly.

  4. Harnessing Vehicle Automation for Public Mobility -- An Overview of Ongoing Efforts

    SciTech Connect (OSTI)

    Young, Stanley E.

    2015-11-05

    This presentation takes a look at the efforts to harness automated vehicle technology for public transport. The European CityMobil2 is the leading demonstration project in which automated shuttles were, or are planned to be, demonstrated in several cities and regions. The presentation provides a brief overview of the demonstrations at Oristano, Italy (July 2014), LaRochelle, France (Dec 2014), Lausanne, Switzerland (Apr 2015), Vantaa, Finland (July 2015), and Trikala, Greece (Sept 2015). In addition to technology exposition, the objectives included generating a legal framework for operation in each location and gaging the reaction of the public to unmanned shuttles, both of which were successfully achieved. Several such demonstrations are planned throughout the world, including efforts in North America in conjunction with the GoMentum Station in California. These early demonstration with low-speed automated shuttles provide a glimpse of the possible with a fully automated fleet of driverless vehicle providing a public transit service.

  5. Pilon: Automated Assembly Improvement Software (Seventh Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting 2012)

    ScienceCinema (OSTI)

    Walker, Bruce (Broad Institute)

    2013-02-11

    Bruce Walker on "Pilon: Automated Assembly Improvement Software" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  6. Automation of a high-speed imaging setup for differential viscosity measurements

    SciTech Connect (OSTI)

    Hurth, C.; Duane, B.; Whitfield, D.; Smith, S.; Nordquist, A.; Zenhausern, F.

    2013-12-28

    We present the automation of a setup previously used to assess the viscosity of pleural effusion samples and discriminate between transudates and exudates, an important first step in clinical diagnostics. The presented automation includes the design, testing, and characterization of a vacuum-actuated loading station that handles the 2 mm glass spheres used as sensors, as well as the engineering of electronic Printed Circuit Board (PCB) incorporating a microcontroller and their synchronization with a commercial high-speed camera operating at 10 000 fps. The hereby work therefore focuses on the instrumentation-related automation efforts as the general method and clinical application have been reported earlier [Hurth et al., J. Appl. Phys. 110, 034701 (2011)]. In addition, we validate the performance of the automated setup with the calibration for viscosity measurements using water/glycerol standard solutions and the determination of the viscosity of an “unknown” solution of hydroxyethyl cellulose.

  7. Disposable and removable nucleic acid extraction and purification cartridges for automated flow-through systems

    DOE Patents [OSTI]

    Regan, John Frederick

    2014-09-09

    Removable cartridges are used on automated flow-through systems for the purpose of extracting and purifying genetic material from complex matrices. Different types of cartridges are paired with specific automated protocols to concentrate, extract, and purifying pathogenic or human genetic material. Their flow-through nature allows large quantities sample to be processed. Matrices may be filtered using size exclusion and/or affinity filters to concentrate the pathogen of interest. Lysed material is ultimately passed through a filter to remove the insoluble material before the soluble genetic material is delivered past a silica-like membrane that binds the genetic material, where it is washed, dried, and eluted. Cartridges are inserted into the housing areas of flow-through automated instruments, which are equipped with sensors to ensure proper placement and usage of the cartridges. Properly inserted cartridges create fluid- and air-tight seals with the flow lines of an automated instrument.

  8. Scientific Data Management Integrated Software Infrastructure Center (SDM/ISIC): Scientific Process Automation (SPA), FINAL REPORT

    SciTech Connect (OSTI)

    Bertram Ludaescher; Ilkay Altintas

    2012-07-03

    This is the final report from SDSC and UC Davis on DE-FC02-01ER25486, Scientific Data Management Integrated Software Infrastructure Center (SDM/ISIC): Scientific Process Automation (SPA).

  9. A Semi-Automated Functional Test Data Analysis Tool

    SciTech Connect (OSTI)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  10. Automated management of radioactive sources in Saudi Arabia

    SciTech Connect (OSTI)

    Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.; Tuwaili, W. R.

    2014-09-30

    For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 247 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check request status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.

  11. Intelligent Control in Automation Based on Wireless Traffic Analysis

    SciTech Connect (OSTI)

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  12. Intelligent Control in Automation Based on Wireless Traffic Analysis

    SciTech Connect (OSTI)

    Kurt Derr; Milos Manic

    2007-08-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  13. Automated video screening for unattended background monitoring in dynamic environments.

    SciTech Connect (OSTI)

    Carlson, Jeffrey J.

    2004-03-01

    This report addresses the development of automated video-screening technology to assist security forces in protecting our homeland against terrorist threats. A threat of specific interest to this project is the covert placement and subsequent remote detonation of bombs (e.g., briefcase bombs) inside crowded public facilities. Different from existing video motion detection systems, the video-screening technology described in this report is capable of detecting changes in the static background of an otherwise, dynamic environment - environments where motion and human activities are persistent. Our goal was to quickly detect changes in the background - even under conditions when the background is visible to the camera less than 5% of the time. Instead of subtracting the background to detect movement or changes in a scene, we subtracted the dynamic scene variations to produce an estimate of the static background. Subsequent comparisons of static background estimates are used to detect changes in the background. Detected changes can be used to alert security forces of the presence and location of potential threats. The results of this research are summarized in two MS Power-point presentations included with this report.

  14. Defect Prevention and Detection in Software for Automated Test Equipment

    SciTech Connect (OSTI)

    E. Bean

    2006-11-30

    Software for automated test equipment can be tedious and monotonous making it just as error-prone as other software. Active defect prevention and detection are also important for test applications. Incomplete or unclear requirements, a cryptic syntax used for some test applicationsespecially script-based test sets, variability in syntax or structure, and changing requirements are among the problems encountered in one tester. Such problems are common to all software but can be particularly problematic in test equipment software intended to test another product. Each of these issues increases the probability of error injection during test application development. This report describes a test application development tool designed to address these issues and others for a particular piece of test equipment. By addressing these problems in the development environment, the tool has powerful built-in defect prevention and detection capabilities. Regular expressions are widely used in the development tool as a means of formally defining test equipment requirements for the test application and verifying conformance to those requirements. A novel means of using regular expressions to perform range checking was developed. A reduction in rework and increased productivity are the results. These capabilities are described along with lessons learned and their applicability to other test equipment software. The test application development tool, or application builder, is known as the PT3800 AM Creation, Revision and Archiving Tool (PACRAT).

  15. Automated Tracing of Horizontal Neuron Processes During Retinal Development

    SciTech Connect (OSTI)

    Kerekes, Ryan A [ORNL; Martins, Rodrigo [St. Jude Children's Research Hospital; Dyer, Michael A [ORNL; Gleason, Shaun Scott [ORNL; Karakaya, Mahmut [ORNL; Davis, Denise [St. Jude Children's Research Hospital

    2011-01-01

    In the developing mammalian retina, horizontal neurons undergo a dramatic reorganization oftheir processes shortly after they migrate to their appropriate laminar position. This is an importantprocess because it is now understood that the apical processes are important for establishing theregular mosaic of horizontal cells in the retina and proper reorganization during lamination isrequired for synaptogenesis with photoreceptors and bipolar neurons. However, this process isdifficult to study because the analysis of horizontal neuron anatomy is labor intensive and time-consuming. In this paper, we present a computational method for automatically tracing the three-dimensional (3-D) dendritic structure of horizontal retinal neurons in two-photon laser scanningmicroscope (TPLSM) imagery. Our method is based on 3-D skeletonization and is thus able topreserve the complex structure of the dendritic arbor of these cells. We demonstrate theeffectiveness of our approach by comparing our tracing results against two sets of semi-automatedtraces over a set of 10 horizontal neurons ranging in age from P1 to P5. We observe an averageagreement level of 81% between our automated trace and the manual traces. This automatedmethod will serve as an important starting point for further refinement and optimization.

  16. Semi-Automated Discovery of Application Session Structure

    SciTech Connect (OSTI)

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  17. Operations of the Automated Radioxenon Sampler/Analyzer - ARSA

    SciTech Connect (OSTI)

    Hayes, James C.; Abel, Keith H.; Bowyer, Ted W.; Heimbigner, Tom R.; Panisko, Mark E.; Reeder, Paul L.; McIntyre, Justin I.; Thompson, Robert C.; Todd, Lindsay C.; Warner, Ray A.

    1999-09-01

    The Automated Radioxenon Sampler/ Analyzer (ARSA), designed and built by Pacific Northwest National Laboratory (PNNL), for the Department of Energy, has exceeded measurement requirements for noble gas measurement systems established by the Comprehensive Nuclear-Test-Ban Treaty. Two units, one at PNNL and a second, sent to DME Corp. of Florida, were built and extensively tested. Both systems have successfully demonstrated stable xenon yields greater than 1.5 cm3 for an eight-hour collection period, corresponding to minimum detectable concentrations for 133Xe on the order of 0.1 mBq/m3 three times per day. High stable xenon yields are critical in obtaining these low minimum detectable concentrations. A history of testing and results that led to the high xenon yields of the ARSA system is presented. A compilation of field tests, laboratory tests and baseline tests that led to cost reduction, power savings and size reduction of the ARSA are also discussed. Lastly, the type of data generated from the ARSA of interest to data center personnel are discussed.

  18. PaR-PaR Laboratory Automation Platform

    SciTech Connect (OSTI)

    Linshiz, G; Stawski, N; Poust, S; Bi, CH; Keasling, JD; Hilson, NJ

    2013-05-01

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.

  19. Automated genome mining of ribosomal peptide natural products

    SciTech Connect (OSTI)

    Mohimani, Hosein; Kersten, Roland; Liu, Wei; Wang, Mingxun; Purvine, Samuel O.; Wu, Si; Brewer, Heather M.; Pasa-Tolic, Ljiljana; Bandeira, Nuno; Moore, Bradley S.; Pevzner, Pavel A.; Dorrestein, Pieter C.

    2014-07-31

    Ribosomally synthesized and posttranslationally modified peptides (RiPPs), especially from microbial sources, are a large group of bioactive natural products that are a promising source of new (bio)chemistry and bioactivity (1). In light of exponentially increasing microbial genome databases and improved mass spectrometry (MS)-based metabolomic platforms, there is a need for computational tools that connect natural product genotypes predicted from microbial genome sequences with their corresponding chemotypes from metabolomic datasets. Here, we introduce RiPPquest, a tandem mass spectrometry database search tool for identification of microbial RiPPs and apply it for lanthipeptide discovery. RiPPquest uses genomics to limit search space to the vicinity of RiPP biosynthetic genes and proteomics to analyze extensive peptide modifications and compute p-values of peptide-spectrum matches (PSMs). We highlight RiPPquest by connection of multiple RiPPs from extracts of Streptomyces to their gene clusters and by the discovery of a new class III lanthipeptide, informatipeptin, from Streptomyces viridochromogenes DSM 40736 as the first natural product to be identified in an automated fashion by genome mining. The presented tool is available at cy-clo.ucsd.edu.

  20. Automated-In-Motion Vehicle Evaluation Environment (AIMVEE)

    Energy Science and Technology Software Center (OSTI)

    2006-05-04

    The AIMVEE/WIM system electronically retrieves deployment information, identifies vehicle automatically, and determines total weight, individual wheel weight, individual axle weights, axle spacing, and center-of-balance for any wheeled vehicle in motion. The AIMVEE/WIM system can also perform these functions statically for both wheel vehicles and cargo with information. The AIMVEE/WIM system incorporates digital images and applies cubing algorithms to determine length, width, height for cubic dimensions of both vehicle and cargo. Once all this information ismore » stored, it electronically links to data collection and dissemination systems to provide “actual” weight and measurement information for planning, deployment, and in-transit visibility. The Static Scale Conversion (SSC) system is an unique enhancement to the AIMVEE/WIM system. It enables a SSC to weigh and measure vehicles and cargo dynamically (i.e., as they pass over the large scale and is included in the AIMVEE computer code base. The material to be copyrighted is the Automated-In-Motion Vehicle Evaluation Environment (AIMVEE)/Weigh-In-Motion User Training and Testing material. It includes instructional material in the set-up, operation and tear-down of the AIMVEE/WIM system. It also includes a final exam associated with the training.« less

  1. Pilot-plant automation for catalytic hydrotreating of heavy residua

    SciTech Connect (OSTI)

    Akimoto, O.; Iwamoto, Y.; Kodama, S.; Takeuchi, C.

    1983-08-01

    The research and development center of Chiyoda Chemical Engineering and Construction Co. has been investigating the catalytic hydrotreating of heavy residua via pilot plant technology. Chiyoda's 52 microreactors. bench-scale test units and pilot plants are each used depending on the purpose of the process development for heavy oil upgrading. The microreactors are effective for catalyst screening. Heavier fractions such as asphaltene and sludge materials often disturbed steady state operation. Many unique devices for the test units and improvement of operation procedures make extended operation easy as well as increasing reliability. The computerized data acquisition and data filing systems minimize the work not only for operators but for all research personnel. Currently, about 40 pilot plant units are continuously running while the others are in preparation. Fully automated operation requires only three for data checking at night. In the daytime, seven operators take care of feed supply, product removal and condition changes. For start-up and shut-down, one operator can handle three microreactos, but only one bench-scale unit or pilot plant. Planning is underway for an improved start-up system for the pilot plants using personal computers. This system automatically sets feed rate and raises reactor temperature. (JMT)

  2. Pilot-plant automation for catalytic hydrotreating of heavy residua

    SciTech Connect (OSTI)

    Akimoto, O.; Iwamoto, Y.; Kodama, S.; Takeuchi, C.

    1983-08-01

    Chiyoda's 52 microreactors, bench-scale test units and pilot plants are each used depending on the purpose of the process development for heavy oil upgrading. The microreactors are effective for catalyst screening. Heavier fractions such as asphaltene and sludge materials often disturbed steady state operation. Many unique devices for the test units and improvement of operation procedures make extended operation easy as well as increasing reliability. The computerized data acquisition and data filing systems minimize the work not only for operators but for all research personnel. Currently, about 40 pilot plant units are continuously running while the others are in preparation. Fully automated operation requires only three for data checking at night. In the daytime, seven operators take care of feed supply, product removal and condition changes. For start-up and shut-down, one operator can handle three microreactors, but only one bench-scale unit or pilot plant. Planning is underway for an improved start-up system for the pilot plants using personal computers. This system automatically sets feed rate and raises reactor temperature.

  3. Open Automated Demand Response Dynamic Pricing Technologies and Demonstration

    SciTech Connect (OSTI)

    Ghatikar, Girish; Mathieu, Johanna L.; Piette, Mary Ann; Koch, Ed; Hennage, Dan

    2010-08-02

    This study examines the use of OpenADR communications specification, related data models, technologies, and strategies to send dynamic prices (e.g., real time prices and peak prices) and Time of Use (TOU) rates to commercial and industrial electricity customers. OpenADR v1.0 is a Web services-based flexible, open information model that has been used in California utilities' commercial automated demand response programs since 2007. We find that data models can be used to send real time prices. These same data models can also be used to support peak pricing and TOU rates. We present a data model that can accommodate all three types of rates. For demonstration purposes, the data models were generated from California Independent System Operator's real-time wholesale market prices, and a California utility's dynamic prices and TOU rates. Customers can respond to dynamic prices by either using the actual prices, or prices can be mapped into"operation modes," which can act as inputs to control systems. We present several different methods for mapping actual prices. Some of these methods were implemented in demonstration projects. The study results demonstrate show that OpenADR allows interoperability with existing/future systems/technologies and can be used within related dynamic pricing activities within Smart Grid.

  4. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    SciTech Connect (OSTI)

    Bennett, Bonnie; Boddy, Mark; Doyle, Frank; Jamshidi, Mo; Ogunnaike, Tunde

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  5. Small- and Medium-Size Building Automation and Control System Needs:

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Scoping Study | Department of Energy Small- and Medium-Size Building Automation and Control System Needs: Scoping Study Small- and Medium-Size Building Automation and Control System Needs: Scoping Study Emerging Technologies Project for the 2013 Building Technologies Office's Program Peer Review PDF icon emrgtech05_brambley_040213.pdf More Documents & Publications Advanced Building Control Solutions PNNL: VOLTTRON Commercialization (CBI/ET Open Call) - 2015 Peer Review U.S. Industrial

  6. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    SciTech Connect (OSTI)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  7. A Standard Analysis Method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: Validation and performance

    SciTech Connect (OSTI)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-11-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a plug-and-play manner into a complete analysis system. These building blocks, which are referred to as Standard laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAM). A SAM for the automated determination of polychlorinated biphenyls (PCBs) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAM consists of the following SLMs: a four-channel Soxhlet extractor, a high-volume concentration, a column clean-up, a gas chromatography, a PCB data-interpretation module, a robot, and a human-computer interface. The SAM is configured to meet the requirements specified in the US Environmental Protection Agency`s (EPA) SW-846 methods 3541/3620A/8082 for the analysis of PCBs in soils. The PCB SAM will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed.

  8. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    SciTech Connect (OSTI)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-10-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a `plug and play` manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency`s (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed.

  9. Automated matching and segmentation of lymphoma on serial CT examinations

    SciTech Connect (OSTI)

    Yan Jiayong; Zhao Binsheng; Curran, Sean; Zelenetz, Andrew; Schwartz, Lawrence H.

    2007-01-15

    In patients with lymphoma, identification and quantification of the tumor extent on serial CT examinations is critical for assessing tumor response to therapy. In this paper, we present a computer method to automatically match and segment lymphomas in follow-up CT images. The method requires that target lymph nodes in baseline CT images be known. A fast, approximate alignment technique along the x, y, and axial directions is developed to provide a good initial condition for the subsequent fast free form deformation (FFD) registration of the baseline and the follow-up images. As a result of the registration, the deformed lymph node contours from the baseline images are used to automatically determine internal and external markers for the marker-controlled watershed segmentation performed in the follow-up images. We applied this automated registration and segmentation method retrospectively to 29 lymph nodes in 9 lymphoma patients treated in a clinical trial at our cancer center. A radiologist independently delineated all lymph nodes on all slices in the follow-up images and his manual contours served as the ''gold standard'' for evaluation of the method. Preliminary results showed that 26/29 (89.7%) lymph nodes were correctly matched; i.e., there was a geometrical overlap between the deformed lymph node from the baseline and its corresponding mass in the follow-up images. Of the matched 26 lymph nodes, 22 (84.6%) were successfully segmented; for these 22 lymph nodes, several metrics were calculated to quantify the method's performance. Among them, the average distance and the Hausdorff distance between the contours generated by the computer and those generated by the radiologist were 0.9 mm (stdev. 0.4 mm) and 3.9 mm (stdev. 2.1 mm), respectively.

  10. Grid collector: An event catalog with automated file management

    SciTech Connect (OSTI)

    Wu, Kesheng; Zhang, Wei-Ming; Sim, Alexander; Gu, Junmin; Shoshani, Arie

    2003-10-17

    High Energy Nuclear Physics (HENP) experiments such as STAR at BNL and ATLAS at CERN produce large amounts of data that are stored as files on mass storage systems in computer centers. In these files, the basic unit of data is an event. Analysis is typically performed on a selected set of events. The files containing these events have to be located, copied from mass storage systems to disks before analysis, and removed when no longer needed. These file management tasks are tedious and time consuming. Typically, all events contained in the files are read into memory before a selection is made. Since the time to read the events dominate the overall execution time, reading the unwanted event needlessly increases the analysis time. The Grid Collector is a set of software modules that works together to address these two issues. It automates the file management tasks and provides ''direct'' access to the selected events for analyses. It is currently integrated with the STAR analysis framework. The users can select events based on tags, such as, ''production date between March 10 and 20, and the number of charged tracks > 100.'' The Grid Collector locates the files containing relevant events, transfers the files across the Grid if necessary, and delivers the events to the analysis code through the familiar iterators. There has been some research efforts to address the file management issues, the Grid Collector is unique in that it addresses the event access issue together with the file management issues. This makes it more useful to a large variety of users.

  11. Development of Automated Production Line Processes for Solar Brightfield Modules: Final Report, 1 June 2003-30 November 2007

    SciTech Connect (OSTI)

    Nowlan, M.

    2008-04-01

    Summary of progress by Spire Corporation under NREL's PV Manufacturing R&D Project to develop new automated systems for fabricating very large photovoltaic modules.

  12. Drivers` activities and information needs in an automated highway system. Working paper, August 1995-May 1996

    SciTech Connect (OSTI)

    Levitan, L.; Bloomfield, J.

    1996-10-01

    In most visions of the AHS--including that of the National Automated Highway System Consortium--it has been assumed that when a vehicle was under automated control, the driver would be allowed to engage in any of a variety of activities not related to driving (e.g, working, reading, sleeping). The objective of the first study reported here--one of the noncommuter studies--was to determine what drivers do when traveling under automated control, and whether the age of and/gender or the driver and/or the intrastring gap have an influence on those activities. One the objectives of the commuter experiment--of relevance for this report--was to determine whether what drivers do when traveling under automated control changes as a function of experience with the AHS (i.e., across trials). As conceptualization of the AHS proceeds, the details of the interface between the driver and the in-vehicle system will become more important. One part of that interface will be information supplied by the AHS to the driver, perhaps about such things as traffic conditions ahead predicted trip time to the driver`s selected exit, and so on. To maximize the utility of that information, it is important to determine what it is that drivers would like to know when traveling under automated control. The objective of the third study reported here--the second of the five noncommuter experiments--was to provide a first investigation of that issue.

  13. Automation of MCDOR at NMT-3 Los Alamos National Laboratory. Final report

    SciTech Connect (OSTI)

    Shahinpoor, M.

    1997-01-01

    The automation of various parts of multiple--cycle direct oxide reduction (MCDOR) at LANL`s NMT-3 was the goal of this research and development activities. In particular, originally the following goals were assigned to the author by the NMT-3 technical staff leaders (Greg Bird, Jim McNeese, Joel Williams): (1) Design and fabricate an automation set up; (2) Step-wise automation is preferred; (3) Step 1 involves automatic metering and mixing of powders; and (4) Step 2-automatic transport of powder to furnace location The initial task assigned in May 91 was to get the appropriate design developed and order equipment and parts to automatically weight powders. In fact the work statement read {open_quotes}Create an experimental automation set up in the ME Department at UNM to automatically weigh powders using an electronic balance. Further, design the set up such that the electronic balance is reprogrammable for specific weight set points. Thus, when a set point in weight is reached by means of a vibratory feeder feeding a container on the balance, the electronic balance will send an electronic signal out to switch off the vibratory feeder{close_quotes}. The automation of the reduction of plutonium oxide to plutonium is described.

  14. Direct versus Facility Centric Load Control for Automated Demand Response

    SciTech Connect (OSTI)

    Koch, Ed; Piette, Mary Ann

    2009-11-06

    Direct load control (DLC) refers to the scenario where third party entities outside the home or facility are responsible for deciding how and when specific customer loads will be controlled in response to Demand Response (DR) events on the electric grid. Examples of third parties responsible for performing DLC may be Utilities, Independent System Operators (ISO), Aggregators, or third party control companies. DLC can be contrasted with facility centric load control (FCLC) where the decisions for how loads are controlled are made entirely within the facility or enterprise control systems. In FCLC the facility owner has more freedom of choice in how to respond to DR events on the grid. Both approaches are in use today in automation of DR and both will continue to be used in future market segments including industrial, commercial and residential facilities. This paper will present a framework which can be used to differentiate between DLC and FCLC based upon where decisions are made on how specific loads are controlled in response to DR events. This differentiation is then used to compare and contrast the differences between DLC and FCLC to identify the impact each has on:(1)Utility/ISO and third party systems for managing demand response, (2)Facility systems for implementing load control, (3)Communications networks for interacting with the facility and (4)Facility operators and managers. Finally a survey of some of the existing DR related specifications and communications standards is given and their applicability to DLC or FCLC. In general FCLC adds more cost and responsibilities to the facilities whereas DLC represents higher costs and complexity for the Utility/ISO. This difference is primarily due to where the DR Logic is implemented and the consequences that creates. DLC may be more certain than FCLC because it is more predictable - however as more loads have the capability to respond to DR signals, people may prefer to have their own control of end-use loads and FCLC systems. Research is needed to understand the predictability of FCLC which is related to the perceived value of the DR from the facility manager or home owner's perspective.

  15. CRDIAC: Coupled Reactor Depletion Instrument with Automated Control

    SciTech Connect (OSTI)

    Steven K. Logan

    2012-08-01

    When modeling the behavior of a nuclear reactor over time, it is important to understand how the isotopes in the reactor will change, or transmute, over that time. This is especially important in the reactor fuel itself. Many nuclear physics modeling codes model how particles interact in the system, but do not model this over time. Thus, another code is used in conjunction with the nuclear physics code to accomplish this. In our code, Monte Carlo N-Particle (MCNP) codes and the Multi Reactor Transmutation Analysis Utility (MRTAU) were chosen as the codes to use. In this way, MCNP would produce the reaction rates in the different isotopes present and MRTAU would use cross sections generated from these reaction rates to determine how the mass of each isotope is lost or gained. Between these two codes, the information must be altered and edited for use. For this, a Python 2.7 script was developed to aid the user in getting the information in the correct forms. This newly developed methodology was called the Coupled Reactor Depletion Instrument with Automated Controls (CRDIAC). As is the case in any newly developed methodology for modeling of physical phenomena, CRDIAC needed to be verified against similar methodology and validated against data taken from an experiment, in our case AFIP-3. AFIP-3 was a reduced enrichment plate type fuel tested in the ATR. We verified our methodology against the MCNP Coupled with ORIGEN2 (MCWO) method and validated our work against the Post Irradiation Examination (PIE) data. When compared to MCWO, the difference in concentration of U-235 throughout Cycle 144A was about 1%. When compared to the PIE data, the average bias for end of life U-235 concentration was about 2%. These results from CRDIAC therefore agree with the MCWO and PIE data, validating and verifying CRDIAC. CRDIAC provides an alternative to using ORIGEN-based methodology, which is useful because CRDIAC's depletion code, MRTAU, uses every available isotope in its depletion, unlike ORIGEN, which only depletes the isotopes specified by the user. This means that depletions done by MRTAU more accurately reflect reality. MRTAU also allows the user to build new isotope data sets, which means any isotope with nuclear data could be depleted, something that would help predict the outcomes of nuclear reaction testing in materials other than fuel, like beryllium or gold.

  16. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect (OSTI)

    Jaeger, Calvin D.; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  17. Scalable Distributed Automation System: Scalable Real-time Decentralized Volt/VAR Control

    SciTech Connect (OSTI)

    2012-03-01

    GENI Project: Caltech is developing a distributed automation system that allows distributed generators—solar panels, wind farms, thermal co-generation systems—to effectively manage their own power. To date, the main stumbling block for distributed automation systems has been the inability to develop software that can handle more than 100,000 distributed generators and be implemented in real time. Caltech’s software could allow millions of generators to self-manage through local sensing, computation, and communication. Taken together, localized algorithms can support certain global objectives, such as maintaining the balance of energy supply and demand, regulating voltage and frequency, and minimizing cost. An automated, grid-wide power control system would ease the integration of renewable energy sources like solar power into the grid by quickly transmitting power when it is created, eliminating the energy loss associated with the lack of renewable energy storage capacity of the grid.

  18. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    SciTech Connect (OSTI)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  19. Automated work packages architecture: An initial set of human factors and instrumentation and controls requirements

    SciTech Connect (OSTI)

    Agarwal, Vivek; Oxstrand, Johanna H.; Le Blanc, Katya L.

    2014-09-01

    The work management process in current fleets of national nuclear power plants is so highly dependent on large technical staffs and quality of work instruction, i.e., paper-based, that this puts nuclear energy at somewhat of a long-term economic disadvantage and increase the possibility of human errors. Technologies like mobile portable devices and computer-based procedures can play a key role in improving the plant work management process, thereby increasing productivity and decreasing cost. Automated work packages are a fundamentally an enabling technology for improving worker productivity and human performance in nuclear power plants work activities because virtually every plant work activity is accomplished using some form of a work package. As part of this year’s research effort, automated work packages architecture is identified and an initial set of requirements identified, that are essential and necessary for implementation of automated work packages in nuclear power plants.

  20. Automated Commissioning for Lower-cost, Widely Deployed Building Commissioning of the Future

    SciTech Connect (OSTI)

    Brambley, Michael R.; Katipamula, Srinivas

    2011-08-16

    This chapter takes a brief look at the benefits of commissioning and describes a vision of the future where most of the objectives of commissioning will be accomplished automatically by capabilities built into the building systems themselves. Commissioning will become an activity that is performed continuously rather than periodically, and only repairs requiring replacement or overhaul of equipment will require manual intervention. This chapter then identifies some of the technologies that will be needed to realize this vision and ends with a call for all involved in the enterprise of building commissioning and automation to embrace and dedicate themselves to a future of automated commissioning.

  1. Integrated automation of the New Waddell Dam performance data acquisition system

    SciTech Connect (OSTI)

    Welch, L.R.; Fields, P.E.

    1999-07-01

    New Waddell Dam, a key feature of the US Bureau of Reclamation's Central Arizona Project, had elements of its dam safety data acquisition system incorporated into the design and construction. The instrumentation array is a reflection of the dam's large size and foundation complexity. Much of the instrumentation is automated. This automation was accomplished while maintaining independent communication connections to major divisions of the instrument array. Fiber optic cables are used to provide high Quality data, free from voltage surges that could originate in a nearby powerplant switchyard or from lightning. The system has been working well but there are concerns with a lack of continued equipment manufacturer support.

  2. Distributed Microprocessor Automation Network for Synthesizing Radiotracers Used in Positron Emission Tomography [PET

    DOE R&D Accomplishments [OSTI]

    Russell, J. A. G.; Alexoff, D. L.; Wolf, A. P.

    1984-09-01

    This presentation describes an evolving distributed microprocessor network for automating the routine production synthesis of radiotracers used in Positron Emission Tomography. We first present a brief overview of the PET method for measuring biological function, and then outline the general procedure for producing a radiotracer. The paper identifies several reasons for our automating the syntheses of these compounds. There is a description of the distributed microprocessor network architecture chosen and the rationale for that choice. Finally, we speculate about how this network may be exploited to extend the power of the PET method from the large university or National Laboratory to the biomedical research and clinical community at large. (DT)

  3. An Automated Implementation of On-shell Methods for One-Loop Amplitudes

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Journal Article: An Automated Implementation of On-shell Methods for One-Loop Amplitudes Citation Details In-Document Search Title: An Automated Implementation of On-shell Methods for One-Loop Amplitudes Authors: Berger, C.F. ; Bern, Z. ; Dixon, L.J. ; Febres Cordero, F. ; Forde, D. ; Ita, H. ; Kosower, D.A. ; Maitre, D. ; /MIT, LNS /Santa Barbara, KITP /SLAC /UCLA /Saclay Publication Date: 2008-04-11 OSTI Identifier: 927069 Report Number(s):

  4. The Stanford Automated Mounter: Pushing the limits of sample exchange at the SSRL macromolecular crystallography beamlines

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; Cohen, Aina E.

    2016-02-24

    The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.

  5. Automation of the New Brunswick Laboratory controlled-potential coulometric method for plutonium

    SciTech Connect (OSTI)

    Mitchell, W.G.; Troutman, D.; Lewis, K.

    1988-11-01

    The development and evolution of the automated coulometric analysis of plutonium at the New Brunswick Laboratory is described. The importance of maintaining electrical calibration of the analysis so that the results are traceable to the Faraday is discussed. The innovations which were necessary to achieve the desired accuracy and precision are explained. The experimental tests to qualify the instruments for use in sample analysis at New Brunswick Laboratory are detailed. Details of the calculations performed by the automated coulometer system are given in the Appendix. 21 refs., 16 figs., 4 tabs.

  6. Y-12 Plant decontamination and decommissioning Technology Logic Diagram for Building 9201-4: Volume 3, Technology evaluation data sheets: Part B, Decontamination; robotics/automation; waste management

    SciTech Connect (OSTI)

    1994-09-01

    This volume consists of the Technology Logic Diagrams (TLDs) for the decontamination, robotics/automation, and waste management areas.

  7. NREL Research and Thoughts on Connected and Automated Vehicle Energy Impacts; NREL (National Renewable Energy Laboratory)

    SciTech Connect (OSTI)

    Gonder, Jeff; Wood, Eric; Lammert, Michael

    2014-12-09

    Jeff was invited to brief the EPA Mobile Sources Technical Review Subcommittee on considerations regarding potential energy and environmental considerations for connected and automated vehicles. For more information about the MSTRS see http://www2.epa.gov/caaac/mobile-sources-technical-review-subcommittee-mstrs-caaac.

  8. Development and evaluation of an automated reflectance microscope system for the petrographic characterization of bituminous coals

    SciTech Connect (OSTI)

    Hoover, D. S.; Davis, A.

    1980-10-01

    The development of automated coal petrographic techniques will lessen the demands on skilled personnel to do routine work. This project is concerned with the development and successful testing of an instrument which will meet these needs. The fundamental differences in reflectance of the three primary maceral groups should enable their differentiation in an automated-reflectance frequency histogram (reflectogram). Consequently, reflected light photometry was chosen as the method for automating coal petrographic analysis. Three generations of an automated system (called Rapid Scan Versions I, II and III) were developed and evaluated for petrographic analysis. Their basic design was that of a reflected-light microscope photometer with an automatic stage, interfaced with a minicomputer. The hardware elements used in the Rapid Scan Version I limited the system's flexibility and presented problems with signal digitization and measurement precision. Rapid Scan Version II was designed to incorporate a new microscope photometer and computer system. A digital stepping stage was incorporated into the Rapid Scan Version III system. The precision of reflectance determination of this system was found to be +- 0.02 percent reflectance. The limiting factor in quantitative interpretation of Rapid Scan reflectograms is the resolution of reflectance populations of the individual maceral groups. Statistical testing indicated that reflectograms were highly reproducible, and a new computer program, PETAN, was written to interpret the curves for vitrinite reflectance parameters ad petrographic.

  9. ProDeGe: A Computational Protocol for fully Automated Decontamination of Genomic Data

    Energy Science and Technology Software Center (OSTI)

    2015-12-01

    The Single Cell Data Decontamination Pipeline is a fully-automated software tool which classifies unscreened contigs from single cell datasets through a combination of homology and feature-based methodologies using the organism's nucleotide sequences and known NCBI taxonomony. The software is freely available to download and install, and can be run on any system.

  10. Nondestructive and automated testing for soil and rock properties. ASTM special technical publication 1350

    SciTech Connect (OSTI)

    Marr, W.A.; Fairhurst, C.E.

    1999-07-01

    The purpose of the symposium was to highlight recent developments in nondestructive and automated testing for soil and rock properties. Speakers present results of recent research in these areas that have practical application for the rapid and economical testing of soil and rock. Authors were encouraged to identify which testing equipment and methods have sufficient practical application to warrant standards development.

  11. Opportunities for Energy Efficiency and Automated Demand Response in Industrial Refrigerated Warehouses in California

    SciTech Connect (OSTI)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee; Rockoff, Alexandra; Piette, Mary Ann

    2009-05-11

    This report summarizes the Lawrence Berkeley National Laboratory's research to date in characterizing energy efficiency and open automated demand response opportunities for industrial refrigerated warehouses in California. The report describes refrigerated warehouses characteristics, energy use and demand, and control systems. It also discusses energy efficiency and open automated demand response opportunities and provides analysis results from three demand response studies. In addition, several energy efficiency, load management, and demand response case studies are provided for refrigerated warehouses. This study shows that refrigerated warehouses can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for open automated demand response (OpenADR) at little additional cost. These improved controls may prepare facilities to be more receptive to OpenADR due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.

  12. Function allocation for humans and automation in the context of team dynamics

    SciTech Connect (OSTI)

    Jeffrey C. Joe; John O'Hara; Jacques Hugo; Johanna Oxstrand

    2015-07-01

    Within Human Factors Engineering, a decision-making process called function allocation (FA) is used during the design life cycle of complex systems to distribute the system functions, often identified through a functional requirements analysis, to all human and automated machine agents (or teammates) involved in controlling the system. Most FA methods make allocation decisions primarily by comparing the capabilities of humans and automation, but then also by considering secondary factors such as cost, regulations, and the health and safety of workers. The primary analysis of the strengths and weaknesses of humans and machines, however, is almost always considered in terms of individual human or machine capabilities. Yet, FA is fundamentally about teamwork in that the goal of the FA decision-making process is to determine what are the optimal allocations of functions among agents. Given this framing of FA, and the increasing use of and sophistication of automation, there are two related social psychological issues that current FA methods need to address more thoroughly. First, many principles for effective human teamwork are not considered as central decision points or in the iterative hypothesis and testing phase in most FA methods, when it is clear that social factors have numerous positive and negative effects on individual and team capabilities. Second, social psychological factors affecting team performance and can be difficult to translate to automated agents, and most FA methods currently do not account for this effect. The implications for these issues are discussed.

  13. Automated Energy Distribution and Reliability System: Validation Integration - Results of Future Architecture Implementation

    SciTech Connect (OSTI)

    Buche, D. L.

    2008-06-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  14. A Test Control Languate for a Computer-automated Battery Testing Lab

    Energy Science and Technology Software Center (OSTI)

    2006-02-08

    A test control language was developed for a compute automated battery testing laboratory to permit an operator to construct testing scripts to define an arbitrary battery test regime. The statements defined by the language are tested for syntax and control block structures are produced by a compiler for downloading into data acquisition computers.

  15. Manual of Security Requirements for the Classified Automated Information System Security Program

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1994-07-15

    This Manual provides specific instructions and delineates the requirements to ensure the graded security of classified information entrusted to the Department of Energy (DOE) that is processed, stored, transferred, or accessed on Automated Information Systems (AISs) and AIS networks. Canceled by DOE M 471.2-2.

  16. Application of bar codes to the automation of analytical sample data collection

    SciTech Connect (OSTI)

    Jurgensen, H A

    1986-01-01

    The Health Protection Department at the Savannah River Plant collects 500 urine samples per day for tritium analyses. Prior to automation, all sample information was compiled manually. Bar code technology was chosen for automating this program because it provides a more accurate, efficient, and inexpensive method for data entry. The system has three major functions: sample labeling is accomplished at remote bar code label stations composed of an Intermec 8220 (Intermec Corp.) interfaced to an IBM-PC, data collection is done on a central VAX 11/730 (Digital Equipment Corp.). Bar code readers are used to log-in samples to be analyzed on liquid scintillation counters. The VAX 11/730 processes the data and generates reports, data storage is on the VAX 11/730 and backed up on the plant's central computer. A brief description of several other bar code applications at the Savannah River Plant is also presented.

  17. Method and system for assigning a confidence metric for automated determination of optic disc location

    DOE Patents [OSTI]

    Karnowski, Thomas P.; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya; Chaum, Edward

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  18. Using after-action review based on automated performance assessment to enhance training effectiveness.

    SciTech Connect (OSTI)

    Stevens-Adams, Susan Marie; Gieseler, Charles J.; Basilico, Justin Derrick; Abbott, Robert G.; Forsythe, James Chris

    2010-09-01

    Training simulators have become increasingly popular tools for instructing humans on performance in complex environments. However, the question of how to provide individualized and scenario-specific assessment and feedback to students remains largely an open question. In this work, we follow-up on previous evaluations of the Automated Expert Modeling and Automated Student Evaluation (AEMASE) system, which automatically assesses student performance based on observed examples of good and bad performance in a given domain. The current study provides a rigorous empirical evaluation of the enhanced training effectiveness achievable with this technology. In particular, we found that students given feedback via the AEMASE-based debrief tool performed significantly better than students given only instructor feedback on two out of three domain-specific performance metrics.

  19. New Prototype Safeguards Technology Offers Improved Confidence and Automation for Uranium Enrichment Facilities

    SciTech Connect (OSTI)

    Brim, Cornelia P.

    2013-04-01

    An important requirement for the international safeguards community is the ability to determine the enrichment level of uranium in gas centrifuge enrichment plants and nuclear fuel fabrication facilities. This is essential to ensure that countries with nuclear nonproliferation commitments, such as States Party to the Nuclear Nonproliferation Treaty, are adhering to their obligations. However, current technologies to verify the uranium enrichment level in gas centrifuge enrichment plants or nuclear fuel fabrication facilities are technically challenging and resource-intensive. NNSA’s Office of Nonproliferation and International Security (NIS) supports the development, testing, and evaluation of future systems that will strengthen and sustain U.S. safeguards and security capabilities—in this case, by automating the monitoring of uranium enrichment in the entire inventory of a fuel fabrication facility. One such system is HEVA—hybrid enrichment verification array. This prototype was developed to provide an automated, nondestructive assay verification technology for uranium hexafluoride (UF6) cylinders at enrichment plants.

  20. Open Automated Demand Response Technologies for Dynamic Pricing and Smart Grid

    SciTech Connect (OSTI)

    Ghatikar, Girish; Mathieu, Johanna L.; Piette, Mary Ann; Kiliccote, Sila

    2010-06-02

    We present an Open Automated Demand Response Communications Specifications (OpenADR) data model capable of communicating real-time prices to electricity customers. We also show how the same data model could be used to for other types of dynamic pricing tariffs (including peak pricing tariffs, which are common throughout the United States). Customers participating in automated demand response programs with building control systems can respond to dynamic prices by using the actual prices as inputs to their control systems. Alternatively, prices can be mapped into"building operation modes," which can act as inputs to control systems. We present several different strategies customers could use to map prices to operation modes. Our results show that OpenADR can be used to communicate dynamic pricing within the Smart Grid and that OpenADR allows for interoperability with existing and future systems, technologies, and electricity markets.

  1. Field Demonstration of Automated Demand Response for Both Winter and Summer Events in Large Buildings in the Pacific Northwest

    SciTech Connect (OSTI)

    Piette, Mary Ann; Kiliccote, Sila; Dudley, Junqiao H.

    2011-11-11

    There are growing strains on the electric grid as cooling peaks grow and equipment ages. Increased penetration of renewables on the grid is also straining electricity supply systems and the need for flexible demand is growing. This paper summarizes results of a series of field test of automated demand response systems in large buildings in the Pacific Northwest. The objective of the research was two fold. One objective was to evaluate the use demand response automation technologies. A second objective was to evaluate control strategies that could change the electric load shape in both winter and summer conditions. Winter conditions focused on cold winter mornings, a time when the electric grid is often stressed. The summer test evaluated DR strategies in the afternoon. We found that we could automate both winter and summer control strategies with the open automated demand response communication standard. The buildings were able to provide significant demand response in both winter and summer events.

  2. RapTOR: Automated sequencing library preparation and suppression for rapid pathogen characterization ( 7th Annual SFAF Meeting, 2012)

    ScienceCinema (OSTI)

    Lane, Todd [SNL

    2013-02-11

    Todd Lane on "RapTOR: Automated sequencing library preparation and suppression for rapid pathogen characterization" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.

  3. Estimate of Fuel Consumption and GHG Emission Impact on an Automated Mobility District: Preprint

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Estimate of Fuel Consumption and GHG Emission Impact on an Automated Mobility District Preprint Yuche Chen, Stanley Young, and Jeff Gonder National Renewable Energy Laboratory Xuewei Qi University of California Riverside Presented at the 4th International Conference on Connected Vehicles & Expo (ICCVE 2015) Shenzhen, China October 19-23, 2015 Conference Paper NREL/CP-5400-65257 December 2015 NOTICE The submitted manuscript has been offered by an employee of the Alliance for Sustainable

  4. Automated fabrication, characterization and transport of ICF pellets. Final report, March 1, 1979-October 31, 1980

    SciTech Connect (OSTI)

    Clifford, D W; Boyd, B A; Lilienkamp, R H

    1980-12-01

    The near-term objectives of the contract were threefold: (1) evaluate techniques for the production of frozen hydrogen microspheres and demonstrate concepts for coating them; (2) develop and demonstrate an optical characterization system which could lead to automated pellet inspection; and (3) develop and demonstrate a preliminary electrostatic pellet transport control system. This report describes the equipment assembled for these experiments and the results obtained.

  5. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    SciTech Connect (OSTI)

    Genebes, Caroline; Filleron, Thomas; Graff, Pierre; Jonca, Frdric; Huyghe, Eric; Thoulouzan, Matthieu; Soulie, Michel; Malavaud, Bernard; Aziza, Richard; Brun, Thomas; Delannes, Martine; Bachaud, Jean-Marc

    2013-11-15

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-risk and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes.

  6. Automated position control of a surface array relative to a liquid microjunction surface sampler

    DOE Patents [OSTI]

    Van Berkel, Gary J.; Kertesz, Vilmos; Ford, Michael James

    2007-11-13

    A system and method utilizes an image analysis approach for controlling the probe-to-surface distance of a liquid junction-based surface sampling system for use with mass spectrometric detection. Such an approach enables a hands-free formation of the liquid microjunction used to sample solution composition from the surface and for re-optimization, as necessary, of the microjunction thickness during a surface scan to achieve a fully automated surface sampling system.

  7. Reducing Fuel Consumption through Semi-Automated Platooning with Class 8 Tractor Trailer Combinations (Poster)

    SciTech Connect (OSTI)

    Lammert, M.; Gonder, J.

    2014-07-01

    This poster describes the National Renewable Energy Laboratory's evaluation of the fuel savings potential of semi-automated truck platooning. Platooning involves reducing aerodynamic drag by grouping vehicles together and decreasing the distance between them through the use of electronic coupling, which allows multiple vehicles to accelerate or brake simultaneously. The NREL study addressed the need for data on American style line-haul sleeper cabs with modern aerodynamics and over a range of trucking speeds common in the United States.

  8. Enhancing Seismic Calibration Research Through Software Automation and Scientific Information Management

    SciTech Connect (OSTI)

    Ruppert, S D; Dodge, D A; Ganzberger, M D; Harris, D B; Hauk, T F

    2009-07-07

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Development (GNEMRD) Program at LLNL continues to make significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. In contrast to previous years, software development work this past year has emphasized development of automation at the data ingestion level. This change reflects a gradually-changing emphasis in our program from processing a few large data sets that result in a single integrated delivery, to processing many different data sets from a variety of sources. The increase in the number of sources had resulted in a large increase in the amount of metadata relative to the final volume of research products. Software developed this year addresses the problems of: (1) Efficient metadata ingestion and conflict resolution; (2) Automated ingestion of bulletin information; (3) Automated ingestion of waveform information from global data centers; and (4) Site Metadata and Response transformation required for certain products. This year, we also made a significant step forward in meeting a long-standing goal of developing and using a waveform correlation framework. Our objective for such a framework is to extract additional calibration data (e.g. mining blasts) and to study the extent to which correlated seismicity can be found in global and regional scale environments.

  9. The rotary zone thermal cycler: A low-power system enabling automated rapid PCR

    SciTech Connect (OSTI)

    Bartsch, Michael S.; Edwards, Harrison S.; Lee, Daniel; Moseley, Caroline E.; Tew, Karen E.; Renzi, Ronald F.; Van de Vreugde, James L.; Kim, Hanyoup; Knight, Daniel L.; Sinha, Anupama; Branda, Steven S.; Patel, Kamlesh D.; Wanunu, Meni

    2015-03-31

    In this study, advances in molecular biology, microfluidics, and laboratory automation continue to expand the accessibility and applicability of these methods beyond the confines of conventional, centralized laboratory facilities and into point of use roles in clinical, military, forensic, portable, and field-deployed applications. As a result, there is a growing need to adapt the unit operations of molecular biology such as aliquoting, centrifuging, mixing, and thermal cycling to compact, portable, low-power, and automation-ready formats. Here we present one such adaptation, the rotary zone thermal cycler (RZTC), a novel wheel-based device capable of cycling up to four different fixed-temperature blocks into contact with a stationary 4-microliter capillary-bound sample to realize 1-3 second transitions with steady state heater power of less than 10 W. We further demonstrate the utility of the RZTC for DNA amplification as part of a highly integrated rotary zone PCR (rzPCR) system using low-volume valves and syringe-based fluid handling to automate sample loading and unloading, thermal cycling, and between run cleaning functionalities in a compact, modular form factor. In addition to characterizing the performance of the RZTC and the efficacy of different online cleaning protocols, preliminary results are presented for rapid single-plex PCR, multiplex short tandem repeat (STR) amplification, and second strand cDNA synthesis.

  10. The rotary zone thermal cycler: A low-power system enabling automated rapid PCR

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Bartsch, Michael S.; Edwards, Harrison S.; Lee, Daniel; Moseley, Caroline E.; Tew, Karen E.; Renzi, Ronald F.; Van de Vreugde, James L.; Kim, Hanyoup; Knight, Daniel L.; Sinha, Anupama; et al

    2015-03-31

    Advances in molecular biology, microfluidics, and laboratory automation continue to expand the accessibility and applicability of these methods beyond the confines of conventional, centralized laboratory facilities and into point of use roles in clinical, military, forensic, and field-deployed applications. As a result, there is a growing need to adapt the unit operations of molecular biology (e.g., aliquoting, centrifuging, mixing, and thermal cycling) to compact, portable, low-power, and automation-ready formats. Here we present one such adaptation, the rotary zone thermal cycler (RZTC), a novel wheel-based device capable of cycling up to four different fixed-temperature blocks into contact with a stationary 4-microlitermorecapillary-bound sample to realize 1-3 second transitions with steady state heater power of less than 10 W. We demonstrate the utility of the RZTC for DNA amplification as part of a highly integrated rotary zone PCR (rzPCR) system that uses low-volume valves and syringe-based fluid handling to automate sample loading and unloading, thermal cycling, and between-run cleaning functionalities in a compact, modular form factor. In addition to characterizing the performance of the RZTC and the efficacy of different online cleaning protocols, we present preliminary results for rapid single-plex PCR, multiplex short tandem repeat (STR) amplification, and second strand cDNA synthesis.less

  11. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    SciTech Connect (OSTI)

    Homan, Gregory K.; Aghajanzadeh, Arian; McKane, Aimee

    2015-08-30

    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  12. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    SciTech Connect (OSTI)

    Williams, Alex C; Hitt, Austin N; Voisin, Sophie; Tourassi, Georgia

    2013-01-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  13. Assessment of the Current Level of Automation in the Manufacture of Fuel Cell Systems for Combined Heat and Power Applications

    SciTech Connect (OSTI)

    Ulsh, M.; Wheeler, D.; Protopappas, P.

    2011-08-01

    The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study using a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.

  14. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    SciTech Connect (OSTI)

    Tsai, Yingssu; McPhillips, Scott E.; Gonzlez, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-05-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference Fourier maps.

  15. Development and evaluation of fully automated demand response in large facilities

    SciTech Connect (OSTI)

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing, characterization, and evaluation relating to Auto-DR. This evaluation also included the related decisionmaking perspectives of the facility owners and managers. Another goal of this project was to develop and test a real-time signal for automated demand response that provided a common communication infrastructure for diverse facilities. The six facilities recruited for this project were selected from the facilities that received CEC funds for new DR technology during California's 2000-2001 electricity crises (AB970 and SB-5X).

  16. Demonstration of automated price response in large customers in New York City using Auto-DR and OpenADR

    SciTech Connect (OSTI)

    Kim, Joyce Jihyun; Schetrit, Oren; Yin, Rongxin; Kiliccote, Sila

    2014-05-01

    Demand response (DR) – allowing customers to respond to reliability requests and market prices by changing electricity use from their normal consumption pattern – continues to be seen as an attractive means of demand-side management and a fundamental smart-grid improvement that links supply and demand. From October 2011 to December 2013, the Demand Response Research Center at Lawrence Berkeley National Laboratory, the New York State Energy Research and Development Authority, and partners Honeywell and Akuacom, have conducted a demonstration project enabling Automated Demand Response (Auto-DR) in large commercial buildings located in New York City using Open Automated Demand Response (OpenADR) communication protocols. In particular, this project focuses on demonstrating how the OpenADR platform, enabled by Akuacom, can automate and simplify interactions between buildings and various stakeholders in New York State and enable the automation of customers’ price response to yield bill savings under dynamic pricing. In this paper, the cost control opportunities under day-ahead hourly pricing and Auto-DR control strategies are presented for four demonstration buildings; present the breakdown of Auto-DR enablement costs; summarize the field test results and their load impact; and show potential bill savings by enabling automated price response under Consolidated Edison’s Mandatory Hourly Pricing (MHP) tariff. For one of the sites, the potential bill savings at the site’s current retail rate are shown. Facility managers were given granular equipment-level opt-out capability to ensure full control of the sites during the Auto-DR implementation. The expected bill savings ranged from 1.1% to 8.0% of the total MHP bill. The automation and enablement costs ranged from $70 to $725 per kW shed. The results show that OpenADR can facilitate the automation of price response, deliver savings to the customers and opt-out capability of the implementation retains control of the sites by facility managers.

  17. Heavy Oil Process Monitor: Automated On-Column Asphaltene Precipitation and Re-Dissolution

    SciTech Connect (OSTI)

    John F. Schabron; Joseph F. Rovani; Mark Sanderson

    2007-03-31

    An automated separation technique was developed that provides a new approach to measuring the distribution profiles of the most polar, or asphaltenic components of an oil, using a continuous flow system to precipitate and re-dissolve asphaltenes from the oil. Methods of analysis based on this new technique were explored. One method based on the new technique involves precipitation of a portion of residua sample in heptane on a polytetrafluoroethylene-packed (PTFE) column. The precipitated material is re-dissolved in three steps using solvents of increasing polarity: cyclohexane, toluene, and methylene chloride. The amount of asphaltenes that dissolve in cyclohexane is a useful diagnostic of the thermal history of oil, and its proximity to coke formation. For example, about 40 % (w/w) of the heptane asphaltenes from unpyrolyzed residua dissolves in cyclohexane. As pyrolysis progresses, this number decrease to below 15% as coke and toluene insoluble pre-coke materials appear. Currently, the procedure for the isolation of heptane asphaltenes and the determination of the amount of asphaltenes soluble in cyclohexane spans three days. The automated procedure takes one hour. Another method uses a single solvent, methylene chloride, to re-dissolve the material that precipitates on heptane on the PTFE-packed column. The area of this second peak can be used to calculate a value which correlates with gravimetric asphaltene content. Currently the gravimetric procedure to determine asphaltenes takes about 24 hours. The automated procedure takes 30 minutes. Results for four series of original and pyrolyzed residua were compared with data from the gravimetric methods. Methods based on the new on-column precipitation and re-dissolution technique provide significantly more detail about the polar constituent's oils than the gravimetric determination of asphaltenes.

  18. Collaboration, Automation, and Information Management at Hanford High Level Radioactive Waste (HLW) Tank Farms

    SciTech Connect (OSTI)

    Aurah, Mirwaise Y.; Roberts, Mark A.

    2013-12-12

    Washington River Protection Solutions (WRPS), operator of High Level Radioactive Waste (HLW) Tank Farms at the Hanford Site, is taking an over 20-year leap in technology, replacing systems that were monitored with clipboards and obsolete computer systems, as well as solving major operations and maintenance hurdles in the area of process automation and information management. While WRPS is fully compliant with procedures and regulations, the current systems are not integrated and do not share data efficiently, hampering how information is obtained and managed.

  19. Automated apparatus for solvent separation of a coal liquefaction product stream

    DOE Patents [OSTI]

    Schweighardt, Frank K. (Upper Macungie, PA)

    1985-01-01

    An automated apparatus for the solvent separation of a coal liquefaction product stream that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In use of the apparatus, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control means. The mixture in the filter is agitated by means of ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process.

  20. Automated Testing of Supercomputers ANL/ALCF/TM-13/3 Argonne Leadership Computing Facility

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Automated Testing of Supercomputers ANL/ALCF/TM-13/3 Argonne Leadership Computing Facility Availability of This Report This report is available, at no cost, at http://www.osti.gov/bridge. It is also available on paper to the U.S. Department of Energy and its contractors, for a processing fee, from: U.S. Department of Energy Offce of Scientifc and Technical Information P.O. Box 62 Oak Ridge, TN 37831-0062 phone (865) 576-8401 fax (865) 576-5728 reports@adonis.osti.gov Disclaimer This report was

  1. SU-E-CAMPUS-T-01: Automation of the Winston-Lutz Test for Stereotactic Radiosurgery

    SciTech Connect (OSTI)

    Litzenberg, D; Irrer, J; Kessler, M; Lam, K; Keranen, W

    2014-06-15

    Purpose: To optimize clinical efficiency and shorten patient wait time by minimizing the time and effort required to perform the Winston-Lutz test before stereotactic radiosurgery (SRS) through automation of the delivery, analysis, and documentation of results. Methods: The radiation fields of the Winston-Lutz (WL) test were created in a machine-QA patient saved in ARIA for use before SRS cases. Images of the BRW target ball placed at mechanical isocenter are captured with the portal imager for each of four, 2cm2cm, MLC-shaped beams. When the WL plan is delivered and closed, this event is detected by in-house software called EventNet which automates subsequent processes with the aid of the ARIA web services. Images are automatically retrieved from the ARIA database and analyzed to determine the offset of the target ball from radiation isocenter. The results are posted to a website and a composite summary image of the results is pushed back into ImageBrowser for review and authenticated documentation. Results: The total time to perform the test was reduced from 20-25 minutes to less than 4 minutes. The results were found to be more accurate and consistent than the previous method which used radiochromic film. The images were also analyzed with DoseLab for comparison. The difference between the film and automated WL results in the X and Y direction and the radius were (?0.17 +/? 0.28) mm, (0.21 +/? 0.20) mm and (?0.14 +/? 0.27) mm, respectively. The difference between the DoseLab and automated WL results were (?0.05 +/? 0.06) mm, (?0.01 +/? 0.02) mm and (0.01 +/? 0.07) mm, respectively. Conclusions: This process reduced patient wait times by 1520 minutes making the treatment machine available to treat another patient. Accuracy and consistency of results were improved over the previous method and were comparable to other commercial solutions. Access to the ARIA web services is made possible through an Eclipse co-development agreement with Varian Medical Systems.

  2. Functional requirements for the Automated Transportation Management System: TTP number: RL 439002

    SciTech Connect (OSTI)

    Portsmouth, J.H.

    1992-12-31

    This requirements analysis, documents Department of Energy (DOE) transportation management procedures for the purpose of providing a clear and mutual understanding between users and designers of the proposed Automated Transportation Management System (ATMS). It is imperative that one understand precisely how DOE currently performs traffic management tasks; only then can an integrated system be proposed that successfully satisfies the major requirements of transportation managers and other system users. Accordingly, this report describes the current workings of DOE transportation organizations and then proposes a new system which represents a synthesis of procedures (both current and desired) which forms the basis for further systems development activities.

  3. Automated Feature Generation in Large-Scale Geospatial Libraries for Content-Based Indexing.

    SciTech Connect (OSTI)

    Tobin Jr, Kenneth William; Bhaduri, Budhendra L; Bright, Eddie A; Cheriydat, Anil; Karnowski, Thomas Paul; Palathingal, Paul J; Potok, Thomas E; Price, Jeffery R

    2006-05-01

    We describe a method for indexing and retrieving high-resolution image regions in large geospatial data libraries. An automated feature extraction method is used that generates a unique and specific structural description of each segment of a tessellated input image file. These tessellated regions are then merged into similar groups, or sub-regions, and indexed to provide flexible and varied retrieval in a query-by-example environment. The methods of tessellation, feature extraction, sub-region clustering, indexing, and retrieval are described and demonstrated using a geospatial library representing a 153 km2 region of land in East Tennessee at 0.5 m per pixel resolution.

  4. Opportunities for Energy Efficiency and Open Automated Demand Response in Wastewater Treatment Facilities in California -- Phase I Report

    SciTech Connect (OSTI)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee; Song, Katherine; Piette, Mary Ann

    2009-04-01

    This report summarizes the Lawrence Berkeley National Laboratory?s research to date in characterizing energy efficiency and automated demand response opportunities for wastewater treatment facilities in California. The report describes the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy use and demand, as well as details of the wastewater treatment process. It also discusses control systems and energy efficiency and automated demand response opportunities. In addition, several energy efficiency and load management case studies are provided for wastewater treatment facilities.This study shows that wastewater treatment facilities can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for automated demand response at little additional cost. These improved controls may prepare facilities to be more receptive to open automated demand response due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.

  5. MyEMSL

    Energy Science and Technology Software Center (OSTI)

    2012-09-06

    MyEMSL provides the following features: user-assisted upload of data from scientific instruments to an archive, tracking data though the upload process, providing metadata for recovery of uploaded data, providing users with access to the data. These features are supplied by a mixture of open source software and internally developed software. MyESML also provides a notification system that allows the system to communicate with users and other parts of the MyEMSL system. This allows the systemmore » to support automated workflows and user interaction. Reprocessing of already ingested data is also supported.« less

  6. Automated solar cell assembly team process research. Annual subcontract report, 1 January 1993--31 December 1993

    SciTech Connect (OSTI)

    Nowlan, M.J.; Hogan, S.J.; Darkazalli, G.; Breen, W.F.; Murach, J.M.; Sutherland, S.F.; Patterson, J.S.

    1994-06-01

    This report describes work done under the Photovoltaic Manufacturing Technology (PVMaT) project, Phase 3A, which addresses problems that are generic to the photovoltaic (PV) industry. Spire`s objective during Phase 3A was to use its light soldering technology and experience to design and fabricate solar cell tabbing and interconnecting equipment to develop new, high-yield, high-throughput, fully automated processes for tabbing and interconnecting thin cells. Areas that were addressed include processing rates, process control, yield, throughput, material utilization efficiency, and increased use of automation. Spire teamed with Solec International, a PV module manufacturer, and the University of Massachusetts at Lowell`s Center for Productivity Enhancement (CPE), automation specialists, who are lower-tier subcontractors. A number of other PV manufacturers, including Siemens Solar, Mobil Solar, Solar Web, and Texas instruments, agreed to evaluate the processes developed under this program.

  7. Estimate of Fuel Consumption and GHG Emission Impact from an Automated Mobility District

    SciTech Connect (OSTI)

    Chen, Yuche; Young, Stanley; Qi, Xuewei; Gonder, Jeffrey

    2015-10-19

    This study estimates the range of fuel and emissions impact of an automated-vehicle (AV) based transit system that services campus-based developments, termed an automated mobility district (AMD). The study develops a framework to quantify the fuel consumption and greenhouse gas (GHG) emission impacts of a transit system comprised of AVs, taking into consideration average vehicle fleet composition, fuel consumption/GHG emission of vehicles within specific speed bins, and the average occupancy of passenger vehicles and transit vehicles. The framework is exercised using a previous mobility analysis of a personal rapid transit (PRT) system, a system which shares many attributes with envisioned AV-based transit systems. Total fuel consumption and GHG emissions with and without an AMD are estimated, providing a range of potential system impacts on sustainability. The results of a previous case study based of a proposed implementation of PRT on the Kansas State University (KSU) campus in Manhattan, Kansas, serves as the basis to estimate personal miles traveled supplanted by an AMD at varying levels of service. The results show that an AMD has the potential to reduce total system fuel consumption and GHG emissions, but the amount is largely dependent on operating and ridership assumptions. The study points to the need to better understand ride-sharing scenarios and calls for future research on sustainability benefits of an AMD system at both vehicle and system levels.

  8. Estimate of Fuel Consumption and GHG Emission Impact on an Automated Mobility District: Preprint

    SciTech Connect (OSTI)

    Chen, Yuche; Young, Stanley; Gonder, Jeff; Qi, Xuewei

    2015-12-11

    This study estimates the range of fuel and emissions impact of an automated-vehicle (AV) based transit system that services campus-based developments, termed an automated mobility district (AMD). The study develops a framework to quantify the fuel consumption and greenhouse gas (GHG) emission impacts of a transit system comprised of AVs, taking into consideration average vehicle fleet composition, fuel consumption/GHG emission of vehicles within specific speed bins, and the average occupancy of passenger vehicles and transit vehicles. The framework is exercised using a previous mobility analysis of a personal rapid transit (PRT) system, a system which shares many attributes with envisioned AV-based transit systems. Total fuel consumption and GHG emissions with and without an AMD are estimated, providing a range of potential system impacts on sustainability. The results of a previous case study based of a proposed implementation of PRT on the Kansas State University (KSU) campus in Manhattan, Kansas, serves as the basis to estimate personal miles traveled supplanted by an AMD at varying levels of service. The results show that an AMD has the potential to reduce total system fuel consumption and GHG emissions, but the amount is largely dependent on operating and ridership assumptions. The study points to the need to better understand ride-sharing scenarios and calls for future research on sustainability benefits of an AMD system at both vehicle and system levels.

  9. Effects of an Advanced Reactor’s Design, Use of Automation, and Mission on Human Operators

    SciTech Connect (OSTI)

    Jeffrey C. Joe; Johanna H. Oxstrand

    2014-06-01

    The roles, functions, and tasks of the human operator in existing light water nuclear power plants (NPPs) are based on sound nuclear and human factors engineering (HFE) principles, are well defined by the plant’s conduct of operations, and have been validated by years of operating experience. However, advanced NPPs whose engineering designs differ from existing light-water reactors (LWRs) will impose changes on the roles, functions, and tasks of the human operators. The plans to increase the use of automation, reduce staffing levels, and add to the mission of these advanced NPPs will also affect the operator’s roles, functions, and tasks. We assert that these factors, which do not appear to have received a lot of attention by the design engineers of advanced NPPs relative to the attention given to conceptual design of these reactors, can have significant risk implications for the operators and overall plant safety if not mitigated appropriately. This paper presents a high-level analysis of a specific advanced NPP and how its engineered design, its plan to use greater levels of automation, and its expanded mission have risk significant implications on operator performance and overall plant safety.

  10. Automated Work Packages Prototype: Initial Design, Development, and Evaluation. Light Water Reactor Sustainability Program

    SciTech Connect (OSTI)

    Oxstrand, Johanna Helene; Ahmad Al Rashdan; Le Blanc, Katya Lee; Bly, Aaron Douglas; Agarwal, Vivek

    2015-07-01

    The goal of the Automated Work Packages (AWP) project is to demonstrate how to enhance work quality, cost management, and nuclear safety through the use of advanced technology. The work described in this report is part of the digital architecture for a highly automated plant project of the technical program plan for advanced instrumentation, information, and control (II&C) systems technologies. This report addresses the DOE Milestone M2LW-15IN0603112: Describe the outcomes of field evaluations/demonstrations of the AWP prototype system and plant surveillance and communication framework requirements at host utilities. A brief background to the need for AWP research is provided, then two human factors field evaluation studies are described. These studies focus on the user experience of conducting a task (in this case a preventive maintenance and a surveillance test) while using an AWP system. The remaining part of the report describes an II&C effort to provide real time status updates to the technician by wireless transfer of equipment indications and a dynamic user interface.

  11. Design and performance of an automated video-based laser beam alignment system

    SciTech Connect (OSTI)

    Rundle, W.J. ); Kartz, M.W. ); Bliss, E.S.; English, R.E. Jr.; Peterson, R.L.; Thompson, G.R.; Uhlich, D.M. )

    1992-07-14

    This paper describes the design and performance of an automated, closed-loop, laser beam alignment system. Its function is to sense a beam alignment error in a laser beam transport system and automatically steer mirrors preceding the sensor location as required to maintain beam alignment. The laser beam is sampled by an optomechanical package which uses video cameras to sense pointing and centering errors. The camera outputs are fed to an image processing module, which includes video digitizers and uses image storage and software to sense the centroid of the image. Signals are sent through a VMEbus to an optical device controller'' (ODC), which drives stepper-motor actuators on mirror mounts preceding the beam-sampling location to return the beam alignment to the prescribed condition. Photodiodes are also used to extend the control bandwidth beyond that which is achievable with video cameras. This system has been operated at LLNL in the Atomic Vapor Laser Isotope Separation (AVLIS) program to maintain the alignment of copper and dye laser beams, the latter to within [plus minus]2 [mu]r in pointing and less than 1 mm in centering. The optomechanical design of the instrumented package, which includes lens, mirror, and video mounts in a rigid housing, the automated control system architecture, and the performance of this equipment is described.

  12. Design and performance of an automated video-based laser beam alignment system

    SciTech Connect (OSTI)

    Rundle, W.J.; Kartz, M.W.; Bliss, E.S.; English, R.E. Jr.; Peterson, R.L.; Thompson, G.R.; Uhlich, D.M.

    1992-07-14

    This paper describes the design and performance of an automated, closed-loop, laser beam alignment system. Its function is to sense a beam alignment error in a laser beam transport system and automatically steer mirrors preceding the sensor location as required to maintain beam alignment. The laser beam is sampled by an optomechanical package which uses video cameras to sense pointing and centering errors. The camera outputs are fed to an image processing module, which includes video digitizers and uses image storage and software to sense the centroid of the image. Signals are sent through a VMEbus to an ``optical device controller`` (ODC), which drives stepper-motor actuators on mirror mounts preceding the beam-sampling location to return the beam alignment to the prescribed condition. Photodiodes are also used to extend the control bandwidth beyond that which is achievable with video cameras. This system has been operated at LLNL in the Atomic Vapor Laser Isotope Separation (AVLIS) program to maintain the alignment of copper and dye laser beams, the latter to within {plus_minus}2 {mu}r in pointing and less than 1 mm in centering. The optomechanical design of the instrumented package, which includes lens, mirror, and video mounts in a rigid housing, the automated control system architecture, and the performance of this equipment is described.

  13. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; Morris, Alan

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  14. Automation and optimization of the design parameters in tactical military pipeline systems. Master's thesis

    SciTech Connect (OSTI)

    Frick, R.M.

    1988-12-01

    Tactical military petroleum pipeline systems will play a vital role in any future conflict due to an increased consumption of petroleum products by our combined Armed Forces. The tactical pipeline must be rapidly constructed and highly mobile to keep pace with the constantly changing battle zone. Currently, the design of these pipeline system is time consuming and inefficient, which may cause shortages of fuel and pipeline components at the front lines. Therefore, a need for a computer program that will both automate and optimize the pipeline design process is quite apparent. These design needs are satisfied by developing a software package using Advance Basic (IBM DOS) programming language and made to run on an IBM-compatible personal computer. The program affords the user the options of either finding the optimum pump station locations for a proposed pipeline or calculating the maximum operating pressures for an existing pipeline. By automating the design procedure, a field engineer can vary the pipeline length, diameter, roughness, viscosity, gravity, flow rate, pump station pressure, or terrain profile and see how it affects the other parameters in just a few seconds. The design process was optimized by implementing a weighting scheme based on the volume percent of each fuel in the pipeline at any given time.

  15. Automated Demand Response Technology Demonstration Project for Small and Medium Commercial Buildings

    SciTech Connect (OSTI)

    Page, Janie; Kiliccote, Sila; Dudley, Junqiao Han; Piette, Mary Ann; Chiu, Albert K.; Kellow, Bashar; Koch, Ed; Lipkin, Paul

    2011-07-01

    Small and medium commercial customers in California make up about 20-25% of electric peak load in California. With the roll out of smart meters to this customer group, which enable granular measurement of electricity consumption, the investor-owned utilities will offer dynamic prices as default tariffs by the end of 2011. Pacific Gas and Electric Company, which successfully deployed Automated Demand Response (AutoDR) Programs to its large commercial and industrial customers, started investigating the same infrastructures application to the small and medium commercial customers. This project aims to identify available technologies suitable for automating demand response for small-medium commercial buildings; to validate the extent to which that technology does what it claims to be able to do; and determine the extent to which customers find the technology useful for DR purpose. Ten sites, enabled by eight vendors, participated in at least four test AutoDR events per site in the summer of 2010. The results showed that while existing technology can reliably receive OpenADR signals and translate them into pre-programmed response strategies, it is likely that better levels of load sheds could be obtained than what is reported here if better understanding of the building systems were developed and the DR response strategies had been carefully designed and optimized for each site.

  16. Demand Response and Open Automated Demand Response Opportunities for Data Centers

    SciTech Connect (OSTI)

    Ghatikar, Girish; Piette, Mary Ann; Fujita, Sydny; McKane, Aimee; Dudley, Junqiao Han; Radspieler, Anthony; Mares, K.C.; Shroyer, Dave

    2009-12-30

    This study examines data center characteristics, loads, control systems, and technologies to identify demand response (DR) and automated DR (Open Auto-DR) opportunities and challenges. The study was performed in collaboration with technology experts, industrial partners, and data center facility managers and existing research on commercial and industrial DR was collected and analyzed. The results suggest that data centers, with significant and rapidly growing energy use, have significant DR potential. Because data centers are highly automated, they are excellent candidates for Open Auto-DR. 'Non-mission-critical' data centers are the most likely candidates for early adoption of DR. Data center site infrastructure DR strategies have been well studied for other commercial buildings; however, DR strategies for information technology (IT) infrastructure have not been studied extensively. The largest opportunity for DR or load reduction in data centers is in the use of virtualization to reduce IT equipment energy use, which correspondingly reduces facility cooling loads. DR strategies could also be deployed for data center lighting, and heating, ventilation, and air conditioning. Additional studies and demonstrations are needed to quantify benefits to data centers of participating in DR and to address concerns about DR's possible impact on data center performance or quality of service and equipment life span.

  17. Analysis of Open Automated Demand Response Deployments in California and Guidelines to Transition to Industry Standards

    SciTech Connect (OSTI)

    Ghatikar, Girish; Riess, David; Piette, Mary Ann

    2014-01-02

    This report reviews the Open Automated Demand Response (OpenADR) deployments within the territories serviced by California?s investor-owned utilities (IOUs) and the transition from the OpenADR 1.0 specification to the formal standard?OpenADR 2.0. As demand response service providers and customers start adopting OpenADR 2.0, it is necessary to ensure that the existing Automated Demand Response (AutoDR) infrastructure investment continues to be useful and takes advantage of the formal standard and its many benefits. This study focused on OpenADR deployments and systems used by the California IOUs and included a summary of the OpenADR deployment from the U.S. Department of Energy-funded demonstration conducted by the Sacramento Municipal Utility District (SMUD). Lawrence Berkeley National Laboratory collected and analyzed data about OpenADR 1.0 deployments, categorized architectures, developed a data model mapping to understand the technical compatibility of each version, and compared the capabilities and features of the two specifications. The findings, for the first time, provided evidence of the total enabled load shed and average first cost for system enablement in the IOU and SMUD service territories. The OpenADR 2.0a profile specification semantically supports AutoDR system architectures and data propagation with a testing and certification program that promotes interoperability, scaled deployments by multiple vendors, and provides additional features that support future services.

  18. Development of an Automated Decision-Making Tool for Supervisory Control System

    SciTech Connect (OSTI)

    Cetiner, Sacit M.; Muhlheim, Michael David; Flanagan, George F.; Fugate, David L.; Kisner, Roger A.

    2014-09-01

    This technical report was generated as a product of the Supervisory Control for Multi-Modular Small Modular Reactor (SMR) Plants project within the Instrumentation, Control and Human-Machine Interface technology area under the Advanced Small Modular Reactor (AdvSMR) Research and Development Program of the US Department of Energy. The report documents the definition of strategies, functional elements, and the structural architecture of a supervisory control system for multi-modular AdvSMR plants. This research activity advances the state of the art by incorporating real-time, probabilistic-based decision-making into the supervisory control system architectural layers through the introduction of a tiered-plant system approach. The report provides background information on the state of the art of automated decision-making, including the description of existing methodologies. It then presents a description of a generalized decision-making framework, upon which the supervisory control decision-making algorithm is based. The probabilistic portion of automated decision-making is demonstrated through a simple hydraulic loop example.

  19. Automated design synthesis of robotic/human workcells for improved manufacturing system design in hazardous environments

    SciTech Connect (OSTI)

    Williams, Joshua M.

    2012-06-12

    Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address this problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates/decomposition. In the process, criteria guide function allocation of components/operators and help ensure compatibility and feasibility. Through multiple function assignment options and varied function structures, multiple design concepts are created. All of the generated designs are then evaluated based on a number of relevant evaluation criteria: cost, dose, ergonomics, hazards, efficiency, etc. These criteria are computed using physical properties/parameters of each system based on the qualities an engineer would use to make evaluations. Nuclear processes such as oxide conversion and electrorefining are utilized to aid algorithm development and provide test cases for the completed program. Through our approach, we capture design knowledge related to manufacturing and other operations in hazardous environments to enable a computational program to automatically generate and evaluate system design concepts.

  20. ENHANCING SEISMIC CALIBRATION RESEARCH THROUGH SOFTWARE AUTOMATION AND SCIENTIFIC INFORMATION MANAGEMENT

    SciTech Connect (OSTI)

    Ruppert, S D; Dodge, D A; Ganzberger, M D; Hauk, T F; Matzel, E M

    2007-07-06

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Engineering (GNEM R&E) Program at LLNL has made significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Several achievements in schema design, data visualization, synthesis, and analysis were completed this year. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. As data volumes have increased, scientific information management issues such as data quality assessment, ontology mapping, and metadata collection that are essential for production and validation of derived calibrations have negatively impacted researchers abilities to produce products. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. Significant software engineering and development efforts have produced an object-oriented framework that provides database centric coordination between scientific tools, users, and data. Nearly a half billion parameters, signals, measurements, and metadata entries are all stored in a relational database accessed by an extensive object-oriented multi-technology software framework that includes elements of stored procedures, real-time transactional database triggers and constraints, as well as coupled Java and C++ software libraries to handle the information interchange and validation requirements. Significant resources were applied to schema design to enable recording of processing flow and metadata. A core capability is the ability to rapidly select and present subsets of related signals and measurements to the researchers for analysis and distillation both visually (JAVA GUI client applications) and in batch mode (instantiation of multi-threaded applications on clusters of processors). Development of efficient data exploitation methods has become increasingly important throughout academic and government seismic research communities to address multi-disciplinary large scale initiatives. Effective frameworks must also simultaneously provide the researcher with robust measurement and analysis tools that can handle and extract groups of events effectively and isolate the researcher from the now onerous task of database management and metadata collection necessary for validation and error analysis. Sufficient information management robustness is required to avoid loss of metadata that would lead to incorrect calibration results in addition to increasing the data management burden. Our specific automation methodology and tools improve the researchers ability to assemble quality-controlled research products for delivery into the NNSA Knowledge Base (KB). The software and scientific automation tasks also provide the robust foundation upon which synergistic and efficient development of, GNEM R&E Program, seismic calibration research may be built.

  1. Automated High Throughput Protein Crystallization Screening at Nanoliter Scale and Protein Structural Study on Lactate Dehydrogenase

    SciTech Connect (OSTI)

    Fenglei Li

    2006-08-09

    The purposes of our research were: (1) To develop an economical, easy to use, automated, high throughput system for large scale protein crystallization screening. (2) To develop a new protein crystallization method with high screening efficiency, low protein consumption and complete compatibility with high throughput screening system. (3) To determine the structure of lactate dehydrogenase complexed with NADH by x-ray protein crystallography to study its inherent structural properties. Firstly, we demonstrated large scale protein crystallization screening can be performed in a high throughput manner with low cost, easy operation. The overall system integrates liquid dispensing, crystallization and detection and serves as a whole solution to protein crystallization screening. The system can dispense protein and multiple different precipitants in nanoliter scale and in parallel. A new detection scheme, native fluorescence, has been developed in this system to form a two-detector system with a visible light detector for detecting protein crystallization screening results. This detection scheme has capability of eliminating common false positives by distinguishing protein crystals from inorganic crystals in a high throughput and non-destructive manner. The entire system from liquid dispensing, crystallization to crystal detection is essentially parallel, high throughput and compatible with automation. The system was successfully demonstrated by lysozyme crystallization screening. Secondly, we developed a new crystallization method with high screening efficiency, low protein consumption and compatibility with automation and high throughput. In this crystallization method, a gas permeable membrane is employed to achieve the gentle evaporation required by protein crystallization. Protein consumption is significantly reduced to nanoliter scale for each condition and thus permits exploring more conditions in a phase diagram for given amount of protein. In addition, evaporation rate can be controlled or adjusted in this method during the crystallization process to favor either nucleation or growing processes for optimizing crystallization process. The protein crystals gotten by this method were experimentally proven to possess high x-ray diffraction qualities. Finally, we crystallized human lactate dehydrogenase 1 (H4) complexed with NADH and determined its structure by x-ray crystallography. The structure of LDH/NADH displays a significantly different structural feature, compared with LDH/NADH/inhibitor ternary complex structure, that subunits in LDH/NADH complex show open conformation or two conformations on the active site while the subunits in LDH/NADH/inhibitor are all in close conformation. Multiple LDH/NADH crystals were obtained and used for x-ray diffraction experiments. Difference in subunit conformation was observed among the structures independently solved from multiple individual LDH/NADH crystals. Structural differences observed among crystals suggest the existence of multiple conformers in solution.

  2. Effects of Levels of Automation for Advanced Small Modular Reactors: Impacts on Performance, Workload, and Situation Awareness

    SciTech Connect (OSTI)

    Johanna Oxstrand; Katya Le Blanc

    2014-07-01

    The Human-Automation Collaboration (HAC) research effort is a part of the Department of Energy (DOE) sponsored Advanced Small Modular Reactor (AdvSMR) program conducted at Idaho National Laboratory (INL). The DOE AdvSMR program focuses on plant design and management, reduction of capital costs as well as plant operations and maintenance costs (O&M), and factory production costs benefits.

  3. Vision 20/20: Automation and advanced computing in clinical radiation oncology

    SciTech Connect (OSTI)

    Moore, Kevin L. Moiseenko, Vitali; Kagadis, George C.; McNutt, Todd R.; Mutic, Sasa

    2014-01-15

    This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.

  4. Automated method and system for the alignment and correlation of images from two different modalities

    DOE Patents [OSTI]

    Giger, Maryellen L.; Chen, Chin-Tu; Armato, Samuel; Doi, Kunio

    1999-10-26

    A method and system for the computerized registration of radionuclide images with radiographic images, including generating image data from radiographic and radionuclide images of the thorax. Techniques include contouring the lung regions in each type of chest image, scaling and registration of the contours based on location of lung apices, and superimposition after appropriate shifting of the images. Specific applications are given for the automated registration of radionuclide lungs scans with chest radiographs. The method in the example given yields a system that spatially registers and correlates digitized chest radiographs with V/Q scans in order to correlate V/Q functional information with the greater structural detail of chest radiographs. Final output could be the computer-determined contours from each type of image superimposed on any of the original images, or superimposition of the radionuclide image data, which contains high activity, onto the radiographic chest image.

  5. Automated feature detection and identification in digital point-ordered signals

    DOE Patents [OSTI]

    Oppenlander, Jane E.; Loomis, Kent C.; Brudnoy, David M.; Levy, Arthur J.

    1998-01-01

    A computer-based automated method to detect and identify features in digital point-ordered signals. The method is used for processing of non-destructive test signals, such as eddy current signals obtained from calibration standards. The signals are first automatically processed to remove noise and to determine a baseline. Next, features are detected in the signals using mathematical morphology filters. Finally, verification of the features is made using an expert system of pattern recognition methods and geometric criteria. The method has the advantage that standard features can be, located without prior knowledge of the number or sequence of the features. Further advantages are that standard features can be differentiated from irrelevant signal features such as noise, and detected features are automatically verified by parameters extracted from the signals. The method proceeds fully automatically without initial operator set-up and without subjective operator feature judgement.

  6. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    SciTech Connect (OSTI)

    JUDI, DAVID; KALYANAPU, ALFRED; MCPHERSON, TIMOTHY; BERSCHEID, ALAN

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  7. Automation, Control and Modeling of Compound Semiconductor Thin-Film Growth

    SciTech Connect (OSTI)

    Breiland, W.G.; Coltrin, M.E.; Drummond, T.J.; Horn, K.M.; Hou, H.Q.; Klem, J.F.; Tsao, J.Y.

    1999-02-01

    This report documents the results of a laboratory-directed research and development (LDRD) project on control and agile manufacturing in the critical metalorganic chemical vapor deposition (MOCVD) and molecular beam epitaxy (MBE) materials growth processes essential to high-speed microelectronics and optoelectronic components. This effort is founded on a modular and configurable process automation system that serves as a backbone allowing integration of process-specific models and sensors. We have developed and integrated MOCVD- and MBE-specific models in this system, and demonstrated the effectiveness of sensor-based feedback control in improving the accuracy and reproducibility of semiconductor heterostructures. In addition, within this framework we have constructed ''virtual reactor'' models for growth processes, with the goal of greatly shortening the epitaxial growth process development cycle.

  8. Process automation using combinations of process and machine control technologies with application to a continuous dissolver

    SciTech Connect (OSTI)

    Spencer, B.B.: Yarbro, O.O.

    1991-01-01

    Operation of a continuous rotary dissolver, designed to leach uranium-plutonium fuel from chopped sections of reactor fuel cladding using nitric acid, has been automated. The dissolver is a partly continuous, partly batch process that interfaces at both ends with batchwise processes, thereby requiring synchronization of certain operations. Liquid acid is fed and flows through the dissolver continuously, whereas chopped fuel elements are fed to the dissolver in small batches and move through the compartments of the dissolver stagewise. Sequential logic (or machine control) techniques are used to control discrete activities such as the sequencing of isolation valves. Feedback control is used to control acid flowrates and temperatures. Expert systems technology is used for on-line material balances and diagnostics of process operation. 1 ref., 3 figs.

  9. Automated high pressure cell for pressure jump x-ray diffraction

    SciTech Connect (OSTI)

    Brooks, Nicholas J.; Gauthe, Beatrice L. L. E.; Templer, Richard H.; Ces, Oscar; Seddon, John M.; Terrill, Nick J.; Rogers, Sarah E.

    2010-06-15

    A high pressure cell for small and wide-angle x-ray diffraction measurements of soft condensed matter samples has been developed, incorporating a fully automated pressure generating network. The system allows both static and pressure jump measurements in the range of 0.1-500 MPa. Pressure jumps can be performed as quickly as 5 ms, both with increasing and decreasing pressures. Pressure is generated by a motorized high pressure pump, and the system is controlled remotely via a graphical user interface to allow operation by a broad user base, many of whom may have little previous experience of high pressure technology. Samples are loaded through a dedicated port allowing the x-ray windows to remain in place throughout an experiment; this facilitates accurate subtraction of background scattering. The system has been designed specifically for use at beamline I22 at the Diamond Light Source, United Kingdom, and has been fully integrated with the I22 beamline control systems.

  10. Automated high-throughput flow-through real-time diagnostic system

    DOE Patents [OSTI]

    Regan, John Frederick

    2012-10-30

    An automated real-time flow-through system capable of processing multiple samples in an asynchronous, simultaneous, and parallel fashion for nucleic acid extraction and purification, followed by assay assembly, genetic amplification, multiplex detection, analysis, and decontamination. The system is able to hold and access an unlimited number of fluorescent reagents that may be used to screen samples for the presence of specific sequences. The apparatus works by associating extracted and purified sample with a series of reagent plugs that have been formed in a flow channel and delivered to a flow-through real-time amplification detector that has a multiplicity of optical windows, to which the sample-reagent plugs are placed in an operative position. The diagnostic apparatus includes sample multi-position valves, a master sample multi-position valve, a master reagent multi-position valve, reagent multi-position valves, and an optical amplification/detection system.

  11. Optimal Control and Coordination of Connected and Automated Vehicles at Urban Traffic Intersections

    SciTech Connect (OSTI)

    Zhang, Yue J.; Malikopoulos, Andreas; Cassandras, Christos G.

    2016-01-01

    We address the problem of coordinating online a continuous flow of connected and automated vehicles (CAVs) crossing two adjacent intersections in an urban area. We present a decentralized optimal control framework whose solution yields for each vehicle the optimal acceleration/deceleration at any time in the sense of minimizing fuel consumption. The solu- tion, when it exists, allows the vehicles to cross the intersections without the use of traffic lights, without creating congestion on the connecting road, and under the hard safety constraint of collision avoidance. The effectiveness of the proposed solution is validated through simulation considering two intersections located in downtown Boston, and it is shown that coordination of CAVs can reduce significantly both fuel consumption and travel time.

  12. Device and method for automated separation of a sample of whole blood into aliquots

    DOE Patents [OSTI]

    Burtis, Carl A.; Johnson, Wayne F.

    1989-01-01

    A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.

  13. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    SciTech Connect (OSTI)

    Thomas, John

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  14. Computerized detection of breast cancer on automated breast ultrasound imaging of women with dense breasts

    SciTech Connect (OSTI)

    Drukker, Karen Sennett, Charlene A.; Giger, Maryellen L.

    2014-01-15

    Purpose: Develop a computer-aided detection method and investigate its feasibility for detection of breast cancer in automated 3D ultrasound images of women with dense breasts. Methods: The HIPAA compliant study involved a dataset of volumetric ultrasound image data, views, acquired with an automated U-Systems SomoV{sup } ABUS system for 185 asymptomatic women with dense breasts (BI-RADS Composition/Density 3 or 4). For each patient, three whole-breast views (3D image volumes) per breast were acquired. A total of 52 patients had breast cancer (61 cancers), diagnosed through any follow-up at most 365 days after the original screening mammogram. Thirty-one of these patients (32 cancers) had a screening-mammogram with a clinically assigned BI-RADS Assessment Category 1 or 2, i.e., were mammographically negative. All software used for analysis was developed in-house and involved 3 steps: (1) detection of initial tumor candidates, (2) characterization of candidates, and (3) elimination of false-positive candidates. Performance was assessed by calculating the cancer detection sensitivity as a function of the number of marks (detections) per view. Results: At a single mark per view, i.e., six marks per patient, the median detection sensitivity by cancer was 50.0% (16/32) 6% for patients with a screening mammogram-assigned BI-RADS category 1 or 2similar to radiologists performance sensitivity (49.9%) for this dataset from a prior reader studyand 45.9% (28/61) 4% for all patients. Conclusions: Promising detection sensitivity was obtained for the computer on a 3D ultrasound dataset of women with dense breasts at a rate of false-positive detections that may be acceptable for clinical implementation.

  15. Signatures and Methods for the Automated Nondestructive Assay of UF6 Cylinders at Uranium Enrichment Plants

    SciTech Connect (OSTI)

    Smith, Leon E.; Mace, Emily K.; Misner, Alex C.; Shaver, Mark W.

    2010-08-08

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility’s entire cylinder inventory. These measurements are time-consuming, expensive, and assay only a small fraction of the total cylinder volume. An automated nondestructive assay system capable of providing enrichment measurements over the full volume of the cylinder could improve upon current verification practices in terms of manpower and assay accuracy. Such a station would use sensors that can be operated in an unattended mode at an industrial facility: medium-resolution scintillators for gamma-ray spectroscopy (e.g., NaI(Tl)) and moderated He-3 neutron detectors. This sensor combination allows the exploitation of additional, more-penetrating signatures beyond the traditional 185-keV emission from U-235: neutrons produced from F-19(α,n) reactions (spawned primarily from U 234 alpha emission) and high-energy gamma rays (extending up to 8 MeV) induced by neutrons interacting in the steel cylinder. This paper describes a study of these non-traditional signatures for the purposes of cylinder enrichment verification. The signatures and the radiation sensors designed to collect them are described, as are proof-of-principle cylinder measurements and analyses. Key sources of systematic uncertainty in the non-traditional signatures are discussed, and the potential benefits of utilizing these non-traditional signatures, in concert with an automated form of the traditional 185-keV-based assay, are discussed.

  16. SU-E-J-191: Automated Detection of Anatomic Changes in H'N Patients

    SciTech Connect (OSTI)

    Usynin, A; Ramsey, C [Thompson Cancer Survival Center Knoxville, TN (United States)

    2014-06-01

    Purpose: To develop a novel statistics-based method for automated detection of anatomical changes using cone-beam CT data. A method was developed that can provide a reliable and automated early warning system that enables a just-in-time adaptation of the treatment plan. Methods: Anatomical changes were evaluated by comparing the original treatment planning CT with daily CBCT images taken prior treatment delivery. The external body contour was computed on a given CT slice and compared against the corresponding contour on the daily CBCT. In contrast to threshold-based techniques, a statistical approach was employed to evaluate the difference between the contours using a given confidence level. The detection tool used the two-sample Kolmogorov-Smirnov test, which is a non-parametric technique that compares two samples drawn from arbitrary probability distributions. 11 H'N patients were retrospectively selected from a clinical imaging database with a total of 186 CBCT images. Six patients in the database were confirmed to have anatomic changes during the course of radiotherapy. Five of the H'N patients did not have significant changes. The KS test was applied to the contour data using a sliding window analysis. The confidence level of 0.99 was used to moderate false detection. Results: The algorithm was able to correctly detect anatomical changes in 6 out of 6 patients with an excellent spatial accuracy as early as at the 14th elapsed day. The algorithm provided a consistent and accurate delineation of the detected changes. The output of the anatomical change tool is easy interpretable, and can be shown overlaid on a 3D rendering of the patient's anatomy. Conclusion: The detection method provides the basis for one of the key components of Adaptive Radiation Therapy. The method uses tools that are readily available in the clinic, including daily CBCT imaging, and image co-registration facilities.

  17. Effects of Granular Control on Customers’ Perspective and Behavior with Automated Demand Response Systems

    SciTech Connect (OSTI)

    Schetrit, Oren; Kim, Joyce; Yin, Rongxin; Kiliccote, Sila

    2014-08-01

    Automated demand response (Auto-DR) is expected to close the loop between buildings and the grid by providing machine-to-machine communications to curtail loads without the need for human intervention. Hence, it can offer more reliable and repeatable demand response results to the grid than the manual approach and make demand response participation a hassle-free experience for customers. However, many building operators misunderstand Auto-DR and are afraid of losing control over their building operation. To ease the transition from manual to Auto-DR, we designed and implemented granular control of Auto-DR systems so that building operators could modify or opt out of individual load-shed strategies whenever they wanted. This paper reports the research findings from this effort demonstrated through a field study in large commercial buildings located in New York City. We focused on (1) understanding how providing granular control affects building operators’ perspective on Auto-DR, and (2) evaluating the usefulness of granular control by examining their interaction with the Auto-DR user interface during test events. Through trend log analysis, interviews, and surveys, we found that: (1) the opt-out capability during Auto-DR events can remove the feeling of being forced into load curtailments and increase their willingness to adopt Auto-DR; (2) being able to modify individual load-shed strategies allows flexible Auto-DR participation that meets the building’s changing operational requirements; (3) a clear display of automation strategies helps building operators easily identify how Auto-DR is functioning and can build trust in Auto-DR systems.

  18. A Fully Automated Method for CT-on-Rails-Guided Online Adaptive Planning for Prostate Cancer Intensity Modulated Radiation Therapy

    SciTech Connect (OSTI)

    Li, Xiaoqiang; Quan, Enzhuo M.; Li, Yupeng; Pan, Xiaoning; Zhou, Yin; Wang, Xiaochun; Du, Weiliang; Kudchadker, Rajat J.; Johnson, Jennifer L.; Kuban, Deborah A.; Lee, Andrew K.; Zhang, Xiaodong

    2013-08-01

    Purpose: This study was designed to validate a fully automated adaptive planning (AAP) method which integrates automated recontouring and automated replanning to account for interfractional anatomical changes in prostate cancer patients receiving adaptive intensity modulated radiation therapy (IMRT) based on daily repeated computed tomography (CT)-on-rails images. Methods and Materials: Nine prostate cancer patients treated at our institution were randomly selected. For the AAP method, contours on each repeat CT image were automatically generated by mapping the contours from the simulation CT image using deformable image registration. An in-house automated planning tool incorporated into the Pinnacle treatment planning system was used to generate the original and the adapted IMRT plans. The cumulative dosevolume histograms (DVHs) of the target and critical structures were calculated based on the manual contours for all plans and compared with those of plans generated by the conventional method, that is, shifting the isocenters by aligning the images based on the center of the volume (COV) of prostate (prostate COV-aligned). Results: The target coverage from our AAP method for every patient was acceptable, while 1 of the 9 patients showed target underdosing from prostate COV-aligned plans. The normalized volume receiving at least 70 Gy (V{sub 70}), and the mean dose of the rectum and bladder were reduced by 8.9%, 6.4 Gy and 4.3%, 5.3 Gy, respectively, for the AAP method compared with the values obtained from prostate COV-aligned plans. Conclusions: The AAP method, which is fully automated, is effective for online replanning to compensate for target dose deficits and critical organ overdosing caused by interfractional anatomical changes in prostate cancer.

  19. Automated Algorithms for Quantum-Level Accuracy in Atomistic Simulations: LDRD Final Report.

    SciTech Connect (OSTI)

    Thompson, Aidan P.; Schultz, Peter A.; Crozier, Paul; Moore, Stan Gerald; Swiler, Laura Painton; Stephens, John Adam; Trott, Christian Robert; Foiles, Stephen M.; Tucker, Garritt J.

    2014-09-01

    This report summarizes the result of LDRD project 12-0395, titled %22Automated Algorithms for Quantum-level Accuracy in Atomistic Simulations.%22 During the course of this LDRD, we have developed an interatomic potential for solids and liquids called Spectral Neighbor Analysis Poten- tial (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected on to a basis of hyperspherical harmonics in four dimensions. The SNAP coef- ficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. Global optimization methods in the DAKOTA software package are used to seek out good choices of hyperparameters that define the overall structure of the SNAP potential. FitSnap.py, a Python-based software pack- age interfacing to both LAMMPS and DAKOTA is used to formulate the linear regression problem, solve it, and analyze the accuracy of the resultant SNAP potential. We describe a SNAP potential for tantalum that accurately reproduces a variety of solid and liquid properties. Most significantly, in contrast to existing tantalum potentials, SNAP correctly predicts the Peierls barrier for screw dislocation motion. We also present results from SNAP potentials generated for indium phosphide (InP) and silica (SiO 2 ). We describe efficient algorithms for calculating SNAP forces and energies in molecular dynamics simulations using massively parallel computers and advanced processor ar- chitectures. Finally, we briefly describe the MSM method for efficient calculation of electrostatic interactions on massively parallel computers.

  20. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    SciTech Connect (OSTI)

    Ruebel, Oliver

    2009-12-01

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.

  1. Automated Chemical Analysis of Internally Mixed Aerosol Particles Using X-ray Spectromicroscopy at the Carbon K-Edge

    SciTech Connect (OSTI)

    Moffet, Ryan C.; Henn, Tobias R.; Laskin, Alexander; Gilles, Marry K.

    2010-10-01

    We have developed an automated data analysis method for atmospheric particles using scanning transmission X-ray microscopy coupled with near edge X-ray fine structure spectroscopy (STXM/NEXAFS). This method is applied to complex internally mixed submicron particles containing organic and inorganic material. Several algorithms were developed to exploit NEXAFS spectral features in the energy range from 278-320 eV for quantitative mapping of the spatial distribution of elemental carbon, organic carbon, potassium, and non-carbonaceous elements in particles of mixed composition. This energy range encompasses the carbon K-edge and potassium L2 and L3 edges. STXM/NEXAFS maps of different chemical components were complemented with a subsequent analysis using elemental maps obtained by scanning electron microscopy coupled with energy dispersive X-ray analysis (SEM/EDX). We demonstrate application of the automated mapping algorithms for data analysis and the statistical classification of particles.

  2. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    SciTech Connect (OSTI)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States)] [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Hart, A. John, E-mail: ajhart@mit.edu [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2013-11-15

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called Robofurnace. Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  3. Assessing the Energy Impact of Connected and Automated Vehicle (CAV) Technologies (Presentation), NREL (National Renewable Energy Laboratory)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Assessing the Energy Impact of Connected and Automated Vehicle (CAV) Technologies SAE 2016 Government/Industry Meeting January 21, 2016 Jeff Gonder, Yuche Chen, Mike Lammert, Eric Wood Transportation and Hydrogen Systems Center (THSC) National Renewable Energy Laboratory (NREL) NREL/PR-5400-65743 2 Outline * Overall energy impact assessment * Example feature-level impacts * Real-world/off-cycle benefit calculation * On-going work by DOE and its national labs 3 "Bookending" CAV Energy

  4. Context-based automated defect classification system using multiple morphological masks

    DOE Patents [OSTI]

    Gleason, Shaun S.; Hunt, Martin A.; Sari-Sarraf, Hamed

    2002-01-01

    Automatic detection of defects during the fabrication of semiconductor wafers is largely automated, but the classification of those defects is still performed manually by technicians. This invention includes novel digital image analysis techniques that generate unique feature vector descriptions of semiconductor defects as well as classifiers that use these descriptions to automatically categorize the defects into one of a set of pre-defined classes. Feature extraction techniques based on multiple-focus images, multiple-defect mask images, and segmented semiconductor wafer images are used to create unique feature-based descriptions of the semiconductor defects. These feature-based defect descriptions are subsequently classified by a defect classifier into categories that depend on defect characteristics and defect contextual information, that is, the semiconductor process layer(s) with which the defect comes in contact. At the heart of the system is a knowledge database that stores and distributes historical semiconductor wafer and defect data to guide the feature extraction and classification processes. In summary, this invention takes as its input a set of images containing semiconductor defect information, and generates as its output a classification for the defect that describes not only the defect itself, but also the location of that defect with respect to the semiconductor process layers.

  5. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    SciTech Connect (OSTI)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  6. Automated data extraction from in situ protein stable isotope probing studies

    SciTech Connect (OSTI)

    Slysz, Gordon W.; Steinke, Laurey A.; Ward, David M.; Klatt, Christian G.; Clauss, Therese RW; Purvine, Samuel O.; Payne, Samuel H.; Anderson, Gordon A.; Smith, Richard D.; Lipton, Mary S.

    2014-01-27

    Protein stable isotope probing (protein-SIP) has strong potential for revealing key metabolizing taxa in complex microbial communities. While most protein-SIP work to date has been performed under controlled laboratory conditions to allow extensive isotope labeling of the target organism, a key application will be in situ studies of microbial communities under conditions that result in small degrees of partial labeling. One hurdle restricting large scale in situ protein-SIP studies is the lack of algorithms and software for automated data processing of the massive data sets resulting from such studies. In response, we developed Stable Isotope Probing Protein Extraction Resources software (SIPPER) and applied it for large scale extraction and visualization of data from short term (3 h) protein-SIP experiments performed in situ on Yellowstone phototrophic bacterial mats. Several metrics incorporated into the software allow it to support exhaustive analysis of the complex composite isotopic envelope observed as a result of low amounts of partial label incorporation. SIPPER also enables the detection of labeled molecular species without the need for any prior identification.

  7. Grid-Competitive Residential and Commercial Fully Automated PV Systems Technology: Final technical Report, August 2011

    SciTech Connect (OSTI)

    Brown, Katie E.; Cousins, Peter; Culligan, Matt; Jonathan Botkin; DeGraaff, David; Bunea, Gabriella; Rose, Douglas; Bourne, Ben; Koehler, Oliver

    2011-08-26

    Under DOE's Technology Pathway Partnership program, SunPower Corporation developed turn-key, high-efficiency residential and commercial systems that are cost effective. Key program objectives include a reduction in LCOE values to 9-12 cents/kWh and 13-18 cents/kWh respectively for the commercial and residential markets. Target LCOE values for the commercial ground, commercial roof, and residential markets are 10, 11, and 13 cents/kWh. For this effort, SunPower collaborated with a variety of suppliers and partners to complete the tasks below. Subcontractors included: Solaicx, SiGen, Ribbon Technology, Dow Corning, Xantrex, Tigo Energy, and Solar Bridge. SunPower's TPP addressed nearly the complete PV value chain: from ingot growth through system deployment. Throughout the award period of performance, SunPower has made progress toward achieving these reduced costs through the development of 20%+ efficient modules, increased cell efficiency through the understanding of loss mechanisms and improved manufacturing technologies, novel module development, automated design tools and techniques, and reduced system development and installation time. Based on an LCOE assessment using NREL's Solar Advisor Model, SunPower achieved the 2010 target range, as well as progress toward 2015 targets.

  8. Development and Demonstration of the Open Automated Demand Response Standard for the Residential Sector

    SciTech Connect (OSTI)

    Herter, Karen; Rasin, Josh; Perry, Tim

    2009-11-30

    The goal of this study was to demonstrate a demand response system that can signal nearly every customer in all sectors through the integration of two widely available and non- proprietary communications technologies--Open Automated Demand Response (OpenADR) over lnternet protocol and Utility Messaging Channel (UMC) over FM radio. The outcomes of this project were as follows: (1) a software bridge to allow translation of pricing signals from OpenADR to UMC; and (2) a portable demonstration unit with an lnternet-connected notebook computer, a portfolio of DR-enabling technologies, and a model home. The demonstration unit provides visitors the opportunity to send electricity-pricing information over the lnternet (through OpenADR and UMC) and then watch as the model appliances and lighting respond to the signals. The integration of OpenADR and UMC completed and demonstrated in this study enables utilities to send hourly or sub-hourly electricity pricing information simultaneously to the residential, commercial and industrial sectors.

  9. Automated whole-genome multiple alignment of rat, mouse, and human

    SciTech Connect (OSTI)

    Brudno, Michael; Poliakov, Alexander; Salamov, Asaf; Cooper, Gregory M.; Sidow, Arend; Rubin, Edward M.; Solovyev, Victor; Batzoglou, Serafim; Dubchak, Inna

    2004-07-04

    We have built a whole genome multiple alignment of the three currently available mammalian genomes using a fully automated pipeline which combines the local/global approach of the Berkeley Genome Pipeline and the LAGAN program. The strategy is based on progressive alignment, and consists of two main steps: (1) alignment of the mouse and rat genomes; and (2) alignment of human to either the mouse-rat alignments from step 1, or the remaining unaligned mouse and rat sequences. The resulting alignments demonstrate high sensitivity, with 87% of all human gene-coding areas aligned in both mouse and rat. The specificity is also high: <7% of the rat contigs are aligned to multiple places in human and 97% of all alignments with human sequence > 100kb agree with a three-way synteny map built independently using predicted exons in the three genomes. At the nucleotide level <1% of the rat nucleotides are mapped to multiple places in the human sequence in the alignment; and 96.5% of human nucleotides within all alignments agree with the synteny map. The alignments are publicly available online, with visualization through the novel Multi-VISTA browser that we also present.

  10. A Semi-automated Commissioning Tool for VAV Air Handling Units:Functional Test Analyzer

    SciTech Connect (OSTI)

    Haves, Philip; Kim, Moosung; Najafi, Massieh; Xu, Peng

    2007-01-01

    A software tool that automates the analysis of functional tests for air-handling units is described. The tool compares the performance observed during manual tests with the performance predicted by simple models of the components under test that are configured using design and of information catalog data. Significant differences between observed and expected performance indicate the presence faults. Fault diagnosis is performed by analyzing the variation of these differences with operating points using expert rules and fuzzy inferencing. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display compares the measured and expected performance, highlighting significant differences that indicate the presence of faults. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retrocommissioning as well as by building owners and operators conducting routine tests to check the performance of their HVAC systems. This paper describes the input data requirements of the tool, the software structure, and the graphical interface and summarizes the development and testing process used.

  11. Rnnotator: an automated de novo transcriptome assembly pipeline from stranded RNA-Seq reads

    SciTech Connect (OSTI)

    Martin, Jeffrey; Bruno, Vincent M.; Fang, Zhide; Meng, Xiandong; Blow, Matthew; Zhang, Tao; Sherlock, Gavin; Snyder, Michael; Wang, Zhong

    2010-11-19

    Background: Comprehensive annotation and quantification of transcriptomes are outstanding problems in functional genomics. While high throughput mRNA sequencing (RNA-Seq) has emerged as a powerful tool for addressing these problems, its success is dependent upon the availability and quality of reference genome sequences, thus limiting the organisms to which it can be applied. Results: Here, we describe Rnnotator, an automated software pipeline that generates transcript models by de novo assembly of RNA-Seq data without the need for a reference genome. We have applied the Rnnotator assembly pipeline to two yeast transcriptomes and compared the results to the reference gene catalogs of these organisms. The contigs produced by Rnnotator are highly accurate (95percent) and reconstruct full-length genes for the majority of the existing gene models (54.3percent). Furthermore, our analyses revealed many novel transcribed regions that are absent from well annotated genomes, suggesting Rnnotator serves as a complementary approach to analysis based on a reference genome for comprehensive transcriptomics. Conclusions: These results demonstrate that the Rnnotator pipeline is able to reconstruct full-length transcripts in the absence of a complete reference genome.

  12. Automated Nondestructive Assay of UF6 Cylinders: Detector Characterization and Initial Measurements

    SciTech Connect (OSTI)

    Mace, Emily K.; Smith, Leon E.

    2011-10-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders assumed to be representative of the facility's entire cylinder inventory. These measurements are time-consuming and assay only a small fraction of the total cylinder volume. An automated nondestructive assay system capable of providing enrichment measurements over the full volume of the cylinder could improve upon current verification practices in terms of manpower and assay accuracy. Pacific Northwest National Laboratory is developing an Integrated Cylinder Verification System (ICVS) intended for this purpose and has developed a field prototype of the nondestructive assay (NDA) components of an ICVS. The nondestructive assay methods would combine the 'traditional' enrichment-meter signature (i.e. 186-keV emission from 235U) as well as 'non-traditional' high-energy photon signatures derived from neutrons produced primarily by 19F({alpha},n) reactions. This paper describes the design, calibration and characterization of the NaI(Tl) and LaBr3(Ce) spectrometers utilized in the field prototype. An overview of a recent field measurement campaign is then provided, supported by example gamma-ray pulse-height spectra collected on cylinders of known enrichment.

  13. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    SciTech Connect (OSTI)

    and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  14. Performance of Integrated Systems of Automated Roller Shade Systems and Daylight Responsive Dimming Systems

    SciTech Connect (OSTI)

    Park, Byoung-Chul; Choi, An-Seop; Jeong, Jae-Weon; Lee, Eleanor S.

    2010-07-08

    Daylight responsive dimming systems have been used in few buildings to date because they require improvements to improve reliability. The key underlying factor contributing to poor performance is the variability of the ratio of the photosensor signal to daylight workplane illuminance in accordance with sun position, sky condition, and fenestration condition. Therefore, this paper describes the integrated systems between automated roller shade systems and daylight responsive dimming systems with an improved closed-loop proportional control algorithm, and the relative performance of the integrated systems and single systems. The concept of the improved closed-loop proportional control algorithm for the integrated systems is to predict the varying correlation of photosensor signal to daylight workplane illuminance according to roller shade height and sky conditions for improvement of the system accuracy. In this study, the performance of the integrated systems with two improved closed-loop proportional control algorithms was compared with that of the current (modified) closed-loop proportional control algorithm. In the results, the average maintenance percentage and the average discrepancies of the target illuminance, as well as the average time under 90percent of target illuminance for the integrated systems significantly improved in comparison with the current closed-loop proportional control algorithm for daylight responsive dimming systems as a single system.

  15. New automated inventory/material accounting system (AIMAS) version for former Soviet Union countries

    SciTech Connect (OSTI)

    Kuzminski, Jozef; Ewing, Tom; Sakunov, Igor; Drapey, Sergey; Nations, Jim

    2009-01-01

    AIMAS (Automated Inventory/Material Accounting System) is a PC-based application for site-level nuclear material accountancy that was originally developed in the late 90's as a part of the U.S Department of Energy Assistance Program to Ukraine. Designed to be flexible and secure, plus place minimal demands on computing infrastructure, it was originally developed to run in early Windows operating system (OS) environments like W98 and W3.1. The development, support, and maintenance of AIMAS were transferred to Ukraine in 2002. Because it is highly flexible and can be configured to meet diverse end-user's needs, the software has been used at several facilities in Ukraine. Incorporating added functionality is planned to support nuclear installations in the Republic of Kazakhstan and Uzbekistan, as well. An improved 32-bit version of AIMAS has recently been developed to operate effectively on modern PCs running the latest Windows OS by AVIS, the Ukrainian developer. In the paper we discuss the status of AIMAS, plans for new functions, and describe the strategy for addressing a sustainable software life-cycle while meeting user requirements in multiple FSU countries.

  16. HEAVY OIL PROCESS MONITOR: AUTOMATED ON-COLUMN ASPHALTENE PRECIPITATION AND RE-DISSOLUTION

    SciTech Connect (OSTI)

    John F. Schabron; Joseph F. Rovani Jr; Mark Sanderson

    2006-06-01

    About 37-50% (w/w) of the heptane asphaltenes from unpyrolyzed residua dissolve in cyclohexane. As pyrolysis progresses, this number decrease to below 15% as coke and toluene insoluble pre-coke materials appear. This solubility measurement can be used after coke begins to form, unlike the flocculation titration, which cannot be applied to multi-phase systems. Currently, the procedure for the isolation of heptane asphaltenes and the determination of the amount of asphaltenes soluble in cyclohexane spans three days. A more rapid method to measure asphaltene solubility was explored using a novel on-column asphaltene precipitation and re-dissolution technique. This was automated using high performance liquid chromatography (HPLC) equipment with a step gradient sequence using the solvents: heptane, cyclohexane, toluene:methanol (98:2). Results for four series of original and pyrolyzed residua were compared with data from the gravimetric method. The measurement time was reduced from three days to forty minutes. The separation was expanded further with the use of four solvents: heptane, cyclohexane, toluene, and cyclohexanone or methylene chloride. This provides a fourth peak which represents the most polar components, in the oil.

  17. Automated real-time detection of defects during machining of ceramics

    DOE Patents [OSTI]

    Ellingson, W.A.; Sun, J.

    1997-11-18

    Apparatus for the automated real-time detection and classification of defects during the machining of ceramic components employs an elastic optical scattering technique using polarized laser light. A ceramic specimen is continuously moved while being machined. Polarized laser light is directed onto the ceramic specimen surface at a fixed position just aft of the machining tool for examination of the newly machined surface. Any foreign material near the location of the laser light on the ceramic specimen is cleared by an air blast. As the specimen is moved, its surface is continuously scanned by the polarized laser light beam to provide a two-dimensional image presented in real-time on a video display unit, with the motion of the ceramic specimen synchronized with the data acquisition speed. By storing known ``feature masks`` representing various surface and sub-surface defects and comparing measured defects with the stored feature masks, detected defects may be automatically characterized. Using multiple detectors, various types of defects may be detected and classified. 14 figs.

  18. Automated real-time detection of defects during machining of ceramics

    DOE Patents [OSTI]

    Ellingson, William A.; Sun, Jiangang

    1997-01-01

    Apparatus for the automated real-time detection and classification of defects during the machining of ceramic components employs an elastic optical scattering technique using polarized laser light. A ceramic specimen is continuously moved while being machined. Polarized laser light is directed onto the ceramic specimen surface at a fixed position just aft of the machining tool for examination of the newly machined surface. Any foreign material near the location of the laser light on the ceramic specimen is cleared by an air blast. As the specimen is moved, its surface is continuously scanned by the polarized laser light beam to provide a two-dimensional image presented in real-time on a video display unit, with the motion of the ceramic specimen synchronized with the data acquisition speed. By storing known "feature masks" representing various surface and sub-surface defects and comparing measured defects with the stored feature masks, detected defects may be automatically characterized. Using multiple detectors, various types of defects may be detected and classified.

  19. Application of an automated wireless structural monitoring system for long-span suspension bridges

    SciTech Connect (OSTI)

    Kurata, M.; Lynch, J. P.; Linden, G. W. van der; Hipley, P.; Sheng, L.-H.

    2011-06-23

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  20. Electricity and technical progress: The bituminous coal mining industry, mechanization to automation

    SciTech Connect (OSTI)

    Devine, W.D. Jr.

    1987-07-01

    Development and use of electric mobile machinery facilitated the mechanization of underground bituminous coal mining and has played a lesser but important role in the growth of surface mining. Electricity has been central to the rise of mechanically integrated mining, both underground (after 1950) and on the surface (recently). Increasing labor productivity in coal mining and decreasing total energy use per ton of coal mined are associated with penetration of new electric technology through at least 1967. Productivity declined and energy intensity increased during the 1970s due in part to government regulations. Recent productivity gains stem partly from new technology that permits automation of certain mining operations. On most big electric excavating machines, a pair of large alternating current (ac) motors operate continuously at full speed. These drive direct current (dc) generators that energize dc motors, each matched to the desired power and speed range of a particular machine function. Direct-current motors provide high torque at low speeds, thus reducing the amount of gearing required; each crawler is independently propelled forward or backward by its own variable-speed dc motors. The principal advantages of electric power are that mechanical power-transmission systems - shafts, gears, etc. - are eliminated or greatly simplified. Reliability is higher, lifetime is longer, and maintenance is much simpler with electric power than with diesel power, and the spare parts inventory is considerably smaller. 100 refs., 11 figs., 12 tabs.

  1. Optimizing RF gun cavity geometry within an automated injector design system

    SciTech Connect (OSTI)

    Alicia Hofler ,Pavel Evtushenko

    2011-03-28

    RF guns play an integral role in the success of several light sources around the world, and properly designed and optimized cw superconducting RF (SRF) guns can provide a path to higher average brightness. As the need for these guns grows, it is important to have automated optimization software tools that vary the geometry of the gun cavity as part of the injector design process. This will allow designers to improve existing designs for present installations, extend the utility of these guns to other applications, and develop new designs. An evolutionary algorithm (EA) based system can provide this capability because EAs can search in parallel a large parameter space (often non-linear) and in a relatively short time identify promising regions of the space for more careful consideration. The injector designer can then evaluate more cavity design parameters during the injector optimization process against the beam performance requirements of the injector. This paper will describe an extension to the APISA software that allows the cavity geometry to be modified as part of the injector optimization and provide examples of its application to existing RF and SRF gun designs.

  2. SAE2.py : a python script to automate parameter studies using SCREAMER with application to magnetic switching on Z.

    SciTech Connect (OSTI)

    Orndorff-Plunkett, Franklin

    2011-05-01

    The SCREAMER simulation code is widely used at Sandia National Laboratories for designing and simulating pulsed power accelerator experiments on super power accelerators. A preliminary parameter study of Z with a magnetic switching retrofit illustrates the utility of the automating script for optimizing pulsed power designs. SCREAMER is a circuit based code commonly used in pulsed-power design and requires numerous iterations to find optimal configurations. System optimization using simulations like SCREAMER is by nature inefficient and incomplete when done manually. This is especially the case when the system has many interactive elements whose emergent effects may be unforeseeable and complicated. For increased completeness, efficiency and robustness, investigators should probe a suitably confined parameter space using deterministic, genetic, cultural, ant-colony algorithms or other computational intelligence methods. I have developed SAE2 - a user-friendly, deterministic script that automates the search for optima of pulsed-power designs with SCREAMER. This manual demonstrates how to make input decks for SAE2 and optimize any pulsed-power design that can be modeled using SCREAMER. Application of SAE2 to magnetic switching on model of a potential Z refurbishment illustrates the power of SAE2. With respect to the manual optimization, the automated optimization resulted in 5% greater peak current (10% greater energy) and a 25% increase in safety factor for the most highly stressed element.

  3. Automating Natural Disaster Impact Analysis: An Open Resource to Visually Estimate a Hurricane s Impact on the Electric Grid

    SciTech Connect (OSTI)

    Barker, Alan M; Freer, Eva B; Omitaomu, Olufemi A; Fernandez, Steven J; Chinthavali, Supriya; Kodysh, Jeffrey B

    2013-01-01

    An ORNL team working on the Energy Awareness and Resiliency Standardized Services (EARSS) project developed a fully automated procedure to take wind speed and location estimates provided by hurricane forecasters and provide a geospatial estimate on the impact to the electric grid in terms of outage areas and projected duration of outages. Hurricane Sandy was one of the worst US storms ever, with reported injuries and deaths, millions of people without power for several days, and billions of dollars in economic impact. Hurricane advisories were released for Sandy from October 22 through 31, 2012. The fact that the geoprocessing was automated was significant there were 64 advisories for Sandy. Manual analysis typically takes about one hour for each advisory. During a storm event, advisories are released every two to three hours around the clock, and an analyst capable of performing the manual analysis has other tasks they would like to focus on. Initial predictions of a big impact and landfall usually occur three days in advance, so time is of the essence to prepare for utility repair. Automated processing developed at ORNL allowed this analysis to be completed and made publicly available within minutes of each new advisory being released.

  4. Automated Thermal Image Processing for Detection and Classification of Birds and Bats - FY2012 Annual Report

    SciTech Connect (OSTI)

    Duberstein, Corey A.; Matzner, Shari; Cullinan, Valerie I.; Virden, Daniel J.; Myers, Joshua R.; Maxwell, Adam R.

    2012-09-01

    Surveying wildlife at risk from offshore wind energy development is difficult and expensive. Infrared video can be used to record birds and bats that pass through the camera view, but it is also time consuming and expensive to review video and determine what was recorded. We proposed to conduct algorithm and software development to identify and to differentiate thermally detected targets of interest that would allow automated processing of thermal image data to enumerate birds, bats, and insects. During FY2012 we developed computer code within MATLAB to identify objects recorded in video and extract attribute information that describes the objects recorded. We tested the efficiency of track identification using observer-based counts of tracks within segments of sample video. We examined object attributes, modeled the effects of random variability on attributes, and produced data smoothing techniques to limit random variation within attribute data. We also began drafting and testing methodology to identify objects recorded on video. We also recorded approximately 10 hours of infrared video of various marine birds, passerine birds, and bats near the Pacific Northwest National Laboratory (PNNL) Marine Sciences Laboratory (MSL) at Sequim, Washington. A total of 6 hours of bird video was captured overlooking Sequim Bay over a series of weeks. An additional 2 hours of video of birds was also captured during two weeks overlooking Dungeness Bay within the Strait of Juan de Fuca. Bats and passerine birds (swallows) were also recorded at dusk on the MSL campus during nine evenings. An observer noted the identity of objects viewed through the camera concurrently with recording. These video files will provide the information necessary to produce and test software developed during FY2013. The annotation will also form the basis for creation of a method to reliably identify recorded objects.

  5. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    SciTech Connect (OSTI)

    Lorenz, Matthias; Ovchinnikova, Olga S; Van Berkel, Gary J

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant materials.

  6. Prototype Radiation Detector Positioning System For The Automated Nondestructive Assay Of Uf6 Cylinders

    SciTech Connect (OSTI)

    Hatchell, Brian K.; Valdez, Patrick LJ; Orton, Christopher R.; Mace, Emily K.

    2011-08-07

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility’s entire cylinder inventory. These measurements are time-consuming, expensive, and assay only a small fraction of the total cylinder volume. An automated nondestructive assay system capable of providing enrichment measurements over the full volume of the cylinder could improve upon current verification practices in terms of efficiency and assay accuracy. This paper describes an approach denoted the Integrated Cylinder Verification Station (ICVS) that supports 100% cylinder verification, provides volume-averaged cylinder enrichment assay, and reduces inspector manpower needs. To allow field measurements to be collected to validate data collection algorithms, a prototype radiation detector positioning system was constructed. The system was designed to accurately position an array of radiation detectors along the length of a cylinder to measure UF6 enrichment. A number of alternative radiation shields for the detectors were included with the system. A collimated gamma-ray spectrometer module that allows translation of the detectors in the surrounding shielding to adjust the field of view, and a collimating plug in the end to further reduce the low-energy field of view, were also developed. Proof-of-principle measurements of neutron and high-energy gamma-ray signatures, using moderated neutron detectors and large-volume spectrometers in a fixed-geometry, portal-like configuration, supported an early assessment of the viability of the concept. The system has been used successfully on two testing campaigns at an AREVA fuel fabrication plant to scan over 30 product cylinders. This paper will describe the overall design of the detector positioning system and provide an overview of the Integrated Cylinder Verification Station (ICVS) approach.

  7. Federal Automated Information System of Nuclear Material Control and Accounting: Uniform System of Reporting Documents

    SciTech Connect (OSTI)

    Pitel, M V; Kasumova, L; Babcock, R A; Heinberg, C

    2003-06-12

    One of the fundamental regulations of the Russian State System for Nuclear Material Accounting and Control (SSAC), ''Basic Nuclear Material Control and Accounting Rules,'' directed that a uniform report system be developed to support the operation of the SSAC. According to the ''Regulation on State Nuclear Material Control and Accounting,'' adopted by the Russian Federation Government, Minatom of Russia is response for the development and adoption of report forms, as well as the reporting procedure and schedule. The report forms are being developed in tandem with the creation of an automated national nuclear material control and accounting system, the Federal Information System (FIS). The forms are in different stages of development and implementation. The first report forms (the Summarized Inventory Listing (SIL), Summarized Inventory Change Report (SICR) and federal and agency registers of nuclear material) have already been created and implemented. The second set of reports (nuclear material movement reports and the special anomaly report) is currently in development. A third set of reports (reports on import/export operations, and foreign nuclear material temporarily located in the Russian Federation) is still in the conceptual stage. To facilitate the development of a unified document system, the FIS must establish a uniform philosophy for the reporting system and determine the requirements for each reporting level, adhering to the following principles: completeness--the unified report system provides the entire range of information that the FIS requires to perform SSAC tasks; requisite level of detail; hierarchical structure--each report is based on the information provided in a lower-level report and is the source of information for reports at the next highest level; consistency checking--reports can be checked against other reports. A similar philosophy should eliminate redundancy in the different reports, support a uniform approach to the contents of previously developed and new reports within the FIS, as well as identify the main priorities for the direction of the FIS.

  8. Automated integration of genomic physical mapping data via parallel simulated annealing

    SciTech Connect (OSTI)

    Slezak, T.

    1994-06-01

    The Human Genome Center at the Lawrence Livermore National Laboratory (LLNL) is nearing closure on a high-resolution physical map of human chromosome 19. We have build automated tools to assemble 15,000 fingerprinted cosmid clones into 800 contigs with minimal spanning paths identified. These islands are being ordered, oriented, and spanned by a variety of other techniques including: Fluorescence Insitu Hybridization (FISH) at 3 levels of resolution, ECO restriction fragment mapping across all contigs, and a multitude of different hybridization and PCR techniques to link cosmid, YAC, AC, PAC, and Pl clones. The FISH data provide us with partial order and distance data as well as orientation. We made the observation that map builders need a much rougher presentation of data than do map readers; the former wish to see raw data since these can expose errors or interesting biology. We further noted that by ignoring our length and distance data we could simplify our problem into one that could be readily attacked with optimization techniques. The data integration problem could then be seen as an M x N ordering of our N cosmid clones which ``intersect`` M larger objects by defining ``intersection`` to mean either contig/map membership or hybridization results. Clearly, the goal of making an integrated map is now to rearrange the N cosmid clone ``columns`` such that the number of gaps on the object ``rows`` are minimized. Our FISH partially-ordered cosmid clones provide us with a set of constraints that cannot be violated by the rearrangement process. We solved the optimization problem via simulated annealing performed on a network of 40+ Unix machines in parallel, using a server/client model built on explicit socket calls. For current maps we can create a map in about 4 hours on the parallel net versus 4+ days on a single workstation. Our biologists are now using this software on a daily basis to guide their efforts toward final closure.

  9. Automated laser-based barely visible impact damage detection in honeycomb sandwich composite structures

    SciTech Connect (OSTI)

    Girolamo, D. Yuan, F. G.; Girolamo, L.

    2015-03-31

    Nondestructive evaluation (NDE) for detection and quantification of damage in composite materials is fundamental in the assessment of the overall structural integrity of modern aerospace systems. Conventional NDE systems have been extensively used to detect the location and size of damages by propagating ultrasonic waves normal to the surface. However they usually require physical contact with the structure and are time consuming and labor intensive. An automated, contactless laser ultrasonic imaging system for barely visible impact damage (BVID) detection in advanced composite structures has been developed to overcome these limitations. Lamb waves are generated by a Q-switched Nd:YAG laser, raster scanned by a set of galvano-mirrors over the damaged area. The out-of-plane vibrations are measured through a laser Doppler Vibrometer (LDV) that is stationary at a point on the corner of the grid. The ultrasonic wave field of the scanned area is reconstructed in polar coordinates and analyzed for high resolution characterization of impact damage in the composite honeycomb panel. Two methodologies are used for ultrasonic wave-field analysis: scattered wave field analysis (SWA) and standing wave energy analysis (SWEA) in the frequency domain. The SWA is employed for processing the wave field and estimate spatially dependent wavenumber values, related to discontinuities in the structural domain. The SWEA algorithm extracts standing waves trapped within damaged areas and, by studying the spectrum of the standing wave field, returns high fidelity damage imaging. While the SWA can be used to locate the impact damage in the honeycomb panel, the SWEA produces damage images in good agreement with X-ray computed tomographic (X-ray CT) scans. The results obtained prove that the laser-based nondestructive system is an effective alternative to overcome limitations of conventional NDI technologies.

  10. Exploiting structure similarity in refinement: automated NCS and target-structure restraints in BUSTER

    SciTech Connect (OSTI)

    Smart, Oliver S. Womack, Thomas O.; Flensburg, Claus; Keller, Peter; Paciorek, Włodek; Sharff, Andrew; Vonrhein, Clemens; Bricogne, Gérard

    2012-04-01

    Local structural similarity restraints (LSSR) provide a novel method for exploiting NCS or structural similarity to an external target structure. Two examples are given where BUSTER re-refinement of PDB entries with LSSR produces marked improvements, enabling further structural features to be modelled. Maximum-likelihood X-ray macromolecular structure refinement in BUSTER has been extended with restraints facilitating the exploitation of structural similarity. The similarity can be between two or more chains within the structure being refined, thus favouring NCS, or to a distinct ‘target’ structure that remains fixed during refinement. The local structural similarity restraints (LSSR) approach considers all distances less than 5.5 Å between pairs of atoms in the chain to be restrained. For each, the difference from the distance between the corresponding atoms in the related chain is found. LSSR applies a restraint penalty on each difference. A functional form that reaches a plateau for large differences is used to avoid the restraints distorting parts of the structure that are not similar. Because LSSR are local, there is no need to separate out domains. Some restraint pruning is still necessary, but this has been automated. LSSR have been available to academic users of BUSTER since 2009 with the easy-to-use -autoncs and @@target target.pdb options. The use of LSSR is illustrated in the re-refinement of PDB entries http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -target enables the correct ligand-binding structure to be found, and http://scripts.iucr.org/cgi-bin/cr.cgi?rm, where -autoncs contributes to the location of an additional copy of the cyclic peptide ligand.

  11. SpArcFiRe: Scalable automated detection of spiral galaxy arm segments

    SciTech Connect (OSTI)

    Davis, Darren R.; Hayes, Wayne B. E-mail: whayes@uci.edu

    2014-08-01

    Given an approximately centered image of a spiral galaxy, we describe an entirely automated method that finds, centers, and sizes the galaxy (possibly masking nearby stars and other objects if necessary in order to isolate the galaxy itself) and then automatically extracts structural information about the spiral arms. For each arm segment found, we list the pixels in that segment, allowing image analysis on a per-arm-segment basis. We also perform a least-squares fit of a logarithmic spiral arc to the pixels in that segment, giving per-arc parameters, such as the pitch angle, arm segment length, location, etc. The algorithm takes about one minute per galaxies, and can easily be scaled using parallelism. We have run it on all ?644,000 Sloan objects that are larger than 40 pixels across and classified as 'galaxies'. We find a very good correlation between our quantitative description of a spiral structure and the qualitative description provided by Galaxy Zoo humans. Our objective, quantitative measures of structure demonstrate the difficulty in defining exactly what constitutes a spiral 'arm', leading us to prefer the term 'arm segment'. We find that pitch angle often varies significantly segment-to-segment in a single spiral galaxy, making it difficult to define the pitch angle for a single galaxy. We demonstrate how our new database of arm segments can be queried to find galaxies satisfying specific quantitative visual criteria. For example, even though our code does not explicitly find rings, a good surrogate is to look for galaxies having one long, low-pitch-angle armwhich is how our code views ring galaxies. SpArcFiRe is available at http://sparcfire.ics.uci.edu.

  12. Automated next-to-leading order predictions for new physics at the LHC: The case of colored scalar pair production

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Degrande, Céline; Fuks, Benjamin; Hirschi, Valentin; Proudom, Josselin; Shao, Hua -Sheng

    2015-05-05

    We present for the first time the full automation of collider predictions matched with parton showers at the next-to-leading accuracy in QCD within nontrivial extensions of the standard model. The sole inputs required from the user are the model Lagrangian and the process of interest. As an application of the above, we explore scenarios beyond the standard model where new colored scalar particles can be pair produced in hadron collisions. Using simplified models to describe the new field interactions with the standard model, we present precision predictions for the LHC within the MadGraph5_aMC@NLO framework.

  13. Using the BEopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    SciTech Connect (OSTI)

    Tabares-Velasco, P. C.; Maguire, J.; Horowitz, S.; Christensen, C.

    2014-09-01

    Verification and validation are crucial software quality control procedures when developing and implementing models. This is particularly important as a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes models that isolate the impacts of specific building components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models; these discrepancies are caused by differences in the models used by the engines or coding errors.

  14. Using the Beopt Automated Residential Simulation Test Suite to Enable Comparative Analysis Between Energy Simulation Engines: Preprint

    SciTech Connect (OSTI)

    Tabares-Velasco, Paulo Cesar; Maguire, Jeff; Horowitz, Scott; Christensen, Craig

    2014-09-01

    Verification and validation are crucial software quality control procedures to follow when developing and implementing models. This is particularly important because a variety of stakeholders rely on accurate predictions from building simulation programs. This study uses the BEopt Automated Residential Simulation Test Suite (BARTS) to facilitate comparison of two energy simulation engines across various building components and includes building models that isolate the impacts of specific components on annual energy consumption. As a case study, BARTS has been used to identify important discrepancies between the engines for several components of the building models. These discrepancies are caused by differences in the algorithms used by the engines or coding errors.

  15. Flow Mapping in a Gas-Solid Riser via Computer Automated Radioactive Particle Tracking (CARPT)

    SciTech Connect (OSTI)

    Muthanna Al-Dahhan; Milorad P. Dudukovic; Satish Bhusarapu; Timothy J. O'hern; Steven Trujillo; Michael R. Prairie

    2005-06-04

    Statement of the Problem: Developing and disseminating a general and experimentally validated model for turbulent multiphase fluid dynamics suitable for engineering design purposes in industrial scale applications of riser reactors and pneumatic conveying, require collecting reliable data on solids trajectories, velocities ? averaged and instantaneous, solids holdup distribution and solids fluxes in the riser as a function of operating conditions. Such data are currently not available on the same system. Multiphase Fluid Dynamics Research Consortium (MFDRC) was established to address these issues on a chosen example of circulating fluidized bed (CFB) reactor, which is widely used in petroleum and chemical industry including coal combustion. This project addresses the problem of lacking reliable data to advance CFB technology. Project Objectives: The objective of this project is to advance the understanding of the solids flow pattern and mixing in a well-developed flow region of a gas-solid riser, operated at different gas flow rates and solids loading using the state-of-the-art non-intrusive measurements. This work creates an insight and reliable database for local solids fluid-dynamic quantities in a pilot-plant scale CFB, which can then be used to validate/develop phenomenological models for the riser. This study also attempts to provide benchmark data for validation of Computational Fluid Dynamic (CFD) codes and their current closures. Technical Approach: Non-Invasive Computer Automated Radioactive Particle Tracking (CARPT) technique provides complete Eulerian solids flow field (time average velocity map and various turbulence parameters such as the Reynolds stresses, turbulent kinetic energy, and eddy diffusivities). It also gives directly the Lagrangian information of solids flow and yields the true solids residence time distribution (RTD). Another radiation based technique, Computed Tomography (CT) yields detailed time averaged local holdup profiles at various planes. Together, these two techniques can provide the needed local solids flow dynamic information for the same setup under identical operating conditions, and the data obtained can be used as a benchmark for development, and refinement of the appropriate riser models. For the above reasons these two techniques were implemented in this study on a fully developed section of the riser. To derive the global mixing information in the riser, accurate solids RTD is needed and was obtained by monitoring the entry and exit of a single radioactive tracer. Other global parameters such as Cycle Time Distribution (CTD), overall solids holdup in the riser, solids recycle percentage at the bottom section of the riser were evaluated from different solids travel time distributions. Besides, to measure accurately and in-situ the overall solids mass flux, a novel method was applied.

  16. Automated Demand Response: The Missing Link in the Electricity Value Chain

    SciTech Connect (OSTI)

    McKane, Aimee; Rhyne, Ivin; Piette, Mary Ann; Thompson, Lisa; Lekov, Alex

    2008-08-01

    In 2006, the Public Interest Energy Research Program (PIER) Demand Response Research Center (DRRC) at Lawrence Berkeley National Laboratory initiated research into Automated Demand Response (OpenADR) applications in California industry. The goal is to improve electric grid reliability and lower electricity use during periods of peak demand. The purpose of this research is to begin to define the relationship among a portfolio of actions that industrial facilities can undertake relative to their electricity use. This 'electricity value chain' defines energy management and demand response (DR) at six levels of service, distinguished by the magnitude, type, and rapidity of response. One element in the electricity supply chain is OpenADR, an open-standards based communications system to send signals to customers to allow them to manage their electric demand in response to supply conditions, such as prices or reliability, through a set of standard, open communications. Initial DRRC research suggests that industrial facilities that have undertaken energy efficiency measures are probably more, not less, likely to initiate other actions within this value chain such as daily load management and demand response. Moreover, OpenADR appears to afford some facilities the opportunity to develop the supporting control structure and to 'demo' potential reductions in energy use that can later be applied to either more effective load management or a permanent reduction in use via energy efficiency. Under the right conditions, some types of industrial facilities can shift or shed loads, without any, or minimal disruption to operations, to protect their energy supply reliability and to take advantage of financial incentives. In 2007 and 2008, 35 industrial facilities agreed to implement OpenADR, representing a total capacity of nearly 40 MW. This paper describes how integrated or centralized demand management and system-level network controls are linked to OpenADR systems. Case studies of refrigerated warehouses and wastewater treatment facilities are used to illustrate OpenADR load reduction potential. Typical shed and shift strategies include: turning off or operating compressors, aerator blowers and pumps at reduced capacity, increasing temperature set-points or pre-cooling cold storage areas and over-oxygenating stored wastewater prior to a DR event. This study concludes that understanding industrial end-use processes and control capabilities is a key to support reduced service during DR events and these capabilities, if DR enabled, hold significant promise in reducing the electricity demand of the industrial sector during utility peak periods.

  17. Automated Demand Response: The Missing Link in the Electricity Value Chain

    SciTech Connect (OSTI)

    McKane, Aimee; Rhyne, Ivin; Lekov, Alex; Thompson, Lisa; Piette, MaryAnn

    2009-08-01

    In 2006, the Public Interest Energy Research Program (PIER) Demand Response Research Center (DRRC) at Lawrence Berkeley National Laboratory initiated research into Automated Demand Response (OpenADR) applications in California industry. The goal is to improve electric grid reliability and lower electricity use during periods of peak demand. The purpose of this research is to begin to define the relationship among a portfolio of actions that industrial facilities can undertake relative to their electricity use. This ?electricity value chain? defines energy management and demand response (DR) at six levels of service, distinguished by the magnitude, type, and rapidity of response. One element in the electricity supply chain is OpenADR, an open-standards based communications system to send signals to customers to allow them to manage their electric demand in response to supply conditions, such as prices or reliability, through a set of standard, open communications. Initial DRRC research suggests that industrial facilities that have undertaken energy efficiency measures are probably more, not less, likely to initiate other actions within this value chain such as daily load management and demand response. Moreover, OpenADR appears to afford some facilities the opportunity to develop the supporting control structure and to"demo" potential reductions in energy use that can later be applied to either more effective load management or a permanent reduction in use via energy efficiency. Under the right conditions, some types of industrial facilities can shift or shed loads, without any, or minimal disruption to operations, to protect their energy supply reliability and to take advantage of financial incentives.1 In 2007 and 2008, 35 industrial facilities agreed to implement OpenADR, representing a total capacity of nearly 40 MW. This paper describes how integrated or centralized demand management and system-level network controls are linked to OpenADR systems. Case studies of refrigerated warehouses and wastewater treatment facilities are used to illustrate OpenADR load reduction potential. Typical shed and shift strategies include: turning off or operating compressors, aerator blowers and pumps at reduced capacity, increasing temperature set-points or pre-cooling cold storage areas and over-oxygenating stored wastewater prior to a DR event. This study concludes that understanding industrial end-use processes and control capabilities is a key to support reduced service during DR events and these capabilities, if DR enabled, hold significant promise in reducing the electricity demand of the industrial sector during utility peak periods.

  18. SU-D-BRD-06: Automated Population-Based Planning for Whole Brain Radiation Therapy

    SciTech Connect (OSTI)

    Schreibmann, E; Fox, T; Crocker, I; Shu, H

    2014-06-01

    Purpose: Treatment planning for whole brain radiation treatment is technically a simple process but in practice it takes valuable clinical time of repetitive and tedious tasks. This report presents a method that automatically segments the relevant target and normal tissues and creates a treatment plan in only a few minutes after patient simulation. Methods: Segmentation is performed automatically through morphological operations on the soft tissue. The treatment plan is generated by searching a database of previous cases for patients with similar anatomy. In this search, each database case is ranked in terms of similarity using a customized metric designed for sensitivity by including only geometrical changes that affect the dose distribution. The database case with the best match is automatically modified to replace relevant patient info and isocenter position while maintaining original beam and MLC settings. Results: Fifteen patients were used to validate the method. In each of these cases the anatomy was accurately segmented to mean Dice coefficients of 0.970 ± 0.008 for the brain, 0.846 ± 0.009 for the eyes and 0.672 ± 0.111 for the lens as compared to clinical segmentations. Each case was then subsequently matched against a database of 70 validated treatment plans and the best matching plan (termed auto-planned), was compared retrospectively with the clinical plans in terms of brain coverage and maximum doses to critical structures. Maximum doses were reduced by a maximum of 20.809 Gy for the left eye (mean 3.533), by 13.352 (1.311) for the right eye, and by 27.471 (4.856), 25.218 (6.315) for the left and right lens. Time from simulation to auto-plan was 3-4 minutes. Conclusion: Automated database- based matching is an alternative to classical treatment planning that improves quality while providing a cost—effective solution to planning through modifying previous validated plans to match a current patient's anatomy.

  19. Characterization and Application of Superlig 620 Solid Phase Extraction Resin for Automated Process Monitoring of 90Sr

    SciTech Connect (OSTI)

    Devol, Timothy A.; Clements, John P.; Farawila, Anne F.; O'Hara, Matthew J.; Egorov, Oleg; Grate, Jay W.

    2009-11-30

    Characterization of SuperLig 620 solid phase extraction resin was performed in order to develop an automated on-line process monitor for 90Sr. The main focus was on strontium separation from barium, with the goal of developing an automated separation process for 90Sr in high-level wastes. High-level waste contains significant 137Cs activity, of which 137mBa is of great concern as an interference to the quantification of strontium. In addition barium, yttrium and plutonium were studied as potential interferences to strontium uptake and detection. A number of complexants were studied in a series of batch Kd experiments, as SuperLig 620 was not previously known to elute strontium in typical mineral acids. The optimal separation was found using a 2M nitric acid load solution with a strontium elution step of ~0.49M ammonium citrate and a barium elution step of ~1.8M ammonium citrate. 90Sr quantification of Hanford high-level tank waste was performed on a sequential injection analysis microfluidics system coupled to a flow-cell detector. The results of the on-line procedure are compared to standard radiochemical techniques in this paper.

  20. Automated insertion of sequences into a ribosomal RNA alignment: An application of computational linguistics in molecular biology

    SciTech Connect (OSTI)

    Taylor, R.C.

    1991-11-01

    This thesis involved the construction of (1) a grammar that incorporates knowledge on base invariancy and secondary structure in a molecule and (2) a parser engine that uses the grammar to position bases into the structural subunits of the molecule. These concepts were combined with a novel pinning technique to form a tool that semi-automates insertion of a new species into the alignment for the 16S rRNA molecule (a component of the ribosome) maintained by Dr. Carl Woese's group at the University of Illinois at Urbana. The tool was tested on species extracted from the alignment and on a group of entirely new species. The results were very encouraging, and the tool should be substantial aid to the curators of the 16S alignment. The construction of the grammar was itself automated, allowing application of the tool to alignments for other molecules. The logic programming language Prolog was used to construct all programs involved. The computational linguistics approach used here was found to be a useful way to attach the problem of insertion into an alignment.

  1. Automated insertion of sequences into a ribosomal RNA alignment: An application of computational linguistics in molecular biology

    SciTech Connect (OSTI)

    Taylor, R.C.

    1991-11-01

    This thesis involved the construction of (1) a grammar that incorporates knowledge on base invariancy and secondary structure in a molecule and (2) a parser engine that uses the grammar to position bases into the structural subunits of the molecule. These concepts were combined with a novel pinning technique to form a tool that semi-automates insertion of a new species into the alignment for the 16S rRNA molecule (a component of the ribosome) maintained by Dr. Carl Woese`s group at the University of Illinois at Urbana. The tool was tested on species extracted from the alignment and on a group of entirely new species. The results were very encouraging, and the tool should be substantial aid to the curators of the 16S alignment. The construction of the grammar was itself automated, allowing application of the tool to alignments for other molecules. The logic programming language Prolog was used to construct all programs involved. The computational linguistics approach used here was found to be a useful way to attach the problem of insertion into an alignment.

  2. Intelligent Production Monitoring and Control based on Three Main Modules for Automated Manufacturing Cells in the Automotive Industry

    SciTech Connect (OSTI)

    Berger, Ulrich; Kretzschmann, Ralf; Algebra, A. Vargas Veronica

    2008-06-12

    The automotive industry is distinguished by regionalization and customization of products. As consequence, the diversity of products will increase while the lot sizes will decrease. Thus, more product types will be handled along the process chain and common production paradigms will fail. Although Rapid Manufacturing (RM) methodology will be used for producing small individual lot sizes, new solution for joining and assembling these components are needed. On the other hand, the non-availability of existing operational knowledge and the absence of dynamic and explicit knowledge retrieval minimize the achievement of on-demand capabilities. Thus, in this paper, an approach for an Intelligent Production System will be introduced. The concept is based on three interlinked main modules: a Technology Data Catalogue (TDC) based on an ontology system, an Automated Scheduling Processor (ASP) based on graph theory and a central Programmable Automation Controller (PAC) for real-time sensor/actor communication. The concept is being implemented in a laboratory set-up with several assembly and joining processes and will be experimentally validated in some research and development projects.

  3. RAPID-L Highly Automated Fast Reactor Concept Without Any Control Rods (1) Reactor concept and plant dynamics analyses

    SciTech Connect (OSTI)

    Kambe, Mitsuru [Central Research Institute of Electric Power Industry (CRIEPI), 2-11-1, Iwado Kita, Komae-shi, Tokyo, 201-8511 (Japan); Tsunoda, Hirokazu [Mitsubishi Research Institute, Inc. 3-6, Otemachi 2-chome, Chiyoda-ku, Tokyo, 100-8141 (Japan); Mishima, Kaichiro [Research Reactor Institute, Kyoto University, Kumatori-cho, Sennan-gun, Osaka, 590-20494 (Japan); Iwamura, Takamichi [Japan Atomic Energy Research Institute, 2-4, Shirakata-shirane, Tokai-mura, Naka-gun, Ibaraki-ken, 319-1195 (Japan)

    2002-07-01

    The 200 kWe uranium-nitride fueled lithium cooled fast reactor concept 'RAPID-L' to achieve highly automated reactor operation has been demonstrated. RAPID-L is designed for Lunar base power system. It is one of the variants of RAPID (Refueling by All Pins Integrated Design), fast reactor concept, which enable quick and simplified refueling. The essential feature of RAPID concept is that the reactor core consists of an integrated fuel assembly instead of conventional fuel subassemblies. In this small size reactor core, 2700 fuel pins are integrated altogether and encased in a fuel cartridge. Refueling is conducted by replacing a fuel cartridge. The reactor can be operated without refueling for up to 10 years. Unique challenges in reactivity control systems design have been attempted in RAPID-L concept. The reactor has no control rod, but involves the following innovative reactivity control systems: Lithium Expansion Modules (LEM) for inherent reactivity feedback, Lithium Injection Modules (LIM) for inherent ultimate shutdown, and Lithium Release Modules (LRM) for automated reactor startup. All these systems adopt lithium-6 as a liquid poison instead of B{sub 4}C rods. In combination with LEMs, LIMs and LRMs, RAPID-L can be operated without operator. This is the first reactor concept ever established in the world. This reactor concept is also applicable to the terrestrial fast reactors. In this paper, RAPID-L reactor concept and its transient characteristics are presented. (authors)

  4. Index of /datasets/files/41/pub/PUBID8_3989

    Open Energy Info (EERE)

    Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 2142255935 Varnish cache server Directory Listing This is a file-level view of...

  5. Index of /datasets/files/41/pub/PUBID8_1234

    Open Energy Info (EERE)

    MaxTech 21-May-2012 23:38 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 99456974...

  6. Automated Price and Demand Response Demonstration for Large Customers in New York City using OpenADR

    SciTech Connect (OSTI)

    Kim, Joyce Jihyun; Yin, Rongxin; Kiliccote, Sila

    2013-10-01

    Open Automated Demand Response (OpenADR), an XML-based information exchange model, is used to facilitate continuous price-responsive operation and demand response participation for large commercial buildings in New York who are subject to the default day-ahead hourly pricing. We summarize the existing demand response programs in New York and discuss OpenADR communication, prioritization of demand response signals, and control methods. Building energy simulation models are developed and field tests are conducted to evaluate continuous energy management and demand response capabilities of two commercial buildings in New York City. Preliminary results reveal that providing machine-readable prices to commercial buildings can facilitate both demand response participation and continuous energy cost savings. Hence, efforts should be made to develop more sophisticated algorithms for building control systems to minimize customer's utility bill based on price and reliability information from the electricity grid.

  7. Remote-controlled NDA (nondestructive assay) systems for feed and product storage at an automated MOX (mixed oxide) facility

    SciTech Connect (OSTI)

    Menlove, H.O.; Augustson, R.H.; Ohtani, T.; Seya, M.; Takahashi, S.; Abedin-Zadeh, R.; Hassan, B.; Napoli, S.

    1989-01-01

    Nondestructive assay (NDA) systems have been developed for use in an automated mixed oxide (MOX) fabrication facility. Unique features have been developed for the NDA systems to accommodate robotic sample handling and remote operation. In addition, the systems have been designed to obtain International Atomic Energy Agency inspection data without the need for an inspector at the facility at the time of the measurements. The equipment is being designed to operate continuously in an unattended mode with data storage for periods of up to one month. The two systems described in this paper include a canister counter for the assay of MOX powder at the input to the facility and a capsule counter for the assay of complete liquid-metal fast breeder reactor fuel assemblies at the output of the plant. The design, performance characteristics, and authentication of the two systems will be described. The data related to reliability, precision, and stability will be presented. 5 refs., 10 figs., 4 tabs.

  8. Automated solar cell assembly teamed process research. Semiannual subcontract report, 7 January 1993--30 June 1993

    SciTech Connect (OSTI)

    Nowlan, M.J.; Hogan, S.J.; Darkazalli, G.; Breen, W.F.; Murach, J.M.; Sutherland, S.F.

    1994-02-01

    This report describes work done under Phase 3A of the PVMaT project to address problems that are generic to the photovoltaics (PV) industry. Crystalline silicon solar cells were used in the majority of all terrestrial power modules shipped in 1992. Spire`s analysis in Phase 1 of the PVMaT project indicated that the use of thin ({le}200-{mu}m) silicon cells can substantially reduce module manufacturing costs, provided that processing yields remain as high as they are now for processing standard thickness cells. Because present solar cell tabbing and interconnecting processes have unacceptably high yield losses with such thin cells, the objective of this Phase 3A subcontract is to use Spire`s light soldering technology and experience in designing and fabricating solar cell tabbing and interconnecting equipment to develop high yield throughput, fully automated processes for tabbing and interconnecting thin cells.

  9. Digital field-bus mode SCADA is key to offshore efficiency. [Automation of offshore gas production platforms

    SciTech Connect (OSTI)

    Cuthbert, P. )

    1994-02-01

    An all-digital SCADA network has been installed in one of the North Sea's largest natural gas fields, controlling the delivery of gas from Shell UK Exploration and Production's souther-area fields to a British Gas Terminal at Bacton, UK. The innovative use of digital technology -- based on the industry-standard HART field protocol -- to complete a digital communications link stretching from the onshore SCADA host right out to the process variable transmitters on the platforms, is playing a key role in the automation of the monitoring and control system by allowing Shell UK Expro to run the majority of the platforms unmanned. The SCADA system is part of a major refit being carried out by Shell Expro on its Leman field. The refit is part of the company's long-term strategy to extend the lifetime of this established field, which started operations in the late 1960s. In order to meet this goal, the prime requirements are to reduce operational costs and risk exposure, and the key element in this area was to reduce the need for resident staff and all of their associated support and equipment costs, through the deployment of automation. The system will achieve the project's cost-cutting aims, but also break new ground in control and monitoring technology for the gas industry, through the use of a smart transmitter scheme as a digital field communications within the wide-area network, using the protocol's all-digital capability in preference to the commonly used 4-20mA-compatible mode, will allow real-time monitoring and control, plus maintenance and diagnostics, to take place remotely. This paper describes the design of this system.

  10. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    SciTech Connect (OSTI)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons.

  11. ESC-CMD

    Energy Science and Technology Software Center (OSTI)

    2003-07-01

    Web-based conference management product based on integrated relational databases and dynamic webpage generation that can collect a vast array of varying data such as proposals, contact information, PDF abstracts and full papers, registration information, etc., to organize and administrate a technical conference. Program functions and capabilities include: Session proposals-electronic submission Abstract submission-electronic submission w/PDF upload capabilities Full paper submission-electronic w/PDF upload capabilities Conference registration Session administration Abstract and full paper administration Automated e-mail capabilities.

  12. Breast Imaging Reporting and Data System (BI-RADS) breast composition descriptors: Automated measurement development for full field digital mammography

    SciTech Connect (OSTI)

    Fowler, E. E.; Sellers, T. A.; Lu, B.; Heine, J. J.

    2013-11-15

    Purpose: The Breast Imaging Reporting and Data System (BI-RADS) breast composition descriptors are used for standardized mammographic reporting and are assessed visually. This reporting is clinically relevant because breast composition can impact mammographic sensitivity and is a breast cancer risk factor. New techniques are presented and evaluated for generating automated BI-RADS breast composition descriptors using both raw and calibrated full field digital mammography (FFDM) image data.Methods: A matched case-control dataset with FFDM images was used to develop three automated measures for the BI-RADS breast composition descriptors. Histograms of each calibrated mammogram in the percent glandular (pg) representation were processed to create the new BR{sub pg} measure. Two previously validated measures of breast density derived from calibrated and raw mammograms were converted to the new BR{sub vc} and BR{sub vr} measures, respectively. These three measures were compared with the radiologist-reported BI-RADS compositions assessments from the patient records. The authors used two optimization strategies with differential evolution to create these measures: method-1 used breast cancer status; and method-2 matched the reported BI-RADS descriptors. Weighted kappa (?) analysis was used to assess the agreement between the new measures and the reported measures. Each measure's association with breast cancer was evaluated with odds ratios (ORs) adjusted for body mass index, breast area, and menopausal status. ORs were estimated as per unit increase with 95% confidence intervals.Results: The three BI-RADS measures generated by method-1 had ? between 0.250.34. These measures were significantly associated with breast cancer status in the adjusted models: (a) OR = 1.87 (1.34, 2.59) for BR{sub pg}; (b) OR = 1.93 (1.36, 2.74) for BR{sub vc}; and (c) OR = 1.37 (1.05, 1.80) for BR{sub vr}. The measures generated by method-2 had ? between 0.420.45. Two of these measures were significantly associated with breast cancer status in the adjusted models: (a) OR = 1.95 (1.24, 3.09) for BR{sub pg}; (b) OR = 1.42 (0.87, 2.32) for BR{sub vc}; and (c) OR = 2.13 (1.22, 3.72) for BR{sub vr}. The radiologist-reported measures from the patient records showed a similar association, OR = 1.49 (0.99, 2.24), although only borderline statistically significant.Conclusions: A general framework was developed and validated for converting calibrated mammograms and continuous measures of breast density to fully automated approximations for the BI-RADS breast composition descriptors. The techniques are general and suitable for a broad range of clinical and research applications.

  13. TU-C-BRE-02: A Novel, Highly Efficient and Automated Quality Assurance Tool for Modern Linear Accelerators

    SciTech Connect (OSTI)

    Goddu, S; Sun, B; Yaddanapudi, S; Kamal, G; Mutic, S; Baltes, C; Rose, S; Stinson, K

    2014-06-15

    Purpose: Quality assurance (QA) of complex linear accelerators is critical and highly time consuming. Varians Machine Performance Check (MPC) uses IsoCal phantom to test geometric and dosimetric aspects of the TrueBeam systems in <5min. In this study we independently tested the accuracy and robustness of the MPC tools. Methods: MPC is automated for simultaneous image-acquisition, using kV-and-MV onboard-imagers (EPIDs), while delivering kV-and-MV beams in a set routine of varying gantry, collimator and couch angles. MPC software-tools analyze the images to test: i) beam-output and uniformity, ii) positional accuracy of isocenter, EPIDs, collimating jaws (CJs), MLC leaves and couch and iii) rotational accuracy of gantry, collimator and couch. 6MV-beam dose-output and uniformity were tested using ionization-chamber (IC) and ICarray. Winston-Lutz-Tests (WLT) were performed to measure isocenter-offsets caused by gantry, collimator and couch rotations. Positional accuracy of EPIDs was evaluated using radio-opaque markers of the IsoCal phantom. Furthermore, to test the robustness of the MPC tools we purposefully miscalibrated a non-clinical TrueBeam by introducing errors in beam-output, energy, symmetry, gantry angle, couch translations, CJs and MLC leaves positions. Results: 6MV-output and uniformity were within 0.6% for most measurements with a maximum deviation of 1.0%. Average isocenter-offset caused by gantry and collimator rotations was 0.3160.011mm agreeing with IsoLock (0.274mm) and WLT (0.41mm). Average rotation-induced couch-shift from MPC was 0.3780.032mm agreeing with WLT (0.35mm). MV-and-kV imager-offsets measured by MPC were within 0.15mm. MPC predicted all machine miscalibrations within acceptable clinical tolerance. MPC detected the output miscalibrations within 0.61% while the MLC and couch positions were within 0.06mm and 0.14mm, respectively. Gantry angle miscalibrations were detected within 0.1. Conclusions: MPC is a useful tool for QA of TrueBeam systems and its automation makes it highly efficient for testing both geometric and dosimetric aspects of the machine. This is very important for hypo-fractionated SBRT treatments. Received support from Varian Medical Systems, Palo Alto, CA 94304-1038.

  14. Automated fit of high-dimensional potential energy surfaces using cluster analysis and interpolation over descriptors of chemical environment

    SciTech Connect (OSTI)

    Fournier, Ren Orel, Slava

    2013-12-21

    We present a method for fitting high-dimensional potential energy surfaces that is almost fully automated, can be applied to systems with various chemical compositions, and involves no particular choice of function form. We tested it on four systems: Ag{sub 20}, Sn{sub 6}Pb{sub 6}, Si{sub 10}, and Li{sub 8}. The cost for energy evaluation is smaller than the cost of a density functional theory (DFT) energy evaluation by a factor of 1500 for Li{sub 8}, and 60000 for Ag{sub 20}. We achieved intermediate accuracy (errors of 0.4 to 0.8 eV on atomization energies, or, 1% to 3% on cohesive energies) with rather small datasets (between 240 and 1400 configurations). We demonstrate that this accuracy is sufficient to correctly screen the configurations with lowest DFT energy, making this function potentially very useful in a hybrid global optimization strategy. We show that, as expected, the accuracy of the function improves with an increase in the size of the fitting dataset.

  15. Self-propelled in-tube shuttle and control system for automated measurements of magnetic field alignment

    SciTech Connect (OSTI)

    Boroski, W.N.; Nicol, T.H. ); Pidcoe, S.V. . Space Systems Div.); Zink, R.A. )

    1990-03-01

    A magnetic field alignment gauge is used to measure the field angle as a function of axial position in each of the magnets for the Superconducting Super Collider (SSC). Present measurements are made by manually pushing the through the magnet bore tube and stopping at intervals to record field measurements. Gauge location is controlled through graduation marks and alignment pins on the push rods. Field measurements are recorded on a logging multimeter with tape output. Described is a computerized control system being developed to replace the manual procedure for field alignment measurements. The automated system employs a pneumatic walking device to move the measurement gauge through the bore tube. Movement of the device, called the Self-Propelled In-Tube Shuttle (SPITS), is accomplished through an integral, gas driven, double-acting cylinder. The motion of the SPITS is transferred to the bore tube by means of a pair of controlled, retractable support feet. Control of the SPITS is accomplished through an RS-422 interface from an IBM-compatible computer to a series of solenoid-actuated air valves. Direction of SPITS travel is determined by the air-valve sequence, and is managed through the control software. Precise axial position of the gauge within the magnet is returned to the control system through an optically-encoded digital position transducer attached to the shuttle. Discussed is the performance of the transport device and control system during preliminary testing of the first prototype shuttle. 1 ref., 7 figs.

  16. Evaluation of coal-mineral association and coal cleanability by using SEM-based automated image analysis

    SciTech Connect (OSTI)

    Straszheim, W.E.; Younkin, K.A.; Markuszewski, R. ); Smith, F.J. )

    1988-06-01

    A technique employing SEM-based automated image analysis (AIA) has been developed for assessing the association of mineral particles with coal, and thus the cleanability of that coal, when the characteristics of the separation process are known. Data resulting from AIA include the mineral distribution by particle size, mineral phase, and extent of association with coal. This AIA technique was applied to samples of -325 mesh (-44 ..mu..m) coal from the Indiana No. 3, Upper Freeport, and Sunnyside (UT) seams. The coals were subjected to cleaning by float-sink separations at 1.3, 1.4, 1.6, and 1.9 specific gravity and by froth flotation. For the three coals, the float-sink procedure at a given specific gravity produced different amounts of clean coal, but with similar ash content. Froth flotation removed much less ash, yielding a product ash content of --8% for the Upper Freeport coal, regardless of recovery, while reducing the ash content to less than 5% for the other two coals. The AIA results documented significantly more association of minerals with the Upper Freeport coal, which thus led to the poor ash reduction.

  17. Automated-In-Motion Vehicle Evaluation Environment (AIMVEE) Weigh-In Motion (WIM) User Training and Testing

    Energy Science and Technology Software Center (OSTI)

    2006-05-04

    The AIMVEE/WIM system electronically retrieves deployment information, identifies vehicle automatically, and determines total weight, individual wheel weight, individual axle weights, axle spacing, and center-of-balance for any wheeled vehicle in motion. The AIMVEE/WIM system can also perform these functions statically for both wheel vehicles and cargo with information. The AIMVEE/WIM system incorporates digital images and applies cubing algorithms to determine length, width, height for cubic dimensions of both vehicle and cargo. Once all this information ismore » stored, it electronically links to data collection and dissemination systems to provide “actual” weight and measurement information for planning, deployment, and in-transit visibility. The Static Scale Conversion (SSC) system is an unique enhancement to the AIMVEE/WIM system. It enables a SSC to weigh and measure vehicles and cargo dynamically (i.e., as they pass over the large scale and is included in the AIMVEE computer code base. The material to be copyrighted is the Automated-In-Motion Vehicle Evaluation Environment (AIMVEE)/Weigh-In-Motion User Training and Testing material. It includes instructional material in the set-up, operation and tear-down of the AIMVEE/WIM system. It also includes a final exam associated with the training.« less

  18. Automated quadrilateral surface discretization method and apparatus usable to generate mesh in a finite element analysis system

    DOE Patents [OSTI]

    Blacker, Teddy D.

    1994-01-01

    An automatic quadrilateral surface discretization method and apparatus is provided for automatically discretizing a geometric region without decomposing the region. The automated quadrilateral surface discretization method and apparatus automatically generates a mesh of all quadrilateral elements which is particularly useful in finite element analysis. The generated mesh of all quadrilateral elements is boundary sensitive, orientation insensitive and has few irregular nodes on the boundary. A permanent boundary of the geometric region is input and rows are iteratively layered toward the interior of the geometric region. Also, an exterior permanent boundary and an interior permanent boundary for a geometric region may be input and the rows are iteratively layered inward from the exterior boundary in a first counter clockwise direction while the rows are iteratively layered from the interior permanent boundary toward the exterior of the region in a second clockwise direction. As a result, a high quality mesh for an arbitrary geometry may be generated with a technique that is robust and fast for complex geometric regions and extreme mesh gradations.

  19. Automated on-line L-edge measurement of SNM concentration for near-real-time accounting

    SciTech Connect (OSTI)

    Russo, P.A.; Marks, T. Jr.; Stephens, M.M.; Hsue, S.T.; Baker, A.L.; Cobb, D.D.

    1982-01-01

    The L-edge densitometer developed at Los Alamos National Laboratory has been modified, tested, and demonstrated for on-line assay of special nuclear material concentration in flowing solution streams. The demonstration was part of a larger demonstration of near-real-time nuclear materials accounting during a continuous, week-long, cold operation of the Allied General Nuclear Services facility in Barnwell, South Carolina. The L-edge data were automatically analyzed and the results were transmitted to the materials accounting computer once every 5.5 min for the duration of the cold run. This report compares the results of the L-edge analyses with the delayed results obtained from destructive analysis of samples withdrawn from the same process line. Comparisons are also made with the results obtained in near real time from an automated process control instrument installed in series with the L-edge densitometer. The performance of the L-edge instrument was reliable throughout the continous operation. The assay precision was consistent with that predicted by the counting statistics of the measurement. The results of the L-edge assays show good agreement with those of the destructive assays. A gradually varying discrepancy (of a few percent) between the L-edge and the process control results remains unexplained. 9 figures.

  20. Opportunities for Automated Demand Response in Wastewater Treatment Facilities in California - Southeast Water Pollution Control Plant Case Study

    SciTech Connect (OSTI)

    Olsen, Daniel; Goli, Sasank; Faulkner, David; McKane, Aimee

    2012-12-20

    This report details a study into the demand response potential of a large wastewater treatment facility in San Francisco. Previous research had identified wastewater treatment facilities as good candidates for demand response and automated demand response, and this study was conducted to investigate facility attributes that are conducive to demand response or which hinder its implementation. One years' worth of operational data were collected from the facility's control system, submetered process equipment, utility electricity demand records, and governmental weather stations. These data were analyzed to determine factors which affected facility power demand and demand response capabilities The average baseline demand at the Southeast facility was approximately 4 MW. During the rainy season (October-March) the facility treated 40% more wastewater than the dry season, but demand only increased by 4%. Submetering of the facility's lift pumps and centrifuges predicted load shifts capabilities of 154 kW and 86 kW, respectively, with large lift pump shifts in the rainy season. Analysis of demand data during maintenance events confirmed the magnitude of these possible load shifts, and indicated other areas of the facility with demand response potential. Load sheds were seen to be possible by shutting down a portion of the facility's aeration trains (average shed of 132 kW). Load shifts were seen to be possible by shifting operation of centrifuges, the gravity belt thickener, lift pumps, and external pump stations These load shifts were made possible by the storage capabilities of the facility and of the city's sewer system. Large load reductions (an average of 2,065 kW) were seen from operating the cogeneration unit, but normal practice is continuous operation, precluding its use for demand response. The study also identified potential demand response opportunities that warrant further study: modulating variable-demand aeration loads, shifting operation of sludge-processing equipment besides centrifuges, and utilizing schedulable self-generation.

  1. AUTOMATED DEAD-END ULTRAFILTRATION FOR ENHANCED SURVEILLANCE OF LEGIONELLA 2 PNEUMOPHILA AND LEGIONELLA SPP. IN COOLING TOWER WATERS

    SciTech Connect (OSTI)

    Brigmon, R.; Leskinen, S.; Kearns, E.; Jones, W.; Miller, R.; Betivas, C.; Kingsley, M.; Lim, D.

    2011-10-10

    Detection of Legionella pneumophila in cooling towers and domestic hot water systems involves concentration by centrifugation or membrane filtration prior to inoculation onto growth media or analysis using techniques such as PCR or immunoassays. The Portable Multi-use Automated Concentration System (PMACS) was designed for concentrating microorganisms from large volumes of water in the field and was assessed for enhancing surveillance of L. pneumophila at the Savannah River Site, SC. PMACS samples (100 L; n = 28) were collected from six towers between August 2010 and April 2011 with grab samples (500 ml; n = 56) being collected before and after each PMACS sample. All samples were analyzed for the presence of L. pneumophila by direct fluorescence immunoassay (DFA) using FITC-labeled monoclonal antibodies targeting serogroups 1, 2, 4 and 6. QPCR was utilized for detection of Legionella spp. in the same samples. Counts of L. pneumophila from DFA and of Legionella spp. from qPCR were normalized to cells/L tower water. Concentrations were similar between grab and PMACS samples collected throughout the study by DFA analysis (P = 0.4461; repeated measures ANOVA). The same trend was observed with qPCR. However, PMACS concentration proved advantageous over membrane filtration by providing larger volume, more representative samples of the cooling tower environment, which led to reduced variability among sampling events and increasing the probability of detection of low level targets. These data highlight the utility of the PMACS for enhanced surveillance of L. pneumophila by providing improved sampling of the cooling tower environment.

  2. Long-term elemental dry deposition fluxes measured around Lake Michigan with an automated dry deposition sampler

    SciTech Connect (OSTI)

    Shahin, U. Yi, S.M.; Paode, R.D.; Holsen, T.M.

    2000-05-15

    Long-term measurements of mass and elemental dry deposition (MG, Al, V, Cr, Mn, Ni, Co, Cu, Zn, As, Sr, Mo, Cd, Sb, Ba, and Pb) were made with an automated dry deposition sampler (Eagle II) containing knife-edge surrogate surfaces during the Lake Michigan Mass Balance/Mass Budget Study. Measurements were made over a roughly 700-day period in Chicago, IL; in South Haven and Sleeping Bear Dunes, MI; and over Lake Michigan on the 68th Street drinking water intake cribs from December 1993 to October 1995. Average mass fluxes in Chicago, South Haven, Sleeping Bear Dunes, and the 68th Street crib were 65, 10, 3.6, and 12 mg m{sup {minus}2} day{sup {minus}1}, respectively. Primarily crustal elemental fluxes were significantly smaller than the mass fluxes but higher than primarily anthropogenic elemental fluxes. For example, the average elemental flux of Al in Chicago, South Haven, Sleeping Bear Dunes, and the 68th Street crib were 1.0, 0.34, 0.074, and 0.34 mg m{sup {minus}2}day{sup {minus}1}, respectively. The average Pb fluxes in Chicago, South Haven, Sleeping Bear Dunes, and the 68th Street crib were 0.038, 0.023, 0.035, and 0.032 mg m{sup {minus}2}day{sup {minus}1}, respectively. The measured fluxes at the various sites were used to calculate the dry deposition loadings to the lake. These estimated fluxes were highest for Mg and lowest for Cd.

  3. Automated UF6 Cylinder Enrichment Assay: Status of the Hybrid Enrichment Verification Array (HEVA) Project: POTAS Phase II

    SciTech Connect (OSTI)

    Jordan, David V.; Orton, Christopher R.; Mace, Emily K.; McDonald, Benjamin S.; Kulisek, Jonathan A.; Smith, Leon E.

    2012-06-01

    Pacific Northwest National Laboratory (PNNL) intends to automate the UF6 cylinder nondestructive assay (NDA) verification currently performed by the International Atomic Energy Agency (IAEA) at enrichment plants. PNNL is proposing the installation of a portal monitor at a key measurement point to positively identify each cylinder, measure its mass and enrichment, store the data along with operator inputs in a secure database, and maintain continuity of knowledge on measured cylinders until inspector arrival. This report summarizes the status of the research and development of an enrichment assay methodology supporting the cylinder verification concept. The enrichment assay approach exploits a hybrid of two passively-detected ionizing-radiation signatures: the traditional enrichment meter signature (186-keV photon peak area) and a non-traditional signature, manifested in the high-energy (3 to 8 MeV) gamma-ray continuum, generated by neutron emission from UF6. PNNL has designed, fabricated, and field-tested several prototype assay sensor packages in an effort to demonstrate proof-of-principle for the hybrid assay approach, quantify the expected assay precision for various categories of cylinder contents, and assess the potential for unsupervised deployment of the technology in a portal-monitor form factor. We refer to recent sensor-package prototypes as the Hybrid Enrichment Verification Array (HEVA). The report provides an overview of the assay signatures and summarizes the results of several HEVA field measurement campaigns on populations of Type 30B UF6 cylinders containing low-enriched uranium (LEU), natural uranium (NU), and depleted uranium (DU). Approaches to performance optimization of the assay technique via radiation transport modeling are briefly described, as are spectroscopic and data-analysis algorithms.

  4. Toward Semi-automated Assessment of Target Volume Delineation in Radiotherapy Trials: The SCOPE 1 Pretrial Test Case

    SciTech Connect (OSTI)

    Gwynne, Sarah; Spezi, Emiliano; Wills, Lucy; Nixon, Lisette; Hurt, Chris; Joseph, George; Evans, Mererid; Griffiths, Gareth; Crosby, Tom; Staffurth, John

    2012-11-15

    Purpose: To evaluate different conformity indices (CIs) for use in the analysis of outlining consistency within the pretrial quality assurance (Radiotherapy Trials Quality Assurance [RTTQA]) program of a multicenter chemoradiation trial of esophageal cancer and to make recommendations for their use in future trials. Methods and Materials: The National Cancer Research Institute SCOPE 1 trial is an ongoing Cancer Research UK-funded phase II/III randomized controlled trial of chemoradiation with capecitabine and cisplatin with or without cetuximab for esophageal cancer. The pretrial RTTQA program included a detailed radiotherapy protocol, an educational package, and a single mid-esophageal tumor test case that were sent to each investigator to outline. Investigator gross tumor volumes (GTVs) were received from 50 investigators in 34 UK centers, and CERR (Computational Environment for Radiotherapy Research) was used to perform an assessment of each investigator GTV against a predefined gold-standard GTV using different CIs. A new metric, the local conformity index (l-CI), that can localize areas of maximal discordance was developed. Results: The median Jaccard conformity index (JCI) was 0.69 (interquartile range, 0.62-0.70), with 14 of 50 investigators (28%) achieving a JCI of 0.7 or greater. The median geographical miss index was 0.09 (interquartile range, 0.06-0.16), and the mean discordance index was 0.27 (95% confidence interval, 0.25-0.30). The l-CI was highest in the middle section of the volume, where the tumor was bulky and more easily definable, and identified 4 slices where fewer than 20% of investigators achieved an l-CI of 0.7 or greater. Conclusions: The available CIs analyze different aspects of a gold standard-observer variation, with JCI being the most useful as a single metric. Additional information is provided by the l-CI and can focus the efforts of the RTTQA team in these areas, possibly leading to semi-automated outlining assessment.

  5. Implementing New Methods of Laser Marking of Items in the Nuclear Material Control and Accountability System at SSC RF-IPPE: An Automated Laser Marking System

    SciTech Connect (OSTI)

    Regoushevsky, V I; Tambovtsev, S D; Dvukhsherstnov, V G; Efimenko, V F; Ilyantsev, A I; Russ III, G P

    2009-05-18

    For over ten years SSC RF-IPPE, together with the US DOE National Laboratories, has been working on implementing automated control and accountability methods for nuclear materials and other items. Initial efforts to use adhesive bar codes or ones printed (painted) onto metal revealed that these methods were inconvenient and lacked durability under operational conditions. For NM disk applications in critical stands, there is the additional requirement that labels not affect the neutron characteristics of the critical assembly. This is particularly true for the many stainless-steel clad disks containing highly enriched uranium (HEU) and plutonium that are used at SSC RF-IPPE for modeling nuclear power reactors. In search of an alternate method for labeling these disks, we tested several technological options, including laser marking and two-dimensional codes. As a result, the method of laser coloring was chosen in combination with Data Matrix ECC200 symbology. To implement laser marking procedures for the HEU disks and meet all the nuclear material (NM) handling standards and rules, IPPE staff, with U.S. technical and financial support, implemented an automated laser marking system; there are also specially developed procedures for NM movements during laser marking. For the laser marking station, a Zenith 10F system by Telesis Technologies (10 watt Ytterbium Fiber Laser and Merlin software) is used. The presentation includes a flowchart for the automated system and a list of specially developed procedures with comments. Among other things, approaches are discussed for human-factor considerations. To date, markings have been applied to numerous steel-clad HEU disks, and the work continues. In the future this method is expected to be applied to other MC&A items.

  6. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    SciTech Connect (OSTI)

    Liu, H; Liang, X; Kalbasi, A; Lin, A; Ahn, P; Both, S

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: proton PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.

  7. SU-C-9A-02: Structured Noise Index as An Automated Quality Control for Nuclear Medicine: A Two Year Experience

    SciTech Connect (OSTI)

    Nelson, J; Christianson, O; Samei, E

    2014-06-01

    Purpose: Flood-field uniformity evaluation is an essential element in the assessment of nuclear medicine (NM) gamma cameras. It serves as the central element of the quality control (QC) program, acquired and analyzed on a daily basis prior to clinical imaging. Uniformity images are traditionally analyzed using pixel value-based metrics which often fail to capture subtle structure and patterns caused by changes in gamma camera performance requiring additional visual inspection which is subjective and time demanding. The goal of this project was to develop and implement a robust QC metrology for NM that is effective in identifying non-uniformity issues, reporting issues in a timely manner for efficient correction prior to clinical involvement, all incorporated into an automated effortless workflow, and to characterize the program over a two year period. Methods: A new quantitative uniformity analysis metric was developed based on 2D noise power spectrum metrology and confirmed based on expert observer visual analysis. The metric, termed Structured Noise Index (SNI) was then integrated into an automated program to analyze, archive, and report on daily NM QC uniformity images. The effectiveness of the program was evaluated over a period of 2 years. Results: The SNI metric successfully identified visually apparent non-uniformities overlooked by the pixel valuebased analysis methods. Implementation of the program has resulted in nonuniformity identification in about 12% of daily flood images. In addition, due to the vigilance of staff response, the percentage of days exceeding trigger value shows a decline over time. Conclusion: The SNI provides a robust quantification of the NM performance of gamma camera uniformity. It operates seamlessly across a fleet of multiple camera models. The automated process provides effective workflow within the NM spectra between physicist, technologist, and clinical engineer. The reliability of this process has made it the preferred platform for NM uniformity analysis.

  8. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    SciTech Connect (OSTI)

    Wu Binbin; McNutt, Todd; Zahurak, Marianna; Simari, Patricio; Pang, Dalong; Taylor, Russell; Sanguineti, Giuseppe

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  9. Poster Thur Eve 51: An analysis of the effectiveness of automated pre-, post- and intra-treatment auditing of electronic health records

    SciTech Connect (OSTI)

    Joseph, A.; Seuntjens, J.; Parker, W.; Kildea, J.; Freeman, C.

    2014-08-15

    We describe development of automated, web-based, electronic health record (EHR) auditing software for use within our paperless radiation oncology clinic. By facilitating access to multiple databases within the clinic, each patient's EHR is audited prior to treatment, regularly during treatment, and post treatment. Anomalies such as missing documentation, non-compliant workflow and treatment parameters that differ significantly from the norm may be monitored, flagged and brought to the attention of clinicians. By determining historical trends using existing patient data and by comparing new patient data with the historical, we expect our software to provide a measurable improvement in the quality of radiotherapy at our centre.

  10. An automated vacuum system

    SciTech Connect (OSTI)

    Atkins, W.H. ); Vaughn, G.D. ); Bridgman, C. )

    1991-01-01

    Software tools available with the Ground Test Accelerator (GTA) control system provide the capability to express a control problem as a finite state machine. System states and transitions are expressed in terms of accelerator parameters and actions are taken based on state transitions. This is particularly useful for sequencing operations which are modal in nature or are unwieldy when implemented with conventional programming. State diagrams are automatically translated into code which is executed by the control system. These tools have been applied to the vacuum system for the GTA accelerator to implement automatic sequencing of operations. With a single request, the operator may initiate a complete pump-down sequence. He can monitor the progress and is notified if an anomaly occurs requiring intervention. The operator is not required to have detailed knowledge of the vacuum system and is protected from taking inappropriate actions. 1 ref., 6 figs.

  11. Real time automated inspection

    DOE Patents [OSTI]

    Fant, K.M.; Fundakowski, R.A.; Levitt, T.S.; Overland, J.E.; Suresh, B.R.; Ulrich, F.W.

    1985-05-21

    A method and apparatus are described relating to the real time automatic detection and classification of characteristic type surface imperfections occurring on the surfaces of material of interest such as moving hot metal slabs produced by a continuous steel caster. A data camera transversely scans continuous lines of such a surface to sense light intensities of scanned pixels and generates corresponding voltage values. The voltage values are converted to corresponding digital values to form a digital image of the surface which is subsequently processed to form an edge-enhanced image having scan lines characterized by intervals corresponding to the edges of the image. The edge-enhanced image is thresholded to segment out the edges and objects formed by the edges by interval matching and bin tracking. Features of the objects are derived and such features are utilized to classify the objects into characteristic type surface imperfections. 43 figs.

  12. Real time automated inspection

    DOE Patents [OSTI]

    Fant, Karl M.; Fundakowski, Richard A.; Levitt, Tod S.; Overland, John E.; Suresh, Bindinganavle R.; Ulrich, Franz W.

    1985-01-01

    A method and apparatus relating to the real time automatic detection and classification of characteristic type surface imperfections occurring on the surfaces of material of interest such as moving hot metal slabs produced by a continuous steel caster. A data camera transversely scans continuous lines of such a surface to sense light intensities of scanned pixels and generates corresponding voltage values. The voltage values are converted to corresponding digital values to form a digital image of the surface which is subsequently processed to form an edge-enhanced image having scan lines characterized by intervals corresponding to the edges of the image. The edge-enhanced image is thresholded to segment out the edges and objects formed by the edges are segmented out by interval matching and bin tracking. Features of the objects are derived and such features are utilized to classify the objects into characteristic type surface imperfections.

  13. Home Automation Interoperability

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    within a home and share information through the use of industry standard interfaces. ... It includes references to applicable standards, electrical codes, regulations, and best ...

  14. Automated Estimating System

    Energy Science and Technology Software Center (OSTI)

    1996-04-15

    AES6.1 is a PC software package developed to aid in the preparation and reporting of cost estimates. AES6.1 provides an easy means for entering and updating the detailed cost, schedule information, project work breakdown structure, and escalation information contained in a typical project cost estimate through the use of menus and formatted input screens. AES6.1 combines this information to calculate both unescalated and escalated cost for a project which can be reported at varying levelsmore » of detail. Following are the major modifications to AES6.0f: Contingency update was modified to provide greater flexibility for user updates, Schedule Update was modified to provide user ability to schedule Bills of Material at the WBS/Participant/Cost Code level, Schedule Plot was modified to graphically show schedule by WBS/Participant/Cost Code, All Fiscal Year reporting has been modified to use the new schedule format, The Schedule 1-B-7, Cost Schedule, and the WBS/Participant reprorts were modified to determine Phase of Work from the B/M Cost Code, Utility program was modified to allow selection by cost code and update cost code in the Global Schedule update, Generic summary and line item download were added to the utility program, and an option was added to all reports which allows the user to indicate where overhead is to be reported (bottom line or in body of report)« less

  15. Automating power supply checkout

    SciTech Connect (OSTI)

    Laster, J.; Bruno, D.; D'Ottavio, T.; Drozd, J.; Marr, G.; Mi, C.

    2011-03-28

    Power Supply checkout is a necessary, pre-beam, time-critical function. At odds are the desire to decrease the amount of time to perform the checkout while at the same time maximizing the number and types of checks that can be performed and analyzing the results quickly (in case any problems exist that must be addressed). Controls and Power Supply Group personnel have worked together to develop tools to accomplish these goals. Power Supply checkouts are now accomplished in a time-frame of hours rather than days, reducing the number of person-hours needed to accomplish the checkout and making the system available more quickly for beam development. The goal of the Collider-Accelerator Department (CAD) at Brookhaven National Laboratory is to provide experimenters with collisions of heavy-ions and polarized protons. The Relativistic Heavy-Ion Collider (RHIC) magnets are controlled by 100's of varying types of power supplies. There is a concentrated effort to perform routine maintenance on the supplies during shutdown periods. There is an effort at RHIC to streamline the time needed for system checkout in order to quickly arrive at a period of beam operations for RHIC. This time-critical period is when the checkout of the power supplies is performed as the RHIC ring becomes cold and the supplies are connected to their physical magnets. The checkout process is used to identify problems in voltage and current regulation by examining data signals related to each for problems in settling and regulation (ripple).

  16. Automated manual transmission controller

    DOE Patents [OSTI]

    Lawrie, Robert E.; Reed, Jr., Richard G.; Bernier, David R.

    1999-12-28

    A powertrain system for a hybrid vehicle. The hybrid vehicle includes a heat engine, such as a diesel engine, and an electric machine, which operates as both an electric motor and an alternator, to power the vehicle. The hybrid vehicle also includes a manual-style transmission configured to operate as an automatic transmission from the perspective of the driver. The engine and the electric machine drive an input shaft which in turn drives an output shaft of the transmission. In addition to driving the transmission, the electric machine regulates the speed of the input shaft in order to synchronize the input shaft during either an upshift or downshift of the transmission by either decreasing or increasing the speed of the input shaft. When decreasing the speed of the input shaft, the electric motor functions as an alternator to produce electrical energy which may be stored by a storage device. Operation of the transmission is controlled by a transmission controller which receives input signals and generates output signals to control shift and clutch motors to effect smooth launch, upshift shifts, and downshifts of the transmission, so that the transmission functions substantially as an automatic transmission from the perspective of the driver, while internally substantially functioning as a manual transmission.

  17. Secretary (Office Automation)

    Broader source: Energy.gov [DOE]

    This position is located in the Power and Operations Planning (PGP) organization of Generation Asset Management (PG), Power Services (P), Bonneville Power Administration. Power and Operations...

  18. National Metal Casting Research Institute final report. Development of an automated ultrasonic inspection cell for detecting subsurface discontinuities in cast gray iron. Volume 3

    SciTech Connect (OSTI)

    Burningham, J.S.

    1995-08-01

    This inspection cell consisted of an ultrasonic flaw detector, transducer, robot, immersion tank, computer, and software. Normal beam pulse-echo ultrasonic nondestructive testing, using the developed automated cell, was performed on 17 bosses on each rough casting. Ultrasonic transducer selection, initial inspection criteria, and ultrasonic flow detector (UFD) setup parameters were developed for the gray iron castings used in this study. The software were developed for control of the robot and UFD in real time. The software performed two main tasks: emulating the manual operation of the UFD, and evaluating the ultrasonic signatures for detecting subsurface discontinuities. A random lot of 105 castings were tested; the 100 castings that passed were returned to the manufacturer for machining into finished parts and then inspection. The other 5 castings had one boss each with ultrasonic signatures consistent with subsurface discontinuities. The cell was successful in quantifying the ultrasonic echo signatures for the existence of signature characteristics consistent with Go/NoGo criteria developed from simulated defects. Manual inspection showed that no defects in the areas inspected by the automated cell avoided detection in the 100 castings machined into finished parts. Of the 5 bosses found to have subsurface discontinuities, two were verified by manual inspection. The cell correctly classified 1782 of the 1785 bosses (99.832%) inspected.

  19. Opportunities for Open Automated Demand Response in Wastewater Treatment Facilities in California - Phase II Report. San Luis Rey Wastewater Treatment Plant Case Study

    SciTech Connect (OSTI)

    Thompson, Lisa; Lekov, Alex; McKane, Aimee; Piette, Mary Ann

    2010-08-20

    This case study enhances the understanding of open automated demand response opportunities in municipal wastewater treatment facilities. The report summarizes the findings of a 100 day submetering project at the San Luis Rey Wastewater Treatment Plant, a municipal wastewater treatment facility in Oceanside, California. The report reveals that key energy-intensive equipment such as pumps and centrifuges can be targeted for large load reductions. Demand response tests on the effluent pumps resulted a 300 kW load reduction and tests on centrifuges resulted in a 40 kW load reduction. Although tests on the facility?s blowers resulted in peak period load reductions of 78 kW sharp, short-lived increases in the turbidity of the wastewater effluent were experienced within 24 hours of the test. The results of these tests, which were conducted on blowers without variable speed drive capability, would not be acceptable and warrant further study. This study finds that wastewater treatment facilities have significant open automated demand response potential. However, limiting factors to implementing demand response are the reaction of effluent turbidity to reduced aeration load, along with the cogeneration capabilities of municipal facilities, including existing power purchase agreements and utility receptiveness to purchasing electricity from cogeneration facilities.

  20. Automated fibroglandular tissue segmentation and volumetric density estimation in breast MRI using an atlas-aided fuzzy C-means method

    SciTech Connect (OSTI)

    Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Kontos, Despina

    2013-12-15

    Purpose: Breast magnetic resonance imaging (MRI) plays an important role in the clinical management of breast cancer. Studies suggest that the relative amount of fibroglandular (i.e., dense) tissue in the breast as quantified in MR images can be predictive of the risk for developing breast cancer, especially for high-risk women. Automated segmentation of the fibroglandular tissue and volumetric density estimation in breast MRI could therefore be useful for breast cancer risk assessment. Methods: In this work the authors develop and validate a fully automated segmentation algorithm, namely, an atlas-aided fuzzy C-means (FCM-Atlas) method, to estimate the volumetric amount of fibroglandular tissue in breast MRI. The FCM-Atlas is a 2D segmentation method working on a slice-by-slice basis. FCM clustering is first applied to the intensity space of each 2D MR slice to produce an initial voxelwise likelihood map of fibroglandular tissue. Then a prior learned fibroglandular tissue likelihood atlas is incorporated to refine the initial FCM likelihood map to achieve enhanced segmentation, from which the absolute volume of the fibroglandular tissue (|FGT|) and the relative amount (i.e., percentage) of the |FGT| relative to the whole breast volume (FGT%) are computed. The authors' method is evaluated by a representative dataset of 60 3D bilateral breast MRI scans (120 breasts) that span the full breast density range of the American College of Radiology Breast Imaging Reporting and Data System. The automated segmentation is compared to manual segmentation obtained by two experienced breast imaging radiologists. Segmentation performance is assessed by linear regression, Pearson's correlation coefficients, Student's pairedt-test, and Dice's similarity coefficients (DSC). Results: The inter-reader correlation is 0.97 for FGT% and 0.95 for |FGT|. When compared to the average of the two readers manual segmentation, the proposed FCM-Atlas method achieves a correlation ofr = 0.92 for FGT% and r = 0.93 for |FGT|, and the automated segmentation is not statistically significantly different (p = 0.46 for FGT% and p = 0.55 for |FGT|). The bilateral correlation between left breasts and right breasts for the FGT% is 0.94, 0.92, and 0.95 for reader 1, reader 2, and the FCM-Atlas, respectively; likewise, for the |FGT|, it is 0.92, 0.92, and 0.93, respectively. For the spatial segmentation agreement, the automated algorithm achieves a DSC of 0.69 0.1 when compared to reader 1 and 0.61 0.1 for reader 2, respectively, while the DSC between the two readers manual segmentation is 0.67 0.15. Additional robustness analysis shows that the segmentation performance of the authors' method is stable both with respect to selecting different cases and to varying the number of cases needed to construct the prior probability atlas. The authors' results also show that the proposed FCM-Atlas method outperforms the commonly used two-cluster FCM-alone method. The authors' method runs at ?5 min for each 3D bilateral MR scan (56 slices) for computing the FGT% and |FGT|, compared to ?55 min needed for manual segmentation for the same purpose. Conclusions: The authors' method achieves robust segmentation and can serve as an efficient tool for processing large clinical datasets for quantifying the fibroglandular tissue content in breast MRI. It holds a great potential to support clinical applications in the future including breast cancer risk assessment.

  1. Glow-to-arc transition events in H{sub 2}-Ar direct current pulsed plasma: Automated measurement of current and voltage

    SciTech Connect (OSTI)

    Mendes, Luciano A.; Rodrigues, Jhonatam C.; Mafra, Marcio

    2012-01-15

    The glow-to-arc transition phenomena (arcing) observed in plasma reactors used in materials processing was studied through the arcs characteristic current and voltage waveforms. In order to capture these arcs signals, a LABVIEW based automated instrumentation system (ARCVIEW) was developed, including the integration of an oscilloscope equipped with proper current and voltage probes. The system also allows capturing the process parameters at the arc occurrence moments, which were used to map the arcs events conditions. Experiments in H{sub 2}-Ar DC pulsed plasma returned signals data from 215 arcs events, which were analyzed through software routines. According to the results, an anti-arcing system should react in the time order of few microseconds to prevent most of the damage caused by the undesired arcing phenomena.

  2. MicroCT: Automated Analysis of CT Reconstructed Data of Home Made Explosive Materials Using the Matlab MicroCT Analysis GUI

    SciTech Connect (OSTI)

    Seetho, I M; Brown, W D; Kallman, J S; Martz, H E; White, W T

    2011-09-22

    This Standard Operating Procedure (SOP) provides the specific procedural steps for analyzing reconstructed CT images obtained under the IDD Standard Operating Procedures for data acquisition [1] and MicroCT image reconstruction [2], per the IDD Quality Assurance Plan for MicroCT Scanning [3]. Although intended to apply primarily to MicroCT data acquired in the HEAFCAT Facility at LLNL, these procedures may also be applied to data acquired at Tyndall from the YXLON cabinet and at TSL from the HEXCAT system. This SOP also provides the procedural steps for preparing the tables and graphs to be used in the reporting of analytical results. This SOP applies to production work - for R and D there are two other semi-automated methods as given in [4, 5].

  3. OSTI, US Dept of Energy, Office of Scientific and Technical Information |

    Office of Scientific and Technical Information (OSTI)

    Speeding access to science information from DOE and Beyond Automated Protocols Used by National Labs (AN 241.1) DOE laboratories have three options for automated submission of AN 241.1 information: AN 241.1 via Batch Upload AN 241.1 via Harvesting AN 241.1 via Web Service All three options must meet the same basic E-Link business rules in terms of required and optional metadata. The differences come from the software code underlying each submittal option/mechanism. Sometimes there are

  4. SU-E-I-89: Assessment of CT Radiation Dose and Image Quality for An Automated Tube Potential Selection Algorithm Using Pediatric Anthropomorphic and ACR Phantoms

    SciTech Connect (OSTI)

    Mahmood, U; Erdi, Y; Wang, W

    2014-06-01

    Purpose: To assess the impact of General Electrics automated tube potential algorithm, kV assist (kVa) on radiation dose and image quality, with an emphasis on optimizing protocols based on noise texture. Methods: Radiation dose was assessed by inserting optically stimulated luminescence dosimeters (OSLs) throughout the body of a pediatric anthropomorphic phantom (CIRS). The baseline protocol was: 120 kVp, 80 mA, 0.7s rotation time. Image quality was assessed by calculating the contrast to noise ratio (CNR) and noise power spectrum (NPS) from the ACR CT accreditation phantom. CNRs were calculated according to the steps described in ACR CT phantom testing document. NPS was determined by taking the 3D FFT of the uniformity section of the ACR phantom. NPS and CNR were evaluated with and without kVa and for all available adaptive iterative statistical reconstruction (ASiR) settings, ranging from 0 to 100%. Each NPS was also evaluated for its peak frequency difference (PFD) with respect to the baseline protocol. Results: For the baseline protocol, CNR was found to decrease from 0.460 0.182 to 0.420 0.057 when kVa was activated. When compared against the baseline protocol, the PFD at ASiR of 40% yielded a decrease in noise magnitude as realized by the increase in CNR = 0.620 0.040. The liver dose decreased by 30% with kVa activation. Conclusion: Application of kVa reduces the liver dose up to 30%. However, reduction in image quality for abdominal scans occurs when using the automated tube voltage selection feature at the baseline protocol. As demonstrated by the CNR and NPS analysis, the texture and magnitude of the noise in reconstructed images at ASiR 40% was found to be the same as our baseline images. We have demonstrated that 30% dose reduction is possible when using 40% ASiR with kVa in pediatric patients.

  5. Index of /datasets/files/41/pub/PUBID8_0730

    Open Energy Info (EERE)

    Listing This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 208445...

  6. Index of /datasets/files/41/pub/PUBID8_4282

    Open Energy Info (EERE)

    Listing This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 21422...

  7. Index of /datasets/files/41/pub/PUBID8_4070

    Open Energy Info (EERE)

    This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 97985954...

  8. Index of /datasets/files/41/pub/PUBID8_3941

    Open Energy Info (EERE)

    Exist 22-May-2012 01:04 - DIR MaxTech 22-May-2012 01:04 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 1006824916 Varnish...

  9. Index of /datasets/files/41/pub/PUBID8_3090

    Open Energy Info (EERE)

    Listing This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 214...

  10. Index of /datasets/files/41/pub/PUBID8_0772

    Open Energy Info (EERE)

    :49 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 979859490 Varnish cache server Directory Listing This is a file-level view...

  11. Index of /datasets/files/41/pub/PUBID8_1095

    Open Energy Info (EERE)

    This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 9674483...

  12. Index of /datasets/files/41/pub/PUBID8_3857

    Open Energy Info (EERE)

    Listing This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 96744...

  13. Index of /datasets/files/41/pub/PUBID8_0332

    Open Energy Info (EERE)

    Apr-2012 11:07 - DIR MaxTech 18-Apr-2012 11:07 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 979859487 Varnish cache...

  14. Index of /datasets/files/41/pub/PUBID8_3844

    Open Energy Info (EERE)

    :01 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 979859543 Varnish cache server Directory Listing This is a file-level view...

  15. Index of /datasets/files/41/pub/PUBID8_0536

    Open Energy Info (EERE)

    Listing This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 9...

  16. Index of /datasets/files/41/pub/PUBID8_0245

    Open Energy Info (EERE)

    This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 96744836...

  17. Index of /datasets/files/41/pub/PUBID8_1803

    Open Energy Info (EERE)

    This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 2142255...

  18. Index of /datasets/files/41/pub/PUBID8_2886

    Open Energy Info (EERE)

    :32 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 994569761 Varnish cache server Directory Listing This is a file-level view...

  19. Index of /datasets/files/41/pub/PUBID8_5373

    Open Energy Info (EERE)

    This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 96744859...

  20. Index of /datasets/files/41/pub/PUBID8_0697

    Open Energy Info (EERE)

    This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 11352643...

  1. Index of /datasets/files/41/pub/PUBID8_3365

    Open Energy Info (EERE)

    This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 96744849...

  2. Index of /datasets/files/41/pub/PUBID8_0001/Base

    Open Energy Info (EERE)

    Sim1 18-Apr-2012 11:06 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 653191374 Varnish cache server Directory Listing This...

  3. Index of /datasets/files/41/pub/PUBID8_4214

    Open Energy Info (EERE)

    - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 994569787 Varnish cache server Directory Listing This is a file-level view of...

  4. Index of /datasets/files/41/pub/PUBID8_0001/Exist

    Open Energy Info (EERE)

    Size DIR Parent Directory - DIR Sim1 18-Apr-2012 11:06 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 653191375 Varnish...

  5. Index of /datasets/files/41/pub/PUBID8_1613

    Open Energy Info (EERE)

    429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 994569752 Varnish cache server Directory Listing This is a file-level view of...

  6. Index of /datasets/files/41/pub/PUBID8_0074

    Open Energy Info (EERE)

    Listing This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 979859...

  7. Index of /datasets/files/41/pub/PUBID8_1997

    Open Energy Info (EERE)

    Listing This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 967448...

  8. Index of /datasets/files/41/pub/PUBID8_1506

    Open Energy Info (EERE)

    Exist 21-May-2012 23:49 - DIR MaxTech 21-May-2012 23:49 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 1006824909 Varnish...

  9. Index of /datasets/files/41/pub/PUBID8_2278

    Open Energy Info (EERE)

    Listing This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 99456...

  10. Index of /datasets/files/41/pub/PUBID8_4510

    Open Energy Info (EERE)

    This is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 96744855...

  11. Index of /datasets/files/41/pub/PUBID8_0001/MaxTech

    Open Energy Info (EERE)

    Size DIR Parent Directory - DIR Sim1 18-Apr-2012 11:06 - 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 653191376 Varnish...

  12. Index of /datasets/files/41/pub/PUBID8_5935

    Open Energy Info (EERE)

    is a file-level view of datasets. Go back to Datasets Listing 429 Throttled (bot load) Error 429 Throttled (bot load) Throttled (bot load) Guru Meditation: XID: 1863489907 Varnish...

  13. Oak Ridge K-25 Site Technology Logic Diagram. Volume 3, Technology evaluation data sheets; Part B, Remedial action, robotics/automation, waste management

    SciTech Connect (OSTI)

    Fellows, R.L.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration (ER) and waste management (WN) problems at the Oak Ridge K-25 Site. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remediation, decontamination, and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This volume 3 B provides the Technology Evaluation Data Sheets (TEDS) for ER/WM activities (Remedial Action Robotics and Automation, Waste Management) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than each technology in Vol. 2. The TEDS are arranged alphanumerically by the TEDS code number in the upper right corner of each data sheet. Volume 3 can be used in two ways: (1) technologies that are identified from Vol. 2 can be referenced directly in Vol. 3 by using the TEDS codes, and (2) technologies and general technology areas (alternatives) can be located in the index in the front of this volume.

  14. Automated detection of cloud and cloud-shadow in single-date Landsat imagery using neural networks and spatial post-processing

    SciTech Connect (OSTI)

    Hughes, Michael J. [University of Tennessee, Knoxville (UTK)] [University of Tennessee, Knoxville (UTK); Hayes, Daniel J [ORNL] [ORNL

    2014-01-01

    Use of Landsat data to answer ecological questions is contingent on the effective removal of cloud and cloud shadow from satellite images. We develop a novel algorithm to identify and classify clouds and cloud shadow, \\textsc{sparcs}: Spacial Procedures for Automated Removal of Cloud and Shadow. The method uses neural networks to determine cloud, cloud-shadow, water, snow/ice, and clear-sky membership of each pixel in a Landsat scene, and then applies a set of procedures to enforce spatial rules. In a comparison to FMask, a high-quality cloud and cloud-shadow classification algorithm currently available, \\textsc{sparcs} performs favorably, with similar omission errors for clouds (0.8% and 0.9%, respectively), substantially lower omission error for cloud-shadow (8.3% and 1.1%), and fewer errors of commission (7.8% and 5.0%). Additionally, textsc{sparcs} provides a measure of uncertainty in its classification that can be exploited by other processes that use the cloud and cloud-shadow detection. To illustrate this, we present an application that constructs obstruction-free composites of images acquired on different dates in support of algorithms detecting vegetation change.

  15. Nuclear Energy Research Initiative Project No. 02 103 Innovative Low Cost Approaches to Automating QA/QC of Fuel Particle Production Using On Line Nondestructive Methods for Higher Reliability Final Project Report

    SciTech Connect (OSTI)

    Ahmed, Salahuddin; Batishko, Charles R.; Flake, Matthew; Good, Morris S.; Mathews, Royce; Morra, Marino; Panetta, Paul D.; Pardini, Allan F.; Sandness, Gerald A.; Tucker, Brian J.; Weier, Dennis R.; Hockey, Ronald L.; Gray, Joseph N.; Saurwein, John J.; Bond, Leonard J.; Lowden, Richard A.; Miller, James H.

    2006-02-28

    This Nuclear Energy Research Initiative (NERI) project was tasked with exploring, adapting, developing and demonstrating innovative nondestructive test methods to automate nuclear coated particle fuel inspection so as to provide the United States (US) with necessary improved and economical Quality Assurance and Control (QA/QC) that is needed for the fuels for several reactor concepts being proposed for both near term deployment [DOE NE & NERAC, 2001] and Generation IV nuclear systems. Replacing present day QA/QC methods, done manually and in many cases destructively, with higher speed automated nondestructive methods will make fuel production for advanced reactors economically feasible. For successful deployment of next generation reactors that employ particle fuels, or fuels in the form of pebbles based on particles, extremely large numbers of fuel particles will require inspection at throughput rates that do not significantly impact the proposed manufacturing processes. The focus of the project is nondestructive examination (NDE) technologies that can be automated for production speeds and make either: (I) On Process Measurements or (II) In Line Measurements. The inspection technologies selected will enable particle quality qualification as a particle or group of particles passes a sensor. A multiple attribute dependent signature will be measured and used for qualification or process control decisions. A primary task for achieving this objective is to establish standard signatures for both good/acceptable particles and the most problematic types of defects using several nondestructive methods.

  16. SU-E-I-81: Assessment of CT Radiation Dose and Image Quality for An Automated Tube Potential Selection Algorithm Using Adult Anthropomorphic and ACR Phantoms

    SciTech Connect (OSTI)

    Mahmood, U; Erdi, Y; Wang, W

    2014-06-01

    Purpose: To assess the impact of General Electrics (GE) automated tube potential algorithm, kV assist (kVa) on radiation dose and image quality, with an emphasis on optimizing protocols based on noise texture. Methods: Radiation dose was assessed by inserting optically stimulated luminescence dosimeters (OSLs) throughout the body of an adult anthropomorphic phantom (CIRS). The baseline protocol was: 120 kVp, Auto mA (180 to 380 mA), noise index (NI) = 14, adaptive iterative statistical reconstruction (ASiR) of 20%, 0.8s rotation time. Image quality was evaluated by calculating the contrast to noise ratio (CNR) and noise power spectrum (NPS) from the ACR CT accreditation phantom. CNRs were calculated according to the steps described in ACR CT phantom testing document. NPS was determined by taking the 3D FFT of the uniformity section of the ACR phantom. NPS and CNR were evaluated with and without kVa and for all available adaptive iterative statistical reconstruction (ASiR) settings, ranging from 0 to 100%. Each NPS was also evaluated for its peak frequency difference (PFD) with respect to the baseline protocol. Results: The CNR for the adult male was found to decrease from CNR = 0.912 0.045 for the baseline protocol without kVa to a CNR = 0.756 0.049 with kVa activated. When compared against the baseline protocol, the PFD at ASiR of 40% yielded a decrease in noise magnitude as realized by the increase in CNR = 0.903 0.023. The difference in the central liver dose with and without kVa was found to be 0.07%. Conclusion: Dose reduction was insignificant in the adult phantom. As determined by NPS analysis, ASiR of 40% produced images with similar noise texture to the baseline protocol. However, the CNR at ASiR of 40% with kVa fails to meet the current ACR CNR passing requirement of 1.0.

  17. Departmental Personnel Security- Clearance Automation

    Broader source: Energy.gov [DOE]

    The primary objective of the DOE Integrated Security System (eDISS+) Initiative is to support the integration of multiple DOE security systems and databases. This integrated environment provides...

  18. Automated internal pipe cutting device

    DOE Patents [OSTI]

    Godlewski, William J.; Haffke, Gary S.; Purvis, Dale; Bashar, Ronald W.; Jones, Stewart D.; Moretti, Jr., Henry; Pimentel, James

    2003-01-21

    The invention is a remotely controlled internal pipe cutting device primarily used for cutting pipes where the outside of the pipe is inaccessible at the line where the cut is to be made. The device includes an axial ram within a rotational cylinder which is enclosed in a housing. The housing is adapted for attachment to an open end of the pipe and for supporting the ram and cylinder in cantilever fashion within the pipe. A radially movable cutter, preferably a plasma arc torch, is attached to the distal end of the ram. A drive mechanism, containing motors and mechanical hardware for operating the ram and cylinder, is attached to the proximal end of the housing. The ram and cylinder provide for moving the cutter axially and circumferentially, and a cable assembly attached to a remote motor provide for the movement of the cutter radially, within the pipe. The control system can be adjusted and operated remotely to control the position and movement of the cutter to obtain the desired cut. The control system can also provide automatic standoff control for a plasma arc torch.

  19. Automated solar collector installation design

    DOE Patents [OSTI]

    Wayne, Gary; Frumkin, Alexander; Zaydman, Michael; Lehman, Scott; Brenner, Jules

    2014-08-26

    Embodiments may include systems and methods to create and edit a representation of a worksite, to create various data objects, to classify such objects as various types of pre-defined "features" with attendant properties and layout constraints. As part of or in addition to classification, an embodiment may include systems and methods to create, associate, and edit intrinsic and extrinsic properties to these objects. A design engine may apply of design rules to the features described above to generate one or more solar collectors installation design alternatives, including generation of on-screen and/or paper representations of the physical layout or arrangement of the one or more design alternatives.

  20. Techniques for Automated Performance Analysis

    SciTech Connect (OSTI)

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  1. Security barriers with automated reconnaissance

    DOE Patents [OSTI]

    McLaughlin, James O; Baird, Adam D; Tullis, Barclay J; Nolte, Roger Allen

    2015-04-07

    An intrusion delaying barrier includes primary and secondary physical structures and can be instrumented with multiple sensors incorporated into an electronic monitoring and alarm system. Such an instrumented intrusion delaying barrier may be used as a perimeter intrusion defense and assessment system (PIDAS). Problems with not providing effective delay to breaches by intentional intruders and/or terrorists who would otherwise evade detection are solved by attaching the secondary structures to the primary structure, and attaching at least some of the sensors to the secondary structures. By having multiple sensors of various types physically interconnected serves to enable sensors on different parts of the overall structure to respond to common disturbances and thereby provide effective corroboration that a disturbance is not merely a nuisance or false alarm. Use of a machine learning network such as a neural network exploits such corroboration.

  2. Automated Nuclear Data Test Suite

    Energy Science and Technology Software Center (OSTI)

    2013-01-09

    Provides python routines to create a database of test problems in a user-defined directory tree, to query the database using user-defined parameters, to generate a list of test urns, to automatically run with user-defined particle transport codes. Includes natural isotope abundance data, and a table of benchmark effective for fast critical assemblies. Does not include input decks, cross-section libraries, or particle transport codes.

  3. Automated soil gas monitoring chamber

    DOE Patents [OSTI]

    Edwards, Nelson T.; Riggs, Jeffery S.

    2003-07-29

    A chamber for trapping soil gases as they evolve from the soil without disturbance to the soil and to the natural microclimate within the chamber has been invented. The chamber opens between measurements and therefore does not alter the metabolic processes that influence soil gas efflux rates. A multiple chamber system provides for repetitive multi-point sampling, undisturbed metabolic soil processes between sampling, and an essentially airtight sampling chamber operating at ambient pressure.

  4. Automated Export Control.PDF

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ... Page 5 Details of Findings PINS Features PINS is a web-based computer system utilized by ... The font and color on the screens change as the end of the 30-day timeframe nears and ...

  5. Automated manual transmission clutch controller

    DOE Patents [OSTI]

    Lawrie, Robert E.; Reed, Jr., Richard G.; Rausen, David J.

    1999-11-30

    A powertrain system for a hybrid vehicle. The hybrid vehicle includes a heat engine, such as a diesel engine, and an electric machine, which operates as both an electric motor and an alternator, to power the vehicle. The hybrid vehicle also includes a manual-style transmission configured to operate as an automatic transmission from the perspective of the driver. The engine and the electric machine drive an input shaft which in turn drives an output shaft of the transmission. In addition to driving the transmission, the electric machine regulates the speed of the input shaft in order to synchronize the input shaft during either an upshift or downshift of the transmission by either decreasing or increasing the speed of the input shaft. When decreasing the speed of the input shaft, the electric motor functions as an alternator to produce electrical energy which may be stored by a storage device. Operation of the transmission is controlled by a transmission controller which receives input signals and generates output signals to control shift and clutch motors to effect smooth launch, upshift shifts, and downshifts of the transmission, so that the transmission functions substantially as an automatic transmission from the perspective of the driver, while internally substantially functioning as a manual transmission.

  6. Sandia Analyst Aide V0.5

    Energy Science and Technology Software Center (OSTI)

    2005-05-06

    Allow people to analyze the contents of text document, allowing the user to generate: I. automated reports providing information about various aspects of the document set 2. produce models which can be uploaded into Sandia's cognitive model framework 3. produce visualization of various aspects of the document set, including relationships among key terms 4. compute various topical contexts discovered within the document set 5. start a web spider which can scour the web for moremore » information about the topics found in the document set« less

  7. Justification Memo DOE - DOE Directives, Delegations, and Requirements

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Diane Johnson Upload File Upload the file here JM-NonNNSA -8.doc - 37 KB Short Name justification_memo_doe-1

  8. Justification Memo NNSA - DOE Directives, Delegations, and Requirements

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Diane Johnson Upload File Upload the file here JM-NNSA -8 -4.doc - 39 KB Short Name justification_memo_nnsa-1

  9. MS/MS Automated Selected Ion Chromatograms

    Energy Science and Technology Software Center (OSTI)

    2005-12-12

    This program can be used to read a LC-MS/MS data file from either a Finnigan ion trap mass spectrometer (.Raw file) or an Agilent Ion Trap mass spectrometer (.MGF and .CDF files) and create a selected ion chromatogram (SIC) for each of the parent ion masses chosen for fragmentation. The largest peak in each SIC is also identified, with reported statistics including peak elution time, height, area, and signal to noise ratio. It creates severalmore » output files, including a base peak intensity (BPI) chromatogram for the survey scan, a BPI for the fragmentation scans, an XML file containing the SIC data for each parent ion, and a "flat file" (ready for import into a database) containing summaries of the SIC data statistics.« less

  10. Manz Automation AG | Open Energy Information

    Open Energy Info (EERE)

    Germany Zip: D-72768 Sector: Solar Product: German manufacturer of solar and LCD capital equipment. Coordinates: 48.49159, 9.21487 Show Map Loading map......

  11. Automated pupil remapping with binary optics

    DOE Patents [OSTI]

    Neal, Daniel R.; Mansell, Justin

    1999-01-01

    Methods and apparatuses for pupil remapping employing non-standard lenslet shapes in arrays; divergence of lenslet focal spots from on-axis arrangements; use of lenslet arrays to resize two-dimensional inputs to the array; and use of lenslet arrays to map an aperture shape to a different detector shape. Applications include wavefront sensing, astronomical applications, optical interconnects, keylocks, and other binary optics and diffractive optics applications.

  12. Automated pupil remapping with binary optics

    DOE Patents [OSTI]

    Neal, D.R.; Mansell, J.

    1999-01-26

    Methods and apparatuses are disclosed for pupil remapping employing non-standard lenslet shapes in arrays; divergence of lenslet focal spots from on-axis arrangements; use of lenslet arrays to resize two-dimensional inputs to the array; and use of lenslet arrays to map an aperture shape to a different detector shape. Applications include wavefront sensing, astronomical applications, optical interconnects, keylocks, and other binary optics and diffractive optics applications. 24 figs.

  13. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Weight and Leak Check System (WALS) The Nuclear Weapons Complex stores radioactive nuclear ... A redundant computer to independently monitor forces on the pit during robot motion. ...

  14. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    pressure and shear loads on human soft tissues (e.g., skin) in a prosthetic device, exoskeleton, or shoe. Need While several commercially available tactile sensors exist,...

  15. Automated fiber pigtailing machine (Patent) | DOEPatents

    Office of Scientific and Technical Information (OSTI)

    to optoelectonic (OE) devices such as laser diodes, photodiodes, and waveguide devices ... optical; fibers; optoelectonic; devices; laser; diodes; photodiodes; waveguide; devices; ...

  16. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Volant Land multi-modal vehicle Sea swimming multi-modal vehicle Air Volant mulit-modal vehicle Imagine a mission where you have to covertly fly into an area, traverse through...

  17. Automated system for handling tritiated mixed waste

    SciTech Connect (OSTI)

    Dennison, D.K.; Merrill, R.D.; Reitz, T.C.

    1995-03-01

    Lawrence Livermore National Laboratory (LLNL) is developing a semi system for handling, characterizing, processing, sorting, and repackaging hazardous wastes containing tritium. The system combines an IBM-developed gantry robot with a special glove box enclosure designed to protect operators and minimize the potential release of tritium to the atmosphere. All hazardous waste handling and processing will be performed remotely, using the robot in a teleoperational mode for one-of-a-kind functions and in an autonomous mode for repetitive operations. Initially, this system will be used in conjunction with a portable gas system designed to capture any gaseous-phase tritium released into the glove box. This paper presents the objectives of this development program, provides background related to LLNL`s robotics and waste handling program, describes the major system components, outlines system operation, and discusses current status and plans.

  18. Automated collection and processing of environmental samples

    DOE Patents [OSTI]

    Troyer, Gary L.; McNeece, Susan G.; Brayton, Darryl D.; Panesar, Amardip K.

    1997-01-01

    For monitoring an environmental parameter such as the level of nuclear radiation, at distributed sites, bar coded sample collectors are deployed and their codes are read using a portable data entry unit that also records the time of deployment. The time and collector identity are cross referenced in memory in the portable unit. Similarly, when later recovering the collector for testing, the code is again read and the time of collection is stored as indexed to the sample collector, or to a further bar code, for example as provided on a container for the sample. The identity of the operator can also be encoded and stored. After deploying and/or recovering the sample collectors, the data is transmitted to a base processor. The samples are tested, preferably using a test unit coupled to the base processor, and again the time is recorded. The base processor computes the level of radiation at the site during exposure of the sample collector, using the detected radiation level of the sample, the delay between recovery and testing, the duration of exposure and the half life of the isotopes collected. In one embodiment, an identity code and a site code are optically read by an image grabber coupled to the portable data entry unit.

  19. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    modes, dramatically improving its overall safety and enabling broader application of the technology. Our command and control system is the only system of its kind that has...

  20. Sandia National Laboratories: Research: High Consequence, Automation...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    operating toward a common goal. Our control technology is advancing the state-of-the-art in unmanned systems and robotics technology and is enabling ground-breaking...