wikiGsT Equivalent URI: cleanenergysolutions.orgcontentgeospatial-toolkit-gst Language: English Policies: Deployment Programs DeploymentPrograms: Technical Assistance...
Laboratory (NREL). It integrates resource data and geographic information systems (GIS) - for integrated resource assessment. The Geospatial Toolkit (GsT) is a map-based...
The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource assessment. The non-resource, country-specific data for each toolkit comes from a variety of agencies within each country as well as from global datasets. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. The revised version of the Geospatial Toolkit has been released for all original toolkit countries/regions and each software package is made available on NREL's website,
The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource assessment. The non-resource, country-specific data for each toolkit comes from a variety of agencies within each country as well as from global datasets. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Themore » revised version of the Geospatial Toolkit has been released for all original toolkit countries/regions and each software package is made available on NREL's website,« less
The Geospatial Toolkit is an NREL-developed map-based software application that integrates resource data and other geographic information systems (GIS) data for integrated resource ...
NREL developed the Geospatial Toolkit (GsT), a map-based software application that integrates resource data and geographic information systems (GIS) for integrated resource assessment. A variety of agencies within countries, along with global datasets, provided country-specific data. Originally developed in 2005, the Geospatial Toolkit was completely redesigned and re-released in November 2010 to provide a more modern, easier-to-use interface with considerably faster analytical querying capabilities. Toolkits are available for 21 countries and each one can be downloaded separately. The source code for the toolkit is also available. [Taken and edited from http://www.nrel.gov/international/geospatial_toolkits.html
Nicaragua from NREL (Abstract): Geographic Information Systems (GIS) data intended for use in the Geospatial toolkit or with any GIS software. (Purpose): The Solar and Wind Energy...
Cuba from NREL (Abstract): Geographic Information Systems (GIS) data intended for use in the Geospatial toolkit or with any GIS software. (Purpose): The Solar and Wind Energy...
show good potential for renewable energy projects. The toolkit displays renewable energy data along with information about the geography, location of population centers,...
show good potential for renewable energy projects. The toolkit displays renewable energy data along with information about the geography, location of population centers,...
and easy to use geographic toolkit that allows non-GIS users to relate the renewable energy resource (solar and wind) data to other geographic data, such as land use, protected...
Company Organization: National Renewable Energy Laboratory Sector: Energy Focus Area: Solar, Wind Phase: Determine Baseline Topics: Resource assessment Resource Type: Guide...
Ghana, Guatemala, Honduras, India, Nepal, Nicaragua, Oaxaca, Pakistan, Sri Lanka, Turkey Cost: Free Southern Asia, Southern Asia, Southern Asia, South America, Eastern Asia,...
geospatial Home Geospatial Description: Discuss and explore geospatial data available on OpenEI. geospatial GIS spatial analysis...
The COR Toolkit cited in the attachments to Policy Flash 2012-25 and posted to/linked from various DOE Internet pages has been withdrawn until further notice.
Potential Toolkit Building Energy Assessment Toolkit Power System Screening and Design Toolkit Land Use Assessment Toolkit Bioenergy Assessment Toolkit Transportation...
The LOCAL Toolkit contains tools and libraries developed under the LLNL LOCAL LDRD project for managing and processing large unstructured data sets primrily from parallel numerical simulations, such as triangular, tetrahedral, and hexahedral meshes, point sets, and graphs. The tools have three main functionalities: cache-coherent, linear ordering of multidimensional data; lossy and lossless data compression optimized for different data types; and an out-of-core streaming I/O library with simple processing modules for unstructed data.
Training Toolkit BETTER BUILDINGS RESIDENTIAL NETWORK Learn more at betterbuildings.energy.govbbrn 1 T he Better Buildings Residential Network Training Toolkit can be used by ...
Agricultural-Marketing-Toolkit Sign In About | Careers | Contact | Investors | bpa.gov Search Policy & Reporting Expand Policy & Reporting EE Sectors Expand EE Sectors...
Commercial-Marketing-Toolkit Sign In About | Careers | Contact | Investors | bpa.gov Search Policy & Reporting Expand Policy & Reporting EE Sectors Expand EE Sectors Technology...
The Geospatial Science Steering Committee (GSSC) functions in an advisory role to the DOE national laboratories, major facilities, and to headquarters and field office elements to actively promote...
Practices and Application in the U.S. Geospatial Toolkit (GsT) Webinar Introduction to Hydrogen for Code Officials LEDSGPanalysisimpactsDIAWebinar on Development Impact...
Knowledge Portals LEDS Benefits Back to top LEDS-Related Toolkits DIA Toolkit This Development Impacts Assessment Toolkit helps country, regional, and local policymakers...
Training Toolkit BETTER BUILDINGS RESIDENTIAL NETWORK Learn more at betterbuildings.energy.gov/bbrn 1 T he Better Buildings Residential Network Training Toolkit can be used by residential energy efficiency programs interested in realizing the value of providing training opportunities for contractors, staff, and volunteers. For example, according to a comprehensive evaluation of more than 140 energy efficiency programs across the country that participated in a $500 million grant program,
Quarterly Cybersecurity Awareness Campaigns and Toolkits Quarterly Cybersecurity Awareness Campaigns and Toolkits The OCIO coordinates quarterly cybersecurity awareness campaigns ...
Larson, J. W.; Jacob, R. L.; Foster, I.; Guo, J.
The advent of coupled earth system models has raised an important question in parallel computing: What is the most effective method for coupling many parallel models to form a high-performance coupled modeling system? We present our solution to this problem--The Model Coupling Toolkit (MCT). We explain how our effort to construct the Next-Generation Coupler for NCAR Community Climate System Model motivated us to create this toolkit. We describe in detail the conceptual design of the MCT and explain its usage in constructing parallel coupled models. We present preliminary performance results for the toolkit's parallel data transfer facilities. Finally, we outline an agenda for future development of the MCT.
Moore, Branden J.; Voskuilen, Gwendolyn Renae; Rodrigues, Arun F.; Hammond, Simon David; Hemmert, Karl Scott
This is a presentation outlining a lunch and learn lecture for the Structural Simulation Toolkit, supported by Sandia National Laboratories.
Geospatial > Posts by term Content Group Activity By term Q & A Feeds ask queries (1) compound queries (1) data (1) developer (1) geospatial data (1) GIS (1) GIS data (1) Global...
The Water Security Toolkit (WST) provides software for modeling and analyzing water distribution systems to minimize the potential impact of contamination incidents. WST wraps capabilities for contaminant transport, impact assessment, and sensor network design with response action plans, including source identification, rerouting, and decontamination, to provide a range of water security planning and real-time applications.
Voluntary Initiative: Partnerships Toolkit BETTER BUILDINGS RESIDENTIAL NETWORK Residential energy efficiency programs are delivered by many different types of organizations and their partners, including utilities, state and local governments, non-profit organizations, and for-profit companies, but no matter which sector delivers the program, the need to work in partnership with different entities can make or break program success. Definition Partnerships are relationships between two or more
Bioenergy Toolkit Jump to: navigation, search Stage 3 LEDS Home Introduction to Framework Assess current country plans, policies, practices, and capacities DevelopBAU Stage 4:...
To aid in optimal design of microgrids, and help avoid potential problems with maintenance, safety, power quality, and stability, Sandia has developed the Microgrid Design Toolkit ...
Policiesdeployment programs, Resource assessment, Pathways analysis, Background analysis Resource Type: Dataset References: REEEP Toolkit 1 This article is a stub. You...
Small Forest Enterprises: A Facilitator's Toolkit Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Supporting Small Forest Enterprises: A Facilitator's Toolkit Agency...
Opportunities 3b.1. Assess technical potential for sector technologies Renewable Energy Technical Potential Toolkit Building Energy Assessment Toolkit Power System Screening...
Opportunities 3b.1. Assess technical potential for sector technologies Renewable Energy Technical Potential Toolkit Building Energy Assessment Toolkit Power System Screening...
Social Media Toolkit This Better Buildings Residential Network toolkit can be used to help residential energy efficiency programs learn to engage potential customers through social ...
Partnerships Toolkit Voluntary Initiative: Partnerships Toolkit Residential energy efficiency programs are delivered by many different types of organizations and their partners, ...
Better Buildings Training Toolkit The Better Buildings Residential Network Training Toolkit can be used by residential energy efficiency programs interested in realizing the value ...
Social Media Toolkit BETTER BUILDINGS RESIDENTIAL NETWORK Learn more at betterbuildings.energy.govbbrn 1 T his Better Buildings Residential Network toolkit can be used to help ...
Integration National Dataset Toolkit NREL is working on a Solar Integration National Dataset (SIND) Toolkit to enable researchers to perform U.S. regional solar generation ...
geospatial data Home NickL's picture Submitted by NickL(137) Contributor 25 June, 2012 - 21:45 GIS keyword geospatial data GIS GIS data Explore the geospatial datasets in OpenEI's...
This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, openmore » source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems; with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.« less
Koch, Daniel B.
As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability to be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is used to
As with many professions, safety planners and first responders tend to be specialists in certain areas. To be truly useful, tools should be tailored to meet their specific needs. Thus, general software suites aimed at the professional geographic information system (GIS) community might not be the best solution for a first responder with little training in GIS terminology and techniques. On the other hand, commonly used web-based map viewers may not have the capability tomore » be customized for the planning, response, and recovery (PR&R) mission. Data formats should be open and foster easy information flow among local, state, and federal partners. Tools should be free or low-cost to address real-world budget constraints at the local level. They also need to work both with and without a network connection to be robust. The Incident Management Preparedness and Coordination Toolkit (IMPACT) can satisfy many of these needs while working in harmony with established systems at the local, state, and federal levels. The IMPACT software framework, termed the Geospatial Integrated Problem Solving Environment (GIPSE), organizes tasks, tools, and resources for the end user. It uses the concept of software wizards to both customize and extend its functionality. On the Tasks panel are a number of buttons used to initiate various operations. Similar to macros, these task buttons launch scripts that utilize the full functionality of the underlying foundational components such as the SQL spatial database and ORNL-developed map editor. The user is presented with a series of instruction pages which are implemented with HTML for interactivity. On each page are links which initiate specific actions such as creating a map showing various features. Additional tasks may be quickly programmed and added to the panel. The end user can customize the graphical interface to faciltate its use during an emergency. One of the major components of IMPACT is the ORNL Geospatial Viewer (OGV). It is
Training Toolkit Better Buildings Training Toolkit The Better Buildings Residential Network Training Toolkit can be used by residential energy efficiency programs interested in realizing the value of providing training opportunities for contractors, staff, and volunteers. Training Toolkit (666.76 KB) More Documents & Publications Better Buildings Network View | March 2015 Better Buildings Residential Network Membership Form Better Buildings Network View | November 2015
MESQUITE is a linkable software library to be used by simulation and mesh generation tools to improve the quality of meshes. Mesh quality is improved by node movement and/or local topological modifications. Various aspects of mesh quality such as smoothness, element shape, size, and orientation are controlled by choosing the appropriate mesh qualtiy metric, and objective function tempate, and a numerical optimization solver to optimize the quality of meshes, MESQUITE uses the TSTT mesh interfacemore » specification to provide an interoperable toolkit that can be used by applications which adopt the standard. A flexible code design makes it easy for meshing researchers to add additional mesh quality metrics, templates, and solvers to develop new quality improvement algorithms by making use of the MESQUITE infrastructure.« less
The TAO project focuses on the development of software for large scale optimization problems. TAO uses an object-oriented design to create a flexible toolkit with strong emphasis on the reuse of external tools where appropriate. Our design enables bi-directional connection to lower level linear algebra support (for example, parallel sparse matrix data structures) as well as higher level application frameworks. The Toolkist for Advanced Optimization (TAO) is aimed at teh solution of large-scale optimization problemsmore » on high-performance architectures. Our main goals are portability, performance, scalable parallelism, and an interface independent of the architecture. TAO is suitable for both single-processor and massively-parallel architectures. The current version of TAO has algorithms for unconstrained and bound-constrained optimization.« less
Hydropower RAPID Toolkit Hydropower RAPID Toolkit Hydropower RAPID Toolkit Providing project permitting process information for hydropower developers. Navigating the complex system of federal and state regulations to secure project approvals can be one of the biggest hurdles hydropower developers face. The U.S. Department of Energy (DOE) Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit offers a solution. The Hydropower RAPID Toolkit makes permitting information easily
Geospatial Science Program Geospatial Science Program June 21, 2011 - 3:50pm Addthis The overarching mission of the Department of Energy (DOE) is to discover solutions to power and secure America's future. DOE's Geospatial Science Program was established to optimize geospatial investments across our complex and to enable prudent stewardship of the resources provided by the American taxpayer. The term 'geospatial science' encompasses both the concepts of geographic information science and
LeAnn Oliver Associate Chief Information Officer for IT Policy and GovernanceUS Department of Energy202-586-0166Geospatial@hq.doe.gov
Groups > Groups > Geospatial Content Group Activity By term Q & A Feeds There are no feeds from external sites for this group. Groups Menu You must login in order to post into this...
Toolkit Overview SEP Toolkit Overview Overview.pdf (124.2 KB) More Documents & Publications Industrial SEP Ratepayer-funded Accelerator Factsheet SEP Overview Slides How Industrial Energy Efficiency Can Support State Climate and Energy Planning
The Regulatory and Permitting Information Desktop (RAPID) Toolkit offers one location for agencies, developers, and industry stakeholders to work together on state and federal hydropower regulatory processes by using a wiki environment to share permitting guidance, regulations, contacts, and other relevant information.
Community solar gardens can be an excellent opportunity for cities, counties, and other local governments to get involved in solar energy and engage community members. This toolkit has been created by Clean Energy Resource Teams to help consumers learn more about community solar.
ParGrid is a 'wrapper' that integrates a coupled Power Grid Simulation toolkit consisting of a library to manage the synchronization and communication of independent simulations. The included library code in ParGid, named FSKIT, is intended to support the coupling multiple continuous and discrete even parallel simulations. The code is designed using modern object oriented C++ methods utilizing C++11 and current Boost libraries to ensure compatibility with multiple operating systems and environments.
Yue, Peng; Gong, Jianya; Di, Liping; He, Lianlian; Wei, Yaxing
Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information and discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.
To issue the Procurement Toolkit CD to all Grantees, to share with their local agencies, for use in the Weatherization Assistance Program.
Outreach Presentation Template Workplace Charging Toolkit: Workshop Outreach Presentation Template Educate workshop attendees and employers about the benefits of workplace charging ...
"This Toolkit has been developed jointly by PricewaterhouseCoopers (PwC) and the World Business Council for Sustainable Development (WBCSD). It is a globally applicable resource...
meant to support the creation and implementation of a Low Emission Development Strategy. Pages in category "LEDS Toolkit" The following 14 pages are in this category, out of...
Instruction Letter Template Workplace Charging Toolkit: Workshop Speaker Instruction Letter Template Inform speakers participating in the employer experience panel about their role ...
Better Buildings Residential Network Peer Exchange Call Series: Voluntary Initiative on Incentives: Toolkit Training Webinar, Call Slides and Discussion Summary, March 26, 2015.
AgencyCompany Organization: Nick Langle ComplexityEase of Use: Not Available Cost: Free Transport Toolkit Region(s): Asia, Europe, Africa & Middle East, Australia & North...
higher-than-expected downtimes and maintenance costs all undermine project profitability. ... The toolkit is being implemented within the Weather Research and Forecasting (WRF) model, ...
Invitation Template Workplace Charging Toolkit: Workshop Invitation Template Engage possible workplace charging event attendees with this template invitation. File General Workshop ...
Toolkit Jump to: navigation, search LEDSGP Logo.png Transportation Toolkit Home Tools Training Request Assistance Tools for Low Emission Development Strategies in Transportation...
targeted proteomics toolkit for high-throughput absolute quantification of Escherichia coli proteins Citation Details In-Document Search Title: A targeted proteomics toolkit for ...
Opportunities 3b.1. Assess technical potential for sector technologies Renewable Energy Technical Potential Toolkit Building Energy Assessment Toolkit Power System Screening...
Security Audit and Attack Detection Toolkit: National SCADA Test Bed May 2008 Cyber Security Audit and Attack Detection Toolkit: National SCADA Test Bed May 2008 This project of ...
Transport Bikes in Spain Toolkit for Low Emissions Development Strategies in Transport Red buses Toolkit for Low Emissions Development Strategies in Transport A Rationale and...
Galvin Electricity Initiative Policy Toolkit Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Galvin Electricity Initiative Policy Toolkit AgencyCompany Organization:...
Toolkit Website Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Renewable Energy and Energy Efficiency Toolkit Website Focus Area: Renewable Energy Topics: Policy...
Contact Us < LEDSGP | Transportation Toolkit Jump to: navigation, search LEDSGP Logo.png Transportation Toolkit Home Tools Training Request Assistance Contacts for the LEDS GP...
Project (CEnergy) Toolkit Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Climate Change and Clean Energy Project (CEnergy) Toolkit AgencyCompany Organization:...
Geospatial Technology Summit Geospatial Technology Summit Geospatial Technologies from the Ground Up-The State Perspective_0.pdf (2.8 MB) Real Time Monitoring of Energy Infrastructure Status_0.pdf (3.71 MB) Geospatial Analysis and the OpenCarto Framework-Spatial Analysis, Data Provision, and Decision Support_0.pdf (3.32 MB) Flight Planning Tool_0.pdf (2.49 MB) Overview of GIS Groups at DOE_0.pdf (701.47 KB) Developing Renewable Energy Resources and Associated Infrastructure-Location, Location,
The Python-ARM Radar Toolkit (Py-ART) is a collection of radar quality control and retrieval codes which all work on two unifying Python objects: the PyRadar and PyGrid objects. By building ingests to several popular radar formats and then abstracting the interface Py-ART greatly simplifies data processing over several other available utilities. In addition Py-ART makes use of Numpy arrays as its primary storage mechanism enabling use of existing and extensive community software tools.
139 | published 9/22/15 BioenergizeME Infographic Challenge Toolkit B i o e n e r g i z e M E U . S . D E P A R T M ENT O F E N E R G Y O P E R A T I O N B i o e n e r g i z e M E U . S . D E P A R T M ENT O F E N E R G Y Developing future leaders who will determine the bioenergy landscape of tomorrow DOE/EE-1139 | published 9/22/15 TABLE OF CONTENTS 1 INTRODUCTION ................................................................................................... 2 Why Bioenergy? Why Now?
Toolkit BioenergizeME Infographic Challenge Toolkit Toolkit for the BioenergizeME Infographic Challenge. bioenergizeme_toolkit.pdf (1.89 MB) More Documents & Publications Webinar: BioenergizeME Office Hours Webinar: Guide to the 2016 BioenergizeME Infographic Challenge BioenergizeME Infographic Challenge Flyer Webinar: BioenergizeME Office Hours Webinar: Biomass Basics
The U.S. Climate Resilience Toolkit provides scientific tools, information, and expertise to help professionals manage their climate-related risks and opportunities, and improve their resilience to extreme events.
Biomass and Biofuels Biomass and Biofuels Find More Like This Return to Search Geospatial Decision Making System Idaho National Laboratory Contact INL About This Technology Technology Marketing Summary The INL has developed a geospatial decision making process to assist agricultural producers in optimizing operating conditions of combine harvesters which detects the presence of grain and distinguishes between that and residual plant material. Upon detecting grain in the process, the system sends
NYSERDA-Wind Energy Toolkit Jump to: navigation, search Tool Summary LAUNCH TOOL Name: NYSERDA-Wind Energy Toolkit AgencyCompany Organization: New York State Energy Research and...
Wind Integration National Dataset (WIND) Toolkit Webinar Caroline Draxl and Bri-Mathias Hodge July 14, 2015 NRELPR-5000-64691 2 Content * Motivation * Creation of the WIND Toolkit ...
Hosted by the National Oceanic and Atmospheric Administration (NOAA), this webinar will demonstrate the U.S. Climate Resilience Toolkit.
The SIERRA Toolkit is a collection of libraries to facilitate the development of parallel engineering analysis applications. These libraries supply basic core services that an engineering analysis application may need such as a parallel distributed and dynamic mesh database (for unstructured meshes), mechanics algorithm support (parallel infrastructure only), interfaces to parallel solvers, parallel mesh and data I/O, and various utilities (timers, diagnostic tools, etc.).The toolkit is intended to reduce the effort required to develop anmore » engineering analysis application by removing the need to develop core capabilities that most every application would require.« less
Simulation Toolkit Team June 9, 2016 2 Co-Optimization of Fuels and Engines (Co-Optima) - Simulation Toolkit Team This presentation does not contain any proprietary, confidential, or otherwise restricted information FT040 M. McNenly, 1 S. Som, 2 D. Carrington, 3 K. D. Edwards, 4 R. Grout, 5 J. Kodavasal, 2 G. Lacaze, 6 J. Oefelein, 6 P. Pal, 2 V. Ram, 2 N. Van Dam, 2 J. Waters, 3 and R. Whitesides 1 1. Lawrence Livermore National Laboratory 2. Argonne National Laboratory 3. Los Alamos National
Wyoming Game and Fish Department Geospatial Data Jump to: navigation, search OpenEI Reference LibraryAdd to library Map: Wyoming Game and Fish Department Geospatial DataInfo...
PETSC2.0 is a software toolkit for portable, parallel (and serial) numerical solution of partial differential equations and minimization problems. It includes software for the solution of linear and nonlinear systems of equations. These codes are written in a data-structure-neutral manner to enable easy reuse and flexibility.
An earth system model is a computer code designed to simulate the interrelated processes that determine the earth's weather and climate, such as atmospheric circulation, atmospheric physics, atmospheric chemistry, oceanic circulation, and biosphere. I propose a toolkit that would support a modular, or object-oriented, approach to the implementation of such models.
An earth system model is a computer code designed to simulate the interrelated processes that determine the earth`s weather and climate, such as atmospheric circulation, atmospheric physics, atmospheric chemistry, oceanic circulation, and biosphere. I propose a toolkit that would support a modular, or object-oriented, approach to the implementation of such models.
Trahan, Michael Wayne; Foehse, Mark C.
The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.
Hempstead, Antoinette R.; Brown, Kenneth L.
A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.
The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' . The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.
This toolkit supports component-level model-based fault detection methods in commercial building HVAC systems. The toolbox consists of five basic modules: a parameter estimator for model calibration, a preprocessor, an AHU model simulator, a steady-state detector, and a comparator. Each of these modules and the fuzzy logic rules for fault diagnosis are described in detail. The toolbox is written in C++ and also invokes the SPARK simulation program.
Department of Energy LBNL: Architecture 2030 District Program and Small Commercial Toolkit LBNL: Architecture 2030 District Program and Small Commercial Toolkit LBNL: Architecture 2030 District Program and Small Commercial Toolkit Lead Performer: Lawrence Berkeley National Laboratory - Berkeley, CA Partners: - Architecture 2030 - Santa Fe, NM - Cleveland 2030 District - Cleveland, OH - Green Building Alliance/Pittsburgh 2030 District - Pittsburgh, PA - Seattle 2030 District - Seattle, WA -
Department of Energy Instruction Letter Template Workplace Charging Toolkit: Workshop Speaker Instruction Letter Template Inform speakers participating in the employer experience panel about their role in the event. General Speaker Instruction Letter Template (707.41 KB) Clean Cities Branded Speaker Instruction Letter Template (707.23 KB) More Documents & Publications Workplace Charging Toolkit: Workshop Speaker Outreach Letter Template Workplace Charging Toolkit: Workshop Host Outreach
Toolkit Fosters Bioenergy Innovation Online Toolkit Fosters Bioenergy Innovation January 21, 2011 - 2:27pm Addthis Learn more about the Bioenergy Knowledge Discovery Framework, an online data sharing and mapping toolkit. Paul Bryan Biomass Program Manager, Office of Energy Efficiency & Renewable Energy What will the project do? The $241 million loan guarantee for Diamond Green Diesel, funding which will support the construction of a facility that will nearly triple the amount of renewable
Energy Supply Chain Risk Management (SCRM) Awareness Toolkit Supply Chain Risk Management (SCRM) Awareness Toolkit The Office of the Chief Information Officer (OCIO) Supply Chain Risk Management (SCRM) Resource Center developed the SCRM Awareness Toolkit to introduce DOE employees to the basic terms and concepts of the technology supply chain and associated threats. For additional information on the DOE Enterprise SCRM Resource Center and program initiatives, please contact Sue Farrand at
Portaledge, March 2010 | Department of Energy Audit and Attack Detection Toolkit: Bandolier and Portaledge, March 2010 Cyber Security Audit and Attack Detection Toolkit: Bandolier and Portaledge, March 2010 This project of the cyber security audit and attack detection toolkit will employ Bandolier Audit Files for optimizing security configurations and the Portaledge event detection capability for energy control systems. By building configuration audit and attack detection capabilities into
Education (ORISE) distributed more than 400 radiological terrorism toolkits filled with key resources, such as training guidelines, clinical directives, details about radioactive...
(DIA) Toolkit offers a structured way to consider the impact of proposed low emission development strategies (LEDS) on other national or local priorities, and provides links to...
Citation Details In-Document Search Title: YT: A Multi-Code Analysis Toolkit for ... Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing ...
Overview of DOE's Regulatory and Permitting Information Desktop (RAPID) Toolkit project, providing information on where to go to view documents and who to contact to get involved.
Impacts Assessment Toolkit Home Tools Request Assistance Remote Expert Assistance on LEDS The Low Emission Development Strategies (LEDS) Global Partnership provides timely,...
audit and attack detection toolkit will employ Bandolier Audit Files for optimizing security configurations and the Portaledge event detection capability for energy control systems. ...
search OpenEI Reference LibraryAdd to library PermittingRegulatory Guidance - GuideHandbook: ACHP - Section 106 Applicant ToolkitPermittingRegulatory GuidanceGuide...
Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Grid-Connected Renewable Energy Generation Toolkit-Biomass AgencyCompany Organization: United States Agency for...
Publications Website: pdf.wri.orgworkingpapersgfitenureindicatorssep09.pdf Cost: Free WRI-The Governance of Forests Toolkit Screenshot References: WRI-The Governance of...
search This wiki-based Wind Working Group Toolkit provides links to information, methods, and resources. This wiki is a work in progress, and we welcome your contributions....
workshops that have been presented to the Asia-Pacific Economic Cooperation (APEC) and other nations around the world. By developing training toolkits and providing...
Host Outreach Letter Template Workplace Charging Toolkit: Workshop Host Outreach Letter Template Approach employers in your community that already have workplace charging to serve ...
Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Distributed Renewable Energy Finance and Policy Toolkit AgencyCompany Organization: Clean Energy States Alliance...
Printable Version NREL GIS Home About NREL GIS Renewable Energy Technical Potential Renewable Energy Economic Potential Maps Data Resources Data Visualization & Geospatial Tools ...
Database Jump to: navigation, search OpenEI Reference LibraryAdd to library Web Site: Renewable Energy and Defense Geospatial Database Abstract This database provides GIS data...
McLendon, William Clarence, III; Wylie, Brian Neil
Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.
Koch, Daniel B; Payne, Patricia W
Although the use of Geographic Information Systems (GIS) by centrally-located operations staff is well established in the area of emergency response, utilization by first responders in the field is uneven. Cost, complexity, and connectivity are often the deciding factors preventing wider adoption. For the past several years, Oak Ridge National Laboratory (ORNL) has been developing a mobile GIS solution using free and open-source software targeting the needs of front-line personnel. Termed IMPACT, for Incident Management Preparedness and Coordination Toolkit, this ORNL application can complement existing GIS infrastructure and extend its power and capabilities to responders first on the scene of a natural or man-made disaster.
Smith, Brian Edward
The parallel analysis toolkit (ParCAT) provides parallel statistical processing of large climate model simulation datasets. ParCAT provides parallel point-wise average calculations, frequency distributions, sum/differences of two datasets, and difference-of-average and average-of-difference for two datasets for arbitrary subsets of simulation time. ParCAT is a command-line utility that can be easily integrated in scripts or embedded in other application. ParCAT supports CMIP5 post-processed datasets as well as non-CMIP5 post-processed datasets. ParCAT reads and writes standard netCDF files.
This package contains a number of systems utilities for managing a set of computers joined in a "cluster". The utilities assist a team of systems administrators in managing the cluster by automating routine tasks, centralizing information, and monitoring individual computers within the cluster. Included in the toolkit are scripts used to boot a computer from a floppy, a program to turn on and off the power to a system, and a system for using amore » database to organize cluster information.« less
Biomass Geospatial Toolkits Geothermal Hydrogen International Marine & Hydrokinetic Solar Wind Data Visualization & Geospatial Tools Geospatial Team Publications Contact Us...
Stephan, Eric G.; Burke, John S.; Carlson, Carrie A.; Gillen, David S.; Joslyn, Cliff A.; Olsen, Bryan K.; Critchlow, Terence J.
The Department of Homeland Security law enforcement faces the continual challenge of analyzing their custom data sources in a geospatial context. From a strategic perspective law enforcement has certain requirements to first broadly characterize a given situation using their custom data sources and then once it is summarily understood, to geospatially analyze their data in detail.
This Memorandum serves to update and supersede the Financial Toolkit, previously issued in Weatherization Program Notice (WPN) 12-4, Weatherization Assistance Program Financial Management Training Toolkit dated December 9, 2011, and the Procurement Toolkit previously issued in WPN 10-3, Procurement Toolkit CD, dated October 30, 2009.
Brim, Michael J; Lothian, Josh
We discuss the design and ongoing development of the Monitoring Extreme-scale Lustre Toolkit (MELT), a unified Lustre performance monitoring and analysis infrastructure that provides continuous, low-overhead summary information on the health and performance of Lustre, as well as on-demand, in-depth problem diagnosis and root-cause analysis. The MELT infrastructure leverages a distributed overlay network to enable monitoring of center-wide Lustre filesystems where clients are located across many network domains. We preview interactive command-line utilities that help administrators and users to observe Lustre performance at various levels of resolution, from individual servers or clients to whole filesystems, including job-level reporting. Finally, we discuss our future plans for automating the root-cause analysis of common Lustre performance problems.
There is a growing interest in using renewable energy options like solar to improve community resiliency. During extreme weather events, solar can help prevent power outages by providing emergency energy to critical facilities and recovery efforts. Solar can provide electricity to remote or less accessible areas, and is flexible enough to be a mobile or temporary power source. In addition to resiliency planning, there are efforts to promote new safety education and guidelines for solar installation, especially related to fire prevention. Also, there is rising interest in how solar can influence homeland security. The Solar for Safety, Security, and Resilience Toolkit demonstrates the different ways of thinking about and incorporating solar energy into ongoing public safety, homeland security, and resiliency initiatives.
The TEVA-SPOT Toolkit (SPOT) supports the design of contaminant warning systems (CWSs) that use real-time sensors to detect contaminants in municipal water distribution networks. Specifically, SPOT provides the capability to select the locations for installing sensors in order to maximize the utility and effectiveness of the CWS. SPOT models the sensor placement process as an optimization problem, and the user can specify a wide range of performance objectives for contaminant warning system design, including populationmore » health effects, time to detection, extent of contamination, volume consumed and number of failed detections. For example, a SPOT user can integrate expert knowledge during the design process by specigying required sensor placements or designating network locations as forbidden. Further, cost considerations can be integrated by limiting the design with user-specified installation costs at each location.« less
Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.
Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the 'diffraction before destruction' nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography datamore » sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.« less
Many applications of modeling spatial dynamic systems focus on a single system and a single process, ignoring the geographic and systemic context of the processes being modeled. A solution to this problem is the coupled modeling of spatial dynamic systems. Coupled modeling is challenging for both technical reasons, as well as conceptual reasons. This paper explores the benefits and challenges to coupling or linking spatial dynamic models, from loose coupling, where information transfer between models is done by hand, to tight coupling, where two (or more) models are merged as one. To illustrate the challenges, a coupled model of Urbanization and Wildfire Risk is presented. This model, called Vesta, was applied to the Santa Barbara, California region (using real geospatial data), where Urbanization and Wildfires occur and recur, respectively. The preliminary results of the model coupling illustrate that coupled modeling can lead to insight into the consequences of processes acting on their own.
Toolkit banner, typically across the top of the page, which features a unique blue color scheme and simple menu. Parameters none Usage It should be called in the following...
Production | Department of Energy Simulation Toolkit Promises Better Wind Predictions, Increased Farm Production Simulation Toolkit Promises Better Wind Predictions, Increased Farm Production May 11, 2016 - 6:16pm Addthis Wind farm production frequently falls short of expectations. Poor forecasts of low-altitude winds, suboptimal wind plant design and operation, and higher-than-expected downtimes and maintenance costs all undermine project profitability. Each of these issues results from
Geospatial Analysis of Renewable Energy Technical Potential on Tribal Lands E. Doris, A. Lopez, and D. Beckley National Renewable Energy Laboratory U.S. Department of Energy | Office of Indian Energy 1000 Independence Ave. SW, Washington DC 20585 | 202-586-1272 energy.gov/indianenergy | email@example.com Geospatial Analysis of Renewable Energy Technical Potential on Tribal Lands NOTICE This report was prepared as an account of work sponsored by an agency of the United States government.
of Energy Next Generation (NextGen) Geospatial Information System (GIS) Next Generation (NextGen) Geospatial Information System (GIS) July 12, 2013 - 12:17pm Addthis The U.S. Department of Energy Office of Legacy Management (LM) manages environmental records from Cold War legacy sites spanning nearly 40 years. These records are a key LM asset and must be managed and maintained efficiently and effectively. There are over 16 different applications that support the databases containing
The Uncertainty Quantification (UQ) Toolkit is a software library for the characterizaton and propagation of uncertainties in computational models. For the characterization of uncertainties, Bayesian inference tools are provided to infer uncertain model parameters, as well as Bayesian compressive sensing methods for discovering sparse representations of high-dimensional input-output response surfaces, and also Karhunen-Loève expansions for representing stochastic processes. Uncertain parameters are treated as random variables and represented with Polynomial Chaos expansions (PCEs). The library implementsmore » several spectral basis function types (e.g. Hermite basis functions in terms of Gaussian random variables or Legendre basis functions in terms of uniform random variables) that can be used to represent random variables with PCEs. For propagation of uncertainty, tools are provided to propagate PCEs that describe the input uncertainty through the computational model using either intrusive methods (Galerkin projection of equations onto basis functions) or non-intrusive methods (perform deterministic operation at sampled values of the random values and project the obtained results onto basis functions).« less
Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.
The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.
The C-Space Toolkit provides a software library that makes it easier to program motion planning, simulation, robotics, and virtual reality codes using the Configuration Space abstraction. Key functionality (1) enables the user to special create representations of movable and stationary rigid geometric objects, and (2) perform fast distance, interference (clash) detection, collision detection, closest-feature pairs, and contact queries in terms of object configuration. Not only can queries be computed at any given point in configurationmore » space, but they can be done exactly over linear-translational path segments and approximately for rotational path segments. Interference detection and distance computations can be done with respect to the Minkowski sum of the original geometry and a piece of convex geometry. The Toolkit takes as raw model input (1) collections of convex polygons that form the boundaries of models and (2) convex polyhedra, cones, cylinders, and discs that are models and model components. Configurations are given in terms of homogeneous transforms. A simple OpenGL-based system for displaying and animating the geometric objects is included in the implementation. This version, 2.5 Beta, incorporates feature additions and enhancements, improvements in algorithms, improved robustness, bug fixes and cleaned-up source code, better compliance with standards and recent programming convention, changes to the build process for the software, support for more recent hardware and software platforms, and improvements to documentation and source-code comments.« less
XOP 2.1 - a new version of the x-ray optics software toolkit. Citation Details In-Document Search Title: XOP 2.1 - a new version of the x-ray optics software toolkit. No abstract ...
King, J.; Clifton, A.; Hodge, B. M.
Renewable energy integration studies require wind data sets of high quality with realistic representations of the variability, ramping characteristics, and forecast performance for current wind power plants. The Wind Integration National Data Set (WIND) Toolkit is meant to be an update for and expansion of the original data sets created for the weather years from 2004 through 2006 during the Western Wind and Solar Integration Study and the Eastern Wind Integration Study. The WIND Toolkit expands these data sets to include the entire continental United States, increasing the total number of sites represented, and it includes the weather years from 2007 through 2012. In addition, the WIND Toolkit has a finer resolution for both the temporal and geographic dimensions. Three separate data sets will be created: a meteorological data set, a wind power data set, and a forecast data set. This report describes the validation of the wind power data set.
Following the release of The Solar Foundation's Brighter Future: A Study on Solar in U.S. Schools under the Solar Outreach Partnership, the organization has been working to help more K-12 public schools go solar through its technical assistance program. As part of this effort, they developed the Toolkit for Installing Solar on K-12 Schools to compile new and existing resources, designed to provide public school officials with a starting point for pursuing their own solar projects. Hands-on guidance in putting the ideas contained within this toolkit into action is available through the SolarOPs Technical Assistance program.
Bollinger, J; Alfred Garrett, A; Larry Koffman, L; David Hayes, D
3-D hydrodynamic models are used by the Savannah River National Laboratory (SRNL) to simulate the transport of thermal and radionuclide discharges in coastal estuary systems. Development of such models requires accurate bathymetry, coastline, and boundary condition data in conjunction with the ability to rapidly discretize model domains and interpolate the required geospatial data onto the domain. To facilitate rapid and accurate hydrodynamic model development, SRNL has developed a pre- and post-processor application in a geospatial framework to automate the creation of models using existing data. This automated capability allows development of very detailed models to maximize exploitation of available surface water radionuclide sample data and thermal imagery.
Del Rio, Nicholas R.; Pinheiro da Silva, Paulo
When constructing visualization pipelines using toolkits such as Visualization Toolkit (VTK) and Generic Mapping Tools (GMT), developers must understand (1) what toolkit operators will transform their data from its raw state to some required view state and (2) what viewers are available to present the generated view. Traditionally, developers learn about how to construct visualization pipelines by reading documentation and inspecting code examples, which can be costly in terms of the time and effort expended. Once an initial pipeline is constructed, developers may still have to undergo a trial and error process before a satisfactory visualization is generated. This paper presents the Visualization Knowledge Project (VisKo) that is built on a knowledge base of visualization toolkit operators and how they can be piped together to form visualization pipelines. Developers may now rely on VisKo to guide them when constructing visualization pipelines and in some cases, when VisKo has complete knowledge about some set of operators (i.e., sequencing and parameter settings), automatically generate a fully functional visualization pipeline.
CUBIT prepares models to be used in computer-based simulation of real-world events. CUBIT is a full-featured software toolkit for robust generation of two- and three-dimensional finite element meshes (grids) and geometry preparation. Its main goal is to reduce the time to generate meshes, particularly large hex meshes of complicated, interlocking assemblies.
Renne, D. S.; Kelly, M.; Elliott, D.; George, R.; Scott, G.; Haymes, S.; Heimiller, D.; Milbrandt, A.; Cowlin, S.; Gilman, P.; Perez, R.
The U.S. National Renewable Energy Laboratory (NREL) has recently completed the production of high-resolution wind and solar energy resource maps and related data products for Afghanistan and Pakistan. The resource data have been incorporated into a geospatial toolkit (GsT), which allows the user to manipulate the resource information along with country-specific geospatial information such as highway networks, power facilities, transmission corridors, protected land areas, etc. The toolkit allows users to then transfer resource data for specific locations into NREL's micropower optimization model known as HOMER.
Doris, E.; Lopez, A.; Beckley, D.
This technical report uses an established geospatial methodology to estimate the technical potential for renewable energy on tribal lands for the purpose of allowing Tribes to prioritize the development of renewable energy resources either for community scale on-tribal land use or for revenue generating electricity sales.
Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.
In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.
May 2008 | Department of Energy Security Audit and Attack Detection Toolkit: National SCADA Test Bed May 2008 Cyber Security Audit and Attack Detection Toolkit: National SCADA Test Bed May 2008 This project of the cyber security audit and attack detection toolkit is adding control system intelligence to widely deployed enterprise vulnerability scanners and security event managers While many energy utilities employ vulnerability scanners and security event managers (SEM) on their enterprise
Roberts, Randy S.; Trucano, Timothy G.; Pope, Paul A.; Aragon, Cecilia R.; Jiang , Ming; Wei, Thomas; Chilton, Lawrence; Bakel, A. J.
Verification and validation (V&V) of geospatial image analysis algorithms is a difficult task and is becoming increasingly important. While there are many types of image analysis algorithms, we focus on developing V&V methodologies for algorithms designed to provide textual descriptions of geospatial imagery. In this paper, we present a novel methodological basis for V&V that employs a domain-specific ontology, which provides a naming convention for a domain-bounded set of objects and a set of named relationship between these objects. We describe a validation process that proceeds through objectively comparing benchmark imagery, produced using the ontology, with algorithm results. As an example, we describe how the proposed V&V methodology would be applied to algorithms designed to provide textual descriptions of facilities
Koch, Daniel B
Effective planning, response, and recovery (PRR) involving terrorist attacks or natural disasters come with a vast array of information needs. Much of the required information originates from disparate sources in widely differing formats. However, one common attribute the information often possesses is physical location. The organization and visualization of this information can be critical to the success of the PRR mission. Organizing information geospatially is often the most intuitive for the user. In the course of developing a field tool for the U.S. Department of Homeland Security (DHS) Office for Bombing Prevention, a geospatial integrated problem solving environment software framework was developed by Oak Ridge National Laboratory. This framework has proven useful as well in a number of other DHS, Department of Defense, and Department of Energy projects. An overview of the software architecture along with application examples are presented.
This article contains a brief summary of some of the 2006 annual committee reports presented to the Energy Minerals Division (EMD) of the American Association of Petroleum Geologists. The purpose of the reports is to advise EMD leadership and members of the current status of research and developments of energy resources (other than conventional oil and natural gas that typically occur in sandstone and carbonate rocks), energy economics, and geospatial information. This summary presented here by the EMD is a service to the general geologic community. Included in this summary are reviews of the current research and activities related to coal, coalbed methane, gas hydrates, gas shales, geospatial information technology related to energy resources, geothermal resources, oil sands, and uranium resources.
Pabian, Frank Vincent
This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.
The Regulatory and Permitting Information Desktop (RAPID) Toolkit provides information about permits and regulations that affect renewable energy and bulk transmission projects.
a Baseline) Jump to: navigation, search LEDSGP Logo.png Transportation Toolkit Home Tools Training Request Assistance Key Actions for Low-Emission Development in Transportation...
Vatsavai, Raju; Bhaduri, Budhendra L
Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and atmospheric conditions present at the time of data acquisition. A second problem with statistical classifiers is the requirement of large number of accurate training samples, which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, it is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 15% improvement in classification accuracy over conventional classification schemes.
Bhaduri, Budhendra L
For many decades, the Department of Energy (DOE) has been a leader in basic scientific and engineering research that utilizes geospatial science to advance the state of knowledge in disciplines impacting national security, energy sustainability, and environmental stewardship. DOE recently established a comprehensive Geospatial Science Program that will provide an enterprise geographic information system infrastructure connecting all elements of DOE to critical geospatial data and associated geographic information services (GIServices). The Geospatial Science Program will provide a common platform for enhanced scientific and technical collaboration across DOE's national laboratories and facilities.
Prabhat, Mr; Ruebel, Oliver; Byna, Surendra; Wu, Kesheng; Li, Fuyu; Wehner, Michael; Bethel, E. Wes
We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.
Caroline Draxl: NREL
Regional wind integration studies require detailed wind power output data at many locations to perform simulations of how the power system will operate under high penetration scenarios. The wind datasets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as being time synchronized with available load profiles.As described in this presentation, the WIND Toolkit fulfills these requirements by providing a state-of-the-art national (US) wind resource, power production and forecast dataset.
Chin, Shih-Miao; Hwang, Ho-Ling; Peterson, Bruce E
This paper highlights geospatial science-related innovations and developments conducted by the Center for Transportation Analysis (CTA) at the Oak Ridge National Laboratory. CTA researchers have been developing integrated inter-modal transportation solutions through innovative and cost-effective research and development for many years. Specifically, this paper profiles CTA-developed Geographic Information System (GIS) products that are publicly available. Examples of these GIS-related products include: the CTA Transportation Networks; GeoFreight system; and the web-based Multi-Modal Routing Analysis System. In addition, an application on assessment of railroad Hazmat routing alternatives is also discussed.
Neher, L A
The focus of this contract (in the summer and fall of 2001) was originally to help the California Energy Commission (CEC) locate and evaluate potential sites for electric power generation facilities and to assist the CEC in addressing areas of congestion on transmission lines and natural gas supply line corridors. Subsequent events have reduced the immediate urgency, although not the ultimate need for such analyses. Software technology for deploying interactive geographic information systems (GIS) accessible over the Internet have developed to the point that it is now practical to develop and publish GIS web sites that have substantial viewing, movement, query, and even map-making capabilities. As part of a separate project not funded by the CEC, the GIS Center at LLNL, on an experimental basis, has developed a web site to explore the technical difficulties as well as the interest in such a web site by agencies and others concerned with energy research. This exploratory effort offers the potential or developing an interactive GIS web site for use by the CEC for energy research, policy analysis, site evaluation, and permit and regulatory matters. To help ground the geospatial capabilities in the realistic requirements and needs of the CEC staff, the CEC requested that the GIS Center conduct interviews of several CEC staff persons to establish their current and envisioned use of spatial data and requirements for geospatial analyses. This survey will help define a web-accessible central GIS database for the CEC, which will augment the well-received work of the CEC Cartography Unit. Individuals within each siting discipline have been contacted and their responses to three question areas have been summarized. The web-based geospatial data and analytical tools developed within this project will be available to CEC staff for initial area studies, queries, and informal, small-format maps. It is not designed for fine cartography or for large-format posters such as the
Young, K. R.; Levine, A.
The Regulatory and Permitting Information Desktop (RAPID) Toolkit combines the former Geothermal Regulatory Roadmap, National Environmental Policy Act (NEPA) Database, and other resources into a Web-based tool that gives the regulatory and utility-scale geothermal developer communities rapid and easy access to permitting information. RAPID currently comprises five tools - Permitting Atlas, Regulatory Roadmap, Resource Library, NEPA Database, and Best Practices. A beta release of an additional tool, the Permitting Wizard, is scheduled for late 2014. Because of the huge amount of information involved, RAPID was developed in a wiki platform to allow industry and regulatory agencies to maintain the content in the future so that it continues to provide relevant and accurate information to users. In 2014, the content was expanded to include regulatory requirements for utility-scale solar and bulk transmission development projects. Going forward, development of the RAPID Toolkit will focus on expanding the capabilities of current tools, developing additional tools, including additional technologies, and continuing to increase stakeholder involvement.
Standart, G. D.; Stulken, K. R.; Zhang, Xuesong; Zong, Ziliang
The Earth Resources Observation and Science (EROS) Center of U.S. Geological Survey is currently managing and maintaining the world largest satellite images distribution system, which provides 24/7 free download service for researchers all over the globe in many areas such as Geology, Hydrology, Climate Modeling, and Earth Sciences. A large amount of geospatial data contained in satellite images maintained by EROS is generated every day. However, this data is not well utilized due to the lack of efficient data visualization tools. This software implements a method for visualizing various characteristics of the global satellite image download requests. More specifically, Keyhole Markup Language (KML) files are generated which can be loaded into an earth browser such as Google Earth. Colored rectangles associated with stored satellite scenes are painted onto the earth browser; and the color and opacity of each rectangle is varied as a function of the popularity of the corresponding satellite image. An analysis of the geospatial information obtained relative to specified time constraints provides an ability to relate image download requests to environmental, political, and social events.
Draxl, C.; Hodge, B. M.; Clifton, A.; McCaa, J.
The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.
Andrea Alfonsi; Cristian Rabiti; Aaron S. Epiney; Yaqi Wang; Joshua Cogliati
The principal idea of this paper is to present the new capabilities available in the PHISICS toolkit, connected with the implementation of the depletion code MRTAU, a generic depletion/ decay/burn-up code developed at the Idaho National Laboratory. It is programmed in a modular structure and modern FORTRAN 95/2003. The code tracks the time evolution of the isotopic concentration of a given material accounting for nuclear reaction happening in presence of neutron flux and also due to natural decay. MRTAU has two different methods to perform the depletion calculation, in order to let the user choose the best one respect his needs. Both the methodologies and some significant results are reported in this paper.
MSTK or Mesh Toolkit is a mesh framework that allows users to represent, manipulate and query unstructured 3D arbitrary topology meshes in a general manner without the need to code their own data structures. MSTK is a flexible framework in that is allows (or will eventually allow) a wide variety of underlying representations for the mesh while maintaining a common interface. It will allow users to choose from different mesh representations either at initialization ormore » during the program execution so that the optimal data structures are used for the particular algorithm. The interaction of users and applications with MSTK is through a functional interface that acts as through the mesh always contains vertices, edges, faces and regions and maintains connectivity between all these entities.« less
The goals of NECWGDN were to establish integrated geospatial databases that interfaced with existing open-source (water.html) environmental data server technologies (e.g., HydroDesktop) and included ecological and human data to enable evaluation, prediction, and adaptation in coastal environments to climate- and human-induced threats to the coastal marine resources within the Gulf of Maine. We have completed the development and testing of a "test bed" architecture that is compatible with HydroDesktop and have identified key metadata structures that will enable seamless integration and delivery of environmental, ecological, and human data as well as models to predict threats to end-users. Uniquely this database integrates point as well as model data and so offers capacities to end-users that are unique among databases. Future efforts will focus on the development of integrated environmental-human dimension models that can serve, in near real time, visualizations of threats to coastal resources and habitats.
The U.S. Department of Energy today announced the publication of a high-impact, informational resources toolkit through its Better Buildings Energy Data Accelerator developed in partnership with 18...
Hovland, P; Lee, S; McInnes, L; Norris, B; Smith, B
The increased use of object-oriented toolkits in large-scale scientific simulation presents new opportunities and challenges for the use of automatic (or algorithmic) differentiation (AD) techniques, especially in the context of optimization. Because object-oriented toolkits use well-defined interfaces and data structures, there is potential for simplifying the AD process. Furthermore, derivative computation can be improved by exploiting high-level information about numerical and computational abstractions. However, challenges to the successful use of AD with these toolkits also exist. Among the greatest challenges is balancing the desire to limit the scope of the AD process with the desire to minimize the work required of a user. They discuss their experiences in integrating AD with the PETSc, PVODE, and TAO toolkits and the plans for future research and development in this area.
Draxl, Caroline; Hodge, Bri-Mathias
A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.
The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects ofmore » the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation of application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and
Brost, Randolph C.; McLendon, William Clarence,
Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report a preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.
Bhaduri, Budhendra L
The U.S. Department of Energy (DOE) has a rich history of significant contributions to geospatial science spanning the past four decades. In the early years, work focused on basic research, such as development of algorithms for processing geographic data and early use of LANDSAT imagery. The emphasis shifted in the mid-1970s to development of geographic information system (GIS) applications to support programs such as the National Uranium Resource Evaluation (NURE), and later to issue-oriented GIS applications supporting programs such as environmental restoration and management (mid-1980s through present). Throughout this period, the DOE national laboratories represented a strong chorus of voices advocating the importance of geospatial science and technology in the decades to come. The establishment of a Geospatial Science Program by the DOE Office of the Chief Information Officer in 2005 reflects the continued potential of geospatial science to enhance DOE's science, projects, and operations, as is well demonstrated by historical analysis.
Arguello, Bryan; Gearhart, Jared Lee; Jones, Katherine A.; Eddy, John P.
The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.
Watson, Jean-Paul; Strip, David R.; McLendon, William C.; Parekh, Ojas D.; Diegert, Carl F.; Martin, Shawn Bryan; Rintoul, Mark Daniel
While collection capabilities have yielded an ever-increasing volume of aerial imagery, analytic techniques for identifying patterns in and extracting relevant information from this data have seriously lagged. The vast majority of imagery is never examined, due to a combination of the limited bandwidth of human analysts and limitations of existing analysis tools. In this report, we describe an alternative, novel approach to both encoding and analyzing aerial imagery, using the concept of a geospatial semantic graph. The advantages of our approach are twofold. First, intuitive templates can be easily specified in terms of the domain language in which an analyst converses. These templates can be used to automatically and efficiently search large graph databases, for specific patterns of interest. Second, unsupervised machine learning techniques can be applied to automatically identify patterns in the graph databases, exposing recurring motifs in imagery. We illustrate our approach using real-world data for Anne Arundel County, Maryland, and compare the performance of our approach to that of an expert human analyst.
Alkadi, Nasr E; Starke, Michael R; Ma, Ookie; Nimbalkar, Sachin U; Cox, Daryl
IGATE-E is an energy analysis tool for industrial energy evaluation. The tool applies statistical modeling to multiple publicly available datasets and provides information at the geospatial resolution of zip code using bottom up approaches. Within each zip code, the current version of the tool estimates electrical energy consumption of manufacturing industries based on each type of industries using DOE s Industrial Assessment Center database (IAC-DB) and DOE s Energy Information Administration Manufacturing Energy Consumption Survey database (EIA-MECS DB), in addition to other commercially available databases such as the Manufacturing News database (MNI, Inc.). Ongoing and future work include adding modules for the predictions of fuel energy consumption streams, manufacturing process steps energy consumption, major energy intensive processes (EIPs) within each industry type among other metrics of interest. The tool provides validation against DOE s EIA-MECS state level energy estimations and permits several statistical examinations. IGATE-E is intended to be a decision support and planning tool to a wide spectrum of energy analysts, researchers, government organizations, private consultants, industry partners, and alike.
Wei, Yaxing; SanthanaVannan, Suresh K; Cook, Robert B
Geospatial data are important to understand the Earth - ecosystem dynamics, land cover changes, resource management, and human interactions with the Earth to name a few. One of the biggest difficulties users face is to discover, access, and assemble distributed, large volume, heterogeneous geospatial data to conduct geo-analysis. Traditional methods of geospatial data discovery, visualization, and delivery lack the capabilities of resource sharing and automation across systems or organizational boundaries. They require users to download the data ldquoas-isrdquo in their original file format, projection, and extent. Also, discovering data served by traditional methods requires prior knowledge of data location, and processing requires specialized expertise. These drawbacks of traditional methods create additional burden to users, introduce too much overhead to research, and also reduce the potential usage of the data. At the Oak Ridge National Laboratory (ORNL), researchers working on NASA-sponsored projects: Distributed Active Archive Center (DAAC) and Modeling and Synthesis Thematic Data Center (MAST-DC) have tapped into the benefits of Open Geospatial Consortium (OGC) standards to overcome the drawbacks of traditional methods of geospatial data discovery, visualization, and delivery. The OGC standards-based approach facilitates data sharing and interoperability across network, organizational, and geopolitical boundaries. Tools and services based on OGC standards deliver the data in many user defined formats and allow users to visualize the data prior to download. This paper introduces an approach taken to visualize and deliver ORNL DAAC, MAST-DC, and other relevant geospatial data through OGC standards-based Web Services, including Web Map Service (WMS), Web Coverage Service (WCS), and Web Feature Service (WFS). It also introduces a WebGIS system built on top of OGC services that helps users discover, visualize, and access geospatial data.
Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William E.; Phillips, Cynthia A.; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica
The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py
Vatsavai, Raju; Symons, Christopher T; Chandola, Varun; Jun, Goo
One of the practical issues in clustering is the specification of the appropriate number of clusters, which is not obvious when analyzing geospatial datasets, partly because they are huge (both in size and spatial extent) and high dimensional. In this paper we present a computationally efficient model-based split and merge clustering algorithm that incrementally finds model parameters and the number of clusters. Additionally, we attempt to provide insights into this problem and other data mining challenges that are encountered when clustering geospatial data. The basic algorithm we present is similar to the G-means and X-means algorithms; however, our proposed approach avoids certain limitations of these well-known clustering algorithms that are pertinent when dealing with geospatial data. We compare the performance of our approach with the G-means and X-means algorithms. Experimental evaluation on simulated data and on multispectral and hyperspectral remotely sensed image data demonstrates the effectiveness of our algorithm.
Tobin Jr, Kenneth William; Bhaduri, Budhendra L; Bright, Eddie A; Cheriydat, Anil; Karnowski, Thomas Paul; Palathingal, Paul J; Potok, Thomas E; Price, Jeffery R
We describe a method for indexing and retrieving high-resolution image regions in large geospatial data libraries. An automated feature extraction method is used that generates a unique and specific structural description of each segment of a tessellated input image file. These tessellated regions are then merged into similar groups, or sub-regions, and indexed to provide flexible and varied retrieval in a query-by-example environment. The methods of tessellation, feature extraction, sub-region clustering, indexing, and retrieval are described and demonstrated using a geospatial library representing a 153 km2 region of land in East Tennessee at 0.5 m per pixel resolution.
Long, Philip E.; Wurstner, Signe K.; Sullivan, E. C.; Schaef, Herbert T.; Bradley, Donald J.
Ice coverage of the Arctic Ocean is predicted to become thinner and to cover less area with time. The combination of more ice-free waters for exploration and navigation, along with increasing demand for hydrocarbons and improvements in technologies for the discovery and exploitation of new hydrocarbon resources have focused attention on the hydrocarbon potential of the Arctic Basin and its margins. The purpose of this document is to 1) summarize results of a review of published hydrocarbon resources in the Arctic, including both conventional oil and gas and methane hydrates and 2) develop a set of digital maps of the hydrocarbon potential of the Arctic Ocean. These maps can be combined with predictions of ice-free areas to enable estimates of the likely regions and sequence of hydrocarbon production development in the Arctic. In this report, conventional oil and gas resources are explicitly linked with potential gas hydrate resources. This has not been attempted previously and is particularly powerful as the likelihood of gas production from marine gas hydrates increases. Available or planned infrastructure, such as pipelines, combined with the geospatial distribution of hydrocarbons is a very strong determinant of the temporal-spatial development of Arctic hydrocarbon resources. Significant unknowns decrease the certainty of predictions for development of hydrocarbon resources. These include: 1) Areas in the Russian Arctic that are poorly mapped, 2) Disputed ownership: primarily the Lomonosov Ridge, 3) Lack of detailed information on gas hydrate distribution, and 4) Technical risk associated with the ability to extract methane gas from gas hydrates. Logistics may control areas of exploration more than hydrocarbon potential. Accessibility, established ownership, and leasing of exploration blocks may trump quality of source rock, reservoir, and size of target. With this in mind, the main areas that are likely to be explored first are the Bering Strait and Chukchi
This study analyzes the market needs for building performance evaluation tools. It identifies the existing gaps and provides a roadmap for the U.S. Department of Energy (DOE) to develop a toolkit with which to optimize energy performance of a commercial building over its life cycle.
Model SN CRAC ToolKit Model Variable, Flat SN CRAC, 80% TPP (TK187SN-03FS3BPA-PropVariableFlatSNN24-Jun-03.xls, 3.1 MB) Data Input Files (required to run the above...
Knowlton, Robert G.; Melton, Brad J; Anderson, Robert J.
operation of th e SESSA tool kit in order to give the user enough information to start using the tool kit . SESSA is currently a prototype system and this documentation covers the initial release of the tool kit . Funding for SESSA was provided by the Department of Defense (D oD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL) . ACKNOWLEDGEMENTS The authors wish to acknowledge the funding support for the development of the Site Exploitation System for Situational Awareness (SESSA) toolkit from the Department of Defense (DoD), Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) Rapid Fielding (RF) organization. The project was managed by the Defense Forensic Science Center (DFSC) , formerly known as the U.S. Army Criminal Investigation Laboratory (USACIL). Special thanks to Mr. Garold Warner, of DFSC, who served as the Project Manager. Individuals that worked on the design, functional attributes, algorithm development, system arc hitecture, and software programming include: Robert Knowlton, Brad Melton, Robert Anderson, and Wendy Amai.
Pasha, M. Fayzul K.; Yang, Majntxov; Yeasmin, Dilruba; Saetern, Sen; Kao, Shih -Chieh; Smith, Brennan T.
Benefited from the rapid development of multiple geospatial data sets on topography, hydrology, and existing energy-water infrastructures, the reconnaissance level hydropower resource assessment can now be conducted using geospatial models in all regions of the US. Furthermore, the updated techniques can be used to estimate the total undeveloped hydropower potential across all regions, and may eventually help identify further hydropower opportunities that were previously overlooked. To enhance the characterization of higher energy density stream-reaches, this paper explored the sensitivity of geospatial resolution on the identification of hydropower stream-reaches using the geospatial merit matrix based hydropower resource assessment (GMM-HRA) model. GMM-HRAmore » model simulation was conducted with eight different spatial resolutions on six U.S. Geological Survey (USGS) 8-digit hydrologic units (HUC8) located at three different terrains; Flat, Mild, and Steep. The results showed that more hydropower potential from higher energy density stream-reaches can be identified with increasing spatial resolution. Both Flat and Mild terrains exhibited lower impacts compared to the Steep terrain. Consequently, greater attention should be applied when selecting the discretization resolution for hydropower resource assessments in the future study.« less
Vatsavai, Raju; Bhaduri, Budhendra L; Cheriyadat, Anil M; Arrowood, Lloyd; Bright, Eddie A; Gleason, Shaun Scott; Diegert, Carl; Katsaggelos, Aggelos K; Pappas, Thrasos N; Porter, Reid; Bollinger, Jim; Chen, Barry; Hohimer, Ryan
With increasing understanding and availability of nuclear technologies, and increasing persuasion of nuclear technologies by several new countries, it is increasingly becoming important to monitor the nuclear proliferation activities. There is a great need for developing technologies to automatically or semi-automatically detect nuclear proliferation activities using remote sensing. Images acquired from earth observation satellites is an important source of information in detecting proliferation activities. High-resolution remote sensing images are highly useful in verifying the correctness, as well as completeness of any nuclear program. DOE national laboratories are interested in detecting nuclear proliferation by developing advanced geospatial image mining algorithms. In this paper we describe the current understanding of geospatial image mining techniques and enumerate key gaps and identify future research needs in the context of nuclear proliferation.
Chandola, Varun; Vatsavai, Raju; Bhaduri, Budhendra L
We demonstrate an interactive visualization and analysis system for integrating climate data with other geospatial data sets, such as environmental and demographic data. The \\eviz system is a desktop-based visualization and analysis environment which allows seamless integration of multiple geospatial data sets from varied sources and provides an interface to interactively analyze the different data sets and apply sophisticated data analysis and mining algorithms in a near real time fashion. The framework is highly desirable in domains such as earth and climate sciences where great emphasis is placed on simultaneous analysis of different data sets such as remote sensing images, climate model simulation outputs, and other environmental and demographic databases, to understand weather and climate systems and the impact of climate change on nature and people.
UC Davis Models Geospatial Station Network Design Tool & Hydrogen Infrastructure Rollout Economic Analysis Model (University of California-Davis) Objectives Analyze regional strategies for early rollout of hydrogen infrastructure in support of fuel cell vehicle commercialization. Estimate how many hydrogen fueling stations would be needed and how much it will cost to develop cost competitive hydrogen supply. Compare the cost of hydrogen from different types and sizes of hydrogen stations
Tobin Jr, Kenneth William; Bhaduri, Budhendra L; Bright, Eddie A; Cheriydat, Anil; Karnowski, Thomas Paul; Palathingal, Paul J; Potok, Thomas E; Price, Jeffery R
We describe a method for indexing and retrieving high-resolution image regions in large geospatial data libraries. An automated feature extraction method is used that generates a unique and specific structural description of each segment of a tessellated input image file. These tessellated regions are then merged into similar groups and indexed to provide flexible and varied retrieval in a query-by-example environment.
Vatsavai, Raju; Bhaduri, Budhendra L
Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions (class conditional probability densities) are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and ecological zones. A second problem with statistical classifiers is the requirement of large number of accurate training samples (10 to 30 |dimensions|), which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, it is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 25 to 35% improvement in overall classification accuracy over conventional classification schemes.
Zhang, Zhiyuan; Tong, Xiaonan; McDonnell, Kevin T.; Zelenyuk, Alla; Imre, D.; Mueller, Klaus
Climate research produces a wealth of multivariate data. These data often have a geospatial reference and so it is of interest to show them within their geospatial context. One can consider this configuration as a multi field visualization problem, where the geospace provides the expanse of the field. However, there is a limit on the amount of multivariate information that can be fit within a certain spatial location, and the use of linked multivari ate information displays has previously been devised to bridge this gap. In this paper we focus on the interactions in the geographical display, present an implementation that uses Google Earth, and demonstrate it within a tightly linked parallel coordinates display. Several other visual representations, such as pie and bar charts are integrated into the Google Earth display and can be interactively manipulated. Further, we also demonstrate new brushing and visualization techniques for parallel coordinates, such as fixedwindow brushing and correlationenhanced display. We conceived our system with a team of climate researchers, who already made a few important discov eries using it. This demonstrates our system’s great potential to enable scientific discoveries, possibly also in oth er domains where data have a geospatial reference.
Pabian, Frank V
This slide presentation focuses on the growing role and importance of imagery analysis for IAEA safeguards applications and how commercial satellite imagery, together with the newly available geospatial tools, can be used to promote 'all-source synergy.' As additional sources of openly available information, satellite imagery in conjunction with the geospatial tools can be used to significantly augment and enhance existing information gathering techniques, procedures, and analyses in the remote detection and assessment of nonproliferation relevant activities, facilities, and programs. Foremost of the geospatial tools are the 'Digital Virtual Globes' (i.e., GoogleEarth, Virtual Earth, etc.) that are far better than previously used simple 2-D plan-view line drawings for visualization of known and suspected facilities of interest which can be critical to: (1) Site familiarization and true geospatial context awareness; (2) Pre-inspection planning; (3) Onsite orientation and navigation; (4) Post-inspection reporting; (5) Site monitoring over time for changes; (6) Verification of states site declarations and for input to State Evaluation reports; and (7) A common basis for discussions among all interested parties (Member States). Additionally, as an 'open-source', such virtual globes can also provide a new, essentially free, means to conduct broad area search for undeclared nuclear sites and activities - either alleged through open source leads; identified on internet BLOGS and WIKI Layers, with input from a 'free' cadre of global browsers and/or by knowledgeable local citizens (a.k.a.: 'crowdsourcing'), that can include ground photos and maps; or by other initiatives based on existing information and in-house country knowledge. They also provide a means to acquire ground photography taken by locals, hobbyists, and tourists of the surrounding locales that can be useful in identifying and discriminating between relevant and non-relevant facilities and their associated
Co-Optimization of Fuels and Engines (Co-Optima) - Simulation Toolkit Team This presentation does not contain any proprietary, confidential, or otherwise restricted information #$%&%" '("')*+,-./ ! "0("012/ 3 "!"#$%&&'()*+(, - #"."#!"#/01%&02, 3# 4"#5&+6*, 7# 8"#.+0%9%2%:, ;# 5"#<%=%>?, @# 8"# A?B?:?'(, @ #C"#C%:, ; #D"#4%E, ; #F"#D%(#!%E, ; #8"#G%*?&2, - #%(0#4"#GH'*?2'0?2 I#
Walker, H; Chou, R; Chubb, K; Schek, J
The objective of this project is to evaluate existing and emerging Open Geospatial Consortium (OGC) standards for use in LLNL programs that rely heavily on geographic data. OGC standards are intended to facilitate interoperability between geospatial processing systems to avoid duplication of effort, lower development costs, and encourage competition based on improved capability and performance rather than vendor lock-in. Some of these standards appear to be gaining traction in the geospatial data community, the Federal government, DOE and DHS. A serious evaluation of this technology is appropriate at this time due to increasing interest and mandated compliance in the Federal government in some situations. A subset of OGC standards is identified and reviewed with a focus on applications to LLNL programs. Each standard or recommendation reviewed was evaluated in general terms. In addition, for specific programs such as Gen&SIS and NARAC, a specific evaluation was made of several of the standards and how they could be used most effectively. It is also important to evaluate the acceptance of these standards in the commercial arena. The implementation of OGC standards by the largest GIS vendor (ESRI) was reviewed. At present, OGC standards are primary useful in specific situations. More generally, many of the standards are immature and their impact on the government and commercial sectors is unclear. Consequently, OGC and related developments need to be observed. As specific standards or groups of standards mature and establish their relevance, these can also be incorporated in LLNL programs as requirements dictate, especially if open implementations and commercial products are available.
The iGlobe system is a desktop-based visualization and analysis environment which allows seamless integration of multiple geospatial data sets from varied sources and provides an interface to interactively analyze the different data sets and apply sophisticated data analysis and mining algorithms in a near real time fashion. The framework is highly desirable in domains such as earth and climate sciences where great emphasis is placed on simultaneous analysis of different data sets such as remote sensing images, climate model simulation outputs, and other environmental and demographic databases, to understand weather and climate systems and the impact of climate change in nature and people.
Mandelli, Diego; Prescott, Steven R; Smith, Curtis L; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Kinoshita, Robert A
In the Risk Informed Safety Margin Characterization (RISMC) approach we want to understand not just the frequency of an event like core damage, but how close we are (or are not) to key safety-related events and how might we increase our safety margins. The RISMC Pathway uses the probabilistic margin approach to quantify impacts to reliability and safety by coupling both probabilistic (via stochastic simulation) and mechanistic (via physics models) approaches. This coupling takes place through the interchange of physical parameters and operational or accident scenarios. In this paper we apply the RISMC approach to evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., system activation) and to perform statistical analyses (e.g., run multiple RELAP-7 simulations where sequencing/timing of events have been changed according to a set of stochastic distributions). By using the RISMC toolkit, we can evaluate how power uprate affects the system recovery measures needed to avoid core damage after the PWR lost all available AC power by a tsunami induced flooding. The simulation of the actual flooding is performed by using a smooth particle hydrodynamics code: NEUTRINO.
Groth, Katrina; Tchouvelev, Andrei V.
There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.
Mandelli, Diego; Prescott, Steven; Smith, Curtis; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua; Kinoshita, Robert
In this paper we evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: the RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., component/system activation) and to perform statistical analyses. In our case, the simulation of the flooding is performed by using an advanced smooth particle hydrodynamics code calledmore » NEUTRINO. The obtained results allow the user to investigate and quantify the impact of timing and sequencing of events on system safety. The impact of power uprate is determined in terms of both core damage probability and safety margins.« less
Merzari, E.; Shemon, E. R.; Yu, Y. Q.; Thomas, J. W.; Obabko, A.; Jain, Rajeev; Mahadevan, Vijay; Tautges, Timothy; Solberg, Jerome; Ferencz, Robert Mark; Whitesides, R.
This report describes to employ SHARP to perform a first-of-a-kind analysis of the core radial expansion phenomenon in an SFR. This effort required significant advances in the framework Multi-Physics Demonstration Problem with the SHARP Reactor Simulation Toolkit used to drive the coupled simulations, manipulate the mesh in response to the deformation of the geometry, and generate the necessary modified mesh files. Furthermore, the model geometry is fairly complex, and consistent mesh generation for the three physics modules required significant effort. Fully-integrated simulations of a 7-assembly mini-core test problem have been performed, and the results are presented here. Physics models of a full-core model of the Advanced Burner Test Reactor have also been developed for each of the three physics modules. Standalone results of each of the three physics modules for the ABTR are presented here, which provides a demonstration of the feasibility of the fully-integrated simulation.
Zeldin, Oliver B.; Brewster, Aaron S.; Hattne, Johan; Uervirojnangkoorn, Monarin; Lyubimov, Artem Y.; Zhou, Qiangjun; Zhao, Minglei; Weis, William I.; Sauter, Nicholas K.; Brunger, Axel T.
Ultrafast diffraction at X-ray free-electron lasers (XFELs) has the potential to yield new insights into important biological systems that produce radiation-sensitive crystals. An unavoidable feature of the 'diffraction before destruction' nature of these experiments is that images are obtained from many distinct crystals and/or different regions of the same crystal. Combined with other sources of XFEL shot-to-shot variation, this introduces significant heterogeneity into the diffraction data, complicating processing and interpretation. To enable researchers to get the most from their collected data, a toolkit is presented that provides insights into the quality of, and the variation present in, serial crystallography data sets. These tools operate on the unmerged, partial intensity integration results from many individual crystals, and can be used on two levels: firstly to guide the experimental strategy during data collection, and secondly to help users make informed choices during data processing.
Mandelli, Diego; Prescott, Steven; Smith, Curtis; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua; Kinoshita, Robert
In this paper we evaluate the impact of a power uprate on a pressurized water reactor (PWR) for a tsunami-induced flooding test case. This analysis is performed using the RISMC toolkit: the RELAP-7 and RAVEN codes. RELAP-7 is the new generation of system analysis codes that is responsible for simulating the thermal-hydraulic dynamics of PWR and boiling water reactor systems. RAVEN has two capabilities: to act as a controller of the RELAP-7 simulation (e.g., component/system activation) and to perform statistical analyses. In our case, the simulation of the flooding is performed by using an advanced smooth particle hydrodynamics code called NEUTRINO. The obtained results allow the user to investigate and quantify the impact of timing and sequencing of events on system safety. The impact of power uprate is determined in terms of both core damage probability and safety margins.
Draxl, C.; Hodge, B. M.; Orwig, K.; Jones, W.; Searight, K.; Getman, D.; Harrold, S.; McCaa, J.; Cline, J.; Clark, C.
Regional wind integration studies in the United States require detailed wind power output data at many locations to perform simulations of how the power system will operate under high-penetration scenarios. The wind data sets that serve as inputs into the study must realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of the simulated wind plants, as well as be time synchronized with available load profiles. The Wind Integration National Dataset (WIND) Toolkit described in this paper fulfills these requirements. A wind resource dataset, wind power production time series, and simulated forecasts from a numerical weather prediction model run on a nationwide 2-km grid at 5-min resolution will be made publicly available for more than 110,000 onshore and offshore wind power production sites.
Zhou, Wei; Minnick, Matthew; Geza, Mengistu; Murray, Kyle; Mattson, Earl
The Colorado School of Mines (CSM) was awarded a grant by the National Energy Technology Laboratory (NETL), Department of Energy (DOE) to conduct a research project en- titled GIS- and Web-based Water Resource Geospatial Infrastructure for Oil Shale Development in October of 2008. The ultimate goal of this research project is to develop a water resource geo-spatial infrastructure that serves as “baseline data” for creating solutions on water resource management and for supporting decisions making on oil shale resource development. The project came to the end on September 30, 2012. This final project report will report the key findings from the project activity, major accomplishments, and expected impacts of the research. At meantime, the gamma version (also known as Version 4.0) of the geodatabase as well as other various deliverables stored on digital storage media will be send to the program manager at NETL, DOE via express mail. The key findings from the project activity include the quantitative spatial and temporal distribution of the water resource throughout the Piceance Basin, water consumption with respect to oil shale production, and data gaps identified. Major accomplishments of this project include the creation of a relational geodatabase, automated data processing scripts (Matlab) for database link with surface water and geological model, ArcGIS Model for hydrogeologic data processing for groundwater model input, a 3D geological model, surface water/groundwater models, energy resource development systems model, as well as a web-based geo-spatial infrastructure for data exploration, visualization and dissemination. This research will have broad impacts of the devel- opment of the oil shale resources in the US. The geodatabase provides a “baseline” data for fur- ther study of the oil shale development and identification of further data collection needs. The 3D geological model provides better understanding through data interpolation and
Stracuzzi, David John; Brost, Randolph C.; Phillips, Cynthia A.; Robinson, David G.; Wilson, Alyson G.; Woodbridge, Diane M. -K.
Geospatial semantic graphs provide a robust foundation for representing and analyzing remote sensor data. In particular, they support a variety of pattern search operations that capture the spatial and temporal relationships among the objects and events in the data. However, in the presence of large data corpora, even a carefully constructed search query may return a large number of unintended matches. This work considers the problem of calculating a quality score for each match to the query, given that the underlying data are uncertain. As a result, we present a preliminary evaluation of three methods for determining both match quality scores and associated uncertainty bounds, illustrated in the context of an example based on overhead imagery data.
Stracuzzi, David John; Brost, Randolph C.; Phillips, Cynthia A.; Robinson, David G.; Wilson, Alyson G.; Woodbridge, Diane M. -K.
Geospatial semantic graphs provide a robust foundation for representing and analyzing remote sensor data. In particular, they support a variety of pattern search operations that capture the spatial and temporal relationships among the objects and events in the data. However, in the presence of large data corpora, even a carefully constructed search query may return a large number of unintended matches. This work considers the problem of calculating a quality score for each match to the query, given that the underlying data are uncertain. As a result, we present a preliminary evaluation of three methods for determining both match qualitymore » scores and associated uncertainty bounds, illustrated in the context of an example based on overhead imagery data.« less
The iGlobe system is a desktop-based visualization and analysis environment which allows seamless integration of multiple geospatial data sets from varied sources and provides an interface to interactively analyze the different data sets and apply sophisticated data analysis and mining algorithms in a near real time fashion. The framework is highly desirable in domains such as earth and climate sciences where great emphasis is placed on simultaneous analysis of different data sets such as remotemore » sensing images, climate model simulation outputs, and other environmental and demographic databases, to understand weather and climate systems and the impact of climate change in nature and people.« less
Schuster, Andre; Bruno, Kenneth S.; Collett, James R.; Baker, Scott E.; Seiboth, Bernhard; Kubicek, Christian P.; Schmoll, Monika
The ascomycete fungus, Trichoderma reesei (anamorph of Hypocrea jecorina), represents a biotechnological workhorse and is currently one of the most proficient cellulase producers. While strain improvement was traditionally accomplished by random mutagenesis, a detailed understanding of cellulase regulation can only be gained using recombinant technologies. RESULTS: Aiming at high efficiency and high throughput methods, we present here a construction kit for gene knock out in T. reesei. We provide a primer database for gene deletion using the pyr4, amdS and hph selection markers. For high throughput generation of gene knock outs, we constructed vectors using yeast mediated recombination and then transformed a T. reesei strain deficient in non-homologous end joining (NHEJ) by spore electroporation. This NHEJ-defect was subsequently removed by crossing of mutants with a sexually competent strain derived from the parental strain, QM9414.CONCLUSIONS:Using this strategy and the materials provided, high throughput gene deletion in T. reesei becomes feasible. Moreover, with the application of sexual development, the NHEJ-defect can be removed efficiently and without the need for additional selection markers. The same advantages apply for the construction of multiple mutants by crossing of strains with different gene deletions, which is now possible with considerably less hands-on time and minimal screening effort compared to a transformation approach. Consequently this toolkit can considerably boost research towards efficient exploitation of the resources of T. reesei for cellulase expression and hence second generation biofuel production.
Shih, C.Y.; Scown, C.D.; Soibelman, L.; Matthews, H.S.; Garrett, J.H.; Dodrill, K.; McSurdy, S.
Critical infrastructures maintain our society's stability, security, and quality of life. These systems are also interdependent, which means that the disruption of one infrastructure system can significantly impact the operation of other systems. Because of the heavy reliance on electricity production, it is important to assess possible vulnerabilities. Determining the source of these vulnerabilities can provide insight for risk management and emergency response efforts. This research uses data warehousing and visualization techniques to explore the interdependencies between coal mines, rail transportation, and electric power plants. By merging geospatial and nonspatial data, we are able to model the potential impacts of a disruption to one or more mines, rail lines, or power plants, and visually display the results using a geographical information system. A scenario involving a severe earthquake in the New Madrid Seismic Zone is used to demonstrate the capabilities of the model when given input in the form of a potentially impacted area. This type of interactive analysis can help decision makers to understand the vulnerabilities of the coal distribution network and the potential impact it can have on electricity production.
Gleason, Shaun Scott; Ferrell, Regina Kay; Cheriyadat, Anil M; Vatsavai, Raju; Sari-Sarraf, Hamed; Dema, Mesfin A
As a result of increasing geospatial image libraries, many algorithms are being developed to automatically extract and classify regions of interest from these images. However, limited work has been done to compare, validate and verify these algorithms due to the lack of datasets with high accuracy ground truth annotations. In this paper, we present an approach to generate a large number of synthetic images accompanied by perfect ground truth annotation via learning scene statistics from few training images through Maximum Entropy (ME) modeling. The ME model [1,2] embeds a Stochastic Context Free Grammar (SCFG) to model object attribute variations with Markov Random Fields (MRF) with the final goal of modeling contextual relations between objects. Using this model, 3D scenes are generated by configuring a 3D object model to obey the learned scene statistics. Finally, these plausible 3D scenes are captured by ray tracing software to produce synthetic images with the corresponding ground truth annotations that are useful for evaluating the performance of a variety of image analysis algorithms.
Pasha, M. Fayzul K.; Yeasmin, Dilruba; Saetern, Sen; Yang, Majntxov; Kao, Shih -Chieh; Smith, Brennan T.
Hydraulic head and mean annual streamflow, two main input parameters in hydropower resource assessment, are not measured at every point along the stream. Translation and interpolation are used to derive these parameters, resulting in uncertainties. This study estimates the uncertainties and their effects on model output parameters: the total potential power and the number of potential locations (stream-reach). These parameters are quantified through Monte Carlo Simulation (MCS) linking with a geospatial merit matrix based hydropower resource assessment (GMM-HRA) Model. The methodology is applied to flat, mild, and steep terrains. Results show that the uncertainty associated with the hydraulic head ismore » within 20% for mild and steep terrains, and the uncertainty associated with streamflow is around 16% for all three terrains. Output uncertainty increases as input uncertainty increases. However, output uncertainty is around 10% to 20% of the input uncertainty, demonstrating the robustness of the GMM-HRA model. Hydraulic head is more sensitive to output parameters in steep terrain than in flat and mild terrains. Furthermore, mean annual streamflow is more sensitive to output parameters in flat terrain.« less
McCaskey, Alex; Billings, Jay Jay; de Almeida, Valmor F
This report details the progress made in the development of the Reprocessing Plant Toolkit (RPTk) for the DOE Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. RPTk is an ongoing development effort intended to provide users with an extensible, integrated, and scalable software framework for the modeling and simulation of spent nuclear fuel reprocessing plants by enabling the insertion and coupling of user-developed physicochemical modules of variable fidelity. The NEAMS Safeguards and Separations IPSC (SafeSeps) and the Enabling Computational Technologies (ECT) supporting program element have partnered to release an initial version of the RPTk with a focus on software usability and utility. RPTk implements a data flow architecture that is the source of the system's extensibility and scalability. Data flows through physicochemical modules sequentially, with each module importing data, evolving it, and exporting the updated data to the next downstream module. This is accomplished through various architectural abstractions designed to give RPTk true plug-and-play capabilities. A simple application of this architecture, as well as RPTk data flow and evolution, is demonstrated in Section 6 with an application consisting of two coupled physicochemical modules. The remaining sections describe this ongoing work in full, from system vision and design inception to full implementation. Section 3 describes the relevant software development processes used by the RPTk development team. These processes allow the team to manage system complexity and ensure stakeholder satisfaction. This section also details the work done on the RPTk ``black box'' and ``white box'' models, with a special focus on the separation of concerns between the RPTk user interface and application runtime. Section 4 and 5 discuss that application runtime component in more detail, and describe the dependencies, behavior, and rigorous testing of its constituent components.
Spiga, J.; Siegbahn, E. A.; Braeuer-Krisch, E.; Randaccio, P.; Bravin, A.
Theoretical dose distributions for microbeam radiation therapy (MRT) are computed in this paper using the GEANT4 Monte Carlo (MC) simulation toolkit. MRT is an innovative experimental radiotherapy technique carried out using an array of parallel microbeams of synchrotron-wiggler-generated x rays. Although the biological mechanisms underlying the effects of microbeams are still largely unknown, the effectiveness of MRT can be traced back to the natural ability of normal tissues to rapidly repair small damages to the vasculature, and on the lack of a similar healing process in tumoral tissues. Contrary to conventional therapy, in which each beam is at least several millimeters wide, the narrowness of the microbeams allows a rapid regeneration of the blood vessels along the beams' trajectories. For this reason the calculation of the 'valley' dose is of crucial importance and the correct use of MC codes for such purposes must be understood. GEANT4 offers, in addition to the standard libraries, a specialized package specifically designed to deal with electromagnetic interactions of particles with matter for energies down to 250 eV. This package implements two different approaches for electron and photon transport, one based on evaluated data libraries, the other adopting analytical models. These features are exploited to cross-check theoretical computations for MRT. The lateral and depth dose profiles are studied for the irradiation of a 20 cm diameter, 20 cm long cylindrical phantom, with cylindrical sources of different size and energy. Microbeam arrays are simulated with the aid of superposition algorithms, and the ratios of peak-to-valley doses are computed for typical cases used in preclinical assays. Dose profiles obtained using the GEANT4 evaluated data libraries and analytical models are compared with simulation results previously obtained using the PENELOPE code. The results show that dose profiles computed with GEANT4's analytical model are almost
Mandelli, Diego; Smith, Curtis Lee; Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua Joseph
The RISMC approach is developing advanced set of methodologies and algorithms in order to perform Probabilistic Risk Analyses (PRAs). In contrast to classical PRA methods, which are based on Event-Tree and Fault-Tree methods, the RISMC approach largely employs system simulator codes applied to stochastic analysis tools. The basic idea is to randomly perturb (by employing sampling algorithms) timing and sequencing of events and internal parameters of the system codes (i.e., uncertain parameters) in order to estimate stochastic parameters such as core damage probability. This approach applied to complex systems such as nuclear power plants requires to perform a series of computationally expensive simulation runs given a large set of uncertain parameters. These types of analysis are affected by two issues. Firstly, the space of the possible solutions (a.k.a., the issue space or the response surface) can be sampled only very sparsely, and this precludes the ability to fully analyze the impact of uncertainties on the system dynamics. Secondly, large amounts of data are generated and tools to generate knowledge from such data sets are not yet available. This report focuses on the first issue and in particular employs novel methods that optimize the information generated by the sampling process by sampling unexplored and risk-significant regions of the issue space: adaptive (smart) sampling algorithms. They infer system response from surrogate models constructed from existing samples and predict the most relevant location of the next sample. It is therefore possible to understand features of the issue space with a small number of carefully selected samples. In this report, we will present how it is possible to perform adaptive sampling using the RISMC toolkit and highlight the advantages compared to more classical sampling approaches such Monte-Carlo. We will employ RAVEN to perform such statistical analyses using both analytical cases but also another RISMC code: RELAP-7.
Ramanathan, Arvind; Pullum, Laura L; Hobson, Tanner C; Steed, Chad A; Chennubhotla, Chakra; Quinn, Shannon
With novel emerging infectious diseases being reported across different parts of the world, there is a need to build effective bio-surveillance systems that can track, monitor and report such events in a timely manner. Apart from monitoring for emerging disease outbreaks, it is also important to identify susceptible geographic regions and populations where these diseases may have a significant impact. The digitization of health related information through electronic health records (EHR) and electronic healthcare claim reimbursements (eHCR) and the continued growth of self-reported health information through social media provides both tremendous opportunities and challenges in developing novel public health surveillance tools. In this paper, we present an overview of Oak Ridge Bio-surveillance Toolkit (ORBiT), which we have developed specifically to address data analytic challenges in the realm of public health surveillance. In particular, ORBiT provides an extensible environment to pull together diverse, large-scale datasets and analyze them to identify spatial and temporal patterns for various bio-surveillance related tasks. We demonstrate the utility of ORBiT in automatically extracting a small number of spatial and temporal patterns during the 2009-2010 pandemic H1N1 flu season using eHCR data. These patterns provide quantitative insights into the dynamics of how the pandemic flu spread across different parts of the country. We discovered that the eHCR data exhibits multi-scale patterns from which we could identify a small number of states in the United States (US) that act as bridge regions contributing to one or more specific influenza spread patterns. Similar to previous studies, the patterns show that the south-eastern regions of the US were widely affected by the H1N1 flu pandemic. Several of these south-eastern states act as bridge regions, which connect the north-east and central US in terms of flu occurrences. These quantitative insights show how the e
Zhang, Xuesong; Izaurralde, Roberto C.; Manowitz, David H.; Sahajpal, Ritvik; West, Tristram O.; Thomson, Allison M.; Xu, Min; Zhao, Kaiguang; LeDuc, Stephen D.; Williams, Jimmy R.
Accurate quantification and clear understanding of regional scale cropland carbon (C) cycling is critical for designing effective policies and management practices that can contribute toward stabilizing atmospheric CO2 concentrations. However, extrapolating site-scale observations to regional scales represents a major challenge confronting the agricultural modeling community. This study introduces a novel geospatial agricultural modeling system (GAMS) exploring the integration of the mechanistic Environmental Policy Integrated Climate model, spatially-resolved data, surveyed management data, and supercomputing functions for cropland C budgets estimates. This modeling system creates spatially-explicit modeling units at a spatial resolution consistent with remotely-sensed crop identification and assigns cropping systems to each of them by geo-referencing surveyed crop management information at the county or state level. A parallel computing algorithm was also developed to facilitate the computationally intensive model runs and output post-processing and visualization. We evaluated GAMS against National Agricultural Statistics Service (NASS) reported crop yields and inventory estimated county-scale cropland C budgets averaged over 2000–2008. We observed good overall agreement, with spatial correlation of 0.89, 0.90, 0.41, and 0.87, for crop yields, Net Primary Production (NPP), Soil Organic C (SOC) change, and Net Ecosystem Exchange (NEE), respectively. However, we also detected notable differences in the magnitude of NPP and NEE, as well as in the spatial pattern of SOC change. By performing crop-specific annual comparisons, we discuss possible explanations for the discrepancies between GAMS and the inventory method, such as data requirements, representation of agroecosystem processes, completeness and accuracy of crop management data, and accuracy of crop area representation. Based on these analyses, we further discuss strategies to improve GAMS by updating input
Pasha, M. Fayzul K.; Yeasmin, Dilruba; Kao, Shih-Chieh; Hadjerioua, Boualem; Wei, Yaxing; Smith, Brennan T
Even after a century of development, the total hydropower potential from undeveloped rivers is still considered to be abundant in the United States. However, unlike evaluating hydropower potential at existing hydropower plants or non-powered dams, locating a feasible new hydropower plant involves many unknowns, and hence the total undeveloped potential is harder to quantify. In light of the rapid development of multiple national geospatial datasets for topography, hydrology, and environmental characteristics, a merit matrix based geospatial algorithm is proposed to help identify possible hydropower stream-reaches for future development. These hydropower stream-reaches sections of natural streams with suitable head, flow, and slope for possible future development are identified and compared using three different scenarios. A case study was conducted in the Alabama-Coosa-Tallapoosa (ACT) and Apalachicola-Chattahoochee-Flint (ACF) hydrologic subregions. It was found that a merit matrix based algorithm, which is based on the product of hydraulic head, annual mean flow, and average channel slope, can help effectively identify stream-reaches with high power density and small surface inundation. The identified stream-reaches can then be efficiently evaluated for their potential environmental impact, land development cost, and other competing water usage in detailed feasibility studies . Given that the selected datasets are available nationally (at least within the conterminous US), the proposed methodology will have wide applicability across the country.
Zhou, Wei; Minnick, Matthew D; Mattson, Earl D; Geza, Mengistu; Murray, Kyle E.
Oil shale deposits of the Green River Formation (GRF) in Northwestern Colorado, Southwestern Wyoming, and Northeastern Utah may become one of the first oil shale deposits to be developed in the U.S. because of their richness, accessibility, and extensive prior characterization. Oil shale is an organic-rich fine-grained sedimentary rock that contains significant amounts of kerogen from which liquid hydrocarbons can be produced. Water is needed to retort or extract oil shale at an approximate rate of three volumes of water for every volume of oil produced. Concerns have been raised over the demand and availability of water to produce oil shale, particularly in semiarid regions where water consumption must be limited and optimized to meet demands from other sectors. The economic benefit of oil shale development in this region may have tradeoffs within the local and regional environment. Due to these potential environmental impacts of oil shale development, water usage issues need to be further studied. A basin-wide baseline for oil shale and water resource data is the foundation of the study. This paper focuses on the design and construction of a centralized geospatial infrastructure for managing a large amount of oil shale and water resource related baseline data, and for setting up the frameworks for analytical and numerical models including but not limited to three-dimensional (3D) geologic, energy resource development systems, and surface water models. Such a centralized geospatial infrastructure made it possible to directly generate model inputs from the same database and to indirectly couple the different models through inputs/outputs. Thus ensures consistency of analyses conducted by researchers from different institutions, and help decision makers to balance water budget based on the spatial distribution of the oil shale and water resources, and the spatial variations of geologic, topographic, and hydrogeological Characterization of the basin. This endeavor
Karpinets, Tatiana V; Park, Byung; Syed, Mustafa H; Uberbacher, Edward C; Leuze, Michael Rex
The Carbohydrate-Active Enzyme (CAZy) database provides a rich set of manually annotated enzymes that degrade, modify, or create glycosidic bonds. Despite rich and invaluable information stored in the database, software tools utilizing this information for annotation of newly sequenced genomes by CAZy families are limited. We have employed two annotation approaches to fill the gap between manually curated high-quality protein sequences collected in the CAZy database and the growing number of other protein sequences produced by genome or metagenome sequencing projects. The first approach is based on a similarity search against the entire non-redundant sequences of the CAZy database. The second approach performs annotation using links or correspondences between the CAZy families and protein family domains. The links were discovered using the association rule learning algorithm applied to sequences from the CAZy database. The approaches complement each other and in combination achieved high specificity and sensitivity when cross-evaluated with the manually curated genomes of Clostridium thermocellum ATCC 27405 and Saccharophagus degradans 2-40. The capability of the proposed framework to predict the function of unknown protein domains (DUF) and of hypothetical proteins in the genome of Neurospora crassa is demonstrated. The framework is implemented as a Web service, the CAZymes Analysis Toolkit (CAT), and is available at http://cricket.ornl.gov/cgi-bin/cat.cgi.
Ramanathan, Arvind; Pullum, Laura L.; Hobson, Tanner C.; Stahl, Christopher G.; Steed, Chad A.; Valkova, Silvia; Quinn, Shannon; Chennubhotla, Chakra
Here, we describe a data-driven unsupervised machine learning approach to extract geo-temporal co-occurrence patterns of asthma and the flu from large-scale electronic healthcare reimbursement claims (eHRC) datasets. Specifically, we examine the eHRC data from 2009 to 2010 pandemic H1N1 influenza season and analyze whether different geographic regions within the United States (US) showed an increase in co-occurrence patterns of the flu and asthma. Our analyses reveal that the temporal patterns extracted from the eHRC data show a distinct lag time between the peak incidence of the asthma and the flu. While the increased occurrence of asthma contributed to increased flumore » incidence during the pandemic, this co-occurrence is predominant for female patients. The geo-temporal patterns reveal that the co-occurrence of the flu and asthma are typically concentrated within the south-east US. Further, in agreement with previous studies, large urban areas (such as New York, Miami, and Los Angeles) exhibit co-occurrence patterns that suggest a peak incidence of asthma and flu significantly early in the spring and winter seasons. Together, our data-analytic approach, integrated within the Oak Ridge Bio-surveillance Toolkit platform, demonstrates how eHRC data can provide novel insights into co-occurring disease patterns.« less
Wilke, Jeremiah J; Kenny, Joseph P.
Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.
Zhang, Xuesong; Sahajpal, Ritvik; Manowitz, D.; Zhao, Kaiguang; LeDuc, Stephen D.; Xu, Min; Xiong, Wei; Zhang, Aiping; Izaurralde, Roberto C.; Thomson, Allison M.; West, Tristram O.; Post, W. M.
The development of effective measures to stabilize atmospheric CO2 concentration and mitigate negative impacts of climate change requires accurate quantification of the spatial variation and magnitude of the terrestrial carbon (C) flux. However, the spatial pattern and strength of terrestrial C sinks and sources remain uncertain. In this study, we designed a spatially-explicit agroecosystem modeling system by integrating the Environmental Policy Integrated Climate (EPIC) model with multiple sources of geospatial and surveyed datasets (including crop type map, elevation, climate forcing, fertilizer application, tillage type and distribution, and crop planting and harvesting date), and applied it to examine the sensitivity of cropland C flux simulations to two widely used soil databases (i.e. State Soil Geographic-STATSGO of a scale of 1:250,000 and Soil Survey Geographic-SSURGO of a scale of 1:24,000) in Iowa, USA. To efficiently execute numerous EPIC runs resulting from the use of high resolution spatial data (56m), we developed a parallelized version of EPIC. Both STATSGO and SSURGO led to similar simulations of crop yields and Net Ecosystem Production (NEP) estimates at the State level. However, substantial differences were observed at the county and sub-county (grid) levels. In general, the fine resolution SSURGO data outperformed the coarse resolution STATSGO data for county-scale crop-yield simulation, and within STATSGO, the area-weighted approach provided more accurate results. Further analysis showed that spatial distribution and magnitude of simulated NEP were more sensitive to the resolution difference between SSURGO and STATSGO at the county or grid scale. For over 60% of the cropland areas in Iowa, the deviations between STATSGO- and SSURGO-derived NEP were larger than 1MgCha(-1)yr(-1), or about half of the average cropland NEP, highlighting the significant uncertainty in spatial distribution and magnitude of simulated C fluxes resulting from
Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O; Sims, Kelly M; Stewart, Robert N; Urban, Marie L
Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, we propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.
After a couple outings, a principal technologist at Sandia National Laboratories saw a need for a travel kit that would have the necessary tools to make the task of site surveys more manageable and safer. They have had great success using the kit in the field already.
Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs onmore » any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invoked interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less
The software analyzes large time-dependent data sets from fleets of vehicles and their fueling infrastructure to characterize performance metrics including efficiency, durability, fueling rates and usage patterns.
AISLDT is a library of utility functions supporting other AISL software. Code provides various utility functions for Common Lisp, including an object-oriented database, distributed objects, logic query engine, web content management, chart drawing, packet sniffing, text processing, and various data structures.
Spend by State - Fiscal Year 2014 Color Key: More than 1 million 300,000 to 1 million 100,000 to 300,000 5,000 to 100,000 Less than 5,000...
PACT is a set of tools to help software developers create applications that will run on any platform and data that can be written/read on any platform.
This software is a set of tools for the design and analysis of binary optics. It consists of a series of stand-alone programs written in C and some scripts written in an application-specific language interpreted by a CAD program called DW2000. This software can be used to optimize the design and placement of a complex lens array from input to output and produce contours, mask designs, and data exported for diffractive optic analysis.
This package contains a number of systems administration utilities to assist a team of system administrators in managing a computer environment by automating routine tasks and centralizing information. Included are utilities to help install software on a network of computers and programs to make an image of a disk drive, to manage and distribute configuration files for a number of systems, and to run self-testss on systems, as well as an example of using amore » database to manage host information and various utilities.« less
& Events Expand News & Events Skip navigation links Residential Residential Lighting Energy Star Appliances Consumer Electronics Heat Pump Water Heaters Electric Storage Water...
This report is a user manual for an ASCII file of Fortran source code which must be compiled before use. The software will assist in creating plastic models of molecules whose specifications are described in the Brookhaven Protein Databank. Other data files can be used if they are in the same format as the files in the databank. The output file is a program for a 3-D Systems Stereolithography Apparatus and the program is run on a SGI Indigo workstation.
LMAT is designed to take as input a collection of raw metagenomic sequencer reads, and search each read against a reference genome database and assign a taxonomic label and confidence value to each read and report a summary of the predicted taxonomic contents of the metagenomic sample.
Residential energy efficiency programs are delivered by many different types of organizations and their partners, including utilities, state and local governments, nonprofit organizations, and for-profit companies, but no matter which sector delivers the program, the need to work in partnership with different entities can make or break program success.
Ramanathan, Arvind; Pullum, Laura L; Steed, Chad A; Chennubhotla, Chakra; Quinn, Shannon
In this position paper, we describe the design and implementation of the Oak Ridge Bio-surveillance Toolkit (ORBiT): a collection of novel statistical and machine learning tools implemented for (1) integrating heterogeneous traditional (e.g. emergency room visits, prescription sales data, etc.) and non-traditional (social media such as Twitter and Instagram) data sources, (2) analyzing large-scale datasets and (3) presenting the results from the analytics as a visual interface for the end-user to interact and provide feedback. We present examples of how ORBiT can be used to summarize ex- tremely large-scale datasets effectively and how user interactions can translate into the data analytics process for bio-surveillance. We also present a strategy to estimate parameters relevant to dis- ease spread models from near real time data feeds and show how these estimates can be integrated with disease spread models for large-scale populations. We conclude with a perspective on how integrating data and visual analytics could lead to better forecasting and prediction of disease spread as well as improved awareness of disease susceptible regions.
Lau, A; Chen, Y; Ahmad, S
Purpose: Proton therapy exhibits several advantages over photon therapy due to depth-dose distributions from proton interactions within the target material. However, uncertainties associated with protons beam range in the patient limit the advantage of proton therapy applications. To quantify beam range, positron-emitting nuclei (PEN) and prompt gamma (PG) techniques have been developed. These techniques use de-excitation photons to describe the location of the beam in the patient. To develop a detector system for implementing the PG technique for range verification applications in proton therapy, we studied the yields, energy and angular distributions of the secondary particles emitted from a PMMA phantom. Methods: Proton pencil beams of various energies incident onto a PMMA phantom with dimensions of 5 x 5 x 50 cm3 were used for simulation with the Geant4 toolkit using the standard electromagnetic packages as well as the packages based on the binary-cascade nuclear model. The emitted secondary particles are analyzed . Results: For 160 MeV incident protons, the yields of secondary neutrons and photons per 100 incident protons were ~6 and ~15 respectively. Secondary photon energy spectrum showed several energy peaks in the range between 0 and 10 MeV. The energy peaks located between 4 and 6 MeV were attributed to originate from direct proton interactions with 12C (~ 4.4 MeV) and 16O (~ 6 MeV), respectively. Most of the escaping secondary neutrons were found to have energies between 10 and 100 MeV. Isotropic emissions were found for lower energy neutrons (<10 MeV) and photons for all energies, while higher energy neutrons were emitted predominantly in the forward direction. The yields of emitted photons and neutrons increased with the increase of incident proton energies. Conclusions: A detector system is currently being developed incorporating the yields, energy and angular distributions of secondary particles from proton interactions obtained from this study.
Access the software at http:antfarm.rubyforge.org. National SCADA Test Bed Sandia is a ... DOE National SCADA Test Bed (NSTB) NSTB is a multi-laboratory resource that partners with ...
MATK provides basic functionality to facilitate model analysis within the Python computational environment. Model analysis setup within MATK includes: - define parameters - define observations - define model (python function) - define samplesets (sets of parameter combinations) Currently supported functionality includes: - forward model runs - Latin-Hypercube sampling of parameters - multi-dimensional parameter studies - parallel execution of parameter samples - model calibration using internal Levenberg-Marquardt algorithm - model calibration using lmfit package - modelmore » calibration using levmar package - Markov Chain Monte Carlo using pymc package MATK facilitates model analysis using: - scipy - calibration (scipy.optimize) - rpy2 - Python interface to R« less
Performance-based Service Acquisition (PBA) means an acquisition structured around the results to be achieved as opposed to the manner by which the work is to be performed.
This section provides links to previous successful workplace charging events. These link directly to the organization’s website and contain event agendas and presentation materials.
Design incentives that motivate potential customers to act by lowering the risk, decreasing the cost, or offering additional benefits of home energy upgrades.
U.S. Renewable Energy Technical Potentials: A GIS-Based Analysis This study summarizes the ... (tools, maps, data): Dynamic Maps, GIS Data and Analysis Tools website provides ...
The Pacific Northwest National Laboratory (PNNL) has developed a prototype stream extraction algorithm that semi-automatically extracts and characterizes streams using a variety of multisensor imagery and digital terrain elevation data (DTEDÃÂ¯ÃÂÃÂ¢) data. The system is currently optimized for three types of single-band imagery: radar, visible, and thermal. Method of Solution: DRAGON: (1) classifies pixels into clumps of water objects based on the classification of water pixels by spectral signatures and neighborhood relationships, (2) uses themore » morphology operations (erosion and dilation) to separate out large lakes (or embayment), isolated lakes, ponds, wide rivers and narrow rivers, and (3) translates the river objects into vector objects. In detail, the process can be broken down into the following steps. A. Water pixels are initially identified using on the extend range and slope values (if an optional DEM file is available). B. Erode to the distance that defines a large water body and then dilate back. The resulting mask can be used to identify large lake and embayment objects that are then removed from the image. Since this operation be time consuming it is only performed if a simple test (i.e. a large box can be found somewhere in the image that contains only water pixels) that indicates a large water body is present. C. All water pixels are ÃÂ¢ÃÂÃÂclumpedÃÂ¢ÃÂÃÂ (in Imagine terminology clumping is when pixels of a common classification that touch are connected) and clumps which do not contain pure water pixels (e.g. dark cloud shadows) are removed D. The resulting true water pixels are clumped and water objects which are too small (e.g. ponds) or isolated lakes (i.e. isolated objects with a small compactness ratio) are removed. Note that at this point lakes have been identified has a byproduct of the filtering process and can be output has vector layers if needed. E. At this point only river pixels are left in the image. To separate out wide rivers all objects in the image are eroded by the half width of narrow rivers. This causes all narrow rivers to be removed and leaves only the core of wide rivers. This core is dilated out by the same distance to create a mask that is used with the original river image to separate out rivers into two separate images of narrow rivers and wide rivers F. If in the image that contains wide rivers there are small isolated short (less than 300 meters if NGA criteria is used) segments these segments are transferred to the narrow river file in order to be treated has parts of single line rivers G. The narrow river file is optionally dilated and eroded. This ÃÂ¢ÃÂÃÂclosingÃÂ¢ÃÂÃÂ has the effect of removing small islands, filling small gaps, and smoothing the outline H. The user also has the option of ÃÂ¢ÃÂÃÂclosingÃÂ¢ÃÂÃÂ objects in the wide river file. However, this depends on the degree to which the user wants to remove small islands in the large rivers. I. To make the translation from raster to single vector easier the objects in the narrow river image are reduced to a single center line (i.e. thinned) with binary morphology operations.« less
energy infrastructure, demographics and land ownership, and the earth's physical geography (topography, land use, rivers). NREL's geographic information system models enable...
Groups Content Group Activity By term Q & A Feeds Share your own status updates, and follow the updates & activities of others by creating your own account. Or, remember to log in...
Patil, Shrikant; Moeys, Sara; von Dassow, Peter; Huysman, Marie J. J.; Mapleson, Daniel; De Veylder, Lieven; Sanges, Remo; Vyverman, Wim; Montresor, Marina; Ferrante, Maria Immacolata
Sexual reproduction is an obligate phase in the life cycle of most eukaryotes. Meiosis varies among organisms, which is reflected by the variability of the gene set associated to the process. Diatoms are unicellular organisms that belong to the stramenopile clade and have unique life cycles that can include a sexual phase. The exploration of five diatom genomes and one diatom transcriptome led to the identification of 42 genes potentially involved in meiosis. While these include the majority of known meiosis-related genes, several meiosis-specific genes, including DMC1, could not be identified. Furthermore, phylogenetic analyses supported gene identification and revealed ancestralmore » loss and recent expansion in the RAD51 family in diatoms. The two sexual species Pseudo-nitzschia multistriata and Seminavis robusta were used to explore the expression of meiosis-related genes: RAD21, SPO11-2, RAD51-A, RAD51-B and RAD51-C were upregulated during meiosis, whereas other paralogs in these families showed no differential expression patterns, suggesting that they may play a role during vegetative divisions. An almost identical toolkit is shared among Pseudo-nitzschia multiseries and Fragilariopsis cylindrus, as well as two species for which sex has not been observed, Phaeodactylum tricornutum and Thalassiosira pseudonana, suggesting that these two may retain a facultative sexual phase. Lastly, our results reveal the conserved meiotic toolkit in six diatom species and indicate that Stramenopiles share major modifications of canonical meiosis processes ancestral to eukaryotes, with important divergences in each Kingdom.« less
The OCIO coordinates a variety of internal cybersecurity awareness campaigns to provide DOE employees with timely information on current cyber threats, recommended mitigations, and sound practices. The OCIO also develops and distributes cyber awareness information and resources to enhance employees' general knowledge of cybersecurity practices, policies, and terms.
Modeling the Global Trade and Environmental Impacts of Biofuel Policies Modified Microgrid Concept for Rural Electrification in Africa NREL-How to Estimate the Economic...
wind energy costs and impacts to neighbors and the environment. At the same time, the benefits of wind energy and diversity of possible applications have continued to increase....
This goal of this project was to develop cyber security audit and attack detection tools for industrial control systems (ICS). Digital Bond developed and released a tool named Bandolier that audits ICS components commonly used in the energy sector against an optimal security configuration. The Portaledge Project developed a capability for the PI Historian, the most widely used Historian in the energy sector, to aggregate security events and detect cyber attacks.
Antigua and Barbuda-Regional Implementation Plan for CARICOM's Climate Change Resilience Framework Bahamas-Regional Implementation Plan for CARICOM's Climate Change...
... Strategies Whole Building Retrofit Tool Open Energy Information System Energy Management Package HVAC: Rooftop Units HVAC: Heat Pumps HVAC: Boilers HVAC: Air Cooled Elec. ...
Invite employers in your community that already have charging to speak on an employer experience panel. File General Speaker Outreach Letter Template File Clean Cities Branded ...