National Library of Energy BETA

Sample records for outage counts high-side

  1. Outage Log

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Queue Look Scheduled Outages Outage Log Science Gateway Status Login Node Status ... It is a historical record and may not be updated while a system event is in progress. ...

  2. NERSC Scheduled System Outages

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scheduled System Outages NERSC Scheduled System Outages Last edited: 2016-04-29 11:35:00

  3. Improving Outage Performance: Outage Optimization Process

    SciTech Connect (OSTI)

    LaPlatney, Jere J.

    2006-07-01

    Planned outage performance is a key measure of how well an Nuclear Power Plant (NPP) is operated. Performance during planned outages strongly affects virtually all of a plant's performance metrics. In recognition of this fact, NPP operators worldwide have and continue to focus on improving their outage performance. The process of improving outage performance is commonly referred to as 'Outage Optimization' in the industry. This paper starts with a summary of the principles of Outage Optimization. It then provides an overview of a process in common use in the USA and elsewhere to manage the improvement of planned outages. The program described is comprehensive in that it involves managing improvement in both the Preparation and Execution phases of outage management. (author)

  4. Systems Outage Notification Policy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    the need for a maintenance event window no less than 24 hours in advance of the outage (emergency fixes). Users will be notified of regularly scheduled maintenance in advance, on ...

  5. Shopping for outage management systems

    SciTech Connect (OSTI)

    Chou, Y.C.; Konneker, L.K.; Watkins, T.R.

    1995-12-31

    Customer service is becoming increasingly important to electric utilities. Outage management is an important part of customer service. Good outage management means quickly responding to outages and keeping customers informed about outages. Each outage equals lost customer satisfaction and lost revenue. Outage management is increasingly important because of new competition among utilities for customers, pressure from regulators, and internal pressure to cut costs. The market has several existing software products for outage management. How does a utility judge whether these products satisfy their specific needs? Technology is changing rapidly to support outage management. Which technology is proven and cost-effective? The purpose of this paper is to outline the procedure for evaluating outage management systems, and to discuss the key features to look for. It also gives our opinion of the features that represent state of the art. This paper will not discuss specific products or list vendors names.

  6. Track NERSC Outages in Google Calendar

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Outages in Google Calendar Track NERSC Outages in Google Calendar March 22, 2013 by Jack Deslippe Outages are now available in Google calendar form. You can subscribe to this...

  7. OutageMapURL Phases Energy Services

    Open Energy Info (EERE)

    OutageMapURL Phases Energy Services County Electric Power Assn http outages county org A N Electric Coop Virginia AEP Generating Company https www aepaccount com zipr...

  8. outages | OpenEI Community

    Open Energy Info (EERE)

    outages Home Graham7781's picture Submitted by Graham7781(2017) Super contributor 29 October, 2012 - 14:46 East Coast Utilities prepare for Hurricane Sandy East Coast Hurricane...

  9. August 14, 2003 Power Outages … Announcement

    Broader source: Energy.gov (indexed) [DOE]

    Ellen P. Vancko evancko@nerc.com Power Outage Update 8162003 11 a.m. EDT The bulk ... will continue to experience rotating outages due to generating capacity availability. ...

  10. RESOLVED: Projectb filesystem outage July 9, 2012

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    filesystem outage July 9, 2012 July 9, 2012 The projectb filesystem had a hardware failure that potentially generated IO errors. The filesystem logs indicate that the...

  11. North American Electric Reliability Council Outage Announcement...

    Broader source: Energy.gov (indexed) [DOE]

    Recommendations Blackout 2003: Blackout Final Implementation Report U.S. - Canada Power System Outage Task Force: Final Report on the Implementation of Task Force Recommendations

  12. Property:OutagePhoneNumber | Open Energy Information

    Open Energy Info (EERE)

    OutagePhoneNumber Jump to: navigation, search Property Name OutagePhoneNumber Property Type String Description An outage hotline or 24-hour customer service number Note: uses...

  13. GUIDELINES FOR IMPLEMENTATION OF AN ADVANCED OUTAGE CONTROL CENTER TO IMPROVE OUTAGE COORDINATION, PROBLEM RESOLUTION, AND OUTAGE RISK MANAGEMENT

    SciTech Connect (OSTI)

    Germain, Shawn St; Farris, Ronald; Whaley, April M; Medema, Heather; Gertman, David

    2014-09-01

    This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE) and performed in close collaboration with industry R&D programs that provide the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. Managing NPP outages is a complex and difficult task due to the large number of maintenance and repair activities that are accomplished in a short period of time. During an outage, the outage control center (OCC) is the temporary command center for outage managers and provides several critical functions for the successful execution of the outage schedule. Essentially, the OCC functions to facilitate information inflow, assist outage management in processing information, and to facilitate the dissemination of information to stakeholders. Currently, outage management activities primarily rely on telephone communication, face to face reports of status, and periodic briefings in the OCC. It is a difficult task to maintain current the information related to outage progress and discovered conditions. Several advanced communication and collaboration technologies have shown promise for facilitating the information flow into, across, and out of the OCC. The use of these technologies will allow information to be shared electronically, providing greater amounts of real-time information to the decision makers and allowing OCC coordinators to meet with supporting staff remotely. Passively monitoring status electronically through advances in the areas of mobile worker technologies, computer-based procedures, and automated work packages will reduce the current reliance on manually

  14. Managing turbine-generator outages by computer

    SciTech Connect (OSTI)

    Reinhart, E.R. [Reinhart and Associates, Inc., Austin, TX (United States)

    1997-09-01

    This article describes software being developed to address the need for computerized planning and documentation programs that can help manage outages. Downsized power-utility companies and the growing demand for independent, competitive engineering and maintenance services have created a need for a computer-assisted planning and technical-direction program for turbine-generator outages. To meet this need, a software tool is now under development that can run on a desktop or laptop personal computer to assist utility personnel and technical directors in outage planning. Total Outage Planning Software (TOPS), which runs on Windows, takes advantage of the mass data storage available with compact-disc technology by archiving the complete outage documentation on CD. Previous outage records can then be indexed, searched, and viewed on a computer with the click of a mouse. Critical-path schedules, parts lists, parts order tracking, work instructions and procedures, custom data sheets, and progress reports can be generated by computer on-site during an outage.

  15. Property:OutageMapURL | Open Energy Information

    Open Energy Info (EERE)

    + Agralite Electric Coop + https:pyxis-oms.comOutageMapAgraliteOutageMap.html + Alfalfa Electric Coop, Inc + https:ebill.alfalfaelectric.comwoViewermapviewer.html?config...

  16. A Review of Power Outages and Restoration Following the June...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    A Review of Power Outages and Restoration Following the June 2012 Derecho A Review of Power Outages and Restoration Following the June 2012 Derecho This report reviews power ...

  17. North American Electric Reliability Council Power Outage Update...

    Office of Environmental Management (EM)

    will continue to experience rotating outages due to generating capacity availability. North American Electric Reliability Council Power Outage Update (48.2 KB) More Documents & ...

  18. Homeowners: Respond to Power Outages | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Guidelines Homeowners: Respond to Power Outages Homeowners: Respond to Power Outages ... Learn more Certify your electrical systems-If your house sustains flood or wind damage ...

  19. Track NERSC Outages in Google Calendar

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Track NERSC Outages in Google Calendar Track NERSC Outages in Google Calendar March 22, 2013 by Jack Deslippe Outages are now available in Google calendar form. You can subscribe to this calendar by following the link, http://goo.gl/A4n3k, and then clicking the add button on the bottom right. If you find any issues with the calendar content, please contact NERSC consultants by email at consult(at)nersc.gov. Subscribe via RSS Subscribe Browse by Date August 2016 June 2016 May 2016 April 2016

  20. Potomac River Project Outage Schedule Clarification | Department...

    Office of Environmental Management (EM)

    Re: Potomac River Generating Station Department of Energy, Case No. EO-05-01: Potomac Electric Power Company (PEPCO) evised plan for transmission outages for the 230 kV circuits ...

  1. Advanced Test Reactor outage risk assessment

    SciTech Connect (OSTI)

    Thatcher, T.A.; Atkinson, S.A.

    1997-12-31

    Beginning in 1997, risk assessment was performed for each Advanced Test Reactor (ATR) outage aiding the coordination of plant configuration and work activities (maintenance, construction projects, etc.) to minimize the risk of reactor fuel damage and to improve defense-in-depth. The risk assessment activities move beyond simply meeting Technical Safety Requirements to increase the awareness of risk sensitive configurations, to focus increased attention on the higher risk activities, and to seek cost-effective design or operational changes that reduce risk. A detailed probabilistic risk assessment (PRA) had been performed to assess the risk of fuel damage during shutdown operations including heavy load handling. This resulted in several design changes to improve safety; however, evaluation of individual outages had not been performed previously and many risk insights were not being utilized in outage planning. The shutdown PRA provided the necessary framework for assessing relative and absolute risk levels and assessing defense-in-depth. Guidelines were written identifying combinations of equipment outages to avoid. Screening criteria were developed for the selection of work activities to receive review. Tabulation of inherent and work-related initiating events and their relative risk level versus plant mode has aided identification of the risk level the scheduled work involves. Preoutage reviews are conducted and post-outage risk assessment is documented to summarize the positive and negative aspects of the outage with regard to risk. The risk for the outage is compared to the risk level that would result from optimal scheduling of the work to be performed and to baseline or average past performance.

  2. Outage project productivity improvement of TVA fossil

    SciTech Connect (OSTI)

    Picard, H.E.; Seay, C.R. Jr.

    1996-10-01

    Competition in the utility industry forces management to look closely at the cost effectiveness of power plant outage projects. At TVA Fossil and Hydro Power, innovative work measurement is proving effective as a project management tool to do more with less. Labor-hours to complete outage work scopes are reduced by some 20 to 30%, not by working harder or sacrificing safety, or quality, but by working and managing smarter. Fossil power plant outages and shutdowns are costly. They are labor-intensive construction projects, often with expanding work scope, and executed on a fast track. Outage work is inherently complex and dynamic, and often unpredictable. Many activities and tasks must be integrated, coordinated and completed safely and efficiently by multiple crafts and work groups. As a result, numerous productivity factors can influence the cost and schedule of outage completion. This provides owners, contractors and labor with unique opportunities for competitive advantage--by making radical changes in how they manage labor-hours and time.

  3. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect (OSTI)

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  4. Outage management and health physics issue, 2007

    SciTech Connect (OSTI)

    Agnihotri, Newal

    2007-05-15

    The focus of the May-June issue is on outage management and health physics. Major articles/reports in this issue include: India: a potential commercial opportunity, a U.S. Department of Commerce Report, by Joe Neuhoff and Justin Rathke; The changing climate for nuclear energy, by Skip Bowman, Nuclear Energy Insitute; Selecting protective clothing, by J. Mark Price, Southern California Edison; and Succssful refurbishment outage, by Sudesh K. Gambhir, Omaha Public Power District. Industry innovation articles in this issue are: Containment radiation monitoring spiking, by Michael W. Lantz and Robert Routolo, Arizona Public Service Company; Improved outage performance, by Michael Powell and Troy Wilfong, Arizona Public Service Company, Palo Verde Nuclear Generating Station; Stop repacking valves and achieve leak-free performance, by Kenneth Hart, PPL Susquehanna LLC; and Head assembly upgrade package, by Timothy Petit, Dominion Nuclear.

  5. Outage managment and health physics issue, 2008

    SciTech Connect (OSTI)

    Agnihotri, Newal

    2008-05-15

    The focus of the May-June issue is on outage management and health physics. Major articles include: Outage optimization initiatives, by George B. Beam, AREVA NP, Inc.; New plant based on excellent track records, by Jim Scarola, Progress Energy; Meeting customer needs and providing environmental benefits, by Peter S. Hastings, Duke Energy; Plants with 3-D design, by Jack A. Bailey, Tennessee Valley Authority; and Highest quality with exceptional planning, by Jason A. Walls, Duke Energy. Industry innovation articles include: Integrated exposure reduction plan, by Ed Wolfe, Exelon; Performance-based radiation worker training, by Joe Giuffre and Timothy Vriezerma, American Electric Power.

  6. RESOLVED: Projectb filesystem outage July 9, 2012

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    RESOLVED: Projectb filesystem outage July 9, 2012 RESOLVED: Projectb filesystem outage July 9, 2012 July 9, 2012 The projectb filesystem had a hardware failure that potentially generated I/O errors. The filesystem logs indicate that the earliest abnormal event on the filesystem occurred at 9:19AM and the filesystem was taken down for maintenance at 10:42AM. The filesystem returned to service at 11:20AM. Jobs running on the cluster would not have been able to read from or write to the projectb

  7. Refinery Outages: First-Half 2016

    U.S. Energy Information Administration (EIA) Indexed Site

    Outages: First-Half 2016 March 2016 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | Refinery Outages: First-Half 2016 i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other officer or employee of the United States

  8. Benchmark Report on Key Outage Attributes: An Analysis of Outage Improvement Opportunities and Priorities

    SciTech Connect (OSTI)

    Germain, Shawn St.; Farris, Ronald

    2014-09-01

    Advanced Outage Control Center (AOCC), is a multi-year pilot project targeted at Nuclear Power Plant (NPP) outage improvement. The purpose of this pilot project is to improve management of NPP outages through the development of an AOCC that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report documents the results of a benchmarking effort to evaluate the transferability of technologies demonstrated at Idaho National Laboratory and the primary pilot project partner, Palo Verde Nuclear Generating Station. The initial assumption for this pilot project was that NPPs generally do not take advantage of advanced technology to support outage management activities. Several researchers involved in this pilot project have commercial NPP experience and believed that very little technology has been applied towards outage communication and collaboration. To verify that the technology options researched and demonstrated through this pilot project would in fact have broad application for the US commercial nuclear fleet, and to look for additional outage management best practices, LWRS program researchers visited several additional nuclear facilities.

  9. Hopper compilers and DDT short outage next Wed, May 16

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    compilers and DDT short outage next Wed, May 16 Hopper compilers and DDT short outage next Wed, May 16 May 10, 2012 Due to a scheduled maintenance for the License Servers, most of...

  10. Outage management and health physics issue, 2009

    SciTech Connect (OSTI)

    Agnihotri, Newal

    2009-05-15

    The focus of the May-June issue is on outage management and health physics. Major articles include the following: Planning and scheduling to minimize refueling outage, by Pat McKenna, AmerenUE; Prioritizing safety, quality and schedule, by Tom Sharkey, Dominion; Benchmarking to high standards, by Margie Jepson, Energy Nuclear; Benchmarking against U.S. standards, by Magnox North, United Kingdom; Enabling suppliers for new build activity, by Marcus Harrington, GE Hitachi Nuclear Energy; Identifying, cultivating and qualifying suppliers, by Thomas E. Silva, AREVA NP; Creating new U.S. jobs, by Francois Martineau, Areva NP. Industry innovation articles include: MSL Acoustic source load reduction, by Amir Shahkarami, Exelon Nuclear; Dual Methodology NDE of CRDM nozzles, by Michael Stark, Dominion Nuclear; and Electronic circuit board testing, by James Amundsen, FirstEnergy Nuclear Operating Company. The plant profile article is titled The future is now, by Julia Milstead, Progress Energy Service Company, LLC.

  11. Outage management and health physics issue, 2006

    SciTech Connect (OSTI)

    Agnihotri, Newal

    2006-05-15

    The focus of the May-June issue is on outage management and health physics. Major articles/reports in this issue include: A design with experience for the U.S., by Michael J. Wallace, Constellation Generation Group; Hope to be among the first, by Randy Hutchinson, Entergy Nuclear; Plans to file COLs in 2008, by Garry Miller, Progress Energy; Evolution of ICRP's recommendations, by Lars-Erik Holm, ICRP; European network on education and training in radiological protection, by Michele Coeck, SCK-CEN, Belgium; Outage managment: an important tool for improving nuclear power plant performance, by Thomas Mazour and Jiri Mandula, IAEA, Austria; and Plant profile: Exploring new paths to excellence, by Anne Thomas, Exelon Nuclear.

  12. Advanced Outage and Control Center: Strategies for Nuclear Plant Outage Work Status Capabilities

    SciTech Connect (OSTI)

    Gregory Weatherby

    2012-05-01

    The research effort is a part of the Light Water Reactor Sustainability (LWRS) Program. LWRS is a research and development program sponsored by the Department of Energy, performed in close collaboration with industry to provide the technical foundations for licensing and managing the long-term, safe and economical operation of current nuclear power plants. The LWRS Program serves to help the US nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The Outage Control Center (OCC) Pilot Project was directed at carrying out the applied research for development and pilot of technology designed to enhance safe outage and maintenance operations, improve human performance and reliability, increase overall operational efficiency, and improve plant status control. Plant outage management is a high priority concern for the nuclear industry from cost and safety perspectives. Unfortunately, many of the underlying technologies supporting outage control are the same as those used in the 1980’s. They depend heavily upon large teams of staff, multiple work and coordination locations, and manual administrative actions that require large amounts of paper. Previous work in human reliability analysis suggests that many repetitive tasks, including paper work tasks, may have a failure rate of 1.0E-3 or higher (Gertman, 1996). With between 10,000 and 45,000 subtasks being performed during an outage (Gomes, 1996), the opportunity for human error of some consequence is a realistic concern. Although a number of factors exist that can make these errors recoverable, reducing and effectively coordinating the sheer number of tasks to be performed, particularly those that are error prone, has the potential to enhance outage efficiency and safety. Additionally, outage management requires precise coordination of work groups that do not always share similar objectives. Outage

  13. How individual traces and interactive timelines could support outage execution - Toward an outage historian concept

    SciTech Connect (OSTI)

    Parfouru, S.; De-Beler, N.

    2012-07-01

    In the context of a project that is designing innovative ICT-based solutions for the organizational concept of outage management, we focus on the informational process of the OCR (Outage Control Room) underlying the execution of the outages. Informational process are based on structured and unstructured documents that have a key role in the collaborative processes and management of the outage. We especially track the structured and unstructured documents, electronically or not, from creation to sharing. Our analysis allows us to consider that the individual traces produced by an individual participant with a specific role could be multi-purpose and support sharing between participants without creating duplication of work. The ultimate goal is to be able to generate an outage historian, that is not just focused on highly structured information, which could be useful to improve the continuity of information between participants. We study the implementation of this approach through web technologies and social media tools to address this issue. We also investigate the issue of data access through interactive visualization timelines coupled with other modality's to assist users in the navigation and exploration of the proposed historian. (authors)

  14. Application of Standard Maintenance Windows in PHWR Outage

    SciTech Connect (OSTI)

    Fuming Jiang

    2006-07-01

    The concept of Standard Maintenance Windows has been widely used in the planned outage of light water reactor in the world. However, due to the specific feature of Pressurized Heavy Water Reactor (PHWR), it has not come to a consensus for the PHWR owners to adopt Standard Maintenance Windows for planned outage aiming at the optimization of outage duration. Third Qinshan Nuclear Power Company (TQNPC), with their experience gained in the previous outages and with reference to other PHWR power plants, has identified a set of Standard Maintenance Windows for planned outage. It can be applied to similar PHWR plants and with a few windows that are specific to Qinshan Phase III NPP. The use of these Standard Maintenance Windows in planned outage has been proved to be effective in control shutdown nuclear safety, minimize the unavailability of safety system, improve the efficient utilization of outage duration, and improved the flexibility of outage schedule in the case of emergency issue, which forced the revision of outage schedule. It has also formed a solid foundation for benchmarking. The identification of Standard Maintenance Windows and its application will be discussed with relevant cases for the common improvement of outage duration. (author)

  15. Plant maintenance and outage management issue, 2005

    SciTech Connect (OSTI)

    Agnihotri, Newal (ed.)

    2005-01-15

    The focus of the January-February issue is on plant maintenance and outage managment. Major articles/reports in this issue include: Dawn of a new era, by Joe Colvin, Nuclear Energy Institute (NEI); Plant profile: Beloyarsk NPP, Russia, by Nikolai Oshkanov, Beloyarsk NPP, Russia; Improving economic performance, by R. Spiegelberg-Planner, John De Mella, and Marius Condu, IAEA; A model for improving performance, by Pet Karns, MRO Software; ASME codes and standards, by Shannon Burke, ASME International; and, Refurbishment programs, by Craig S. Irish, Nuclear Logistics, Inc.

  16. Outlook for Refinery Outages and Available Refinery Capacity...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    of gasoline and distillate, and to include a more detailed consideration of the impact of unexpected outages on product supplies. This report reviews the potential...

  17. Homeowners: Respond to Power Outages | Department of Energy

    Broader source: Energy.gov (indexed) [DOE]

    Power Outages After a disaster, electric utilities and government officials will first work to restore power to critical infrastructure like power plants and transmission lines, ...

  18. A Review of Power Outages and Restoration Following the June...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    August 2012 A Review of Power Outages and Restoration Following the June 2012 Derecho Infrastructure Security and Energy Restoration Office of Electricity Delivery and Energy ...

  19. Outlook for Refinery Outages and Available Refinery Capacity...

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    level of refinery outages outlined in this report. This report does not consider the impacts of refined product logistics and distribution, which could affect the movement of...

  20. SAS Output

    U.S. Energy Information Administration (EIA) Indexed Site

    B. U.S. Transformer Sustained Automatic Outage Counts and Hours by High-Voltage Size and NERC Region, 2013 Sustained Automatic Outage Counts High-Side Voltage (kV) Eastern...

  1. Development of Improved Graphical Displays for an Advanced Outage Control Center, Employing Human Factors Principles for Outage Schedule Management

    SciTech Connect (OSTI)

    St Germain, Shawn Walter; Farris, Ronald Keith; Thomas, Kenneth David

    2015-09-01

    The long-term viability of existing nuclear power plants in the United States (U.S.) is dependent upon a number of factors, including maintaining high capacity factors, maintaining nuclear safety, and reducing operating costs, particularly those associated with refueling outages. Refueling outages typically take 20-30 days, and for existing light water NPPs in the U.S., the reactor cannot be in operation during the outage. Furthermore, given that many NPPs generate between $1-1.5 million/day in revenue when in operation, there is considerable interest in shortening the length of refueling outages. Yet refueling outages are highly complex operations, involving multiple concurrent and dependent activities that are somewhat challenging to coordinate; therefore, finding ways to improve refueling outage performance, while maintaining nuclear safety has proven to be difficult. The Advanced Outage Control Center (AOCC) project is a research and development (R&D) demonstration activity under the LWRS Program. LWRS is an R&D program that works closely with industry R&D programs to establish technical foundations for the licensing and managing of long-term, safe, and economical operation of current fleet of NPPs. As such, the LWRS Advanced Outage Control Center project has the goal of improving the management of commercial NPP refueling outages. To accomplish this goal, INL is developing an advanced outage control center (OCC) that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. The overall focus is on developing an AOCC with the following capabilities that enables plant and OCC staff to; Collaborate in real-time to address emergent issues; Effectively communicate outage status to all workers involved in the outage; Effectively communicate discovered conditions in the field to the OCC; Provide real-time work status; Provide automatic pending support notifications

  2. Fermi 2: Independent safety assessment of refueling outage

    SciTech Connect (OSTI)

    Arora, H.O. [Detroit Edison, MI (United States)

    1994-12-31

    Industry experience and studies conducted by the U.S. Nuclear Regulatory Commission (NRC) have shown that plants are susceptible to a variety of events that can challenge safety during shutdowns. While these events have neither posed nor indicated an undue risk to public health and safety, they do serve to underscore the importance of effective outage planning and control. The NUMARC 91-06 guidelines suggest that proper planning and execution of outage activities can reduce the likelihood and consequences of events, which ultimately enhances safety during shutdown. The Fermi 2, Independent Safety Engineering Group (ISEG) is charged with the independent safety review of the refueling outage plan and its implementation. The ISEG is responsible for performing a detailed and critical review of proposed outage plan prior to the start of the outage, maintaining surveillance of the adequacy and consistency of the {open_quotes}defense-in-depth{close_quotes} provided during the outage, reviewing the outage plan changes for potential vulnerabilities that could affect safety functions, and investigating selected events that emerge during the course of the outage.

  3. Analysis of scrams and forced outages at boiling water reactors

    SciTech Connect (OSTI)

    Earle, R. T.; Sullivan, W. P.; Miller, K. R.; Schwegman, W. J.

    1980-07-01

    This report documents the results of a study of scrams and forced outages at General Electric Boiling Water Reactors (BWRs) operating in the United States. This study was conducted for Sandia Laboratories under a Light Water Reactor Safety Program which it manages for the United States Department of Energy. Operating plant data were used to identify the causes of scrams and forced outages. Causes of scrams and forced outages have been summarized as a function of operating plant and plant age and also ranked according to the number of events per year, outage time per year, and outage time per event. From this ranking, identified potential improvement opportunities were evaluated to determine the associated benefits and impact on plant availability.

  4. A stochastic model for the measurement of electricity outage costs

    SciTech Connect (OSTI)

    Grosfeld-Nir, A.; Tishler, A. (Tel Aviv Univ. (Israel))

    1993-01-01

    The measurement of customer outage costs has recently become an important subject of research for electric utilities. This paper uses a stochastic dynamic model as the starting point in developing a market-based method for the evaluation of outage costs. Specifically, the model postulates that once an electricity outage occurs, all production activity stops. Full production is resumed once the electricity outage is over. This process repeats itself indefinitely. The business customer maximizes his expected discounted profits (the expected value of the firm), taking into account his limited ability to respond to repeated random electricity outages. The model is applied to 11 industrial branches in Israel. The estimates exhibit a large variation across branches. 34 refs., 3 tabs.

  5. Technology Integration Initiative In Support of Outage Management

    SciTech Connect (OSTI)

    Gregory Weatherby; David Gertman

    2012-07-01

    Plant outage management is a high priority concern for the nuclear industry from cost and safety perspectives. Often, command and control during outages is maintained in the outage control center where many of the underlying technologies supporting outage control are the same as those used in the 1980’s. This research reports on the use of advanced integrating software technologies and hand held mobile devices as a means by which to reduce cycle time, improve accuracy, and enhance transparency among outage team members. This paper reports on the first phase of research supported by the DOE Light Water Reactor Sustainability (LWRS) Program that is performed in close collaboration with industry to examine the introduction of newly available technology allowing for safe and efficient outage performance. It is thought that this research will result in: improved resource management among various plant stakeholder groups, reduced paper work, and enhanced overall situation awareness for the outage control center management team. A description of field data collection methods, including personnel interview data, success factors, end-user evaluation and integration of hand held devices in achieving an integrated design are also evaluated. Finally, the necessity of obtaining operations cooperation support in field studies and technology evaluation is acknowledged.

  6. U.S. - Canada Power System Outage Task Force: Final Report on...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    - Canada Power System Outage Task Force: Final Report on the Implementation of Task Force Recommendations U.S. - Canada Power System Outage Task Force: Final Report on the ...

  7. Preparing for a Power Outage | Department of Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    of September, FEMA's website Ready.gov will focus on a different emergency scenario and share tips on how to be prepared in case of floods, wildfires, hurricanes or power outages. ...

  8. SAMPLE RESULTS FROM MCU SOLIDS OUTAGE

    SciTech Connect (OSTI)

    Peters, T.; Washington, A.; Oji, L.; Coleman, C.; Poirier, M.

    2014-09-22

    Savannah River National Laboratory (SRNL) has received several solid and liquid samples from MCU in an effort to understand and recover from the system outage starting on April 6, 2014. SRNL concludes that the presence of solids in the Salt Solution Feed Tank (SSFT) is the likely root cause for the outage, based upon the following discoveries ? A solids sample from the extraction contactor #1 proved to be mostly sodium oxalate ? A solids sample from the scrub contactor#1 proved to be mostly sodium oxalate ? A solids sample from the Salt Solution Feed Tank (SSFT) proved to be mostly sodium oxalate ? An archived sample from Tank 49H taken last year was shown to contain a fine precipitate of sodium oxalate ? A solids sample from the extraction contactor #1 drain pipe from extraction contactor#1 proved to be mostly sodium aluminosilicate ? A liquid sample from the SSFT was shown to have elevated levels of oxalate anion compared to the expected concentration in the feed Visual inspection of the SSFT indicated the presence of precipitated or transferred solids, which were likely also in the Salt Solution Receipt Tank (SSRT). The presence of the solids coupled with agitation performed to maintain feed temperature resulted in oxalate solids migration through the MCU system and caused hydraulic issues that resulted in unplanned phase carryover from the extraction into the scrub, and ultimately the strip contactors. Not only did this carryover result in the Strip Effluent (SE) being pushed out of waste acceptance specification, but it resulted in the deposition of solids into several of the contactors. At the same time, extensive deposits of aluminosilicates were found in the drain tube in the extraction contactor #1. However it is not known at this time how the aluminosilicate solids are related to the oxalate solids. The solids were successfully cleaned out of the MCU system. However, future consideration must be given to the exclusion of oxalate solids into the MCU system

  9. Development of Methodologies for Technology Deployment for Advanced Outage Control Centers that Improve Outage Coordination, Problem Resolution and Outage Risk Management

    SciTech Connect (OSTI)

    Shawn St. Germain; Ronald Farris; Heather Medeman

    2013-09-01

    This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE) and performed in close collaboration with industry R&D programs that provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The long term viability of existing nuclear power plants in the U.S. will depend upon maintaining high capacity factors, avoiding nuclear safety issues and reducing operating costs. The slow progress in the construction on new nuclear power plants has placed in increased importance on maintaining the output of the current fleet of nuclear power plants. Recently expanded natural gas production has placed increased economic pressure on nuclear power plants due to lower cost competition. Until recently, power uprate projects had steadily increased the total output of the U.S. nuclear fleet. Errors made during power plant upgrade projects have now removed three nuclear power plants from the U.S. fleet and economic considerations have caused the permanent shutdown of a fourth plant. Additionally, several utilities have cancelled power uprate projects citing economic concerns. For the past several years net electrical generation from U.S. nuclear power plants has been declining. One of few remaining areas where significant improvements in plant capacity factors can be made is in minimizing the duration of refueling outages. Managing nuclear power plant outages is a complex and difficult task. Due to the large number of complex tasks and the uncertainty that accompanies them, outage durations routinely exceed the planned duration. The ability to complete an outage on or near

  10. Multiplicity Counting

    SciTech Connect (OSTI)

    Geist, William H.

    2015-12-01

    This set of slides begins by giving background and a review of neutron counting; three attributes of a verification item are discussed: 240Pueff mass; α, the ratio of (α,n) neutrons to spontaneous fission neutrons; and leakage multiplication. It then takes up neutron detector systems – theory & concepts (coincidence counting, moderation, die-away time); detector systems – some important details (deadtime, corrections); introduction to multiplicity counting; multiplicity electronics and example distributions; singles, doubles, and triples from measured multiplicity distributions; and the point model: multiplicity mathematics.

  11. Overview of Common Mode Outages in Power Systems

    SciTech Connect (OSTI)

    Papic, Milorad; Awodele , Kehinde; Billinton, Roy; Dent, Chris; Eager, Dan; Hamoud, Gomaa; Jirutitijaroen, Panida; Kumbale, Murali; Mitra, Joydeep; Samaan, Nader A.; Schneider, Alex; Singh, Chanan

    2012-11-10

    This paper is a result of ongoing activity carried out by Probability Applications for Common Mode Events (PACME) Task Force under the Reliability Risk and Probability Applications (RRPA) Subcommittee. The paper is intended to constitute a valid source of information and references about dealing with common-mode outages in power systems reliability analysis. This effort involves reviewing published literature and presenting state-of-the-art research and practical applications in the area of common-mode outages. Evaluation of available outage statistics show that there is a definite need for collective effort from academia and industry to not only recommended procedures for data collection and monitoring but also to provide appropriate mathematical models to assess such events.

  12. Hopper compilers and DDT short outage next Wed, May 16

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    compilers and DDT short outage next Wed, May 16 Hopper compilers and DDT short outage next Wed, May 16 May 10, 2012 Due to a scheduled maintenance for the License Servers, most of the compilers (except GNU) and the DDT debugger on Hopper will not be available from 10:30 am to 12:30 pm on Wednesday, May 16. If there are any questions or concerns, please contact "consult at nersc dot gov". Subscribe via RSS Subscribe Browse by Date February 2013 September 2012 August 2012 May 2012 April

  13. Hopper scheduled maintenance tomorrow (Sept 19) and /project outage

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    scheduled maintenance tomorrow (Sept 19) and /project outage Hopper scheduled maintenance tomorrow (Sept 19) and /project outage September 18, 2012 by Helen He There will be a scheduled hardware and software maintenance for Hopper next Wednesday, Sept 19, from 6:30 am to midnight Pacific time. Please plan your work accordingly and check the NERSC Message of the Day (MOTD) for status update: http://www.nersc.gov/live-status/motd/. The /project file system (also known as /global/project) will be

  14. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.

  15. Notification of Planned 230kV Outage at Potomac River Generating...

    Energy Savers [EERE]

    Subject: Notification of Planned 230kV Outage at Potomac River Generating Station To ... The next planned outage on 23106 high voltage circuit between xxxxxxx xx xxxx and ...

  16. A framework and review of customer outage costs: Integration and analysis of electric utility outage cost surveys

    SciTech Connect (OSTI)

    Lawton, Leora; Sullivan, Michael; Van Liere, Kent; Katz, Aaron; Eto, Joseph

    2003-11-01

    A clear understanding of the monetary value that customers place on reliability and the factors that give rise to higher and lower values is an essential tool in determining investment in the grid. The recent National Transmission Grid Study recognizes the need for this information as one of growing importance for both public and private decision makers. In response, the U.S. Department of Energy has undertaken this study, as a first step toward addressing the current absence of consistent data needed to support better estimates of the economic value of electricity reliability. Twenty-four studies, conducted by eight electric utilities between 1989 and 2002 representing residential and commercial/industrial (small, medium and large) customer groups, were chosen for analysis. The studies cover virtually all of the Southeast, most of the western United States, including California, rural Washington and Oregon, and the Midwest south and east of Chicago. All variables were standardized to a consistent metric and dollar amounts were adjusted to the 2002 CPI. The data were then incorporated into a meta-database in which each outage scenario (e.g., the lost of electric service for one hour on a weekday summer afternoon) is treated as an independent case or record both to permit comparisons between outage characteristics and to increase the statistical power of analysis results. Unadjusted average outage costs and Tobit models that estimate customer damage functions are presented. The customer damage functions express customer outage costs for a given outage scenario and customer class as a function of location, time of day, consumption, and business type. One can use the damage functions to calculate outage costs for specific customer types. For example, using the customer damage functions, the cost experienced by an ''average'' customer resulting from a 1 hour summer afternoon outage is estimated to be approximately $3 for a residential customer, $1,200 for small

  17. Risk Assessment of Cascading Outages: Methodologies and Challenges

    SciTech Connect (OSTI)

    Vaiman, Marianna; Bell, Keith; Chen, Yousu; Chowdhury, Badrul; Dobson, Ian; Hines, Paul; Papic, Milorad; Miller, Stephen; Zhang, Pei

    2012-05-31

    Abstract- This paper is a result of ongoing activity carried out by Understanding, Prediction, Mitigation and Restoration of Cascading Failures Task Force under IEEE Computer Analytical Methods Subcommittee (CAMS). The task force's previous papers are focused on general aspects of cascading outages such as understanding, prediction, prevention and restoration from cascading failures. This is the first of two new papers, which extend this previous work to summarize the state of the art in cascading failure risk analysis methodologies and modeling tools. This paper is intended to be a reference document to summarize the state of the art in the methodologies for performing risk assessment of cascading outages caused by some initiating event(s). A risk assessment should cover the entire potential chain of cascades starting with the initiating event(s) and ending with some final condition(s). However, this is a difficult task and heuristic approaches and approximations have been suggested. This paper discusses different approaches to this and suggests directions for future development of methodologies. The second paper summarizes the state of the art in modeling tools for risk assessment of cascading outages.

  18. Survey of tools for risk assessment of cascading outages

    SciTech Connect (OSTI)

    Papic, Milorad; Bell, Keith; Chen, Yousu; Dobson, Ian; Fonte, Louis; Haq, Enamul; Hines, Paul; Kirschen, Daniel; Luo, Xiaochuan; Miller, Stephen; Samaan, Nader A.; Vaiman, Marianna; Varghese, Matthew; Zhang, Pei

    2011-10-01

    Abstract-This paper is a result of ongoing activity carried out by Understanding, Prediction, Mitigation and Restoration of Cascading Failures Task Force under IEEE Computer Analytical Methods Subcommittee (CAMS). The task force's previous papers [1, 2] are focused on general aspects of cascading outages such as understanding, prediction, prevention and restoration from cascading failures. This is the second of two new papers, which extend this previous work to summarize the state of the art in cascading failure risk analysis methodologies and modeling tools. The first paper reviews the state of the art in methodologies for performing risk assessment of potential cascading outages [3]. This paper describes the state of the art in cascading failure modeling tools, documenting the view of experts representing utilities, universities and consulting companies. The paper is intended to constitute a valid source of information and references about presently available tools that deal with prediction of cascading failure events. This effort involves reviewing published literature and other documentation from vendors, universities and research institutions. The assessment of cascading outages risk evaluation is in continuous evolution. Investigations to gain even better understanding and identification of cascading events are the subject of several research programs underway aimed at solving the complexity of these events that electrical utilities face today. Assessing the risk of cascading failure events in planning and operation for power transmission systems require adequate mathematical tools/software.

  19. Design Concepts for an Outage Control Center Information Dashboard

    SciTech Connect (OSTI)

    Hugo, Jacques Victor; St Germain, Shawn Walter; Thompson, Cheradan Jo; Whitesides, McKenzie Jo; Farris, Ronald Keith

    2015-12-01

    The nuclear industry, and the business world in general, is facing a rapidly increasing amount of data to be dealt with on a daily basis. In the last two decades, the steady improvement of data storage devices and means to create and collect data along the way influenced the manner in which we deal with information. Most data is still stored without filtering and refinement for later use. Many functions at a nuclear power plant generate vast amounts of data, with scheduled and unscheduled outages being a prime example of a source of some of the most complex data sets at the plant. To make matters worse, modern information and communications technology is making it possible to collect and store data faster than our ability to use it for making decisions. However, in most applications, especially outages, raw data has no value in itself; instead, managers, engineers and other specialists want to extract the information contained in it. The complexity and sheer volume of data could lead to information overload, resulting in getting lost in data that may be irrelevant to the task at hand, processed in an inappropriate way, or presented in an ineffective way. To prevent information overload, many data sources are ignored so production opportunities are lost because utilities lack the ability to deal with the enormous data volumes properly. Decision-makers are often confronted with large amounts of disparate, conflicting and dynamic information, which are available from multiple heterogeneous sources. Information and communication technologies alone will not solve this problem. Utilities need effective methods to exploit and use the hidden opportunities and knowledge residing in unexplored data resources. Superior performance before, during and after outages depends upon the right information being available at the right time to the right people. Acquisition of raw data is the easy part; instead, it is the ability to use advanced analytical, data processing and data

  20. Study, outlines why outages go long, short, or on-time

    SciTech Connect (OSTI)

    Not Available

    1993-09-01

    A recent report by a nuclear industry professional, based on a survey of outage managers at US nuclear power plants, declares that [open quotes]preplanned outage schedules appear to be grossly inaccurate, and the outage management planners and schedulers do not have a grasp of the requirements and/or the resources needed to complete the actual activities on schedule.[close quotes] It declares that [open quotes]the scheduled duration of a planned outage must be realistic.[close quotes] The study identifies personnel, planning and scheduling, and equipment/hardware as [open quotes]the primary reasons why refueling outages and outage activities finished ahead of, right on, or behind schedule.[close quotes

  1. Risk Assessment of Cascading Outages: Part I - Overview of Methodologies

    SciTech Connect (OSTI)

    Vaiman, Marianna; Bell, Keith; Chen, Yousu; Chowdhury, Badrul; Dobson, Ian; Hines, Paul; Papic, Milorad; Miller, Stephen; Zhang, Pei

    2011-07-31

    This paper is a result of ongoing activity carried out by Understanding, Prediction, Mitigation and Restoration of Cascading Failures Task Force under IEEE Computer Analytical Methods Subcommittee (CAMS). The task force's previous papers are focused on general aspects of cascading outages such as understanding, prediction, prevention and restoration from cascading failures. This is the first of two new papers, which will extend this previous work to summarize the state of the art in cascading failure risk analysis methodologies and modeling tools. This paper is intended to be a reference document to summarize the state of the art in the methodologies for performing risk assessment of cascading outages caused by some initiating event(s). A risk assessment should cover the entire potential chain of cascades starting with the initiating event(s) and ending with some final condition(s). However, this is a difficult task and heuristic approaches and approximations have been suggested. This paper discusses diffeent approaches to this and suggests directions for future development of methodologies.

  2. Further Notice of 230kV Circuit Planned Outages | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Further Notice of 230kV Circuit Planned Outages Further Notice of 230kV Circuit Planned Outages Docket No. EO-05-01. Order No. 202-05-03: Pursuant 10 the United States Department of Energy "DOE") Order No. 102-05-3, issued December 20, 2005 ("DOE Potomac River Order''), Pepco hereby files this Further Notice Of 230kV Circuit Planned Outages serving the Potomac River Substation, and through thaI station, the District of Columbia. Further Notice of 230kV Circuit Planned Outages

  3. A Study of Outage Management Practices at Selected U.S. Nuclear Plants

    SciTech Connect (OSTI)

    Lin, James C.

    2002-07-01

    This paper presents insights gained from a study of the outage management practices at a number of U.S. nuclear plants. The objective of the study was to conduct an in-depth review of the current practices of outage management at these selected plants and identify important factors that have contributed to the recent success of their outage performance. Two BWR-4, three BWR-6, and two 3-loop Westinghouse PWR plants were selected for this survey. The results of this study can be used to formulate outage improvement efforts for nuclear plants in other countries. (author)

  4. Status of U.S. Nuclear Outages - U.S. Energy Information Administratio...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    < NUCLEAR & URANIUM Data Status of U.S. Nuclear Outages Download Download Link to: Nuclear Regulatory Commission's Power Reactor Status Report

  5. Procedures and equipment for shortening refueling outages in Babcock and Wilcox PWRs. Final report

    SciTech Connect (OSTI)

    Baker, H.A.; Carr, C.W.

    1985-04-01

    New refueling equipment and procedures - plus a software package bid specification for outage management - can reduce refueling outages in Babcock and Wilcox PWRs. At Duke Power Company's Oconee nuclear station, a single modification in the fuel-handling system cut 5 days off the refueling schedule.

  6. Quantitative evaluation of savings in outage costs by using emergency actions strategy

    SciTech Connect (OSTI)

    Akhtar, A.; Asuhaimi, A.; Shaibon, H. [Univ. Teknologi Malaysia, Johor Bharu (Malaysia); Lo, K.L. [Univ. of Strathclyde, Glasgow (United Kingdom)

    1995-12-31

    This paper presents the results of a study carried out to assess the savings in consumer outage costs that can be accrued as a result of implementing Emergency Actions Strategy. The use of Emergency Actions Strategy plays a significant role in curtailing the consumer outage costs ensuing from unreliable electric service. In order to calculate the savings in outage costs, the probabilistic framework of the frequency and duration method has been used in conjunction with emergency actions. At first, the outage costs of various consumer sectors are estimated without considering the emergency actions. Secondly, the consumer outage costs are calculated by combining the frequency and duration method, and unserved energy with the emergency actions invoked. The results of the savings in consumer outage costs that can be accrued by utilizing Emergency Actions Strategy are presented for a synthetic system. The results of the study show that substantial savings in consumer outage costs are obtained by devising and implementing emergency actions strategy in situations of capacity outages. The results are of particular relevance and utility to the underdeveloped and developing countries where capacity shortages occur quite frequently. These results also suggest the importance of emergency actions strategy for electric utilities in reducing the consumer economic losses arising from unreliable electric service.

  7. A Review of Power Outages and Restoration Following the June 2012 Derecho

    Broader source: Energy.gov [DOE]

    The Office of Electricity Delivery and Energy Reliability has released a report that reviews power outages and restoration efforts following the June 29, 2012 Derecho and compares them to outages and restoration efforts following other spring and summer storms in the Ohio Valley and Mid-Atlantic regions.

  8. Use of collaboration software to improve nuclear power plant outage management

    SciTech Connect (OSTI)

    Germain, Shawn

    2015-02-01

    Nuclear Power Plant (NPP) refueling outages create some of the most challenging activities the utilities face in both tracking and coordinating thousands of activities in a short period of time. Other challenges, including nuclear safety concerns arising from atypical system configurations and resource allocation issues, can create delays and schedule overruns, driving up outage costs. Today the majority of the outage communication is done using processes that do not take advantage of advances in modern technologies that enable enhanced communication, collaboration and information sharing. Some of the common practices include: runners that deliver paper-based requests for approval, radios, telephones, desktop computers, daily schedule printouts, and static whiteboards that are used to display information. Many gains have been made to reduce the challenges facing outage coordinators; however; new opportunities can be realized by utilizing modern technological advancements in communication and information tools that can enhance the collective situational awareness of plant personnel leading to improved decision-making. Ongoing research as part of the Light Water Reactor Sustainability Program (LWRS) has been targeting NPP outage improvement. As part of this research, various applications of collaborative software have been demonstrated through pilot project utility partnerships. Collaboration software can be utilized as part of the larger concept of Computer-Supported Cooperative Work (CSCW). Collaborative software can be used for emergent issue resolution, Outage Control Center (OCC) displays, and schedule monitoring. Use of collaboration software enables outage staff and subject matter experts (SMEs) to view and update critical outage information from any location on site or off.

  9. Job Counting Guidelines

    Office of Environmental Management (EM)

    Definitions and Guidelines for Counting Monthly and Quarterly EM Recovery Act Full Time Equivalents (FTEs) and Cumulative Head-Count The following updated definitions and...

  10. Smart Grid Week: Hurricane Season and the Department’s Efforts to Make the Grid More Resilient to Power Outages

    Broader source: Energy.gov [DOE]

    Next up in our Smart Grid Week series -- improving electric grid technologies to adequately prepare for emergencies with power outages.

  11. Notice of Unplanned Outage at the Mirant Potomac River Plant | Department

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Energy Unplanned Outage at the Mirant Potomac River Plant Notice of Unplanned Outage at the Mirant Potomac River Plant Docket No. EO-05-01. Order No. 202-05-03: Pursuant to the United States Department of Energy ("DOE") Order No_ 202-05-3, issued December 20, 2005 ("DOE Potomac River Order"), Pepco hereby files this notice of an unplanned outage of one of the 230kV circuits serving the Potomac River Substation, and through that station, the District of Columbia. Notice

  12. Notification of Planned 230kV Outage at Potomac River Generating Station |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy The next planned outage on xxxxx high voltage circuit between Palmers Corner Substation and the Potomac River Generating Station is scheduled for Sunday, June 3, 2007 and will begin at 4:00 AM with a scheduled return date of Saturday, June 9, 2007 at 2:00 PM. Notification of Planned 230kV Outage at Potomac River Generating Station (34.76 KB) More Documents & Publications Notification of Planned 230kV Outage at Potomac River Generating Station Notification of Planned

  13. Hoboken Hopes To Reduce Power Outages With New 'Smart Grid' System

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... HOBOKEN, N.J. (CBSNewYork) - Officials are hoping to reduce power outages during future storms in Hoboken by designing a "smart grid" system using military-inspired technology. The ...

  14. Notification of Planned 230kV Outage at Potomac River Generating Station

    Office of Energy Efficiency and Renewable Energy (EERE)

    Docket No. EO-05-01. The next planned outage on xxxxx high voltage circuit between xxxxx and xxxxx is tentatively scheduled for Saturday May 19, 2007 and will begin at 4:00 AM with a scheduled...

  15. Notification of Planned 230kV Outage at Potomac River Generating...

    Office of Environmental Management (EM)

    Re: Potomac River Generating Station Department of Energy, Case No. EO-05-01: Potomac Electric Power Company (PEPCO) evised plan for transmission outages for the 230 kV circuits ...

  16. Notification of Planned 230kV Outage at Potomac River Generating...

    Energy Savers [EERE]

    Re: Potomac River Generating Station Department of Energy Case No. EO-05-01: Advanced Notice of Power Outages. Special Environmental Analysis For Actions Taken under U.S. ...

  17. Pepco Update on Current Construction Work and Mirant Generation Needs for Pepco's Planned June Line Outage

    Office of Energy Efficiency and Renewable Energy (EERE)

    Docket No. EO-05-01.  Pepco needs the following to occur to provide necessary reliability to the central D.C. area during this scheduled June outage in order to complete installation of new...

  18. U.S. - Canada Power System Outage Task Force: Final Report on the

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Implementation of Task Force Recommendations | Department of Energy - Canada Power System Outage Task Force: Final Report on the Implementation of Task Force Recommendations U.S. - Canada Power System Outage Task Force: Final Report on the Implementation of Task Force Recommendations On August 14, 2003, the largest power blackout in North American history affected an area with an estimated 50 million people and 61,800 megawatts (MW) of electric load in the states of Ohio, Michigan,

  19. OTRA-THS MAC to reduce Power Outage Data Collection Latency in a smart meter network

    SciTech Connect (OSTI)

    Garlapati, Shravan K; Kuruganti, Phani Teja; Buehrer, Richard M; Reed, Jeffrey H

    2014-01-01

    The deployment of advanced metering infrastructure by the electric utilities poses unique communication challenges, particularly as the number of meters per aggregator increases. During a power outage, a smart meter tries to report it instantaneously to the electric utility. In a densely populated residential/industrial locality, it is possible that a large number of smart meters simultaneously try to get access to the communication network to report the power outage. If the number of smart meters is very high of the order of tens of thousands (metropolitan areas), the power outage data flooding can lead to Random Access CHannel (RACH) congestion. Several utilities are considering the use of cellular network for smart meter communications. In 3G/4G cellular networks, RACH congestion not only leads to collisions, retransmissions and increased RACH delays, but also has the potential to disrupt the dedicated traffic flow by increasing the interference levels (3G CDMA). In order to overcome this problem, in this paper we propose a Time Hierarchical Scheme (THS) that reduces the intensity of power outage data flooding and power outage reporting delay by 6/7th, and 17/18th when compared to their respective values without THS. Also, we propose an Optimum Transmission Rate Adaptive (OTRA) MAC to optimize the latency in power outage data collection. The analysis and simulation results presented in this paper show that both the OTRA and THS features of the proposed MAC results in a Power Outage Data Collection Latency (PODCL) that is 1/10th of the 4G LTE PODCL.

  20. Status Report on the Development of Micro-Scheduling Software for the Advanced Outage Control Center Project

    SciTech Connect (OSTI)

    Shawn St. Germain; Kenneth Thomas; Ronald Farris; Jeffrey Joe

    2014-09-01

    The long-term viability of existing nuclear power plants (NPPs) in the United States (U.S.) is dependent upon a number of factors, including maintaining high capacity factors, maintaining nuclear safety, and reducing operating costs, particularly those associated with refueling outages. Refueling outages typically take 20-30 days, and for existing light water NPPs in the U.S., the reactor cannot be in operation during the outage. Furthermore, given that many NPPs generate between $1-1.5 million/day in revenue when in operation, there is considerable interest in shortening the length of refueling outages. Yet, refueling outages are highly complex operations, involving multiple concurrent and dependent activities that are difficult to coordinate. Finding ways to improve refueling outage performance while maintaining nuclear safety has proven to be difficult. The Advanced Outage Control Center project is a research and development (R&D) demonstration activity under the Light Water Reactor Sustainability (LWRS) Program. LWRS is a R&D program which works with industry R&D programs to establish technical foundations for the licensing and managing of long-term, safe, and economical operation of current NPPs. The Advanced Outage Control Center project has the goal of improving the management of commercial NPP refueling outages. To accomplish this goal, this INL R&D project is developing an advanced outage control center (OCC) that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report describes specific recent efforts to develop a capability called outage Micro-Scheduling. Micro-Scheduling is the ability to allocate and schedule outage support task resources on a sub-hour basis. Micro-Scheduling is the real-time fine-tuning of the outage schedule to react to the actual progress of the primary outage activities to ensure that support task resources are

  1. Evaluation of the marginal outage costs in interconnected and composite power systems

    SciTech Connect (OSTI)

    Ghajar, R.; Billinton, R.

    1995-05-01

    The structure of electric utilities is undergoing dramatic changes as new and expanded service options are added. The concepts of unbundling the electric service and offering customers a range of new services that more closely track actual costs are expanding the options open to customers. Spot pricing provides the economic structure for many of these new service options. An important component of spot prices is the marginal outage cost incurred by customers due to an incremental change in load. This paper presents a formalized approach of calculating the marginal outage cost in interconnected generating systems and composite generation and transmission systems using quantitative reliability techniques. The effects of selected pertinent factors on the marginal outage cost in composite systems are also presented. The proposed methods are illustrated by application to the IEEE-Reliability Test System (IEEE-RTS).

  2. Nuclear Safety Risk Management in Refueling Outage of Qinshan Nuclear Power Plant

    SciTech Connect (OSTI)

    Meijing Wu; Guozhang Shen

    2006-07-01

    The NPP is used to planning maintenance, in-service inspection, surveillance test, fuel handling and design modification in the refueling outage; the operator response capability will be reduced plus some of the plant systems out of service or loss of power at this time. Based on 8 times refueling outage experiences of the Qinshan NPP, this article provide some good practice and lesson learned for the nuclear safety risk management focus at four safety function areas of Residual Heat Removal Capability, Inventory Control, Power availability and Reactivity control. (authors)

  3. Pepco Update on Current Construction Work and Mirant Generation Needs for Pepco's Planned June Line Outage

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    May 25, 2007 Kevin Kolevar Director of the Office of Electricity Deliverability and Energy Reliability Department of Energy 1000 Independence Ave., SW Washington, DC 20585 Dear Mr. Kolevar, DOE has requested that Pepco provide an update on the current work to install two new 230 kilovolt circuits into Potomac River substation and to evaluate the need for generation from the Potomac River plant to support the anticipated line outage during June, 2007. An outage on one of the 230 kV circuits is

  4. Application of Hybrid Geo-Spatially Granular Fragility Curves to Improve Power Outage Predictions

    SciTech Connect (OSTI)

    Fernandez, Steven J; Allen, Melissa R; Omitaomu, Olufemi A; Walker, Kimberly A

    2014-01-01

    Fragility curves depict the relationship between a weather variable (wind speed, gust speed, ice accumulation, precipitation rate) and the observed outages for a targeted infrastructure network. This paper describes an empirical study of the county by county distribution of power outages and one minute weather variables during Hurricane Irene with the objective of comparing 1) as built fragility curves (statistical approach) to engineering as designed (bottom up) fragility curves for skill in forecasting outages during future hurricanes; 2) county specific fragility curves to find examples of significant deviation from average behavior; and 3) the engineering practices of outlier counties to suggest future engineering studies of robustness. Outages in more than 90% of the impacted counties could be anticipated through an average or generic fragility curve. The remaining counties could be identified and handled as exceptions through geographic data sets. The counties with increased or decreased robustness were characterized by terrain more or less susceptible to persistent flooding in areas where above ground poles located their foundations. Land use characteristics of the area served by the power distribution system can suggest trends in the as built power grid vulnerabilities to extreme weather events that would be subjects for site specific studies.

  5. Blackout 2003: Energy Secretary Bodman and Minister of Natural Resources for Canada Lunn Release the 2003 Power Outage Final Report, October 3, 2006

    Broader source: Energy.gov [DOE]

    Energy Secretary Bodman and Minister of Natural Resources for Canada Lunn Release the 2003 Power Outage Final Report. In accordance with the mandate of the U.S.-Canada Power Outage Task Force, the...

  6. Fast counting electronics for neutron coincidence counting

    DOE Patents [OSTI]

    Swansen, J.E.

    1985-03-05

    An amplifier-discriminator is tailored to output a very short pulse upon an above-threshold input from a detector which may be a /sup 3/He detector. The short pulse output is stretched and energizes a light emitting diode (LED) to provide a visual output of operation and pulse detection. The short pulse is further fed to a digital section for processing and possible ORing with other like generated pulses. Finally, the output (or ORed output) is fed to a derandomizing buffer which converts the rapidly and randomly occurring pulses into synchronized and periodically spaced-apart pulses for the accurate counting thereof. Provision is also made for the internal and external disabling of each individual channel of amplifier-discriminators in an ORed plurality of same.

  7. Fast counting electronics for neutron coincidence counting

    DOE Patents [OSTI]

    Swansen, James E.

    1987-01-01

    An amplifier-discriminator is tailored to output a very short pulse upon an above-threshold input from a detector which may be a .sup.3 He detector. The short pulse output is stretched and energizes a light emitting diode (LED) to provide a visual output of operation and pulse detection. The short pulse is further fed to a digital section for processing and possible ORing with other like generated pulses. Finally, the output (or ORed output ) is fed to a derandomizing buffer which converts the rapidly and randomly occurring pulses into synchronized and periodically spaced-apart pulses for the accurate counting thereof. Provision is also made for the internal and external disabling of each individual channel of amplifier-discriminators in an ORed plurality of same.

  8. LOW ENERGY COUNTING CHAMBERS

    DOE Patents [OSTI]

    Hayes, P.M.

    1960-02-16

    A beta particle counter adapted to use an end window made of polyethylene terephthalate was designed. The extreme thinness of the film results in a correspondingly high transmission of incident low-energy beta particles by the window. As a consequence, the counting efficiency of the present counter is over 40% greater than counters using conventional mica end windows.

  9. Recent Performance of and Plasma Outage Studies with the SNS H- Source

    SciTech Connect (OSTI)

    Stockli, Martin P; Han, Baoxi; Murray Jr, S N; Pennisi, Terry R; Piller, Chip; Santana, Manuel; Welton, Robert F

    2016-01-01

    SNS ramps to higher power levels that can be sustained with high availability. The goal is 1.4 MW despite a compromised RFQ, which requires higher RF power than design levels to approach the nominal beam transmission. Unfortunately at higher power the RFQ often loses its thermal stability, a problem apparently enhanced by beam losses and high influxes of hydrogen. Delivering as much H- beam as possible with the least amount of hydrogen led to plasma outages. The root cause is the dense 1-ms long ~55-kW 2-MHz plasma pulses reflecting ~90% of the continuous ~300W, 13-MHz power, which was mitigated with a 4-ms filter for the reflected power signal and an outage resistant, slightly-detuned 13-MHz match. Lowering the H2 also increased the H- beam current to ~55 mA, and increased the transmission by ~7%.

  10. Method for estimating power outages and restoration during natural and man-made events

    DOE Patents [OSTI]

    Omitaomu, Olufemi A.; Fernandez, Steven J.

    2016-01-05

    A method of modeling electric supply and demand with a data processor in combination with a recordable medium, and for estimating spatial distribution of electric power outages and affected populations. A geographic area is divided into cells to form a matrix. Within the matrix, supply cells are identified as containing electric substations and demand cells are identified as including electricity customers. Demand cells of the matrix are associated with the supply cells as a function of the capacity of each of the supply cells and the proximity and/or electricity demand of each of the demand cells. The method includes estimating a power outage by applying disaster event prediction information to the matrix, and estimating power restoration using the supply and demand cell information of the matrix and standardized and historical restoration information.

  11. Final Remediation Report for the K-Area Bingham Pump Outage Pit (643-1G)

    SciTech Connect (OSTI)

    Morganstern, M.

    2002-06-18

    The K-Area Bingham Pump Outage Pit (K BPOP) Building Number 643-1G, is situated immediately south and outside the K-Reactor fence line and is approximately 400 feet in length and 60 feet in width. For the K BPOP operable unit, the Land Use Control (LUC) objectives are to prevent contact, removal, or excavation of buried waste in the area and to preclude residential use of the area.

  12. Energy Secretary Bodman and Minister of Natural Resources for Canada Lunn Release the 2003 Power Outage Final Report

    Broader source: Energy.gov [DOE]

    WASHINGTON, D.C. - U.S. Department of Energy Secretary Samuel W. Bodman and Minister of Natural Resources for Canada Gary Lunn, today released the final report on the power outage that affected 50...

  13. Energy Secretary Bodman and Minister of Natural Resources for Canada Lunn Release the 2003 Power Outage Final Report

    Broader source: Energy.gov [DOE]

    U.S. Department of Energy Secretary Samuel W. Bodman and Minister of Natural Resources for Canada Gary Lunn, today released the final report on the power outage that affected 50 million North Americans in August 2003.

  14. Methodology to predict the number of forced outages due to creep failure

    SciTech Connect (OSTI)

    Palermo, J.V. Jr.

    1996-12-31

    All alloy metals at a temperature above 950 degrees Fahrenheit experience creep damage. Creep failures in boiler tubes usually begin after 25 to 40 years of operation. Since creep damage is irreversible, the only remedy is to replace the tube sections. By predicting the number of failures per year, the utility can make the best economic decision concerning tube replacement. This paper describes a methodology to calculate the number of forced outages per yea due to creep failures. This methodology is particularly useful to utilities that have boilers that have at least 25 years of operation.

  15. Analytical Tools to Predict Distribution Outage Restoration Load. Final Project Report.

    SciTech Connect (OSTI)

    Law, John

    1994-11-14

    The main activity of this project has been twofold: (1) development of a computer model to predict CLPU(Cold Load Pickup) and (2) development of a field measurement and analysis method to obtain the input parameters of the CLPU model. The field measurement and analysis method is called the Step-Voltage-Test (STEPV). The Kootenai Electric Cooperative Appleway 51 feeder in Coeur d`Alene was selected for analysis in this project and STEPV tests were performed in winters of 92 and 93. The STEPV data was analyzed (method and results presented within this report) to obtain the Appleway 51 feeder parameters for prediction by the CLPU model. One only CLPU record was obtained in winter 1994. Unfortunately, the actual CLPU was not dramatic (short outage and moderate temperature) and did not display cyclic restoration current. A predicted Appleway 51 feeder CLPU was generated using the parameters obtained via the STEPV measurement/analysis/algorithm method at the same ambient temperature and outage duration as the measured actual CLPU. The predicted CLPU corresponds reasonably well with the single actual CLPU data obtained in winter 1994 on the Appleway 51 feeder.

  16. Reducing Duration of Refueling Outage by Optimizing Core Design and Shuffling Sequence

    SciTech Connect (OSTI)

    Wakker, P.H.; Verhagen, F.C.M.; Bloois, J.T. van; Sutton, W.R. III

    2005-07-15

    Reducing the duration of refueling outage is possible by optimizing the core design and the shuffling sequence. For both options software tools have been developed that have been applied to the three most recent cycles of the Borssele plant in the Netherlands. Applicability of the shuffling sequence optimization to boiling water reactors has been demonstrated by a comparison to a recent shuffle plan used in the Hatch plant located in the United States. Their uses have shown that both core design and shuffling sequence optimization can be exploited to reduce the time needed for reloading a core with an in-core shuffling scheme. Ex-core shuffling schemes for pressurized water reactors can still have substantial benefit from a core design using a minimized number of insert shuffles.

  17. Understanding the Benefits of Dispersed Grid-Connected Photovoltaics: From Avoiding the Next Major Outage to Taming Wholesale Power Markets

    SciTech Connect (OSTI)

    Letendre, Steven E.; Perez, Richard

    2006-07-15

    Thanks to new solar resource assessment techniques using cloud cover data available from geostationary satellites, it is apparent that grid-connected PV installations can serve to enhance electric grid reliability, preventing or hastening recovery from major power outages and serving to mitigate extreme price spikes in wholesale energy markets. (author)

  18. Olkiluoto 1 and 2 - Plant efficiency improvement and lifetime extension-project (PELE) implemented during outages 2010 and 2011

    SciTech Connect (OSTI)

    Kosonen, M.; Hakola, M.

    2012-07-01

    Teollisuuden Voima Oyj (TVO) is a non-listed public company founded in 1969 to produce electricity for its stakeholders. TVO is the operator of the Olkiluoto nuclear power plant. TVO follows the principle of continuous improvement in the operation and maintenance of the Olkiluoto plant units. The PELE project (Plant Efficiency Improvement and Lifetime Extension), mainly completed during the annual outages in 2010 and 2011, and forms one part of the systematic development of Olkiluoto units. TVO maintains a long-term development program that aims at systematically modernizing the plant unit systems and equipment based on the latest technology. According to the program, the Olkiluoto 1 and Olkiluoto 2 plant units are constantly renovated with the intention of keeping them safe and reliable, The aim of the modernization projects is to improve the safety, reliability, and performance of the plant units. PELE project at Olkiluoto 1 was done in 2010 and at Olkiluoto 2 in 2011. The outage length of Olkiluoto 1 was 26 d 12 h 4 min and Olkiluoto 2 outage length was 28 d 23 h 46 min. (Normal service-outage is about 14 days including refueling and refueling-outage length is about seven days. See figure 1) The PELE project consisted of several single projects collected into one for coordinated project management. Some of the main projects were as follows: - Low pressure turbines: rotor, stator vane, casing and turbine instrumentation replacement. - Replacement of Condenser Cooling Water (later called seawater pumps) pumps - Replacement of inner isolation valves on the main steam lines. - Generator and the generator cooling system replacement. - Low voltage switchgear replacement. This project will continue during future outages. PELE was a success. 100 TVO employees and 1500 subcontractor employees participated in the project. The execution of the PELE projects went extremely well during the outages. The replacement of the low pressure turbines and seawater pumps improved the

  19. Property:EditCount | Open Energy Information

    Open Energy Info (EERE)

    EditCount Jump to: navigation, search Property Name EditCount Property Type Number Description Number of user edits. Pages using the property "EditCount" Showing 25 pages using...

  20. Plant Outage Time Savings Provided by Subcritical Physics Testing at Vogtle Unit 2

    SciTech Connect (OSTI)

    Cupp, Philip [Southern Nuclear Company (United States); Heibel, M.D. [Westinghouse Electric Company, LLC (United States)

    2006-07-01

    The most recent core reload design verification physics testing done at Southern Nuclear Company's (SNC) Vogtle Unit 2, performed prior to initial power operations in operating cycle 12, was successfully completed while the reactor was at least 1% {delta}K/K subcritical. The testing program used was the first application of the Subcritical Physics Testing (SPT) program developed by the Westinghouse Electric Company LLC. The SPT program centers on the application of the Westinghouse Subcritical Rod Worth Measurement (SRWM) methodology that was developed in cooperation with the Vogtle Reactor Engineering staff. The SRWM methodology received U. S. Nuclear Regulatory Commission (NRC) approval in August of 2005. The first application of the SPT program occurred at Vogtle Unit 2 in October of 2005. The results of the core design verification measurements obtained during the SPT program demonstrated excellent agreement with prediction, demonstrating that the predicted core characteristics were in excellent agreement with the actual operating characteristics of the core. This paper presents an overview of the SPT Program used at Vogtle Unit 2 during operating cycle 12, and a discussion of the critical path outage time savings the SPT program is capable of providing. (authors)

  1. Risk-based evaluation of Allowed Outage Times (AOTs) considering risk of shutdown

    SciTech Connect (OSTI)

    Mankamo, T.; Kim, I.S.; Samanta, P.K.

    1992-12-31

    When safety systems fail during power operation, Technical Specifications (TS) usually limit the repair within Allowed Outage Time (AOT). If the repair cannot be completed within the AOT, or no AOT is allowed, the plant is required to be shut down for the repair. However, if the capability to remove decay heat is degraded, shutting down the plant with the need to operate the affected decay-heat removal systems may impose a substantial risk compared to continued power operation over a usual repair time. Thus, defining a proper AOT in such situations can be considered as a risk-comparison between the repair in frill power state with a temporarily increased level of risk, and the altemative of shutting down the plant for the repair in zero power state with a specific associated risk. The methodology of the risk-comparison approach, with a due consideration of the shutdown risk, has been further developed and applied to the AOT considerations of residual heat removal and standby service water systems of a boiling water reactor (BWR) plant. Based on the completed work, several improvements to the TS requirements for the systems studied can be suggested.

  2. Risk-based evaluation of Allowed Outage Times (AOTs) considering risk of shutdown

    SciTech Connect (OSTI)

    Mankamo, T. ); Kim, I.S.; Samanta, P.K. )

    1992-01-01

    When safety systems fail during power operation, Technical Specifications (TS) usually limit the repair within Allowed Outage Time (AOT). If the repair cannot be completed within the AOT, or no AOT is allowed, the plant is required to be shut down for the repair. However, if the capability to remove decay heat is degraded, shutting down the plant with the need to operate the affected decay-heat removal systems may impose a substantial risk compared to continued power operation over a usual repair time. Thus, defining a proper AOT in such situations can be considered as a risk-comparison between the repair in frill power state with a temporarily increased level of risk, and the altemative of shutting down the plant for the repair in zero power state with a specific associated risk. The methodology of the risk-comparison approach, with a due consideration of the shutdown risk, has been further developed and applied to the AOT considerations of residual heat removal and standby service water systems of a boiling water reactor (BWR) plant. Based on the completed work, several improvements to the TS requirements for the systems studied can be suggested.

  3. LINEAR COUNT-RATE METER

    DOE Patents [OSTI]

    Henry, J.J.

    1961-09-01

    A linear count-rate meter is designed to provide a highly linear output while receiving counting rates from one cycle per second to 100,000 cycles per second. Input pulses enter a linear discriminator and then are fed to a trigger circuit which produces positive pulses of uniform width and amplitude. The trigger circuit is connected to a one-shot multivibrator. The multivibrator output pulses have a selected width. Feedback means are provided for preventing transistor saturation in the multivibrator which improves the rise and decay times of the output pulses. The multivibrator is connected to a diode-switched, constant current metering circuit. A selected constant current is switched to an averaging circuit for each pulse received, and for a time determined by the received pulse width. The average output meter current is proportional to the product of the counting rate, the constant current, and the multivibrator output pulse width.

  4. Short-Term Energy Outlook Supplement: 2013 Outlook for Gulf of Mexico Hurricane-Related Production Outages

    U.S. Energy Information Administration (EIA) Indexed Site

    3 Outlook for Gulf of Mexico Hurricane-Related Production Outages June 2013 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | STEO Supplement: 2013 Hurricane Outlook i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other

  5. Short-Term Energy Outlook Supplement: 2015 Outlook for Gulf of Mexico Hurricane-Related Production Outages

    Gasoline and Diesel Fuel Update (EIA)

    4 Outlook for Gulf of Mexico Hurricane-Related Production Outages June 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | STEO Supplement: 2014 Hurricane Outlook i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are independent of approval by any other

  6. Count-doubling time safety circuit

    DOE Patents [OSTI]

    Rusch, Gordon K.; Keefe, Donald J.; McDowell, William P.

    1981-01-01

    There is provided a nuclear reactor count-factor-increase time monitoring circuit which includes a pulse-type neutron detector, and means for counting the number of detected pulses during specific time periods. Counts are compared and the comparison is utilized to develop a reactor scram signal, if necessary.

  7. Well coincidence counting and analysis

    SciTech Connect (OSTI)

    Lu, Ming-Shih; Teichmann, T.; Ceo, R.N.; Collins, L.L.

    1994-03-01

    In several recent papers a physical/mathematical model was developed to describe the nuclear multiplicative processes in samples containing fissile material from a general statistical viewpoint, starting with the basic underlying physical phenomena. The results of this model agreed with the established picture used in ``standard`` HLNCC (High Level Neutron Coincidence Counter) measurements, but considerably extended them, and allowed a more detailed interpretation of the underlying physical mechanisms and of the higher moments of the neutron counts. The present paper examines some recent measurements made at Y-12 (Oak Ridge) using the AWCC, in the light of this model. The results show internal consistency under a variety of conditions, and give good agreement between experiment and theory.

  8. Low Background Counting At SNOLAB

    SciTech Connect (OSTI)

    Lawson, Ian; Cleveland, Bruce [SNOLAB, 1039 Regional Rd 24, Lively, ON P3Y 1N2 (Canada)

    2011-04-27

    It is a continuous and ongoing effort to maintain radioactivity in materials and in the environment surrounding most underground experiments at very low levels. These low levels are required so that experiments can achieve the required detection sensitivities for the detection of low-energy neutrinos, searches for dark matter and neutrinoless double-beta decay. SNOLAB has several facilities which are used to determine these low background levels in the materials and the underground environment. This proceedings will describe the SNOLAB High Purity Germanium Detector which has been in continuous use for the past five years and give results of many of the items that have been counted over that period. Brief descriptions of SNOLAB's alpha-beta and electrostatic counters will be given, and the radon levels at SNOLAB will be discussed.

  9. Anomalous liquid scintillation counting of chromium-51

    SciTech Connect (OSTI)

    Charig, A.; Blake-Haskins, J.; Eigen, E.

    1985-12-01

    Unusual behavior of chromium-51 in liquid scintillation cocktail is described. Rapidly declining count rate is attributed to first-order binding of chromate to glass vials.

  10. Low Background Counting at LBNL

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Smith, A. R.; Thomas, K. J.; Norman, E. B.; Chan, Y. D.; Lesko, K. T.; Hurley, D. L.

    2015-03-24

    The Low Background Facility (LBF) at Lawrence Berkeley National Laboratory in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background cave and remotely at an underground location that historically has operated underground in Oroville, CA, but has recently been relocated to the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K)more » or common cosmogenic/anthropogenic products, as well as active screening via Neutron Activation Analysis for specific applications. The LBF also provides hosting services for general R&D testing in low background environments on the surface or underground for background testing of detector systems or similar prototyping. A general overview of the facilities, services, and sensitivities is presented. Recent activities and upgrades will also be presented, such as the completion of a 3π anticoincidence shield at the surface station and environmental monitoring of Fukushima fallout. The LBF is open to any users for counting services or collaboration on a wide variety of experiments and projects.« less

  11. Low Background Counting at LBNL

    SciTech Connect (OSTI)

    Smith, A. R.; Thomas, K. J.; Norman, E. B.; Chan, Y. D.; Lesko, K. T.; Hurley, D. L.

    2015-03-24

    The Low Background Facility (LBF) at Lawrence Berkeley National Laboratory in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background cave and remotely at an underground location that historically has operated underground in Oroville, CA, but has recently been relocated to the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K) or common cosmogenic/anthropogenic products, as well as active screening via Neutron Activation Analysis for specific applications. The LBF also provides hosting services for general R&D testing in low background environments on the surface or underground for background testing of detector systems or similar prototyping. A general overview of the facilities, services, and sensitivities is presented. Recent activities and upgrades will also be presented, such as the completion of a 3π anticoincidence shield at the surface station and environmental monitoring of Fukushima fallout. The LBF is open to any users for counting services or collaboration on a wide variety of experiments and projects.

  12. Time Variant Floating Mean Counting Algorithm

    Energy Science and Technology Software Center (OSTI)

    1999-06-03

    This software was written to test a time variant floating mean counting algorithm. The algorithm was developed by Westinghouse Savannah River Company and a provisional patent has been filed on the algorithm. The test software was developed to work with the Val Tech model IVB prototype version II count rate meter hardware. The test software was used to verify the algorithm developed by WSRC could be correctly implemented with the vendor''s hardware.

  13. The impact of fuel cladding failure events on occupational radiation exposures at nuclear power plants: Case study, PWR (pressurized-water reactor) during an outage

    SciTech Connect (OSTI)

    Moeller, M.P.; Martin, G.F.; Kenoyer, J.L.

    1987-08-01

    This report is the second in a series of case studies designed to evaluate the magnitude of increase in occupational radiation exposures at commercial US nuclear power plants resulting from small incidents or abnormal events. The event evaluated is fuel cladding failure, which can result in elevated primary coolant activity and increased radiation exposure rates within a plant. For this case study, radiation measurements were made at a pressurized-water reactor (PWR) during a maintenance and refueling outage. The PWR had been operating for 22 months with fuel cladding failure characterized as 105 pin-hole leakers, the equivalent of 0.21% failed fuel. Gamma spectroscopy measurements, radiation exposure rate determinations, thermoluminescent dosimeter (TLD) assessments, and air sample analyses were made in the plant's radwaste, pipe penetration, and containment buildings. Based on the data collected, evaluations indicate that the relative contributions of activation products and fission products to the total exposure rates were constant over the duration of the outage. This constancy is due to the significant contribution from the longer-lived isotopes of cesium (a fission product) and cobalt (an activation product). For this reason, fuel cladding failure events remain as significant to occupational radiation exposure during an outage as during routine operations. As documented in the previous case study (NUREG/CR-4485 Vol. 1), fuel cladding failure events increased radiation exposure rates an estimated 540% at some locations of the plant during routine operations. Consequently, such events can result in significantly greater radiation exposure rates in many areas of the plant during the maintenance and refueling outages than would have been present under normal fuel conditions.

  14. FLOP Counts for "Small" Single-Node Miniapplication Tests

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    FLOP Counts for "Small" Single-Node Miniapplication Tests FLOP Counts for "Small" Single-Node Miniapplication Tests These data, obtained using the NERSC Hopper system, are provided...

  15. Refinery Outages: Fall 2014

    Gasoline and Diesel Fuel Update (EIA)

    gasoline supply in a particular region because pipeline infrastructure, geography and marine shipping regulations constrain the amount of product that can flow among the different...

  16. Refinery Outages: Fall 2014

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    some Libyan crude oil production to the market, and increasing U.S. crude production. Economic growth in 2014 outside of the United States has been slow, and some recent data...

  17. Photon-counting solid-state photomultiplier

    SciTech Connect (OSTI)

    Petroff, M.D.; Stapelbroek, M.G.

    1989-02-01

    The Solid-State Photomultiplier is a silicon device capable of continuous detection of individual photons in the wave length range from 0.4 to 28 ..mu..m. Operated with an applied bias near 7 volts the device responds to the absorption of an incident photon with a submicrosecond-rise-time current pulse with a narrow amplitude distribution well above the electronic readout noise level. Optimal photon-counting performance occurs between 6 and 10 K and for count rates less than 10/sup 10/ counts/s per cm/sup 2/ of detector area. A 60% counting quantum efficiency has been demonstrated at 20 ..mu..m, and near 60% was observed in the visible light region. The underlying mechanism involves extremely fast internal charge amplification by impact ionization of impurity-band electrons and results in a pulse for each photoelectrically or thermally induced free carrier. The thermally induced dark pulse rate at 7 K is sufficiently low that background limited detector performance is obtained at a background of less than 10/sup 6/ photons/cm/sup 2/s.

  18. Differential white cell count by centrifugal microfluidics.

    SciTech Connect (OSTI)

    Sommer, Gregory Jon; Tentori, Augusto M.; Schaff, Ulrich Y.

    2010-07-01

    We present a method for counting white blood cells that is uniquely compatible with centrifugation based microfluidics. Blood is deposited on top of one or more layers of density media within a microfluidic disk. Spinning the disk causes the cell populations within whole blood to settle through the media, reaching an equilibrium based on the density of each cell type. Separation and fluorescence measurement of cell types stained with a DNA dye is demonstrated using this technique. The integrated signal from bands of fluorescent microspheres is shown to be proportional to their initial concentration in suspension. Among the current generation of medical diagnostics are devices based on the principle of centrifuging a CD sized disk functionalized with microfluidics. These portable 'lab on a disk' devices are capable of conducting multiple assays directly from a blood sample, embodied by platforms developed by Gyros, Samsung, and Abaxis. [1,2] However, no centrifugal platform to date includes a differential white blood cell count, which is an important metric complimentary to diagnostic assays. Measuring the differential white blood cell count (the relative fraction of granulocytes, lymphocytes, and monocytes) is a standard medical diagnostic technique useful for identifying sepsis, leukemia, AIDS, radiation exposure, and a host of other conditions that affect the immune system. Several methods exist for measuring the relative white blood cell count including flow cytometry, electrical impedance, and visual identification from a stained drop of blood under a microscope. However, none of these methods is easily incorporated into a centrifugal microfluidic diagnostic platform.

  19. Alternative Fuels Data Center: Alternative Fueling Station Counts by State

    Alternative Fuels and Advanced Vehicles Data Center [Office of Energy Efficiency and Renewable Energy (EERE)]

    Locate Stations Printable Version Share this resource Send a link to Alternative Fuels Data Center: Alternative Fueling Station Counts by State to someone by E-mail Share Alternative Fuels Data Center: Alternative Fueling Station Counts by State on Facebook Tweet about Alternative Fuels Data Center: Alternative Fueling Station Counts by State on Twitter Bookmark Alternative Fuels Data Center: Alternative Fueling Station Counts by State on Google Bookmark Alternative Fuels Data Center:

  20. Active neutron multiplicity counting of bulk uranium

    SciTech Connect (OSTI)

    Ensslin, N.; Krick, M.S.; Langner, D.G.; Miller, M.C. )

    1991-01-01

    This paper describes a new nondestructive assay technique being developed to assay bulk uranium containing kilogram quantities of {sup 235}U. The new technique uses neutron multiplicity analysis of data collected with a coincidence counter outfitted with AmLi neutron sources. The authors have calculated the expected neutron multiplicity count rate and assay precision for this technique and will report on its expected performance as a function of detector design characteristics, {sup 235}U sample mass, AmLi source strength, and source-to-sample coupling.

  1. SAS Output

    Gasoline and Diesel Fuel Update (EIA)

    A. U.S. Transmission Circuit Outages by Type and NERC region, 2013 Outage Type FRCC MRO NPCC RFC SERC SPP TRE WECC Contiguous U.S. Circuit Outage Counts Automatic Outages...

  2. SAS Output

    U.S. Energy Information Administration (EIA) Indexed Site

    B. U.S. Transformer Outages by Type and NERC region, 2013 Outage Type Eastern Interconnection TRE WECC Contiguous U.S. Circuit Outage Counts Automatic Outages (Sustained) 59.00 --...

  3. Low background counting techniques at SNOLAB

    SciTech Connect (OSTI)

    Lawson, Ian; Cleveland, Bruce [SNOLAB, 1039 Regional Rd 24, Lively, ON P3Y 1N2 (Canada)] [SNOLAB, 1039 Regional Rd 24, Lively, ON P3Y 1N2 (Canada)

    2013-08-08

    Many of the experiments currently searching for dark matter, studying properties of neutrinos or searching for neutrinoless double beta decay require very low levels of radioactive backgrounds both in their own construction materials and in the surrounding environment. These low background levels are required so that the experiments can achieve the required sensitivities for their searches. SNOLAB has several facilities which are used to directly measure these radioactive backgrounds. This proceedings will describe SNOLAB's High Purity Germanium Detectors, one of which has been in continuous use for the past seven years measuring materials for many experiments in operation or under construction at SNOLAB. A description of the characterisation of SNOLAB's new germanium well detector will be presented. In addition, brief descriptions of SNOLAB's alpha-beta and electrostatic counters will be presented and a description of SNOLAB's future low background counting laboratory will be given.

  4. Proposed Plan for the R-Area Bingham Pump Outage Pits (643-8G, -9G, -10G) and R-Area Unknown Pits No.1, No.2, No.3 (RUNK-1, -2, -3)

    SciTech Connect (OSTI)

    Mundy, S.

    2002-07-31

    The purpose of this proposed plan is to describe the preferred remedial alternative for the R-Area Bingham Pump Outage Pits (R BPOPs) and the R-Area Unknowns (RUNKs) operable unit (OU) and to provide for public involvement in the decision-making process.

  5. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOE Patents [OSTI]

    Prasad, Mano K.; Snyderman, Neal J.; Rowland, Mark S.

    2015-12-01

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  6. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOE Patents [OSTI]

    Prasad, Manoj K.; Snyderman, Neal J.; Rowland, Mark S.

    2012-06-05

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  7. Optical People Counting for Demand Controlled Ventilation: A...

    Office of Scientific and Technical Information (OSTI)

    of Counter Performance Citation Details In-Document Search Title: Optical People Counting for Demand Controlled Ventilation: A Pilot Study of Counter Performance This pilot ...

  8. When DNA Needs to Stand Up and Be Counted

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    biomolecules must be properly oriented to perform their biological function. In other words, the DNA literally must stand up to be counted. Understanding both the attachment...

  9. Counting small RNA in disease-causing organisms

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    in disease-causing organisms Los Alamos researchers demonstrated improved technical methods capable of directly counting small RNA molecules in pathogenic (disease-causing)...

  10. First AID (Atom counting for Isotopic Determination).

    SciTech Connect (OSTI)

    Roach, J. L.; Israel, K. M.; Steiner, R. E.; Duffy, C. J.; Roench, F. R.

    2002-01-01

    Los Alamos National Laboratory (LANL) has established an in vitro bioassay monitoring program in compliance with the requirements in the Code of Federal Regulations, 10 CFR 835, Occupational Radiation Protection. One aspect of this program involves monitoring plutonium levels in at-risk workers. High-risk workers are monitored using the ultra-sensitive Therrnal Ionization Mass Spectrometry (TIMS) technique to ensure compliance with DOE standards. TIMS is used to measure atom ratios of 239Pua nd 240Puw ith respect to a tracer isotope ('Pu). These ratios are then used to calculate the amount of 239Pu and 240Pup resent. This low-level atom counting technique allows the calculation of the concentration levels of 239Pu and 240Pu in urine for at risk workers. From these concentration levels, dose assessments can be made and worker exposure levels can be monitored. Detection limits for TIMS analysis are on the order of millions of atoms, which translates to activity levels of 150 aCi 239Pua nd 500 aCi for 240Pu. pCi for Our poster presentation will discuss the ultra-sensitive, low-level analytical technique used to measure plutonium isotopes and the data verification methods used for validating isotopic measurements.

  11. Galaxy number counts to second order and their bispectrum

    SciTech Connect (OSTI)

    Dio, Enea Di; Durrer, Ruth; Marozzi, Giovanni; Montanari, Francesco E-mail: Ruth.Durrer@unige.ch E-mail: Francesco.Montanari@unige.ch

    2014-12-01

    We determine the number counts to second order in cosmological perturbation theory in the Poisson gauge and allowing for anisotropic stress. The calculation is performed using an innovative approach based on the recently proposed ''geodesic light-cone'' gauge. This allows us to determine the number counts in a purely geometric way, without using Einstein's equation. The result is valid for general dark energy models and (most) modified gravity models. We then evaluate numerically some relevant contributions to the number counts bispectrum. In particular we consider the terms involving the density, redshift space distortion and lensing.

  12. When DNA Needs to Stand Up and Be Counted

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    When DNA Needs to Stand Up and Be Counted When DNA Needs to Stand Up and Be Counted Print Wednesday, 31 May 2006 00:00 DNA microarrays are small metal, glass, or silicon chips covered with patterns of short single-stranded DNA (ssDNA). These "DNA chips" are revolutionizing biotechnology, allowing scientists to identify and count many DNA sequences simultaneously. They are the enabling technology for genomic-based medicine and are a critical component of advanced diagnostic systems for

  13. Correlated neutron counting for the 21st century

    SciTech Connect (OSTI)

    Evans, Louise G

    2010-12-01

    Correlated neutron counting techniques, such as neutron coincidence and multiplicity counting, are widely employed at nuclear fuel cycle facilities for the accountancy of nuclear material such as plutonium. These techniques need to be improved and enhanced to meet the challenges of complex measurement items and future nuclear safeguards applications, for example; the non-destructive assay of spent nuclear fuel, high counting rate applications, small sample measurements, and Helium-3 replacement. At the same time simulation tools, used for the design of detection systems based on these techniques, are being developed in anticipation of future needs. This seminar will present the theory and current state of the practice of temporally correlated neutron counting. A range of future safeguards applications will then be presented in the context of research projects at Los Alamos National Laboratory.

  14. Multianode cylindrical proportional counter for high count rates

    DOE Patents [OSTI]

    Hanson, James A.; Kopp, Manfred K.

    1981-01-01

    A cylindrical, multiple-anode proportional counter is provided for counting of low-energy photons (<60 keV) at count rates of greater than 10.sup.5 counts/sec. A gas-filled proportional counter cylinder forming an outer cathode is provided with a central coaxially disposed inner cathode and a plurality of anode wires disposed in a cylindrical array in coaxial alignment with and between the inner and outer cathodes to form a virtual cylindrical anode coaxial with the inner and outer cathodes. The virtual cylindrical anode configuration improves the electron drift velocity by providing a more uniform field strength throughout the counter gas volume, thus decreasing the electron collection time following the detection of an ionizing event. This avoids pulse pile-up and coincidence losses at these high count rates. Conventional RC position encoding detection circuitry may be employed to extract the spatial information from the counter anodes.

  15. Multianode cylindrical proportional counter for high count rates

    DOE Patents [OSTI]

    Hanson, J.A.; Kopp, M.K.

    1980-05-23

    A cylindrical, multiple-anode proportional counter is provided for counting of low-energy photons (< 60 keV) at count rates of greater than 10/sup 5/ counts/sec. A gas-filled proportional counter cylinder forming an outer cathode is provided with a central coaxially disposed inner cathode and a plurality of anode wires disposed in a cylindrical array in coaxial alignment with and between the inner and outer cathodes to form a virtual cylindrical anode coaxial with the inner and outer cathodes. The virtual cylindrical anode configuration improves the electron drift velocity by providing a more uniform field strength throughout the counter gas volume, thus decreasing the electron collection time following the detection of an ionizing event. This avoids pulse pile-up and coincidence losses at these high count rates. Conventional RC position encoding detection circuitry may be employed to extract the spatial information from the counter anodes.

  16. Gathering total items count for pagination | OpenEI Community

    Open Energy Info (EERE)

    Gathering total items count for pagination Home > Groups > Utility Rate Hi I'm using the following base link plus some restrictions to sector, utility, and locations to poll for...

  17. When DNA Needs to Stand Up and Be Counted

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    stand up to be counted. Understanding both the attachment and orientation of DNA on gold surfaces was the goal of recent experiments performed at ALS Beamline 8.0.1 by an...

  18. Students Count -- From the Classroom to the Conference | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Count -- From the Classroom to the Conference Students Count -- From the Classroom to the Conference January 18, 2012 - 5:42pm Addthis Secretary Chu and former Governor of California Arnold Schwarzenegger speak with students at the 2011 Energy Innovation Summit. | Photo courtesy of ARPA-E. Secretary Chu and former Governor of California Arnold Schwarzenegger speak with students at the 2011 Energy Innovation Summit. | Photo courtesy of ARPA-E. Alexa McClanahan Communications Support

  19. When DNA Needs to Stand Up and Be Counted

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    When DNA Needs to Stand Up and Be Counted Print DNA microarrays are small metal, glass, or silicon chips covered with patterns of short single-stranded DNA (ssDNA). These "DNA chips" are revolutionizing biotechnology, allowing scientists to identify and count many DNA sequences simultaneously. They are the enabling technology for genomic-based medicine and are a critical component of advanced diagnostic systems for medical and homeland security applications. Like digital chips, DNA

  20. When DNA Needs to Stand Up and Be Counted

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    When DNA Needs to Stand Up and Be Counted Print DNA microarrays are small metal, glass, or silicon chips covered with patterns of short single-stranded DNA (ssDNA). These "DNA chips" are revolutionizing biotechnology, allowing scientists to identify and count many DNA sequences simultaneously. They are the enabling technology for genomic-based medicine and are a critical component of advanced diagnostic systems for medical and homeland security applications. Like digital chips, DNA

  1. When DNA Needs to Stand Up and Be Counted

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    When DNA Needs to Stand Up and Be Counted Print DNA microarrays are small metal, glass, or silicon chips covered with patterns of short single-stranded DNA (ssDNA). These "DNA chips" are revolutionizing biotechnology, allowing scientists to identify and count many DNA sequences simultaneously. They are the enabling technology for genomic-based medicine and are a critical component of advanced diagnostic systems for medical and homeland security applications. Like digital chips, DNA

  2. Counting Down to the Collegiate Wind Competition 2016 | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Counting Down to the Collegiate Wind Competition 2016 Counting Down to the Collegiate Wind Competition 2016 May 19, 2016 - 10:15am Addthis Competitors test their turbines in a wind tunnel at the Collegiate Wind Competition 2015, held at the National Renewable Energy Laboratory's National Wind Technology Center just south of Boulder, Colorado. (Photo by Dennis Schroeder / NREL) Competitors test their turbines in a wind tunnel at the Collegiate Wind Competition 2015, held at the

  3. Refinery Outages: First Half 2015

    Gasoline and Diesel Fuel Update (EIA)

    to increase by 820,000 bbld in 2015. While global oil supply growth has been strong, economic growth outside of the United States has been slow, particularly in Russia and...

  4. Compensated count-rate circuit for radiation survey meter

    DOE Patents [OSTI]

    Todd, Richard A.

    1981-01-01

    A count-rate compensating circuit is provided which may be used in a portable Geiger-Mueller (G-M) survey meter to ideally compensate for counting loss errors in the G-M tube detector. In a G-M survey meter, wherein the pulse rate from the G-M tube is converted into a pulse rate current applied to a current meter calibrated to indicate dose rate, the compensated circuit generates and controls a reference voltage in response to the rate of pulses from the detector. This reference voltage is gated to the current-generating circuit at a rate identical to the rate of pulses coming from the detector so that the current flowing through the meter is varied in accordance with both the frequency and amplitude of the reference voltage pulses applied thereto so that the count rate is compensated ideally to indicate a true count rate within 1% up to a 50% duty cycle for the detector. A positive feedback circuit is used to control the reference voltage so that the meter output tracks true count rate indicative of the radiation dose rate.

  5. Photon counting spectroscopy as done with a Thomson scattering diagnostic

    SciTech Connect (OSTI)

    Den Hartog, D.J.; Ruppert, D.E.

    1993-11-01

    The measurement and reduction of photon counting spectral data is demonstrated within the context of a Thomson scattering diagnostic. This diagnostic contains a microchannel plate (MCP) photomultiplier tube (PMT) as the photon sensing device. The MCP PMT is not an ideal photon sensor, the loss of photoelectrons at the MCP input and the broad charge pulse distribution at the output add to the uncertainty in recorded data. Computer simulations are used to demonstrate an approach to quantification of this added uncertainty and to develop an understanding of its source; the methodology may be applicable to the development of an understanding of photon detectors other than an MCP PMT. Emphasis is placed on the Poisson statistical character of the data, because the assumption that a Gaussian probability distribution is a reasonable statistical description of photon counting data is often questionable. When the count rate is low, the product the possible number of photon counts and the probability of measurement of a single photon is usually not sufficiently large to justify Gaussian statistics. Rather, because probabilities of measurement are so low, the Poisson probability distribution best quantifies the inherent statistical fluctuations in such counting measurements. The method of maximum likelihood is applied to derive the Poisson statistics equivalent of {sub X}{sup 2}. A Poisson statistics based data fitting code is implemented using the Newton-Raphson method of multi-dimensional root finding; we also demonstrate an algorithm to estimate the uncertainties in derived quantities.

  6. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 5: Analysis of core damage frequency from seismic events for plant operational state 5 during a refueling outage

    SciTech Connect (OSTI)

    Budnitz, R.J.; Davis, P.R.; Ravindra, M.K.; Tong, W.H.

    1994-08-01

    In 1989 the US Nuclear Regulatory Commission (NRC) initiated an extensive program to examine carefully the potential risks during low-power and shutdown operations. The program included two parallel projects, one at Sandia National Laboratories studying a boiling water reactor (Grand Gulf), and the other at Brookhaven National Laboratory studying a pressurized water reactor (Surry Unit 1). Both the Sandia and Brookhaven projects have examined only accidents initiated by internal plant faults---so-called ``internal initiators.`` This project, which has explored the likelihood of seismic-initiated core damage accidents during refueling outage conditions, is complementary to the internal-initiator analyses at Brookhaven and Sandia. This report covers the seismic analysis at Grand Gulf. All of the many systems modeling assumptions, component non-seismic failure rates, and human effort rates that were used in the internal-initiator study at Grand Gulf have been adopted here, so that the results of the study can be as comparable as possible. Both the Sandia study and this study examine only one shutdown plant operating state (POS) at Grand Gulf, namely POS 5 representing cold shutdown during a refueling outage. This analysis has been limited to work analogous to a level-1 seismic PRA, in which estimates have been developed for the core-damage frequency from seismic events during POS 5. The results of the analysis are that the core-damage frequency for earthquake-initiated accidents during refueling outages in POS 5 is found to be quite low in absolute terms, less than 10{sup {minus}7}/year.

  7. Compensated count-rate circuit for radiation survey meter

    DOE Patents [OSTI]

    Todd, R.A.

    1980-05-12

    A count-rate compensating circuit is provided which may be used in a portable Geiger-Mueller (G-M) survey meter to ideally compensate for couting loss errors in the G-M tube detector. In a G-M survey meter, wherein the pulse rate from the G-M tube is converted into a pulse rate current applied to a current meter calibrated to indicate dose rate, the compensation circuit generates and controls a reference voltage in response to the rate of pulses from the detector. This reference voltage is gated to the current-generating circuit at a rate identical to the rate of pulses coming from the detector so that the current flowing through the meter is varied in accordance with both the frequency and amplitude of the reference voltage pulses applied thereto so that the count rate is compensated ideally to indicate a true count rate within 1% up to a 50% duty cycle for the detector. A positive feedback circuit is used to control the reference voltage so that the meter output tracks true count rate indicative of the radiation dose rate.

  8. It's the little things that count | National Nuclear Security

    National Nuclear Security Administration (NNSA)

    Administration | (NNSA) It's the little things that count April 16, 2012 OAK RIDGE, Tenn. -- In just five months, the Jack Case Center at the National Nuclear Security Administration's Y-12 National Security Complex has not only achieved compliance with a national building standard for energy sustainability, but has also accomplished a 21.4 percent reduction in energy consumption. File It's the little things th

  9. Modeling patterns in count data using loglinear and related models

    SciTech Connect (OSTI)

    Atwood, C.L.

    1995-12-01

    This report explains the use of loglinear and logit models, for analyzing Poisson and binomial counts in the presence of explanatory variables. The explanatory variables may be unordered categorical variables or numerical variables, or both. The report shows how to construct models to fit data, and how to test whether a model is too simple or too complex. The appropriateness of the methods with small data sets is discussed. Several example analyses, using the SAS computer package, illustrate the methods.

  10. Learning How to Count: A High Multiplicity Search for the LHC...

    Office of Scientific and Technical Information (OSTI)

    Learning How to Count: A High Multiplicity Search for the LHC Citation Details In-Document Search Title: Learning How to Count: A High Multiplicity Search for the LHC Authors:...

  11. Development of Counted Single Donor Devices using in-situ Single...

    Office of Scientific and Technical Information (OSTI)

    Development of Counted Single Donor Devices using in-situ Single Ion Detectors on the SNL NanoImplanter. Citation Details In-Document Search Title: Development of Counted Single ...

  12. FTCP-08-001, Methodology for Counting TQP Personnel and Qualifications...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    1, Methodology for Counting TQP Personnel and Qualifications FTCP-08-001, Methodology for Counting TQP Personnel and Qualifications FTCP Issue Paper: FTCP-08-001 Approved by FTCP, ...

  13. Optimal gate-width setting for passive neutrons multiplicity counting

    SciTech Connect (OSTI)

    Croft, Stephen; Evans, Louise G; Schear, Melissa A

    2010-01-01

    When setting up a passive neutron coincidence counter it is natural to ask what coincidence gate settings should be used to optimize the counting precision. If the gate width is too short then signal is lost and the precision is compromised because in a given period only a few coincidence events will be observed. On the other hand if the gate is too large the signal will be maximized but it will also be compromised by the high level of random pile-up or Accidental coincidence events which must be subtracted. In the case of shift register electronics connected to an assay chamber with an exponential dieaway profile operating in the regime where the Accidentals rate dominates the Reals coincidence rate but where dead-time is not a concern, simple arguments allow one to show that the relative precision on the net Reals rate is minimized when the coincidence gate is set to about 1.2 times the lie dieaway time of the system. In this work we show that making the same assumptions it is easy to show that the relative precision on the Triples rates is also at a minimum when the relative precision of the Doubles (or Reals) is at a minimum. Although the analysis is straightforward to our knowledge such a discussion has not been documented in the literature before. Actual measurement systems do not always behave in the ideal we choose to model them. Fortunately however the variation in the relative precision as a function of gate width is rather flat for traditional safeguards counters and so the performance is somewhat forgiving of the exact choice. The derivation further serves to delineate the important parameters which determine the relative counting precision of the Doubles and Triples rates under the regime considered. To illustrate the similarities and differences we consider the relative standard deviation that might be anticipated for a passive correlation count of an axial section of a spent nuclear fuel assembly under practically achievable conditions.

  14. Neutron counting and gamma spectroscopy with PVT detectors.

    SciTech Connect (OSTI)

    Mitchell, Dean James; Brusseau, Charles A.

    2011-06-01

    Radiation portals normally incorporate a dedicated neutron counter and a gamma-ray detector with at least some spectroscopic capability. This paper describes the design and presents characterization data for a detection system called PVT-NG, which uses large polyvinyl toluene (PVT) detectors to monitor both types of radiation. The detector material is surrounded by polyvinyl chloride (PVC), which emits high-energy gamma rays following neutron capture reactions. Assessments based on high-energy gamma rays are well suited for the detection of neutron sources, particularly in border security applications, because few isotopes in the normal stream of commerce have significant gamma ray yields above 3 MeV. Therefore, an increased count rate for high-energy gamma rays is a strong indicator for the presence of a neutron source. The sensitivity of the PVT-NG sensor to bare {sup 252}Cf is 1.9 counts per second per nanogram (cps/ng) and the sensitivity for {sup 252}Cf surrounded by 2.5 cm of polyethylene is 2.3 cps/ng. The PVT-NG sensor is a proof-of-principal sensor that was not fully optimized. The neutron detector sensitivity could be improved, for instance, by using additional moderator. The PVT-NG detectors and associated electronics are designed to provide improved resolution, gain stability, and performance at high-count rates relative to PVT detectors in typical radiation portals. As well as addressing the needs for neutron detection, these characteristics are also desirable for analysis of the gamma-ray spectra. Accurate isotope identification results were obtained despite the common impression that the absence of photopeaks makes data collected by PVT detectors unsuitable for spectroscopic analysis. The PVT detectors in the PVT-NG unit are used for both gamma-ray and neutron detection, so the sensitive volume exceeds the volume of the detection elements in portals that use dedicated components to detect each type of radiation.

  15. Los Alamos Middle School team wins regional MathCounts competition

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Los Alamos Middle School team wins Regional MathCounts competition Community Connections: Your link to news and opportunities from Los Alamos National Laboratory Latest Issue: September 1, 2016 all issues All Issues » submit Los Alamos Middle School wins regional MathCounts event Competes against 60 other middle schools for the title. March 1, 2013 Los Alamos Middle School won the regional MathCounts competition. Los Alamos Middle School won the regional MathCounts competition. Contacts Editor

  16. Metals processing control by counting molten metal droplets

    DOE Patents [OSTI]

    Schlienger, Eric; Robertson, Joanna M.; Melgaard, David; Shelmidine, Gregory J.; Van Den Avyle, James A.

    2000-01-01

    Apparatus and method for controlling metals processing (e.g., ESR) by melting a metal ingot and counting molten metal droplets during melting. An approximate amount of metal in each droplet is determined, and a melt rate is computed therefrom. Impedance of the melting circuit is monitored, such as by calculating by root mean square a voltage and current of the circuit and dividing the calculated current into the calculated voltage. Analysis of the impedance signal is performed to look for a trace characteristic of formation of a molten metal droplet, such as by examining skew rate, curvature, or a higher moment.

  17. Optical People Counting for Demand Controlled Ventilation: A Pilot Study of Counter Performance

    SciTech Connect (OSTI)

    Fisk, William J.; Sullivan, Douglas

    2009-12-26

    This pilot scale study evaluated the counting accuracy of two people counting systems that could be used in demand controlled ventilation systems to provide control signals for modulating outdoor air ventilation rates. The evaluations included controlled challenges of the people counting systems using pre-planned movements of occupants through doorways and evaluations of counting accuracies when naive occupants (i.e., occupants unaware of the counting systems) passed through the entrance doors of the building or room. The two people counting systems had high counting accuracy accuracies, with errors typically less than 10percent, for typical non-demanding counting events. However, counting errors were high in some highly challenging situations, such as multiple people passing simultaneously through a door. Counting errors, for at least one system, can be very high if people stand in the field of view of the sensor. Both counting system have limitations and would need to be used only at appropriate sites and where the demanding situations that led to counting errors were rare.

  18. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1: Analysis of core damage frequency from internal events for Plant Operational State 5 during a refueling outage. Volume 2, Part 3: Internal Events Appendices I and J

    SciTech Connect (OSTI)

    Yakle, J.; Darby, J.; Whitehead, D.; Staple, B.

    1994-06-01

    This report provides supporting documentation for various tasks associated with the performance of the probablistic risk assessment for Plant Operational State 5 during a refueling outage at Grand Gulf, Unit 1 as documented in Volume 2, Part 1 of NUREG/CR-6143.

  19. Full-scale demonstration of low-NO{sub x} cell{trademark} burner retrofit: Addendum to long-term testing report, September 1994 outage: Examination of corrosion test panel and UT survey in DP&L Unit {number_sign}4

    SciTech Connect (OSTI)

    Kung, S.C.; Kleisley, R.J.

    1995-06-01

    As part of this DOE`s demonstration program, a corrosion test panel was installed on the west sidewall of Dayton Power & Light Unit no.4 at the J. M. Stuart Station (JMSS4) during the burner retrofit outage in November 1991. The test panel consisted of four sections of commercial coatings separated by bare SA213-T2 tubing. During the retrofit outage, a UT survey was performed to document the baseline wall thicknesses of the test panel, as well as several furnace wall areas outside the test panel. The purpose of the UT survey was to generate the baseline data so that the corrosion wastage associated with the operation of Low NO{sub x} Cell Burners (LNCB{trademark} burner) could be quantitatively determined. The corrosion test panel in JMSS4 was examined in April 1993 after the first 15-month operation of the LNCB{trademark} burners. Details of the corrosion analysis and UT data were documented in the Long-Term Testing Report. The second JMSS4 outage following the LNCB{trademark} burner retrofit took place in September 1944. Up to this point, the test panel in JMSS4 had been exposed to the corrosive combustion environment for approximately 31 months under normal boiler operation of JMSS4. This test period excluded the down time for the April 1993 outage. During the September 1994 outage, 70 tube samples of approximately one-foot length were cut from the bottom of the test panel. These samples were evaluated by the Alliance Research Center of B&W using the same metallurgical techniques as those employed for the previous outage. In addition, UT measurements were taken on the same locations of the lower furnace walls in JMSS4 as those during the prior outages. Results of the metallurgical analyses and UT surveys from different exposure times were compared, and the long-term performance of waterwall materials was analyzed. The corrosion data obtained from the long-term field study at JMSS4 after 32 months of LNCB{trademark} burner operation are summarized in this report.

  20. SAS Output

    U.S. Energy Information Administration (EIA) Indexed Site

    A. U.S. Transmission Circuit Sustained Automatic Outage Counts and Hours by High-Voltage Size and NERC Region, 2013 Sustained Automatic Outage Counts Voltage Region Type Operating...

  1. Active Well Counting Using New PSD Plastic Detectors

    SciTech Connect (OSTI)

    Hausladen, Paul; Newby, Jason; McElroy, Robert Dennis

    2015-11-01

    This report presents results and analysis from a series of proof-of-concept measurements to assess the suitability of segmented detectors constructed from Eljen EJ-299-34 PSD-plastic scintillator with pulse-shape discrimination capability for the purposes of quantifying uranium via active neutron coincidence counting. Present quantification of bulk uranium materials for international safeguards and domestic materials control and accounting relies on active neutron coincidence counting systems, such as the Active Well Coincidence Counter (AWCC) and the Uranium Neutron Coincidence Collar (UNCL), that use moderated He-3 proportional counters along with necessarily low-intensity 241Am(Li) neutron sources. Scintillation-based fast-neutron detectors are a potentially superior technology to the existing AWCC and UNCL designs due to their spectroscopic capability and their inherently short neutron coincidence times that largely eliminate random coincidences and enable interrogation by stronger sources. One of the past impediments to the investigation and adoption of scintillation counters for the purpose of quantifying bulk uranium was the commercial availability of scintillators having the necessary neutron-gamma pulse-shape discrimination properties only as flammable liquids. Recently, Eljen EJ-299-34 PSD-plastic scintillator became commercially available. The present work is the first assessment of an array of PSD-plastic detectors for the purposes of quantifying bulk uranium. The detector panel used in the present work was originally built as the focal plane for a fast-neutron imager, but it was repurposed for the present investigation by construction of a stand to support the inner well of an AWCC immediately in front of the detector panel. The detector panel and data acquisition of this system are particularly well suited for performing active-well fast-neutron counting of LEU and HEU samples because the active detector volume is solid, the 241Am(Li) interrogating

  2. Doubles counting of highly multiplying items in reflective surroundings

    SciTech Connect (OSTI)

    Croft, Stephen; Evans, Louise G; Schear, Melissa A; Tobin, Stephen J

    2010-11-18

    When a neutrons are counted from a spontaneously fissile multiplying item in a reflecting environment the temporal behavior of the correlated signal following neutron birth is complex. At early times the signal is dominated by prompt fission events coming from spontaneous fission bursts and also from prompt fast-neutron induced fission events. At later times neutrons 'returning' from the surroundings induce fission and give rise to an additional chain of correlated events. The prompt and returning components probe the fissile and fertile constituents of the item in different ways and it is potentially beneficial to exploit this fact. In this work we look at how the two components can be represented using a linear combination of two simple functions. Fitting of the composite function to the capture time distribution represents one way of quantifying the proportion of each contribution. Another approach however is to use a dual shift register analysis where after each triggering event two coincidence gates are opened, one close to the trigger that responds preferentially to the prompt dynamics and one later in time which is more sensitive to the returning neutron induced events. To decide on the best gate positions and gate widths and also to estimate the counting precision we can use the analytical fit to work out the necessary gate utilization factors which are required in both these calculations. In this work, we develop the approach. Illustrative examples are given using spent Low Enriched Uranium (LEU) Pressurized light Water Reactor (LWR) fuel assemblies submersed in borated water and counted in a ring of {sup 3}He gas-filled proportional counters. In this case the prompt component is dominated by {sup 244}Cm spontaneous fission and induced fast neutron fission in for example {sup 238}U while the returning low energy neutrons induce fission mainly in the fissile nuclides such as {sup 239}Pu, {sup 241}Pu and {sup 235}U. One requirement is to calculate the Random

  3. Status of in-plant neutron coincidence counting

    SciTech Connect (OSTI)

    Enssling, N.; Krick, M.; Menlove, H.; Stewart, J.

    1986-01-01

    Neutron coincidence counters are used in nuclear material processing plants to assay bulk quantities of plutonium or uranium. Passive assays of plutonium are often made with the High-Level Neutron Counter (HLNC or HLNC-II), the Dual-Range Coincidence Counter, or customized detector geometries. Active assays of uranium are often made with the Active Well Coincidence Counter or the Uranium Neutron Coincidence Collar. Modern counters may have flattened efficiency profiles, fast AMPTEK amplifier/discriminators mounted directly next to the /sup 3/He detection tubes, external background shields, or special sample-loading mechanisms. Typical counting times and accuracies that can be obtained for plutonium are summarized. If isotopic composition is known, large plutonium samples can be assayed in 100 to 200 s - comparable to the time requires to input sample data into the counter's calculator or computer.

  4. Neutron multiplicity counting: Confidence intervals for reconstruction parameters

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Verbeke, Jerome M.

    2016-03-09

    From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less

  5. Working with SRNL - Our Facilities- Ultra Low-Level Underground Counting

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Facility Ultra Low-Level Underground Counting Facility Working with SRNL Our Facilities - Ultra Low-Level Underground Counting Facility The Ultra Low-Level Underground Counting Facility is the only facility of its kind in the country. This facility is located 50 feet below ground level, and has four-inch thick walls of pre-nuclear weapons era steel. This allows highly sensitive measurements of ultra-low amounts of environmental radioactivity, free from interference by background radiatio

  6. Cryogenic, high-resolution x-ray detector with high count rate capability

    DOE Patents [OSTI]

    Frank, Matthias; Mears, Carl A.; Labov, Simon E.; Hiller, Larry J.; Barfknecht, Andrew T.

    2003-03-04

    A cryogenic, high-resolution X-ray detector with high count rate capability has been invented. The new X-ray detector is based on superconducting tunnel junctions (STJs), and operates without thermal stabilization at or below 500 mK. The X-ray detector exhibits good resolution (.about.5-20 eV FWHM) for soft X-rays in the keV region, and is capable of counting at count rates of more than 20,000 counts per second (cps). Simple, FET-based charge amplifiers, current amplifiers, or conventional spectroscopy shaping amplifiers can provide the electronic readout of this X-ray detector.

  7. Modeling the Number of Ignitions Following an Earthquake: Developing Prediction Limits for Overdispersed Count Data

    Broader source: Energy.gov [DOE]

    Modeling the Number of Ignitions Following an Earthquake: Developing Prediction Limits for Overdispersed Count Data Elizabeth J. Kelly and Raymond N. Tell

  8. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)

    SciTech Connect (OSTI)

    Whitehead, D.; Darby, J.; Yakle, J.

    1994-06-01

    This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

  9. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1: Evaluation of severe accident risks for plant operational state 5 during a refueling outage. Main report and appendices, Volume 6, Part 1

    SciTech Connect (OSTI)

    Brown, T.D.; Kmetyk, L.N.; Whitehead, D.; Miller, L.; Forester, J.; Johnson, J.

    1995-03-01

    Traditionally, probabilistic risk assessments (PRAS) of severe accidents in nuclear power plants have considered initiating events potentially occurring only during full power operation. Recent studies and operational experience have, however, implied that accidents during low power and shutdown could be significant contributors to risk. In response to this concern, in 1989 the Nuclear Regulatory Commission (NRC) initiated an extensive program to carefully examine the potential risks during low power and shutdown operations. Two plants, Surry (pressurized water reactor) and Grand Gulf (boiling water reactor), were selected as the plants to be studied. The program consists of two parallel projects being performed by Brookhaven National Laboratory (Surry) and Sandia National Laboratories (Grand Gulf). The program objectives include assessing the risks of severe accidents initiated during plant operational states other than full power operation and comparing the estimated risks with the risk associated with accidents initiated during full power operation as assessed in NUREG-1150. The scope of the program is that of a Level-3 PRA. The subject of this report is the PRA of the Grand Gulf Nuclear Station, Unit 1. The Grand Gulf plant utilizes a 3833 MWt BUR-6 boiling water reactor housed in a Mark III containment. The Grand Gulf plant is located near Port Gibson, Mississippi. The regime of shutdown analyzed in this study was plant operational state (POS) 5 during a refueling outage, which is approximately Cold Shutdown as defined by Grand Gulf Technical Specifications. The entire PRA of POS 5 is documented in a multi-volume NUREG report (NUREG/CR-6143). The internal events accident sequence analysis (Level 1) is documented in Volume 2. The Level 1 internal fire and internal flood analyses are documented in Vols 3 and 4, respectively.

  10. The IAEA neutron coincidence counting (INCC) and the DEMING least-squares fitting programs

    SciTech Connect (OSTI)

    Krick, M.S.; Harker, W.C.; Rinard, P.M.; Wenz, T.R.; Lewis, W.; Pham, P.; Ridder, P. de

    1998-12-01

    Two computer programs are described: (1) the INCC (IAEA or International Neutron Coincidence Counting) program and (2) the DEMING curve-fitting program. The INCC program is an IAEA version of the Los Alamos NCC (Neutron Coincidence Counting) code. The DEMING program is an upgrade of earlier Windows{reg_sign} and DOS codes with the same name. The versions described are INCC 3.00 and DEMING 1.11. The INCC and DEMING codes provide inspectors with the software support needed to perform calibration and verification measurements with all of the neutron coincidence counting systems used in IAEA inspections for the nondestructive assay of plutonium and uranium.

  11. Refinery Outages: First-Half 2016

    U.S. Energy Information Administration (EIA) Indexed Site

    Analysis & Projections Glossary › FAQS › Overview Projection Data Monthly short-term forecasts to 2016 Annual projections to 2040 International projections All projections reports Analysis & Projections Major Topics Most popular Annual Energy Outlook related Congressional & other requests International Energy Outlook related Presentations Recurring Short-Term Outlook Related Special outlooks Testimony All reports Browse by Tag Alphabetical Frequency Tag Cloud Full report Previous

  12. U.S. oil production forecast update reflects lower rig count

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    U.S. oil production forecast update reflects lower rig count Lower oil prices and fewer rigs drilling for crude oil are expected to slow U.S. oil production growth this year and in ...

  13. Self-Calibrated Cluster Counts as a Probe of Primordial Non-Gaussianity

    SciTech Connect (OSTI)

    Oguri, Masamune; /KIPAC, Menlo Park

    2009-05-07

    We show that the ability to probe primordial non-Gaussianity with cluster counts is drastically improved by adding the excess variance of counts which contains information on the clustering. The conflicting dependences of changing the mass threshold and including primordial non-Gaussianity on the mass function and biasing indicate that the self-calibrated cluster counts well break the degeneracy between primordial non-Gaussianity and the observable-mass relation. Based on the Fisher matrix analysis, we show that the count variance improves constraints on f{sub NL} by more than an order of magnitude. It exhibits little degeneracy with dark energy equation of state. We forecast that upcoming Hyper Suprime-cam cluster surveys and Dark Energy Survey will constrain primordial non-Gaussianity at the level {sigma}(f{sub NL}) {approx} 8, which is competitive with forecasted constraints from next-generation cosmic microwave background experiments.

  14. The LANL C-NR counting room and fission product yields

    SciTech Connect (OSTI)

    Jackman, Kevin Richard

    2015-09-21

    This PowerPoint presentation focused on the following areas: LANL C-NR counting room; Fission product yields; Los Alamos Neutron wheel experiments; Recent experiments ad NCERC; and Post-detonation nuclear forensics

  15. Don't Count Your Ions Before They Dissociate | U.S. DOE Office...

    Office of Science (SC) Website

    Don't Count Your Ions Before They Dissociate Basic Energy Sciences (BES) BES Home About ... The green shading represents the 99.98% of the molecules that exist in a neutral, or ...

  16. Cosmic ray neutron background reduction using localized coincidence veto neutron counting

    DOE Patents [OSTI]

    Menlove, Howard O.; Bourret, Steven C.; Krick, Merlyn S.

    2002-01-01

    This invention relates to both the apparatus and method for increasing the sensitivity of measuring the amount of radioactive material in waste by reducing the interference caused by cosmic ray generated neutrons. The apparatus includes: (a) a plurality of neutron detectors, each of the detectors including means for generating a pulse in response to the detection of a neutron; and (b) means, coupled to each of the neutrons detectors, for counting only some of the pulses from each of the detectors, whether cosmic ray or fission generated. The means for counting includes a means that, after counting one of the pulses, vetos the counting of additional pulses for a prescribed period of time. The prescribed period of time is between 50 and 200 .mu.s. In the preferred embodiment the prescribed period of time is 128 .mu.s. The veto means can be an electronic circuit which includes a leading edge pulse generator which passes a pulse but blocks any subsequent pulse for a period of between 50 and 200 .mu.s. Alternately, the veto means is a software program which includes means for tagging each of the pulses from each of the detectors for both time and position, means for counting one of the pulses from a particular position, and means for rejecting those of the pulses which originate from the particular position and in a time interval on the order of the neutron die-away time in polyethylene or other shield material. The neutron detectors are grouped in pods, preferably at least 10. The apparatus also includes means for vetoing the counting of coincidence pulses from all of the detectors included in each of the pods which are adjacent to the pod which includes the detector which produced the pulse which was counted.

  17. FLOP Counts for "Small" Single-Node Miniapplication Tests

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    FLOP Counts for "Small" Single-Node Miniapplication Tests FLOP Counts for "Small" Single-Node Miniapplication Tests These data, obtained using the NERSC Hopper system, are provided for reference. Code MPI Tasks Threads Reference TFLOP Count Benchmark Time (seconds) # of iterations miniFE 144 1 5.05435E+12 130.2 (total program time) miniGhost 96 1 6.55500E+12 76.5 AMG 96 1 1.30418E+12 66.95 18 UMT 96 1 1.30211E+13 416.99 49 SNAP 96 1 5.84246E+11 15.37 3059 miniDFT 40 1

  18. Apparatus and method for temperature correction and expanded count rate of inorganic scintillation detectors

    DOE Patents [OSTI]

    Ianakiev, Kiril D.; Hsue, Sin Tao; Browne, Michael C.; Audia, Jeffrey M.

    2006-07-25

    The present invention includes an apparatus and corresponding method for temperature correction and count rate expansion of inorganic scintillation detectors. A temperature sensor is attached to an inorganic scintillation detector. The inorganic scintillation detector, due to interaction with incident radiation, creates light pulse signals. A photoreceiver processes the light pulse signals to current signals. Temperature correction circuitry that uses a fast light component signal, a slow light component signal, and the temperature signal from the temperature sensor to corrected an inorganic scintillation detector signal output and expanded the count rate.

  19. Powered by NERSC, a Database of Billions of Genes and Counting!

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Powered by NERSC, a Database of Billions of Genes and Counting! Powered by NERSC, a Database of Billions of Genes and Counting! With More than a Billion Microbial genes, IMG/M Breaks a Record January 26, 2012 Linda Vu, lvu@lbl.gov, +1 510 495 2402 IMG/M team celebrates the recording of 1 billionth gene. Microbes are microscopic organisms that live in every nook and cranny of our planet. Without them, plants wouldn't grow, garbage wouldn't decay, humans wouldn't digest food, and there would

  20. ULTRAVIOLET NUMBER COUNTS OF GALAXIES FROM SWIFT ULTRAVIOLET/OPTICAL TELESCOPE DEEP IMAGING OF THE CHANDRA DEEP FIELD SOUTH

    SciTech Connect (OSTI)

    Hoversten, E. A.; Gronwall, C.; Koch, T. S.; Roming, P. W. A.; Siegel, M. H.; Berk, D. E. Vanden; Breeveld, A. A.; Curran, P. A.; Still, M.

    2009-11-10

    Deep Swift UV/Optical Telescope (UVOT) imaging of the Chandra Deep Field South is used to measure galaxy number counts in three near-ultraviolet (NUV) filters (uvw2: 1928 A, uvm2: 2246 A, and uvw1: 2600 A) and the u band (3645 A). UVOT observations cover the break in the slope of the NUV number counts with greater precision than the number counts by the Hubble Space Telescope Space Telescope Imaging Spectrograph and the Galaxy Evolution Explorer, spanning a range 21 approx< m{sub AB} approx< 25. Model number counts confirm earlier investigations in favoring models with an evolving galaxy luminosity function.

  1. Historical review of lung counting efficiencies for low energy photon emitters

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Jeffers, Karen L.; Hickman, David P.

    2014-03-01

    This publication reviews the measured efficiency and variability over time of a high purity planar germanium in vivo lung count system for multiple photon energies using increasingly thick overlays with the Lawrence Livermore Torso Phantom. Furthermore, the measured variations in efficiency are compared with the current requirement for in vivo bioassay performance as defined by the American National Standards Institute Standard.

  2. Calibration of the Accuscan II In Vivo System for Whole Body Counting

    SciTech Connect (OSTI)

    Orval R. Perry; David L. Georgeson

    2011-08-01

    This report describes the April 2011 calibration of the Accuscan II HpGe In Vivo system for whole body counting. The source used for the calibration was a NIST traceable BOMAB manufactured by DOE as INL2006 BOMAB containing Eu-154, Eu-155, Eu-152, Sb-125 and Y-88 with energies from 27 keV to 1836 keV with a reference date of 11/29/2006. The actual usable energy range was 86.5 keV to 1597 keV on 4/21/2011. The BOMAB was constructed inside the Accuscan II counting 'tub' in the order of legs, thighs, abdomen, thorax/arms, neck, and head. Each piece was taped to the backwall of the counter. The arms were taped to the thorax. The phantom was constructed between the v-ridges on the backwall of the Accuscan II counter. The energy and efficiency calibrations were performed using the INL2006 BOMAB. The calibrations were performed with the detectors in the scanning mode. This report includes an overview introduction and records for the energy/FWHM and efficiency calibration including performance verification and validation counting. The Accuscan II system was successfully calibrated for whole body counting and verified in accordance with ANSI/HPS N13.30-1996 criteria.

  3. A New Technique for Studying the Fano Factor And the Mean Energy Per Ion Pair in Counting Gases

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Panksky, A.; Breskin, A.; Chechik, R.

    1996-04-01

    A new method is presented for deriving the Fano factor and the mean energy per ion pair in the ultrasoft x-ray energy range. It is based on counting electrons deposited by a photon in a low-pressure gas, and is applicable for all counting gases. The energy dependence of these parameters for several hydrocarbons and gas mixtures is presented.

  4. Wedge sampling for computing clustering coefficients and triangle counts on large graphs

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Seshadhri, C.; Pinar, Ali; Kolda, Tamara G.

    2014-05-08

    Graphs are used to model interactions in a variety of contexts, and there is a growing need to quickly assess the structure of such graphs. Some of the most useful graph metrics are based on triangles, such as those measuring social cohesion. Despite the importance of these triadic measures, algorithms to compute them can be extremely expensive. We discuss the method of wedge sampling. This versatile technique allows for the fast and accurate approximation of various types of clustering coefficients and triangle counts. Furthermore, these techniques are extensible to counting directed triangles in digraphs. Our methods come with provable andmore » practical time-approximation tradeoffs for all computations. We provide extensive results that show our methods are orders of magnitude faster than the state of the art, while providing nearly the accuracy of full enumeration.« less

  5. Improving Neutron Measurement Capabilities; Expanding the Limits of Correlated Neutron Counting

    SciTech Connect (OSTI)

    Santi, Peter Angelo; Geist, William H.; Dougan, Arden

    2015-11-05

    A number of technical and practical limitations exist within the neutron correlated counting techniques used in safeguards, especially within the algorithms that are used to process and analyze the detected neutron signals. A multi-laboratory effort is underway to develop new and improved analysis and data processing algorithms based on fundamental physics principles to extract additional or more accurate information about nuclear material bearing items.

  6. $598,890 raised for northern New Mexico students, and counting...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    $598,890 raised Community Connections: Your link to news and opportunities from Los Alamos National Laboratory Latest Issue: September 1, 2016 all issues All Issues » submit $598,890 raised for northern New Mexico students, and counting... LAESF donations give scholars of all income levels access to higher education July 1, 2015 Celebrating a new record-breaking total for the Los Alamos Employees' Scholarship Fund are, from left, scholarship program chair Steven Girrens (Associate Director for

  7. Particle count monitoring of reverse osmosis water treatment for removal of low-level radionuclides

    SciTech Connect (OSTI)

    Moritz, E.J.; Hoffman, C.R.; Hergert, T.R.

    1995-03-01

    Laser diode particle counting technology and analytical measurements were used to evaluate a pilot-scale reverse osmosis (RO) water treatment system for removal of particulate matter and sub-picocurie low-level radionuclides. Stormwater mixed with Waste Water Treatment Plant (WWTP) effluent from the Rocky Flats Environmental Technology Site (RFETS), formerly a Department of Energy (DOE) nuclear weapons production facility, were treated. No chemical pretreatment of the water was utilized during this study. The treatment system was staged as follows: multimedia filtration, granular activated carbon adsorption, hollow tube ultrafiltration, and reverse osmosis membrane filtration. Various recovery rates and two RO membrane models were tested. Analytical measurements included total suspended solids (TSS), total dissolved solids (TDS), gross alpha ({alpha}) and gross beta ({beta}) activity, uranium isotopes {sup 233/234}U and {sup 238}U, plutonium {sup 239/240}Pu, and americium {sup 241}Am. Particle measurement between 1--150 microns ({mu}) included differential particle counts (DPC), and total particle counts (TPC) before and after treatment at various sampling points throughout the test. Performance testing showed this treatment system produced a high quality effluent in clarity and purity. Compared to raw water levels, TSS was reduced to below detection of 5 milligrams per liter (mg/L) and TDS reduced by 98%. Gross {alpha} was essentially removed 100%, and gross {beta} was reduced an average of 94%. Uranium activity was reduced by 99%. TPC between 1-150{mu} were reduced by an average 99.8% to less than 1,000 counts per milliliter (mL), similar in purity to a good drinking water treatment plant. Raw water levels of {sup 239/240}Pu and {sup 241}Am were below reliable quantitation limits and thus no removal efficiencies could be determined for these species.

  8. Counts-in-Cylinders in the Sloan Digital Sky Survey with Comparisons to N-Body

    SciTech Connect (OSTI)

    Berrier, Heather D.; Barton, Elizabeth J.; Berrier, Joel C.; Bullock, James S.; Zentner, Andrew R.; Wechsler, Risa H. /KIPAC, Menlo Park /SLAC

    2010-12-16

    Environmental statistics provide a necessary means of comparing the properties of galaxies in different environments and a vital test of models of galaxy formation within the prevailing, hierarchical cosmological model. We explore counts-in-cylinders, a common statistic defined as the number of companions of a particular galaxy found within a given projected radius and redshift interval. Galaxy distributions with the same two-point correlation functions do not necessarily have the same companion count distributions. We use this statistic to examine the environments of galaxies in the Sloan Digital Sky Survey, Data Release 4. We also make preliminary comparisons to four models for the spatial distributions of galaxies, based on N-body simulations, and data from SDSS DR4 to study the utility of the counts-in-cylinders statistic. There is a very large scatter between the number of companions a galaxy has and the mass of its parent dark matter halo and the halo occupation, limiting the utility of this statistic for certain kinds of environmental studies. We also show that prevalent, empirical models of galaxy clustering that match observed two- and three-point clustering statistics well fail to reproduce some aspects of the observed distribution of counts-in-cylinders on 1, 3 and 6-h{sup -1}Mpc scales. All models that we explore underpredict the fraction of galaxies with few or no companions in 3 and 6-h{sup -1} Mpc cylinders. Roughly 7% of galaxies in the real universe are significantly more isolated within a 6 h{sup -1} Mpc cylinder than the galaxies in any of the models we use. Simple, phenomenological models that map galaxies to dark matter halos fail to reproduce high-order clustering statistics in low-density environments.

  9. Cosmology Constraints from the Weak Lensing Peak Counts and the Power Spectrum in CFHTLenS

    SciTech Connect (OSTI)

    Liu, Jia; May, Morgan; Petri, Andrea; Haiman, Zoltan; Hui, Lam; Kratochvil, Jan M.

    2015-03-04

    Lensing peaks have been proposed as a useful statistic, containing cosmological information from non-Gaussianities that is inaccessible from traditional two-point statistics such as the power spectrum or two-point correlation functions. Here we examine constraints on cosmological parameters from weak lensing peak counts, using the publicly available data from the 154 deg2 CFHTLenS survey. We utilize a new suite of ray-tracing N-body simulations on a grid of 91 cosmological models, covering broad ranges of the three parameters Ωm, σ8, and w, and replicating the galaxy sky positions, redshifts, and shape noise in the CFHTLenS observations. We then build an emulator that interpolates the power spectrum and the peak counts to an accuracy of ≤ 5%, and compute the likelihood in the three-dimensional parameter space (Ωm, σ8, w) from both observables. We find that constraints from peak counts are comparable to those from the power spectrum, and somewhat tighter when different smoothing scales are combined. Neither observable can constrain w without external data. When the power spectrum and peak counts are combined, the area of the error “banana” in the (Ωm, σ8) plane reduces by a factor of ≈ two, compared to using the power spectrum alone. For a flat Λ cold dark matter model, combining both statistics, we obtain the constraint σ8m/0.27)0.63 = 0.85+0.03-0.03.

  10. Cosmology Constraints from the Weak Lensing Peak Counts and the Power Spectrum in CFHTLenS

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Liu, Jia; May, Morgan; Petri, Andrea; Haiman, Zoltan; Hui, Lam; Kratochvil, Jan M.

    2015-03-04

    Lensing peaks have been proposed as a useful statistic, containing cosmological information from non-Gaussianities that is inaccessible from traditional two-point statistics such as the power spectrum or two-point correlation functions. Here we examine constraints on cosmological parameters from weak lensing peak counts, using the publicly available data from the 154 deg2 CFHTLenS survey. We utilize a new suite of ray-tracing N-body simulations on a grid of 91 cosmological models, covering broad ranges of the three parameters Ωm, σ8, and w, and replicating the galaxy sky positions, redshifts, and shape noise in the CFHTLenS observations. We then build an emulator thatmore » interpolates the power spectrum and the peak counts to an accuracy of ≤ 5%, and compute the likelihood in the three-dimensional parameter space (Ωm, σ8, w) from both observables. We find that constraints from peak counts are comparable to those from the power spectrum, and somewhat tighter when different smoothing scales are combined. Neither observable can constrain w without external data. When the power spectrum and peak counts are combined, the area of the error “banana” in the (Ωm, σ8) plane reduces by a factor of ≈ two, compared to using the power spectrum alone. For a flat Λ cold dark matter model, combining both statistics, we obtain the constraint σ8(Ωm/0.27)0.63 = 0.85+0.03-0.03.« less

  11. The effect of deadtime and electronic transients on the predelay bias in neutron coincidence counting

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Croft, Stephen; Favalli, Andrea; Swinhoe, Martyn T.; Goddard, Braden; Stewart, Scott

    2016-01-13

    In neutron coincidence counting using the shift register autocorrelation technique, a predelay is inserted before the opening of the (R+A)-gate. Operationally the purpose of the predelay is to ensure that the (R+A)- and A-gates have matched effectiveness, otherwise a bias will result when the difference between the gates is used to calculate the accidentals corrected net reals coincidence rate. The necessity for the predelay was established experimentally in the early practical development and deployment of the coincidence counting method. The choice of predelay for a given detection system is usually made experimentally, but even today long standing traditional values (e.g.,more » 4.5 µs) are often used. This, at least in part, reflects the fact that a deep understanding of why a finite predelay setting is needed and how to control the underlying influences has not been fully worked out. We attempt, in this paper, to gain some insight into the problem. One aspect we consider is the slowing down, thermalization, and diffusion of neutrons in the detector moderator. The other is the influence of deadtime and electronic transients. These may be classified as non-ideal detector behaviors because they are not included in the conventional model used to interpret measurement data. From improved understanding of the effect of deadtime and electronic transients on the predelay bias in neutron coincidence counting, the performance of both future and current coincidence counters may be improved.« less

  12. Calibration of the Accuscan II In Vivo System for I-125 Thyroid Counting

    SciTech Connect (OSTI)

    Ovard R. Perry; David L. Georgeson

    2011-07-01

    This report describes the March 2011 calibration of the Accuscan II HpGe In Vivo system for I-125 thyroid counting. The source used for the calibration was a DOE manufactured Am-241/Eu-152 source contained in a 22 ml vial BEA Am-241/Eu-152 RMC II-1 with energies from 26 keV to 344 keV. The center of the detector housing was positioned 64 inches from the vault floor. This position places the approximate center line of the detector housing at the center line of the source in the phantom thyroid tube. The energy and efficiency calibration were performed using an RMC II phantom (Appendix J). Performance testing was conducted using source BEA Am-241/Eu-152 RMC II-1 and Validation testing was performed using an I-125 source in a 30 ml vial (I-125 BEA Thyroid 002) and an ANSI N44.3 phantom (Appendix I). This report includes an overview introduction and records for the energy/FWHM and efficiency calibration including performance verification and validation counting. The Accuscan II system was successfully calibrated for counting the thyroid for I-125 and verified in accordance with ANSI/HPS N13.30-1996 criteria.

  13. Calibration of the Accuscan II In Vivo System for I-131 Thyroid Counting

    SciTech Connect (OSTI)

    Orval R. Perry; David L. Georgeson

    2011-07-01

    This report describes the March 2011 calibration of the Accuscan II HpGe In Vivo system for I-131 thyroid counting. The source used for the calibration was an Analytics mixed gamma source 82834-121 distributed in an epoxy matrix in a Wheaton Liquid Scintillation Vial with energies from 88.0 keV to 1836.1 keV. The center of the detectors was position 64-feet from the vault floor. This position places the approximate center line of the detectors at the center line of the source in the thyroid tube. The calibration was performed using an RMC II phantom (Appendix J). Validation testing was performed using a Ba-133 source and an ANSI N44.3 Phantom (Appendix I). This report includes an overview introduction and records for the energy/FWHM and efficiency calibrations including verification counting. The Accuscan II system was successfully calibrated for counting the thyroid for I-131 and verified in accordance with ANSI/HPS N13.30-1996 criteria.

  14. The number counts and infrared backgrounds from infrared-bright galaxies

    SciTech Connect (OSTI)

    Hacking, P.B.; Soifer, B.T. California Institute of Technology, Pasadena )

    1991-02-01

    Extragalactic number counts and diffuse backgrounds at 25, 60, and 100 microns are predicted using new luminosity functions and improved spectral-energy distribution density functions derived from IRAS observations of nearby galaxies. Galaxies at redshifts z less than 3 that are like those in the local universe should produce a minimum diffuse background of 0.0085, 0.038, and 0.13 MJy/sr at 25, 60, and 100 microns, respectively. Models with significant luminosity evolution predict backgrounds about a factor of 4 greater than this minimum. 22 refs.

  15. Characterization of energy response for photon-counting detectors using x-ray fluorescence

    SciTech Connect (OSTI)

    Ding, Huanjun; Cho, Hyo-Min; Molloi, Sabee; Barber, William C.; Iwanczyk, Jan S.

    2014-12-15

    Purpose: To investigate the feasibility of characterizing a Si strip photon-counting detector using x-ray fluorescence. Methods: X-ray fluorescence was generated by using a pencil beam from a tungsten anode x-ray tube with 2 mm Al filtration. Spectra were acquired at 90 from the primary beam direction with an energy-resolved photon-counting detector based on an edge illuminated Si strip detector. The distances from the source to target and the target to detector were approximately 19 and 11 cm, respectively. Four different materials, containing silver (Ag), iodine (I), barium (Ba), and gadolinium (Gd), were placed in small plastic containers with a diameter of approximately 0.7 cm for x-ray fluorescence measurements. Linear regression analysis was performed to derive the gain and offset values for the correlation between the measured fluorescence peak center and the known fluorescence energies. The energy resolutions and charge-sharing fractions were also obtained from analytical fittings of the recorded fluorescence spectra. An analytical model, which employed four parameters that can be determined from the fluorescence calibration, was used to estimate the detector response function. Results: Strong fluorescence signals of all four target materials were recorded with the investigated geometry for the Si strip detector. The average gain and offset of all pixels for detector energy calibration were determined to be 6.95 mV/keV and ?66.33 mV, respectively. The detectors energy resolution remained at approximately 2.7 keV for low energies, and increased slightly at 45 keV. The average charge-sharing fraction was estimated to be 36% within the investigated energy range of 2045 keV. The simulated detector output based on the proposed response function agreed well with the experimental measurement. Conclusions: The performance of a spectral imaging system using energy-resolved photon-counting detectors is very dependent on the energy calibration of the detector. The

  16. Longitudinal Bunch Pattern Measurements through Single Photon Counting at SPEAR3

    SciTech Connect (OSTI)

    Wang, Hongyi; /UC, San Diego

    2012-09-07

    The Stanford Synchrotron Radiation Lightsource (SSRL), a division of SLAC National Accelerator Laboratory, is a synchrotron light source that provides x-rays for experimental use. As electrons are bent in the storage ring, they emit electromagnetic radiation. There are 372 different buckets which electrons can be loaded into. Different filling patterns produce different types of x-rays. What is the bunch pattern at a given time? Which filling pattern is better? Are there any flaws to the current injection system? These questions can be answered with this single photon counting experiment.

  17. Laboratory adds a sixth R&D 100 award to its 2009 count

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    R&D 100 awards Laboratory adds a sixth R&D 100 award to its 2009 count This year's awards bring the Los Alamos total to 113 since the Laboratory first entered the competition in 1978. November 4, 2009 Los Alamos National Laboratory sits on top of a once-remote mesa in northern New Mexico with the Jemez mountains as a backdrop to research and innovation covering multi-disciplines from bioscience, sustainable energy sources, to plasma physics and new materials. Los Alamos National

  18. Liquid scintillation counting methodology for 99Tc analysis. A remedy for radiopharmaceutical waste

    SciTech Connect (OSTI)

    Khan, Mumtaz; Um, Wooyong

    2015-08-13

    This paper presents a new approach for liquid scintillation counting (LSC) analysis of single-radionuclide samples containing appreciable organic or inorganic quench. This work offers better analytical results than existing LSC methods for technetium-99 (99gTc) analysis with significant savings in analysis cost and time. The method was developed to quantify 99gTc in environmental liquid and urine samples using LSC. Method efficiency was measured in the presence of 1.9 to 11,900 ppm total dissolved solids. The quench curve was proved to be effective in the case of spiked 99gTc activity calculation for deionized water, tap water, groundwater, seawater, and urine samples. Counting efficiency was found to be 91.66% for Ultima Gold LLT (ULG-LLT) and Ultima Gold (ULG). Relative error in spiked 99gTc samples was ±3.98% in ULG and ULG-LLT cocktails. Minimum detectable activity was determined to be 25.3 mBq and 22.7 mBq for ULG-LLT and ULG cocktails, respectively. A pre-concentration factor of 1000 was achieved at 100°C for 100% chemical recovery.

  19. Sequential data assimilation for single-molecule FRET photon-counting data

    SciTech Connect (OSTI)

    Matsunaga, Yasuhiro; Kidera, Akinori; Sugita, Yuji

    2015-06-07

    Data assimilation is a statistical method designed to improve the quality of numerical simulations in combination with real observations. Here, we develop a sequential data assimilation method that incorporates one-dimensional time-series data of smFRET (single-molecule Frster resonance energy transfer) photon-counting into conformational ensembles of biomolecules derived from replicated molecular dynamics (MD) simulations. A particle filter using a large number of replicated MD simulations with a likelihood function for smFRET photon-counting data is employed to screen the conformational ensembles that match the experimental data. We examine the performance of the method using emulated smFRET data and coarse-grained (CG) MD simulations of a dye-labeled polyproline-20. The method estimates the dynamics of the end-to-end distance from smFRET data as well as revealing that of latent conformational variables. The particle filter is also able to correct model parameter dependence in CG MD simulations. We discuss the applicability of the method to real experimental data for conformational dynamics of biomolecules.

  20. Association Between White Blood Cell Count Following Radiation Therapy With Radiation Pneumonitis in Non-Small Cell Lung Cancer

    SciTech Connect (OSTI)

    Tang, Chad; Gomez, Daniel R. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Wang, Hongmei [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Department of Radiation Oncology, Nanfang Hospital, Southern Medical University, Guangzhou (China); Levy, Lawrence B. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhuang, Yan [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Xu, Ting; Nguyen, Quynh; Komaki, Ritsuko [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Liao, Zhongxing, E-mail: zliao@mdanderson.org [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States)

    2014-02-01

    Purpose: Radiation pneumonitis (RP) is an inflammatory response to radiation therapy (RT). We assessed the association between RP and white blood cell (WBC) count, an established metric of systemic inflammation, after RT for non-small cell lung cancer. Methods and Materials: We retrospectively analyzed 366 patients with non-small cell lung cancer who received ?60 Gy as definitive therapy. The primary endpoint was whether WBC count after RT (defined as 2 weeks through 3 months after RT completion) was associated with grade ?3 or grade ?2 RP. Median lung volume receiving ?20 Gy (V{sub 20}) was 31%, and post-RT WBC counts ranged from 1.7 to 21.2 10{sup 3} WBCs/?L. Odds ratios (ORs) associating clinical variables and post-RT WBC counts with RP were calculated via logistic regression. A recursive-partitioning algorithm was used to define optimal post-RT WBC count cut points. Results: Post-RT WBC counts were significantly higher in patients with grade ?3 RP than without (P<.05). Optimal cut points for post-RT WBC count were found to be 7.4 and 8.0 10{sup 3}/?L for grade ?3 and ?2 RP, respectively. Univariate analysis revealed significant associations between post-RT WBC count and grade ?3 (n=46, OR=2.6, 95% confidence interval [CI] 1.4?4.9, P=.003) and grade ?2 RP (n=164, OR=2.0, 95% CI 1.2?3.4, P=.01). This association held in a stepwise multivariate regression. Of note, V{sub 20} was found to be significantly associated with grade ?2 RP (OR=2.2, 95% CI 1.2?3.4, P=.01) and trended toward significance for grade ?3 RP (OR=1.9, 95% CI 1.0-3.5, P=.06). Conclusions: Post-RT WBC counts were significantly and independently associated with RP and have potential utility as a diagnostic or predictive marker for this toxicity.

  1. Full counting statistics of energy fluctuations in a driven quantum resonator

    SciTech Connect (OSTI)

    Clerk, A. A.

    2011-10-15

    We consider the statistics of time-integrated energy fluctuations of a driven bosonic single-mode resonator, as measured by a quantum nondemolition (QND) detector, using the standard Keldysh prescription to define higher moments. We find that, due to an effective cascading of fluctuations, these statistics are surprisingly nonclassical: the low-temperature, quantum probability distribution is not equivalent to the high-temperature classical distribution evaluated at some effective temperature. Moreover, for a sufficiently large drive detuning and low temperatures, the Keldysh-ordered quasiprobability distribution characterizing these fluctuations fails to be positive-definite; this is similar to the full counting statistics of charge in superconducting systems. We argue that this indicates a kind of nonclassical behavior akin to that tested by Leggett-Garg inequalities.

  2. On the single-photon-counting (SPC) modes of imaging using an XFEL source

    SciTech Connect (OSTI)

    Wang, Zhehui

    2015-12-14

    In this study, the requirements to achieve high detection efficiency (above 50%) and gigahertz (GHz) frame rate for the proposed 42-keV X-ray free-electron laser (XFEL) at Los Alamos are summarized. Direct detection scenarios using C (diamond), Si, Ge and GaAs semiconductor sensors are analyzed. Single-photon counting (SPC) mode and weak SPC mode using Si can potentially meet the efficiency and frame rate requirements and be useful to both photoelectric absorption and Compton physics as the photon energy increases. Multilayer three-dimensional (3D) detector architecture, as a possible means to realize SPC modes, is compared with the widely used two-dimensional (2D) hybrid planar electrode structure and 3D deeply entrenched electrode architecture. Demonstration of thin film cameras less than 100-μm thick with onboard thin ASICs could be an initial step to realize multilayer 3D detectors and SPC modes for XFELs.

  3. Analysis of 161Tb by radiochemical separation and liquid scintillation counting

    SciTech Connect (OSTI)

    Jiang, J.; Davies, A.; Arrigo, L.; Friese, J.; Seiner, B. N.; Greenwood, L.; Finch, Z.

    2015-12-05

    The determination of 161Tb activity is problematic due to its very low fission yield, short half-life, and the complication of its gamma spectrum. At AWE, radiochemically purified 161Tb solution was measured on a PerkinElmer 1220 QuantulusTM Liquid Scintillation Spectrometer. Since there was no 161Tb certified standard solution available commercially, the counting efficiency was determined by the CIEMAT/NIST Efficiency Tracing method. The method was validated during a recent inter-laboratory comparison exercise involving the analysis of a uranium sample irradiated with thermal neutrons. Lastly, the measured 161Tb result was in excellent agreement with the result using gamma spectrometry and the result obtained by Pacific Northwest National Laboratory.

  4. Analysis of 161Tb by radiochemical separation and liquid scintillation counting

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Jiang, J.; Davies, A.; Arrigo, L.; Friese, J.; Seiner, B. N.; Greenwood, L.; Finch, Z.

    2015-12-05

    The determination of 161Tb activity is problematic due to its very low fission yield, short half-life, and the complication of its gamma spectrum. At AWE, radiochemically purified 161Tb solution was measured on a PerkinElmer 1220 QuantulusTM Liquid Scintillation Spectrometer. Since there was no 161Tb certified standard solution available commercially, the counting efficiency was determined by the CIEMAT/NIST Efficiency Tracing method. The method was validated during a recent inter-laboratory comparison exercise involving the analysis of a uranium sample irradiated with thermal neutrons. Lastly, the measured 161Tb result was in excellent agreement with the result using gamma spectrometry and the result obtainedmore » by Pacific Northwest National Laboratory.« less

  5. Measurement of uranium and plutonium in solid waste by passive photon or neutron counting and isotopic neutron source interrogation

    SciTech Connect (OSTI)

    Crane, T.W.

    1980-03-01

    A summary of the status and applicability of nondestructive assay (NDA) techniques for the measurement of uranium and plutonium in 55-gal barrels of solid waste is reported. The NDA techniques reviewed include passive gamma-ray and x-ray counting with scintillator, solid state, and proportional gas photon detectors, passive neutron counting, and active neutron interrogation with neutron and gamma-ray counting. The active neutron interrogation methods are limited to those employing isotopic neutron sources. Three generic neutron sources (alpha-n, photoneutron, and /sup 252/Cf) are considered. The neutron detectors reviewed for both prompt and delayed fission neutron detection with the above sources include thermal (/sup 3/He, /sup 10/BF/sub 3/) and recoil (/sup 4/He, CH/sub 4/) proportional gas detectors and liquid and plastic scintillator detectors. The instrument found to be best suited for low-level measurements (< 10 nCi/g) is the /sup 252/Cf Shuffler. The measurement technique consists of passive neutron counting followed by cyclic activation using a /sup 252/Cf source and delayed neutron counting with the source withdrawn. It is recommended that a waste assay station composed of a /sup 252/Cf Shuffler, a gamma-ray scanner, and a screening station be tested and evaluated at a nuclear waste site. 34 figures, 15 tables.

  6. Task-based weights for photon counting spectral x-ray imaging

    SciTech Connect (OSTI)

    Bornefalk, Hans

    2011-11-15

    Purpose: To develop a framework for taking the spatial frequency composition of an imaging task into account when determining optimal bin weight factors for photon counting energy sensitive x-ray systems. A second purpose of the investigation is to evaluate the possible improvement compared to using pixel based weights. Methods: The Fourier based approach of imaging performance and detectability index d' is applied to pulse height discriminating photon counting systems. The dependency of d' on the bin weight factors is made explicit, taking into account both differences in signal and noise transfer characteristics across bins and the spatial frequency dependency of interbin correlations from reabsorbed scatter. Using a simplified model of a specific silicon detector, d' values for a high and a low frequency imaging task are determined for optimal weights and compared to pixel based weights. Results: The method successfully identifies bins where a large point spread function degrades detection of high spatial frequency targets. The method is also successful in determining how to downweigh highly correlated bins. Quantitative predictions for the simplified silicon detector model indicate that improvements in the detectability index when applying task-based weights instead of pixel based weights are small for high frequency targets, but could be in excess of 10% for low frequency tasks where scatter-induced correlation otherwise degrade detectability. Conclusions: The proposed method makes the spatial frequency dependency of complex correlation structures between bins and their effect on the system detective quantum efficiency easier to analyze and allows optimizing bin weights for given imaging tasks. A potential increase in detectability of double digit percents in silicon detector systems operated at typical CT energies (100 kVp) merits further evaluation on a real system. The method is noted to be of higher relevance for silicon detectors than for cadmium (zink

  7. Hydrogen cluster/network in tobermorite as studied by multiple-quantum spin counting {sup 1}H NMR

    SciTech Connect (OSTI)

    Mogami, Yuuki; Yamazaki, Satoru; Matsuno, Shinya; Matsui, Kunio; Noda, Yasuto; Takegoshi, K.

    2014-12-15

    Proton multiple-quantum (MQ) spin-counting experiment has been employed to study arrangement of hydrogen atoms in 9 Å/11 Å natural/synthetic tobermorites. Even though all tobermorite samples give similar characterless, broad static-powder {sup 1}H NMR spectra, their MQ spin-counting spectra are markedly different; higher quanta in 11 Å tobermorite do not grow with the MQ excitation time, while those in 9 Å one do. A statistical analysis of the MQ results recently proposed [26] is applied to show that hydrogens align in 9 Å tobermorite one dimensionally, while in 11 Å tobermorite they exist as a cluster of 5–8 hydrogen atoms.

  8. 500-MHz x-ray counting with a Si-APD and a fast-pulse processing system

    SciTech Connect (OSTI)

    Kishimoto, Shunji; Taniguchi, Takashi; Tanaka, Manobu

    2010-06-23

    We introduce a counting system of up to 500 MHz for synchrotron x-ray high-rate measurements. A silicon avalanche photodiode detector was used in the counting system. The fast-pulse circuit of the amplifier was designed with hybrid ICs to prepare an ASIC system for a large-scale pixel array detector in near future. The fast amplifier consists of two cascading emitter-followers using 10-GHz band transistors. A count-rate of 3.25x10{sup 8} s{sup -1} was then achieved using the system for 8-keV x-rays. However, a baseline shift by adopting AC-coupling in the amplifier disturbed us to observe the maximum count of 4.49x10{sup 8} s{sup -1}, determined by electron-bunch filling into a ring accelerator. We also report that an amplifier with a baseline restorer was tested in order to keep the baseline level to be 0 V even at high input rates.

  9. Degree of polarization and source counts of faint radio sources from Stacking Polarized intensity

    SciTech Connect (OSTI)

    Stil, J. M.; George, S. J.; Keller, B. W.; Taylor, A. R.

    2014-06-01

    We present stacking polarized intensity as a means to study the polarization of sources that are too faint to be detected individually in surveys of polarized radio sources. Stacking offers not only high sensitivity to the median signal of a class of radio sources, but also avoids a detection threshold in polarized intensity, and therefore an arbitrary exclusion of sources with a low percentage of polarization. Correction for polarization bias is done through a Monte Carlo analysis and tested on a simulated survey. We show that the nonlinear relation between the real polarized signal and the detected signal requires knowledge of the shape of the distribution of fractional polarization, which we constrain using the ratio of the upper quartile to the lower quartile of the distribution of stacked polarized intensities. Stacking polarized intensity for NRAO VLA Sky Survey (NVSS) sources down to the detection limit in Stokes I, we find a gradual increase in median fractional polarization that is consistent with a trend that was noticed before for bright NVSS sources, but is much more gradual than found by previous deep surveys of radio polarization. Consequently, the polarized radio source counts derived from our stacking experiment predict fewer polarized radio sources for future surveys with the Square Kilometre Array and its pathfinders.

  10. Implementation and Initial Testing of Advanced Processing and Analysis Algorithms for Correlated Neutron Counting

    SciTech Connect (OSTI)

    Santi, Peter Angelo; Cutler, Theresa Elizabeth; Favalli, Andrea; Koehler, Katrina Elizabeth; Henzl, Vladimir; Henzlova, Daniela; Parker, Robert Francis; Croft, Stephen

    2015-12-01

    In order to improve the accuracy and capabilities of neutron multiplicity counting, additional quantifiable information is needed in order to address the assumptions that are present in the point model. Extracting and utilizing higher order moments (Quads and Pents) from the neutron pulse train represents the most direct way of extracting additional information from the measurement data to allow for an improved determination of the physical properties of the item of interest. The extraction of higher order moments from a neutron pulse train required the development of advanced dead time correction algorithms which could correct for dead time effects in all of the measurement moments in a self-consistent manner. In addition, advanced analysis algorithms have been developed to address specific assumptions that are made within the current analysis model, namely that all neutrons are created at a single point within the item of interest, and that all neutrons that are produced within an item are created with the same energy distribution. This report will discuss the current status of implementation and initial testing of the advanced dead time correction and analysis algorithms that have been developed in an attempt to utilize higher order moments to improve the capabilities of correlated neutron measurement techniques.

  11. On the single-photon-counting (SPC) modes of imaging using an XFEL source

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Wang, Zhehui

    2015-12-14

    In this study, the requirements to achieve high detection efficiency (above 50%) and gigahertz (GHz) frame rate for the proposed 42-keV X-ray free-electron laser (XFEL) at Los Alamos are summarized. Direct detection scenarios using C (diamond), Si, Ge and GaAs semiconductor sensors are analyzed. Single-photon counting (SPC) mode and weak SPC mode using Si can potentially meet the efficiency and frame rate requirements and be useful to both photoelectric absorption and Compton physics as the photon energy increases. Multilayer three-dimensional (3D) detector architecture, as a possible means to realize SPC modes, is compared with the widely used two-dimensional (2D) hybridmore » planar electrode structure and 3D deeply entrenched electrode architecture. Demonstration of thin film cameras less than 100-μm thick with onboard thin ASICs could be an initial step to realize multilayer 3D detectors and SPC modes for XFELs.« less

  12. Risk communication with Fukushima residents affected by the Fukushima Daiichi accident at whole-body counting

    SciTech Connect (OSTI)

    Gunji, I.; Furuno, A.; Yonezawa, R.; Sugiyama, K.

    2013-07-01

    After the Tokyo Electric Power Company (TEPCO) Fukushima Daiichi nuclear power plant accident, the Tokai Research and Development Center of the Japan Atomic Energy Agency (JAEA) have had direct dialogue as risk communication with Fukushima residents who underwent whole-body counting examination (WBC). The purpose of the risk communication was to exchange information and opinions about radiation in order to mitigate Fukushima residents' anxiety and stress. Two kinds of opinion surveys were performed: one survey evaluated residents' views of the nuclear accident itself and the second survey evaluated the management of WBC examination as well as the quality of JAEA's communication skills on risks. It appears that most Fukushima residents seem to have reduced their anxiety level after the direct dialogue. The results of the surveys show that Fukushima residents have the deepest anxiety and concern about their long-term health issues and that they harbor anger toward the government and TEPCO. On the other hand, many WBC patients and patients' relatives have expressed gratitude for help in reducing their feelings of anxiety.

  13. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    SciTech Connect (OSTI)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L.; Arendt, R.; Barmby, P.; Barro, G.; Faber, S.; Guhathakurta, P.; Bouwens, R.; Cattaneo, A.; Croton, D.; Dave, R.; Dunlop, J. S.; Egami, E.; Finlator, K.; Grogin, N. A.; and others

    2013-05-20

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg{sup 2} to a depth of 26 AB mag (3{sigma}) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 {mu}m. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 {+-} 1.0 and 4.4 {+-} 0.8 nW m{sup -2} sr{sup -1} at 3.6 and 4.5 {mu}m to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  14. Recent Developments In Fast Neutron Detection And Multiplicity Counting With Verification With Liquid Scintillator

    SciTech Connect (OSTI)

    Nakae, L; Chapline, G; Glenn, A; Kerr, P; Kim, K; Ouedraogo, S; Prasad, M; Sheets, S; Snyderman, N; Verbeke, J; Wurtz, R

    2011-09-30

    For many years at LLNL, we have been developing time-correlated neutron detection techniques and algorithms for applications such as Arms Control, Threat Detection and Nuclear Material Assay. Many of our techniques have been developed specifically for the relatively low efficiency (a few percent) attainable by detector systems limited to man-portability. Historically, we used thermal neutron detectors (mainly {sup 3}He), taking advantage of the high thermal neutron interaction cross-sections. More recently, we have been investigating the use of fast neutron detection with liquid scintillators, inorganic crystals, and in the near future, pulse-shape discriminating plastics which respond over 1000 times faster (nanoseconds versus tens of microseconds) than thermal neutron detectors. Fast neutron detection offers considerable advantages, since the inherent nanosecond production time-scales of spontaneous fission and neutron-induced fission are preserved and measured instead of being lost by thermalization required for thermal neutron detectors. We are now applying fast neutron technology to the safeguards regime in the form of fast portable digital electronics as well as faster and less hazardous scintillator formulations. Faster detector response times and sensitivity to neutron momentum show promise for measuring, differentiating, and assaying samples that have modest to very high count rates, as well as mixed fission sources like Cm and Pu. We report on measured results with our existing liquid scintillator array, and progress on the design of a nuclear material assay system that incorporates fast neutron detection, including the surprising result that fast liquid scintillator detectors become competitive and even surpass the precision of {sup 3}He-based counters measuring correlated pairs in modest (kg) samples of plutonium.

  15. Market Assessment of Refinery Outages Planned for October 2010...

    Gasoline and Diesel Fuel Update (EIA)

    2011 November 2010 Energy Information Administration Office of Petroleum, Gas, and Biofuels Analysis U.S. Department of Energy Washington, DC 20585 This report was prepared by...

  16. Market Assessment of Refinery Outages Planned for October 2010...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    average values for 2002-2009 excluding months in 2005, 2006, and 2008 affected by hurricanes & refinery closures. Similarly, typical historical values are average planned...

  17. May 2016 Planned Outages Archive | The Ames Laboratory

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    May 2016 National Idling Reduction Network News May 2016 National Idling Reduction Network News July 13, 2016 - 10:07am Addthis The National Idling Reduction Network brings together trucking and transit companies; railroads; ports; equipment manufacturers; Federal, state, and local government agencies (including regulators); nonprofit organizations; and national research laboratories to identify consistent, workable solutions to heavy-vehicle idling for the entire United States. Below is the May

  18. Market Assessment of Refinery Outages Planned for March 2011...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    fluctuating between 70 and 85 per barrel, but by the beginning of 2011, Brent crude oil was at 95 per barrel. Recent instability in the Middle East and North Africa added...

  19. Microsoft Word - 112706 Final Outage Letter PUBLIC.doc

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    CRITICAL ENERGY INFRASTRUCTURE INFORMATION REMOVED FOR PRIVILEGED TREATMENT November 27, 2006 Lawrence Mansueti Office of Electricity Delivery and Energy Reliability U.S. Department of Energy Rm. 8H-033 1000 Independence Avenue Washington, D.C. 20585 Re: Potomac River Generating Station Department of Energy Case No. EO-05-01 Dear Mr. Mansueti: Potomac Electric Power Company ("Pepco"), on behalf of itself and PJM Interconnection, L.L.C. ("PJM"), is providing you with

  20. Power Outages Update: Post-Tropical Cyclone Sandy

    Office of Energy Efficiency and Renewable Energy (EERE)

    Hurricane Sandy has landed and the Energy Department is working closely to support state and local officials who are responsible for working with utilities.

  1. Survey of Tools for Risk Assessment of Cascading Outages

    SciTech Connect (OSTI)

    Papic, Milorad; Bell, Keith; Chen, Yousu; Dobson, Ian; Fonte, Louis; Haq, Enamul; Hines, Paul; Kirschen, Daniel; Luo, Xiaochuan; Miller, Stephen; Samaan, Nader A.; Vaiman, Marianna; Varghese, Matthew; Zhang, Pei

    2011-10-17

    Cascading failure can cause large blackouts, and a variety of methods are emerging to study this challenging topic. In parts 1 and 2 of this paper, the IEEE task force on cascading failure seeks to consolidate and review the progress of the field towards methods and tools of assessing the risk of cascading failure. Part 2 summarizes and discusses the state of the art in the available cascading failure modeling tools. The discussion integrates industry and research perspectives from a variety of institutions. Strengths, weaknesses, and gaps in current approaches are indicated.

  2. Refinery Outages: Description and Potential Impact on Petroleum Product Prices

    Reports and Publications (EIA)

    2007-01-01

    This report responds to a July 13, 2006 request from Chairman Jeff Bingaman of the Senate Committee on Energy and Natural Resources requested that the Energy Information Administration conduct a study of the impact that refinery shutdowns have had on the price of oil and gasoline.

  3. HPSS Outage Tue Mar 19 - Fri Mar 22

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    of the HPSS client software tools hsi and htar. Note that if you use a "regular" ftp client, your current client will work with the new version of the HPSS server software....

  4. Market Assessment of Refinery Outages Planned for March 2011...

    Gasoline and Diesel Fuel Update (EIA)

    n s u m p tio n 20.68 19.50 18.77 19.13 19.28 0.8% Note: Gasoline consumption includes ethanol. Source: 2007-2010, EIA Petroleum Supply Monthly; 2011, February 2011 Short Term...

  5. Separation and counting of single molecules through nanofluidics, programmable electrophoresis, and nanoelectrode-gated tunneling and dielectric detection

    DOE Patents [OSTI]

    Lee, James W.; Thundat, Thomas G.

    2006-04-25

    An apparatus for carrying out the separation, detection, and/or counting of single molecules at nanometer scale. Molecular separation is achieved by driving single molecules through a microfluidic or nanofluidic medium using programmable and coordinated electric fields. In various embodiments, the fluidic medium is a strip of hydrophilic material on nonconductive hydrophobic surface, a trough produced by parallel strips of hydrophobic nonconductive material on a hydrophilic base, or a covered passageway produced by parallel strips of hydrophobic nonconductive material on a hydrophilic base together with a nonconductive cover on the parallel strips of hydrophobic nonconductive material. The molecules are detected and counted using nanoelectrode-gated electron tunneling methods, dielectric monitoring, and other methods.

  6. Impact of sensitivity and throughput on optimum selection of a low-background alpha/beta gross counting system

    SciTech Connect (OSTI)

    Seymour, R.; Sergent, F.; Knight, K.; Kyker, B.

    1992-12-31

    Selection of the appropriate low-background counting system is determined by the laboratory`s measurement requirements including the radionuclide activities being measured, required sensitivity, sample volume, sample throughput, operator skill, automation, reporting requirements, budget, reliability, service, and upgrade capability. These requirements are ranked differently by each user. Nevertheless, any selection requires that the sensitivity and sample throughput be evaluated first because these parameters are instrument-specific, cannot be changed after the equipment is purchased and are easily quantified beforehand. Many of the other criteria are also related to sensitivity and affect the choice of instrument. Mathematical expressions, useful in evaluating sensitivity and throughput, are reviewed, extended, and applied to selecting a low-background alpha/beta counting system.

  7. SU-E-I-77: A Noise Reduction Technique for Energy-Resolved Photon-Counting Detectors

    SciTech Connect (OSTI)

    Lam Ng, A; Ding, H; Cho, H; Molloi, S

    2014-06-01

    Purpose: Finding the optimal energy threshold setting for an energy-resolved photon-counting detector has an important impact on the maximization of contrast-to-noise-ratio (CNR). We introduce a noise reduction method to enhance CNR by reducing the noise in each energy bin without altering the average gray levels in the projection and image domains. Methods: We simulated a four bin energy-resolved photon-counting detector based on Si with a 10 mm depth of interaction. TASMIP algorithm was used to simulate a spectrum of 65 kVp with 2.7 mm Al filter. A 13 mm PMMA phantom with hydroxyapatite and iodine at different concentrations (100, 200 and 300 mg/ml for HA, and 2, 4, and 8 mg/ml for Iodine) was used. Projection-based and Image-based energy weighting methods were used to generate weighted images. A reference low noise image was used for noise reduction purposes. A Gaussian-like weighting function which computes the similarity between pixels of interest was calculated from the reference image and implemented on a pixel by pixel basis for the noisy images. Results: CNR improvement compared to different methods (Charge-Integrated, Photon-Counting and Energy-Weighting) and after noise reduction was highly task-dependent. The CNR improvement with respect to the Charge-Integrated CNR for hydroxyapatite and iodine were 1.8 and 1.5, respectively. In each of the energy bins, the noise was reduced by approximately factor of two without altering their respective average gray levels. Conclusion: The proposed noise reduction technique for energy-resolved photon-counting detectors can significantly reduce image noise. This technique can be used as a compliment to the current energy-weighting methods in CNR optimization.

  8. MEASUREMENT OF RADIONUCLIDES USING ION CHROMATOGRAPHY AND FLOW-CELL SCINTILLATION COUNTING WITH PULSE SHAPE DISCRIMINATION

    SciTech Connect (OSTI)

    R. A. Fjeld; T.A. DeVol; J.D. Leyba

    2000-03-30

    Radiological characterization and monitoring is an important component of environmental management activities throughout the Department of Energy complex. Gamma-ray spectroscopy is the technology most often used for the detection of radionuclides. However, radionuclides which cannot easily be detected by gamma-ray spectroscopy, such as pure beta emitters and transuranics, pose special problems because their quantification generally requires labor intensive radiochemical separations procedures that are time consuming and impractical for field applications. This project focused on a technology for measuring transuranics and pure beta emitters relatively quickly and has the potential of being field deployable. The technology combines ion exchange liquid chromatography and on-line alpha/beta pulse shape discriminating scintillation counting to produce simultaneous alpha and beta chromatograms. The basic instrumentation upon which the project was based was purchased in the early 1990's. In its original commercial form, the instrumentation was capable of separating select activation/fission products in ionic forms from relatively pure aqueous samples. We subsequently developed the capability of separating and detecting actinides (thorium, uranium, neptunium, plutonium, americium, and curium) in less than 30 minutes (Reboul, 1993) and realized that the potential time savings over traditional radiochemical methods for isolating some of these radionuclides was significant. However, at that time, the technique had only been used for radionuclide concentrations that were considerably above environmental levels and for aqueous samples of relatively high chemical purity. For the technique to be useful in environmental applications, development work was needed in lowering detection limits; to be useful in applications involving non-aqueous matrices such as soils and sludges or complex aqueous matrices such as those encountered in waste samples, development work was needed in

  9. The Design, Construction, and Initial Characterization of an Ultra-Low-Background Gas-Proportional Counting System

    SciTech Connect (OSTI)

    Seifert, Allen; Aalseth, Craig E.; Day, Anthony R.; Fuller, Erin S.; Hoppe, Eric W.; Keillor, Martin E.; Mace, Emily K.; Overman, Cory T.; Warren, Glen A.

    2013-05-01

    ABSTRACT Over the past several years, the Pacific Northwest National Laboratory (PNNL) has developed an ultra-low background proportional counter (ULBPC) technology. The resulting detector is the product of an effort to produce a low-background, physically robust gas proportional counter for applications like radon emanation measurements, groundwater tritium, and 37Ar. In order to fully take advantage of the inherent low-background properties designed into the ULBPC, a comparably low-background dedicated counting system is required. An ultra-low-background counting system (ULBCS) was recently built in the new shallow underground laboratory at PNNL. With a design depth of 30 meters water-equivalent, the shallow underground laboratory provides approximately 100x fewer fast neutrons and 6x fewer muons than a surface location. The ULBCS itself provides additional shielding in the form of active anti-cosmic veto (via 2-in. thick plastic scintillator paddles) and passive borated poly (1 in.), lead (6 in.), and copper (~3 in.) shielding. This work will provide details on PNNLs new shallow underground laboratory, examine the motivation for the design of the counting system, and provide results from the characterization of the ULBCS, including initial detector background.

  10. 235U Determination using In-Beam Delayed Neutron Counting Technique at the NRU Reactor

    SciTech Connect (OSTI)

    Andrews, M. T.; Bentoumi, G.; Corcoran, E. C.; Dimayuga, I.; Kelly, D. G.; Li, L.; Sur, B.; Rogge, R. B.

    2015-11-17

    This paper describes a collaborative effort that saw the Royal Military College of Canada (RMC)’s delayed neutron and gamma counting apparatus transported to Canadian Nuclear Laboratories (CNL) for use in the neutron beamline at the National Research Universal (NRU) reactor. Samples containing mg quantities of fissile material were re-interrogated, and their delayed neutron emissions measured. This collaboration offers significant advantages to previous delayed neutron research at both CNL and RMC. This paper details the determination of 235U content in enriched uranium via the assay of in-beam delayed neutron magnitudes and temporal behavior. 235U mass was determined with an average absolute error of ± 2.7 %. This error is lower than that obtained at RMCC for the assay of 235U content in aqueous solutions (3.6 %) using delayed neutron counting. Delayed neutron counting has been demonstrated to be a rapid, accurate, and precise method for special nuclear material detection and identification.

  11. High quantum efficiency and low dark count rate in multi-layer superconducting nanowire single-photon detectors

    SciTech Connect (OSTI)

    Jafari Salim, A. Eftekharian, A.; Hamed Majedi, A.

    2014-02-07

    In this paper, we theoretically show that a multi-layer superconducting nanowire single-photon detector (SNSPD) is capable of approaching characteristics of an ideal SNSPD in terms of the quantum efficiency, dark count, and band-width. A multi-layer structure improves the performance in two ways. First, the potential barrier for thermally activated vortex crossing, which is the major source of dark counts and the reduction of the critical current in SNSPDs is elevated. In a multi-layer SNSPD, a vortex is made of 2D-pancake vortices that form a stack. It will be shown that the stack of pancake vortices effectively experiences a larger potential barrier compared to a vortex in a single-layer SNSPD. This leads to an increase in the experimental critical current as well as significant decrease in the dark count rate. In consequence, an increase in the quantum efficiency for photons of the same energy or an increase in the sensitivity to photons of lower energy is achieved. Second, a multi-layer structure improves the efficiency of single-photon absorption by increasing the effective optical thickness without compromising the single-photon sensitivity.

  12. Comparison of MCNP6 and experimental results for neutron counts, Rossi-{alpha}, and Feynman-{alpha} distributions

    SciTech Connect (OSTI)

    Talamo, A.; Gohar, Y.; Sadovich, S.; Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.

    2013-07-01

    MCNP6, the general-purpose Monte Carlo N-Particle code, has the capability to perform time-dependent calculations by tracking the time interval between successive events of the neutron random walk. In fixed-source calculations for a subcritical assembly, the zero time value is assigned at the moment the neutron is emitted by the external neutron source. The PTRAC and F8 cards of MCNP allow to tally the time when a neutron is captured by {sup 3}He(n, p) reactions in the neutron detector. From this information, it is possible to build three different time distributions: neutron counts, Rossi-{alpha}, and Feynman-{alpha}. The neutron counts time distribution represents the number of neutrons captured as a function of time. The Rossi-a distribution represents the number of neutron pairs captured as a function of the time interval between two capture events. The Feynman-a distribution represents the variance-to-mean ratio, minus one, of the neutron counts array as a function of a fixed time interval. The MCNP6 results for these three time distributions have been compared with the experimental data of the YALINA Thermal facility and have been found to be in quite good agreement. (authors)

  13. Characteristic performance evaluation of a photon counting Si strip detector for low dose spectral breast CT imaging

    SciTech Connect (OSTI)

    Cho, Hyo-Min; Ding, Huanjun; Molloi, Sabee; Barber, William C.; Iwanczyk, Jan S.

    2014-09-15

    Purpose: The possible clinical applications which can be performed using a newly developed detector depend on the detector's characteristic performance in a number of metrics including the dynamic range, resolution, uniformity, and stability. The authors have evaluated a prototype energy resolved fast photon counting x-ray detector based on a silicon (Si) strip sensor used in an edge-on geometry with an application specific integrated circuit to record the number of x-rays and their energies at high flux and fast frame rates. The investigated detector was integrated with a dedicated breast spectral computed tomography (CT) system to make use of the detector's high spatial and energy resolution and low noise performance under conditions suitable for clinical breast imaging. The aim of this article is to investigate the intrinsic characteristics of the detector, in terms of maximum output count rate, spatial and energy resolution, and noise performance of the imaging system. Methods: The maximum output count rate was obtained with a 50 W x-ray tube with a maximum continuous output of 50 kVp at 1.0 mA. A{sup 109}Cd source, with a characteristic x-ray peak at 22 keV from Ag, was used to measure the energy resolution of the detector. The axial plane modulation transfer function (MTF) was measured using a 67 ?m diameter tungsten wire. The two-dimensional (2D) noise power spectrum (NPS) was measured using flat field images and noise equivalent quanta (NEQ) were calculated using the MTF and NPS results. The image quality parameters were studied as a function of various radiation doses and reconstruction filters. The one-dimensional (1D) NPS was used to investigate the effect of electronic noise elimination by varying the minimum energy threshold. Results: A maximum output count rate of 100 million counts per second per square millimeter (cps/mm{sup 2}) has been obtained (1 million cps per 100 100 ?m pixel). The electrical noise floor was less than 4 keV. The energy

  14. Low-noise low-jitter 32-pixels CMOS single-photon avalanche diodes array for single-photon counting from 300 nm to 900 nm

    SciTech Connect (OSTI)

    Scarcella, Carmelo; Tosi, Alberto Villa, Federica; Tisa, Simone; Zappa, Franco

    2013-12-15

    We developed a single-photon counting multichannel detection system, based on a monolithic linear array of 32 CMOS SPADs (Complementary Metal-Oxide-Semiconductor Single-Photon Avalanche Diodes). All channels achieve a timing resolution of 100 ps (full-width at half maximum) and a photon detection efficiency of 50% at 400 nm. Dark count rate is very low even at room temperature, being about 125 counts/s for 50 ?m active area diameter SPADs. Detection performance and microelectronic compactness of this CMOS SPAD array make it the best candidate for ultra-compact time-resolved spectrometers with single-photon sensitivity from 300 nm to 900 nm.

  15. A comparative analysis of OTF, NPS, and DQE in energy integrating and photon counting digital x-ray detectors

    SciTech Connect (OSTI)

    Acciavatti, Raymond J.; Maidment, Andrew D. A.

    2010-12-15

    Purpose: One of the benefits of photon counting (PC) detectors over energy integrating (EI) detectors is the absence of many additive noise sources, such as electronic noise and secondary quantum noise. The purpose of this work is to demonstrate that thresholding voltage gains to detect individual x rays actually generates an unexpected source of white noise in photon counters. Methods: To distinguish the two detector types, their point spread function (PSF) is interpreted differently. The PSF of the energy integrating detector is treated as a weighting function for counting x rays, while the PSF of the photon counting detector is interpreted as a probability. Although this model ignores some subtleties of real imaging systems, such as scatter and the energy-dependent amplification of secondary quanta in indirect-converting detectors, it is useful for demonstrating fundamental differences between the two detector types. From first principles, the optical transfer function (OTF) is calculated as the continuous Fourier transform of the PSF, the noise power spectra (NPS) is determined by the discrete space Fourier transform (DSFT) of the autocovariance of signal intensity, and the detective quantum efficiency (DQE) is found from combined knowledge of the OTF and NPS. To illustrate the calculation of the transfer functions, the PSF is modeled as the convolution of a Gaussian with the product of rect functions. The Gaussian reflects the blurring of the x-ray converter, while the rect functions model the sampling of the detector. Results: The transfer functions are first calculated assuming outside noise sources such as electronic noise and secondary quantum noise are negligible. It is demonstrated that while OTF is the same for two detector types possessing an equivalent PSF, a frequency-independent (i.e., ''white'') difference in their NPS exists such that NPS{sub PC}{>=}NPS{sub EI} and hence DQE{sub PC}{<=}DQE{sub EI}. The necessary and sufficient condition for

  16. A new method of passive counting of nuclear missile warheads -a white paper for the Defense Threat Reduction Agency

    SciTech Connect (OSTI)

    Morris, Christopher; Durham, J. Matthew; Guardincerri, Elena; Bacon, Jeffrey Darnell; Wang, Zhehui; Fellows, Shelby; Poulson, Daniel Cris; Plaud-Ramos, Kenie Omar; Daughton, Tess Marie; Johnson, Olivia Ruth

    2015-07-31

    Cosmic ray muon imaging has been studied for the past several years as a possible technique for nuclear warhead inspection and verification as part of the New Strategic Arms Reduction Treaty between the United States and the Russian Federation. The Los Alamos team has studied two different muon imaging methods for this application, using detectors on two sides and one side of the object of interest. In this report we present results obtained on single sided imaging of configurations aimed at demonstrating the potential of this technique for counting nuclear warheads in place with detectors above the closed hatch of a ballistic missile submarine.

  17. Characterizing energy dependence and count rate performance of a dual scintillator fiber-optic detector for computed tomography

    SciTech Connect (OSTI)

    Hoerner, Matthew R. Stepusin, Elliott J.; Hyer, Daniel E.; Hintenlang, David E.

    2015-03-15

    Purpose: Kilovoltage (kV) x-rays pose a significant challenge for radiation dosimetry. In the kV energy range, even small differences in material composition can result in significant variations in the absorbed energy between soft tissue and the detector. In addition, the use of electronic systems in light detection has demonstrated measurement losses at high photon fluence rates incident to the detector. This study investigated the feasibility of using a novel dual scintillator detector and whether its response to changes in beam energy from scatter and hardening is readily quantified. The detector incorporates a tissue-equivalent plastic scintillator and a gadolinium oxysulfide scintillator, which has a higher sensitivity to scatter x-rays. Methods: The detector was constructed by coupling two scintillators: (1) small cylindrical plastic scintillator, 500 μm in diameter and 2 mm in length, and (2) 100 micron sheet of gadolinium oxysulfide 500 μm in diameter, each to a 2 m long optical fiber, which acts as a light guide to transmit scintillation photons from the sensitive element to a photomultiplier tube. Count rate linearity data were obtained from a wide range of exposure rates delivered from a radiological x-ray tube by adjusting the tube current. The data were fitted to a nonparalyzable dead time model to characterize the time response. The true counting rate was related to the reference free air dose air rate measured with a 0.6 cm{sup 3} Radcal{sup ®} thimble chamber as described in AAPM Report No. 111. Secondary electron and photon spectra were evaluated using Monte Carlo techniques to analyze ionization quenching and photon energy-absorption characteristics from free-in-air and in phantom measurements. The depth/energy dependence of the detector was characterized using a computed tomography dose index QA phantom consisting of nested adult head and body segments. The phantom provided up to 32 cm of acrylic with a compatible 0.6 cm{sup 3} calibrated

  18. Faint submillimeter galaxies revealed by multifield deep ALMA observations: number counts, spatial clustering, and a dark submillimeter line emitter

    SciTech Connect (OSTI)

    Ono, Yoshiaki; Ouchi, Masami; Momose, Rieko; Kurono, Yasutaka

    2014-11-01

    We present the statistics of faint submillimeter/millimeter galaxies (SMGs) and serendipitous detections of a submillimeter/millimeter line emitter (SLE) with no multi-wavelength continuum counterpart revealed by the deep ALMA observations. We identify faint SMGs with flux densities of 0.1-1.0 mJy in the deep Band-6 and Band-7 maps of 10 independent fields that reduce cosmic variance effects. The differential number counts at 1.2 mm are found to increase with decreasing flux density down to 0.1 mJy. Our number counts indicate that the faint (0.1-1.0 mJy, or SFR{sub IR} ? 30-300 M {sub ?} yr{sup 1}) SMGs contribute nearly a half of the extragalactic background light (EBL), while the remaining half of the EBL is mostly contributed by very faint sources with flux densities of <0.1 mJy (SFR{sub IR} ? 30 M {sub ?} yr{sup 1}). We conduct counts-in-cells analysis with multifield ALMA data for the faint SMGs, and obtain a coarse estimate of galaxy bias, b {sub g} < 4. The galaxy bias suggests that the dark halo masses of the faint SMGs are ? 7 10{sup 12} M {sub ?}, which is smaller than those of bright (>1 mJy) SMGs, but consistent with abundant high-z star-forming populations, such as sBzKs, LBGs, and LAEs. Finally, we report the serendipitous detection of SLE-1, which has no continuum counterparts in our 1.2 mm-band or multi-wavelength images, including ultra deep HST/WFC3 and Spitzer data. The SLE has a significant line at 249.9 GHz with a signal-to-noise ratio of 7.1. If the SLE is not a spurious source made by the unknown systematic noise of ALMA, the strong upper limits of our multi-wavelength data suggest that the SLE would be a faint galaxy at z ? 6.

  19. A cascaded model of spectral distortions due to spectral response effects and pulse pileup effects in a photon-counting x-ray detector for CT

    SciTech Connect (OSTI)

    Cammin, Jochen E-mail: ktaguchi@jhmi.edu; Taguchi, Katsuyuki E-mail: ktaguchi@jhmi.edu; Xu, Jennifer; Barber, William C.; Iwanczyk, Jan S.; Hartsough, Neal E.

    2014-04-15

    Purpose: Energy discriminating, photon-counting detectors (PCDs) are an emerging technology for computed tomography (CT) with various potential benefits for clinical CT. The photon energies measured by PCDs can be distorted due to the interactions of a photon with the detector and the interaction of multiple coincident photons. These effects result in distorted recorded x-ray spectra which may lead to artifacts in reconstructed CT images and inaccuracies in tissue identification. Model-based compensation techniques have the potential to account for the distortion effects. This approach requires only a small number of parameters and is applicable to a wide range of spectra and count rates, but it needs an accurate model of the spectral distortions occurring in PCDs. The purpose of this study was to develop a model of those spectral distortions and to evaluate the model using a PCD (model DXMCT-1; DxRay, Inc., Northridge, CA) and various x-ray spectra in a wide range of count rates. Methods: The authors hypothesize that the complex phenomena of spectral distortions can be modeled by: (1) separating them into count-rate independent factors that we call the spectral response effects (SRE), and count-rate dependent factors that we call the pulse pileup effects (PPE), (2) developing separate models for SRE and PPE, and (3) cascading the SRE and PPE models into a combined SRE+PPE model that describes PCD distortions at both low and high count rates. The SRE model describes the probability distribution of the recorded spectrum, with a photo peak and a continuum tail, given the incident photon energy. Model parameters were obtained from calibration measurements with three radioisotopes and then interpolated linearly for other energies. The PPE model used was developed in the authors previous work [K. Taguchi et al., Modeling the performance of a photon counting x-ray detector for CT: Energy response and pulse pileup effects, Med. Phys. 38(2), 10891102 (2011)]. The

  20. Free-running InGaAs single photon detector with 1 dark count per second at 10% efficiency

    SciTech Connect (OSTI)

    Korzh, B. Walenta, N.; Lunghi, T.; Gisin, N.; Zbinden, H.

    2014-02-24

    We present a free-running single photon detector for telecom wavelengths based on a negative feedback avalanche photodiode (NFAD). A dark count rate as low as 1?cps was obtained at a detection efficiency of 10%, with an afterpulse probability of 2.2% for 20??s of deadtime. This was achieved by using an active hold-off circuit and cooling the NFAD with a free-piston stirling cooler down to temperatures of ?110?C. We integrated two detectors into a practical, 625?MHz clocked quantum key distribution system. Stable, real-time key distribution in the presence of 30?dB channel loss was possible, yielding a secret key rate of 350?bps.

  1. Centroid Position as a Function of Total Counts in a Windowed CMOS Image of a Point Source

    SciTech Connect (OSTI)

    Wurtz, R E; Olivier, S; Riot, V; Hanold, B J; Figer, D F

    2010-05-27

    We obtained 960,200 22-by-22-pixel windowed images of a pinhole spot using the Teledyne H2RG CMOS detector with un-cooled SIDECAR readout. We performed an analysis to determine the precision we might expect in the position error signals to a telescope's guider system. We find that, under non-optimized operating conditions, the error in the computed centroid is strongly dependent on the total counts in the point image only below a certain threshold, approximately 50,000 photo-electrons. The LSST guider camera specification currently requires a 0.04 arcsecond error at 10 Hertz. Given the performance measured here, this specification can be delivered with a single star at 14th to 18th magnitude, depending on the passband.

  2. Evaluation of two-stage system for neutron measurement aiming at increase in count rate at Japan Atomic Energy Agency-Fusion Neutronics Source

    SciTech Connect (OSTI)

    Shinohara, K. Ochiai, K.; Sukegawa, A.; Ishii, K.; Kitajima, S.; Baba, M.; Sasao, M.

    2014-11-15

    In order to increase the count rate capability of a neutron detection system as a whole, we propose a multi-stage neutron detection system. Experiments to test the effectiveness of this concept were carried out on Fusion Neutronics Source. Comparing four configurations of alignment, it was found that the influence of an anterior stage on a posterior stage was negligible for the pulse height distribution. The two-stage system using 25 mm thickness scintillator was about 1.65 times the count rate capability of a single detector system for d-D neutrons and was about 1.8 times the count rate capability for d-T neutrons. The results suggested that the concept of a multi-stage detection system will work in practice.

  3. HERSCHEL-ATLAS GALAXY COUNTS AND HIGH-REDSHIFT LUMINOSITY FUNCTIONS: THE FORMATION OF MASSIVE EARLY-TYPE GALAXIES

    SciTech Connect (OSTI)

    Lapi, A.; Gonzalez-Nuevo, J.; Fan, L.; Bressan, A.; De Zotti, G.; Danese, L.; Negrello, M.; Dunne, L.; Maddox, S.; Eales, S.; Auld, R.; Dariush, A.; Dye, S.; Baes, M.; Fritz, J.; Bonfield, D. G.; Buttiglione, S.; Cava, A.; Clements, D. L.; Cooray, A.

    2011-11-20

    Exploiting the Herschel Astrophysical Terahertz Large Area Survey Science Demonstration Phase survey data, we have determined the luminosity functions (LFs) at rest-frame wavelengths of 100 and 250 {mu}m and at several redshifts z {approx}> 1, for bright submillimeter galaxies with star formation rates (SFRs) {approx}> 100 M{sub Sun} yr{sup -1}. We find that the evolution of the comoving LF is strong up to z Almost-Equal-To 2.5, and slows down at higher redshifts. From the LFs and the information on halo masses inferred from clustering analysis, we derived an average relation between SFR and halo mass (and its scatter). We also infer that the timescale of the main episode of dust-enshrouded star formation in massive halos (M{sub H} {approx}> 3 Multiplication-Sign 10{sup 12} M{sub Sun }) amounts to {approx}7 Multiplication-Sign 10{sup 8} yr. Given the SFRs, which are in the range of 10{sup 2}-10{sup 3} M{sub Sun} yr{sup -1}, this timescale implies final stellar masses of the order of 10{sup 11}-10{sup 12} M{sub Sun }. The corresponding stellar mass function matches the observed mass function of passively evolving galaxies at z {approx}> 1. The comparison of the statistics for submillimeter and UV-selected galaxies suggests that the dust-free, UV bright phase is {approx}> 10{sup 2} times shorter than the submillimeter bright phase, implying that the dust must form soon after the onset of star formation. Using a single reference spectral energy distribution (SED; the one of the z Almost-Equal-To 2.3 galaxy SMM J2135-0102), our simple physical model is able to reproduce not only the LFs at different redshifts >1 but also the counts at wavelengths ranging from 250 {mu}m to Almost-Equal-To 1 mm. Owing to the steepness of the counts and their relatively broad frequency range, this result suggests that the dispersion of submillimeter SEDs of z > 1 galaxies around the reference one is rather small.

  4. ELLIPTICAL WEIGHTED HOLICs FOR WEAK LENSING SHEAR MEASUREMENT. III. THE EFFECT OF RANDOM COUNT NOISE ON IMAGE MOMENTS IN WEAK LENSING ANALYSIS

    SciTech Connect (OSTI)

    Okura, Yuki; Futamase, Toshifumi E-mail: tof@astr.tohoku.ac.jp

    2013-07-01

    This is the third paper on the improvement of systematic errors in weak lensing analysis using an elliptical weight function, referred to as E-HOLICs. In previous papers, we succeeded in avoiding errors that depend on the ellipticity of the background image. In this paper, we investigate the systematic error that depends on the signal-to-noise ratio of the background image. We find that the origin of this error is the random count noise that comes from the Poisson noise of sky counts. The random count noise makes additional moments and centroid shift error, and those first-order effects are canceled in averaging, but the second-order effects are not canceled. We derive the formulae that correct this systematic error due to the random count noise in measuring the moments and ellipticity of the background image. The correction formulae obtained are expressed as combinations of complex moments of the image, and thus can correct the systematic errors caused by each object. We test their validity using a simulated image and find that the systematic error becomes less than 1% in the measured ellipticity for objects with an IMCAT significance threshold of {nu} {approx} 11.7.

  5. An efficient parallel sampling technique for Multivariate Poisson-Lognormal model: Analysis with two crash count datasets

    SciTech Connect (OSTI)

    Zhan, Xianyuan; Aziz, H. M. Abdul; Ukkusuri, Satish V.

    2015-11-19

    Our study investigates the Multivariate Poisson-lognormal (MVPLN) model that jointly models crash frequency and severity accounting for correlations. The ordinary univariate count models analyze crashes of different severity level separately ignoring the correlations among severity levels. The MVPLN model is capable to incorporate the general correlation structure and takes account of the over dispersion in the data that leads to a superior data fitting. But, the traditional estimation approach for MVPLN model is computationally expensive, which often limits the use of MVPLN model in practice. In this work, a parallel sampling scheme is introduced to improve the original Markov Chain Monte Carlo (MCMC) estimation approach of the MVPLN model, which significantly reduces the model estimation time. Two MVPLN models are developed using the pedestrian vehicle crash data collected in New York City from 2002 to 2006, and the highway-injury data from Washington State (5-year data from 1990 to 1994) The Deviance Information Criteria (DIC) is used to evaluate the model fitting. The estimation results show that the MVPLN models provide a superior fit over univariate Poisson-lognormal (PLN), univariate Poisson, and Negative Binomial models. Moreover, the correlations among the latent effects of different severity levels are found significant in both datasets that justifies the importance of jointly modeling crash frequency and severity accounting for correlations.

  6. An efficient parallel sampling technique for Multivariate Poisson-Lognormal model: Analysis with two crash count datasets

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Zhan, Xianyuan; Aziz, H. M. Abdul; Ukkusuri, Satish V.

    2015-11-19

    Our study investigates the Multivariate Poisson-lognormal (MVPLN) model that jointly models crash frequency and severity accounting for correlations. The ordinary univariate count models analyze crashes of different severity level separately ignoring the correlations among severity levels. The MVPLN model is capable to incorporate the general correlation structure and takes account of the over dispersion in the data that leads to a superior data fitting. But, the traditional estimation approach for MVPLN model is computationally expensive, which often limits the use of MVPLN model in practice. In this work, a parallel sampling scheme is introduced to improve the original Markov Chainmore » Monte Carlo (MCMC) estimation approach of the MVPLN model, which significantly reduces the model estimation time. Two MVPLN models are developed using the pedestrian vehicle crash data collected in New York City from 2002 to 2006, and the highway-injury data from Washington State (5-year data from 1990 to 1994) The Deviance Information Criteria (DIC) is used to evaluate the model fitting. The estimation results show that the MVPLN models provide a superior fit over univariate Poisson-lognormal (PLN), univariate Poisson, and Negative Binomial models. Moreover, the correlations among the latent effects of different severity levels are found significant in both datasets that justifies the importance of jointly modeling crash frequency and severity accounting for correlations.« less

  7. Low-Intrusion Techniques and Sensitive Information Management for Warhead Counting and Verification: FY2011 Annual Report

    SciTech Connect (OSTI)

    Jarman, Kenneth D.; Robinson, Sean M.; McDonald, Benjamin S.; Gilbert, Andrew J.; Misner, Alex C.; Pitts, W. Karl; White, Timothy A.; Seifert, Allen; Miller, Erin A.

    2011-09-01

    Future arms control treaties may push nuclear weapons limits to unprecedented low levels and may entail precise counting of warheads as well as distinguishing between strategic and tactical nuclear weapons. Such advances will require assessment of form and function to confidently verify the presence or absence of nuclear warheads and/or their components. Imaging with penetrating radiation can provide such an assessment and could thus play a unique role in inspection scenarios. Yet many imaging capabilities have been viewed as too intrusive from the perspective of revealing weapon design details, and the potential for the release of sensitive information poses challenges in verification settings. A widely held perception is that verification through radiography requires images of sufficient quality that an expert (e.g., a trained inspector or an image-matching algorithm) can verify the presence or absence of components of a device. The concept of information barriers (IBs) has been established to prevent access to relevant weapon-design information by inspectors (or algorithms), and has, to date, limited the usefulness of radiographic inspection. The challenge of this project is to demonstrate that radiographic information can be used behind an IB to improve the capabilities of treaty-verification weapons-inspection systems.

  8. Full counting statistics as a probe of quantum coherence in a side-coupled double quantum dot system

    SciTech Connect (OSTI)

    Xue, Hai-Bin

    2013-12-15

    We study theoretically the full counting statistics of electron transport through side-coupled double quantum dot (QD) based on an efficient particle-number-resolved master equation. It is demonstrated that the high-order cumulants of transport current are more sensitive to the quantum coherence than the average current, which can be used to probe the quantum coherence of the considered double QD system. Especially, quantum coherence plays a crucial role in determining whether the super-Poissonian noise occurs in the weak inter-dot hopping coupling regime depending on the corresponding QD-lead coupling, and the corresponding values of super-Poissonian noise can be relatively enhanced when considering the spins of conduction electrons. Moreover, this super-Poissonian noise bias range depends on the singly-occupied eigenstates of the system, which thus suggests a tunable super-Poissonian noise device. The occurrence-mechanism of super-Poissonian noise can be understood in terms of the interplay of quantum coherence and effective competition between fast-and-slow transport channels. -- Highlights: The FCS can be used to probe the quantum coherence of side-coupled double QD system. Probing quantum coherence using FCS may permit experimental tests in the near future. The current noise characteristics depend on the quantum coherence of this QD system. The super-Poissonian noise can be enhanced when considering conduction electron spin. The side-coupled double QD system suggests a tunable super-Poissonian noise device.

  9. Neutron Coincidence Counting Studies

    SciTech Connect (OSTI)

    Rogers, Jeremy L.; Ely, James H.; Kouzes, Richard T.; Lintereur, Azaree T.; Siciliano, Edward R.

    2012-08-31

    The efficiency comparison for measured and simulated responses of a 10B-lined proportional counter and a 3He proportional counter in a close, symmetrical geometry are presented. The measurement geometry was modeled in MCNPX to validate the methods used for simulating the response of both the 3He and 10B-lined tubes. The MCNPX models agree within 1% with the 3He tube measurements and within 3% for the 10B-lined tubes when a 0.75-µm boron-metal lining is used.

  10. Event counting alpha detector

    DOE Patents [OSTI]

    Bolton, R.D.; MacArthur, D.W.

    1996-08-27

    An electrostatic detector is disclosed for atmospheric radon or other weak sources of alpha radiation. In one embodiment, nested enclosures are insulated from one another, open at the top, and have a high voltage pin inside and insulated from the inside enclosure. An electric field is produced between the pin and the inside enclosure. Air ions produced by collision with alpha particles inside the decay volume defined by the inside enclosure are attracted to the pin and the inner enclosure. With low alpha concentrations, individual alpha events can be measured to indicate the presence of radon or other alpha radiation. In another embodiment, an electrical field is produced between parallel plates which are insulated from a single decay cavity enclosure. 6 figs.