skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: GeeWiz Integrated Visualization Interface for SCALE 5.1

Abstract

The KENO V.a and KENO-VI three-dimensional Monte Carlo criticality computer codes in the SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory (ORNL) are widely used and accepted around the world for criticality safety analyses. As part of current development efforts to improve SCALE's ease of use, the SCALE project team at ORNL has developed a new integrated graphical visualization package for KENO V.a and KENO-VI in SCALE 5.1. This package uses the SCALE Graphically Enhanced Editing Wizard (GeeWiz) as the visualization control center that provides users the capability to set up, execute, plot, and view results from KENO in a friendly, colorful, and interactive computing environment without ever using a text editor or a command prompt.

Authors:
 [1];  [1];  [1]
  1. ORNL
Publication Date:
Research Org.:
Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Sponsoring Org.:
USDOE National Nuclear Security Administration (NNSA)
OSTI Identifier:
931782
DOE Contract Number:
AC05-00OR22725
Resource Type:
Conference
Resource Relation:
Conference: The 8th International Conference on Nuclear Criticality Safty, St. Petersburg, Russian Fed., 20070528, 20070601
Country of Publication:
United States
Language:
English
Subject:
GeeWiz SCALE 5.1

Citation Formats

Bowman, Stephen M, Rearden, Bradley T, and Horwedel, James E. GeeWiz Integrated Visualization Interface for SCALE 5.1. United States: N. p., 2007. Web.
Bowman, Stephen M, Rearden, Bradley T, & Horwedel, James E. GeeWiz Integrated Visualization Interface for SCALE 5.1. United States.
Bowman, Stephen M, Rearden, Bradley T, and Horwedel, James E. Mon . "GeeWiz Integrated Visualization Interface for SCALE 5.1". United States. doi:.
@article{osti_931782,
title = {GeeWiz Integrated Visualization Interface for SCALE 5.1},
author = {Bowman, Stephen M and Rearden, Bradley T and Horwedel, James E},
abstractNote = {The KENO V.a and KENO-VI three-dimensional Monte Carlo criticality computer codes in the SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory (ORNL) are widely used and accepted around the world for criticality safety analyses. As part of current development efforts to improve SCALE's ease of use, the SCALE project team at ORNL has developed a new integrated graphical visualization package for KENO V.a and KENO-VI in SCALE 5.1. This package uses the SCALE Graphically Enhanced Editing Wizard (GeeWiz) as the visualization control center that provides users the capability to set up, execute, plot, and view results from KENO in a friendly, colorful, and interactive computing environment without ever using a text editor or a command prompt.},
doi = {},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Mon Jan 01 00:00:00 EST 2007},
month = {Mon Jan 01 00:00:00 EST 2007}
}

Conference:
Other availability
Please see Document Availability for additional information on obtaining the full-text document. Library patrons may search WorldCat to identify libraries that hold this conference proceeding.

Save / Share:
  • Version 5.1 of the SCALE computer software system developed at Oak Ridge National Laboratory, released in 2006, contains several significant enhancements for nuclear criticality safety analysis. This paper highlights new capabilities in SCALE 5.1, including improved resonance self-shielding capabilities; ENDF/B-VI.7 cross-section and covariance data libraries; HTML output for KENO V.a; analytical calculations of KENO-VI volumes with GeeWiz/KENO3D; new CENTRMST/PMCST modules for processing ENDF/B-VI data in TSUNAMI; SCALE Generalized Geometry Package in NEWT; KENO Monte Carlo depletion in TRITON; and plotting of cross-section and covariance data in Javapeno.
  • The LBNL/NERSC Visportal effort explores ways to deliver advanced Remote/Distributed Visualization (RDV) capabilities through a Grid-enabled web-portal interface. The effort focuses on latency tolerant distributed visualization algorithms, GUI designs that are more appropriate for the capabilities of web interfaces, and refactoring parallel-distributed applications to work in a N-tiered component deployment strategy. Most importantly, our aim is to leverage commercially-supported technology as much as possible in order to create a deployable, supportable, and hence viable platform for delivering grid-based visualization services to collaboratory users.
  • No abstract prepared.
  • Visualization in scientific computing refers to the process of transforming data produced by a simulation into graphical representations that help scientific users interpret the results. While the back-end rendering phase of this work can be performed efficiently in graphics card hardware, the front-end 'post processing' portion of visualization is currently performed entirely in software. Field-Programmable Gate Arrays (FPGAs) are an attractive option for accelerating post-processing operations because they enable users to offload computations into reconfigurable hardware. A key challenge in utilizing FPGAs for this work is developing an infrastructure that allows FPGAs to be integrated into a distributed visualization system.more » We propose a networked approach, where each post-processing FPGA is equipped with specialized network interface (NI) hardware that is capable of transporting graphics commands across the network to existing rendering resources. In this paper we discuss a NI for FPGAs that is comprised of a Chromium OpenGL interface, a TCP Offload Engine, and a Gigabit Ethernet module. A prototype system has been tested for a distributed isosurfacing application.« less
  • Understanding the interactions of structured communities known as “biofilms” and other complex matrixes is possible through the X-ray micro tomography imaging of the biofilms. Feature detection and image processing for this type of data focuses on efficiently identifying and segmenting biofilms and bacteria in the datasets. The datasets are very large and often require manual interventions due to low contrast between objects and high noise levels. Thus new software is required for the effectual interpretation and analysis of the data. This work specifies the evolution and application of the ability to analyze and visualize high resolution X-ray micro tomography datasets.