skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Designing interpolation kernels for SAR data resampling.

; ; ; ;
Publication Date:
Research Org.:
Sandia National Laboratories
Sponsoring Org.:
USDOE National Nuclear Security Administration (NNSA)
OSTI Identifier:
Report Number(s):
DOE Contract Number:
Resource Type:
Resource Relation:
Conference: Proposed for presentation at the SPIE Defense, Security & Sensing Symposium 2012 held April 23-27, 2012 in Baltimore, MD.
Country of Publication:
United States

Citation Formats

Doerry, Armin W., Small, Donald M., Bishop, Edward, Miller, John, and Horndt, Volker. Designing interpolation kernels for SAR data resampling.. United States: N. p., 2012. Web.
Doerry, Armin W., Small, Donald M., Bishop, Edward, Miller, John, & Horndt, Volker. Designing interpolation kernels for SAR data resampling.. United States.
Doerry, Armin W., Small, Donald M., Bishop, Edward, Miller, John, and Horndt, Volker. 2012. "Designing interpolation kernels for SAR data resampling.". United States. doi:.
title = {Designing interpolation kernels for SAR data resampling.},
author = {Doerry, Armin W. and Small, Donald M. and Bishop, Edward and Miller, John and Horndt, Volker},
abstractNote = {},
doi = {},
journal = {},
number = ,
volume = ,
place = {United States},
year = 2012,
month = 3

Other availability
Please see Document Availability for additional information on obtaining the full-text document. Library patrons may search WorldCat to identify libraries that hold this conference proceeding.

Save / Share:
  • Two main problems must be solved in the geometric processing of satellite data: geometric registration and resampling. When the data must be geometrically registered over a reference map, and particularly when the output pixel size is not the same as the original pixel size, the quality of the resampling can determine the quality of the output, not only in the visual appearance of the image, but also in the numerically interpolated values when used in multitemporal or multisensor studies. The optimum'' interpolation algorithm for AVHRR data is defined over a 6 x 6 window in order to consider overlapping effectsmore » among adjacent pixels. The optimum method, as mathematically defined, is highly expensive in CPU time. Then, a big effort is necessary to implement the algorithms so that they could be operationally applied. Two approaches are considered: a general numerical method (assuming a realistic spatial response for function) and a pseudo-analytical approximation (assuming a simplified Gaussian pulse as spatial response function). The analytical method requires only 2% of the CPU-time required by the fully numerical approach. Some examples are given by comparing the optimum interpolation technique with some other traditional methods. A Landsat TM image corresponding to the same date of the AVHRR image is used to test the quality of the radiometric interpolation procedure. The main advantage of the optimum interpolation is given by the fact that the resulting interpolated image loses the memory'' of the original pixel spacing in the image, which is not true for classical interpolation approaches.« less
  • Design of data structures for high performance computing (HPC) is one of the principal challenges facing researchers looking to utilize heterogeneous computing machinery. Heterogeneous systems derive cost, power, and speed efficiency by being composed of the appropriate hardware for the task. Yet, each type of processor requires a specific organization of the application state in order to achieve peak performance. Discovering this and refactoring the code can be a challenging and time-consuming task for the researcher, as the data structures and the computational model must be co-designed. We present a methodology that uses Python as the environment for which tomore » explore tradeoffs in both the data structure design as well as the code executing on the computation accelerator. Our method enables multi-dimensional arrays to be used effectively in any target environment. We have chosen to focus on OpenMP and CUDA environments, thus exploring the development of optimized kernels for the two most common classes of computing hardware available today: multi-core CPU and GPU. Python s large palette of file and network access routines, its associative indexing syntax and support for common HPC environments makes it relevant for diverse hardware ranging from laptops through computing clusters to the highest performance supercomputers. Our work enables researchers to accelerate the development of their codes on the computing hardware of their choice.« less
  • Neutron shielding design is an important part of the construction of nuclear reactors and high-energy accelerators. Neutron shielding design is also indispensable in the packaging and storage of isotopic neutron sources. Most efforts in the development of neutron shielding design have been concentrated on nuclear reactor shielding because of its huge mass and strict requirement of accuracy. Sophisticated computational tools, such as transport and Monte Carlo codes and detailed data libraries have been developed. In principle, now, neutron shielding, in spite of its complexity, can be designed in any detail and with fine accuracy. However, in most practical cases, neutronmore » shielding design is accomplished with simplified methods. Unlike practical gamma-ray shielding design, where exponential attenuation coupled with buildup factors has been applied effectively and accurately, simplified neutron shielding design, either by using removal cross sections or by applying charts or tables of transmission factors such as the National Council on Radiation Protection and Measurements (NCRP) 38 (Ref. 1) for general neutron protection or to NCRP 51 (Ref. 2) for accelerator neutron shielding, is still very primitive and not well established. The available data are limited in energy range, materials, and thicknesses, and the estimated results are only roughly accurate. It is the purpose of this work to establish a simple, convenient, and user-friendly general-purpose computational tool for practical preliminary neutron shielding design that is reasonably accurate. A wide-range (energy, material, and thickness) data base of dose transmission factors has been generated by applying one-dimensional transport calculations in slab geometry.« less
  • For many geologic problems, subsurface observations are available only from a small number of irregularly distributed locations, for example from a handful of drill holes in the region of interest. These observations will be interpolated one way or another, for example by hand-drawn stratigraphic cross-sections, by trend-fitting techniques, or by simple averaging which ignores spatial correlation. In this paper we consider an interpolation technique for such situations which provides, in addition to point estimates, the error estimates which are lacking from other ad hoc methods. The proposed estimator is like a kriging estimator in form, but because direct estimation ofmore » the spatial covariance function is not possible the parameters of the estimator are selected by cross-validation. Its use in estimating subsurface stratigraphy at a candidate site for geologic waste repository provides an example.« less
  • A two-dimensional advection model is developed at KNMI to advect and assimilate total ozone using a windfield at a single pressure level. The Advection Model KNMI (AMK) with a resolution of 110 x 110 km{sup 2} describes the transport of total ozone, considering ozone as a passive tracer, using a simple linear advection equation. Ozone data measured by the TOVS instrument on the NOAA polar satellites and windfields from the MARS archives of ECMWF are used. By means of the AMK model the TOVS total ozone maps, which are hampered by missing data, can be replaced by global total ozonemore » maps at a given time without gaps in the data. The windfield that must be used for transporting total ozone is, however, not obvious. In this paper it is shown that the 200 hPa windfield is the optimal windfield to choose for advecting total ozone. Ground-based data are often used for validation of satellite measurements. By advection of the satellite total ozone data to the same location and time as the ground-based measurements, validation can in principle be improved. In this paper, Brewer total ozone measured at De Bilt, TOVS total ozone and assimilated TOVS total ozone, for the pixel closest to De Bilt, are compared and the results are discussed.« less