skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Multi-scale saliency search in image analysis.

Abstract

Saliency detection in images is an important outstanding problem both in machine vision design and the understanding of human vision mechanisms. Recently, seminal work by Itti and Koch resulted in an effective saliency-detection algorithm. We reproduce the original algorithm in a software application Vision and explore its limitations. We propose extensions to the algorithm that promise to improve performance in the case of difficult-to-detect objects.

Authors:
; ;
Publication Date:
Research Org.:
Sandia National Laboratories
Sponsoring Org.:
USDOE
OSTI Identifier:
875629
Report Number(s):
SAND2005-6057
TRN: US200603%%274
DOE Contract Number:
AC04-94AL85000
Resource Type:
Technical Report
Country of Publication:
United States
Language:
English
Subject:
99 GENERAL AND MISCELLANEOUS//MATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; ALGORITHMS; DESIGN; DETECTION; PERFORMANCE; VISION; IMAGES; DATA ANALYSIS; V CODES; Digital techniques.; Image analysis.

Citation Formats

Slepoy, Alexander, Campisi, Anthony, and Backer, Alejandro. Multi-scale saliency search in image analysis.. United States: N. p., 2005. Web. doi:10.2172/875629.
Slepoy, Alexander, Campisi, Anthony, & Backer, Alejandro. Multi-scale saliency search in image analysis.. United States. doi:10.2172/875629.
Slepoy, Alexander, Campisi, Anthony, and Backer, Alejandro. Sat . "Multi-scale saliency search in image analysis.". United States. doi:10.2172/875629. https://www.osti.gov/servlets/purl/875629.
@article{osti_875629,
title = {Multi-scale saliency search in image analysis.},
author = {Slepoy, Alexander and Campisi, Anthony and Backer, Alejandro},
abstractNote = {Saliency detection in images is an important outstanding problem both in machine vision design and the understanding of human vision mechanisms. Recently, seminal work by Itti and Koch resulted in an effective saliency-detection algorithm. We reproduce the original algorithm in a software application Vision and explore its limitations. We propose extensions to the algorithm that promise to improve performance in the case of difficult-to-detect objects.},
doi = {10.2172/875629},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Sat Oct 01 00:00:00 EDT 2005},
month = {Sat Oct 01 00:00:00 EDT 2005}
}

Technical Report:

Save / Share:
  • MREG V1.1 is the sixth generation SAR image registration algorithm developed by the Signal Processing&Technology Department for Synthetic Aperture Radar applications. Like its predecessor algorithm REGI, it employs a powerful iterative multi-scale paradigm to achieve the competing goals of sub-pixel registration accuracy and the ability to handle large initial offsets. Since it is not model based, it allows for high fidelity tracking of spatially varying terrain-induced misregistration. Since it does not rely on image domain phase, it is equally adept at coherent and noncoherent image registration. This document provides a brief history of the registration processors developed by Dept. 5962more » leading up to MREG V1.1, a full description of the signal processing steps involved in the algorithm, and a user's manual with application specific recommendations for CCD, TwoColor MultiView, and SAR stereoscopy.« less
  • This report describes the work performed at the Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy`s Office of Nonproliferation and National Security, Office of Research and Development (NN-20). The work supports the NN-20 Broad Area Search and Analysis, a program initiated by NN-20 to improve the detection and classification of undeclared weapons facilities. Ongoing PNNL research activities are described in three main components: image collection, information processing, and change analysis. The Multispectral Airborne Imaging System, which was developed to collect georeferenced imagery in the visible through infrared regions of the spectrum, and flown on a light aircraftmore » platform, will supply current land use conditions. The image information extraction software (dynamic clustering and end-member extraction) uses imagery, like the multispectral data collected by the PNNL multispectral system, to efficiently generate landcover information. The advanced change detection uses a priori (benchmark) information, current landcover conditions, and user-supplied rules to rank suspect areas by probable risk of undeclared facilities or proliferation activities. These components, both separately and combined, provide important tools for improving the detection of undeclared facilities.« less
  • In FY97 we completed work on the (MELD) code, a comprehensive, multiple-length-scale, Graphical User Interface (GUI)-driven photonics design tool. In 1997 MELD was rated one of the one hundred most technologically significant new products of the year by Research and Development magazine.
  • In both our past work and the work in progress we focused on understanding the physics and statistical patterns in earthquake faults and fault systems. Our approach had three key aspects. The first was to look for patterns of seismic activity in earthquake fault systems. The second was to understand the physics of a sequence of models for faults and fault systems that are increasingly more realistic. The third key element was to connect the two previous approaches by investigating specific properties found in models to see if they are indeed properties of real faults. A specific example of howmore » this approach works can be seen in the following: In the papers discussed below, we demonstrated that the cellular automation (CA) versions of the slider block models with long range stress transfer are ergodic and could be described by a Boltzmann-Gibbs distribution in the meanfield limit. The ergodicity follows from the fact that the long range stress transfer makes the model meanfield. The meanfield nature of the CA models, generated by long range stress transfer, also allows a description of the CA models by a Langevin equation. The Langevin equation indicates that evolution of seismicity in the model over relatively short times is linear in time. This appears to be consistent with the success of a forecasting algorithm we have developed that is based on a linear evolution of seismicity patterns. This algorithm has had considerable success in that the regions of the Southern California fault system which have been predicted to have a higher probability of an event greater than magnitude 5 have consistently been the sites where such events occur. These two results have led to the question as to whether the Southern California fault system is ergodic and can be described by a Langevin equation like the model. To answer this question we ran a series of tests for ergodicity very much like the ones run on the models. Our results, which have been accepted for publication in Physical Review Letters (Tiampo et al., in press), demonstrate that the Southern California system is ergodic in the same way that is seen in the models. These results will be discussed in more detail below. However, the point that needs to be emphasized is that it was the combination of model investigation via theory and simulation coupled with assimilation and classification of real data and applying the methods of statistical mechanics to real fault systems that led to both a successful forecasting algorithm and a deeper understanding of the nature of earthquake fault systems. This paper describes in some detail the results obtained in the previous funding period. We present these in three groups. (A) Investigation of statistical physics models and applications. (B) Earthquake fault systems and Greens functions for complex sources and (C) Space time patterns, data analysis and forecasting.« less
  • This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to humanmore » epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in future work including validating the model against reliable historical disease data, improving contact rates, spread methods, and disease parameters through discussions with epidemiological experts, and incorporating realistic behavioral assumptions.« less