skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Parallel, self-organizing, hierarchical neural networks

Abstract

This paper presents a new neural network architecture called the parallel, self-organizing, hierarchical neural network (PSHNN). The new architecture involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each stage, error detection is carried out, and a number of input vectors are rejected. Between two stages there is a nonlinear transformation of those input vectors rejected by the previous stage. The new architecture has many desirable properties such as optimized system complexity in the sense of minimized self-organizing number of stages, high classification accuracy, minimized learning and recall times, and truly parallel architectures in which all stages are operating simultaneously without waiting for data from each other during testing. The experiments performed in comparison to multilayered networks with back-propagation training indicated the superiority of the new architecture.

Authors:
;  [1]
  1. Purdue Univ., Lafayette, IN (USA). School of Electrical Engineering
Publication Date:
OSTI Identifier:
7052269
Resource Type:
Journal Article
Journal Name:
Neural Networks; (USA)
Additional Journal Information:
Journal Volume: 1:2
Country of Publication:
United States
Language:
English
Subject:
99 GENERAL AND MISCELLANEOUS//MATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; NEURAL NETWORKS; COMPUTERIZED SIMULATION; PARALLEL PROCESSING; COMPUTER ARCHITECTURE; ACCURACY; NONLINEAR PROBLEMS; PROGRAMMING; SIMULATION; 990200* - Mathematics & Computers

Citation Formats

Ersoy, O K, and Hong, D. Parallel, self-organizing, hierarchical neural networks. United States: N. p., 1990. Web. doi:10.1109/72.80229.
Ersoy, O K, & Hong, D. Parallel, self-organizing, hierarchical neural networks. United States. https://doi.org/10.1109/72.80229
Ersoy, O K, and Hong, D. Fri . "Parallel, self-organizing, hierarchical neural networks". United States. https://doi.org/10.1109/72.80229.
@article{osti_7052269,
title = {Parallel, self-organizing, hierarchical neural networks},
author = {Ersoy, O K and Hong, D},
abstractNote = {This paper presents a new neural network architecture called the parallel, self-organizing, hierarchical neural network (PSHNN). The new architecture involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each stage, error detection is carried out, and a number of input vectors are rejected. Between two stages there is a nonlinear transformation of those input vectors rejected by the previous stage. The new architecture has many desirable properties such as optimized system complexity in the sense of minimized self-organizing number of stages, high classification accuracy, minimized learning and recall times, and truly parallel architectures in which all stages are operating simultaneously without waiting for data from each other during testing. The experiments performed in comparison to multilayered networks with back-propagation training indicated the superiority of the new architecture.},
doi = {10.1109/72.80229},
url = {https://www.osti.gov/biblio/7052269}, journal = {Neural Networks; (USA)},
number = ,
volume = 1:2,
place = {United States},
year = {1990},
month = {6}
}