Neural node network and model, and method of teaching same
Abstract
The present invention is a fully connected feed forward network that includes at least one hidden layer. The hidden layer includes nodes in which the output of the node is fed back to that node as an input with a unit delay produced by a delay device occurring in the feedback path (local feedback). Each node within each layer also receives a delayed output (crosstalk) produced by a delay unit from all the other nodes within the same layer. The node performs a transfer function operation based on the inputs from the previous layer and the delayed outputs. The network can be implemented as analog or digital or within a general purpose processor. Two teaching methods can be used: (1) back propagation of weight calculation that includes the local feedback and the crosstalk or (2) more preferably a feed forward gradient decent which immediately follows the output computations and which also includes the local feedback and the crosstalk. Subsequent to the gradient propagation, the weights can be normalized, thereby preventing convergence to a local optimum. Education of the network can be incremental both on and off-line. An educated network is suitable for modeling and controlling dynamic nonlinear systems and timemore »
- Inventors:
- Issue Date:
- Research Org.:
- Texas A & M Univ., College Station, TX (United States). Texas A & M Engineering Experiment Station
- Sponsoring Org.:
- USDOE
- OSTI Identifier:
- 170476
- Patent Number(s):
- 5479571
- Application Number:
- PAN: 8-046,936; CNN: Grant NAG9-491
- Assignee:
- Texas A and M Univ. System, College Station, TX (United States)
- DOE Contract Number:
- FG07-89ER12893
- Resource Type:
- Patent
- Resource Relation:
- Other Information: PBD: 26 Dec 1995
- Country of Publication:
- United States
- Language:
- English
- Subject:
- 99 MATHEMATICS, COMPUTERS, INFORMATION SCIENCE, MANAGEMENT, LAW, MISCELLANEOUS; NEURAL NETWORKS; DESIGN; TIME DELAY; ANALOG SYSTEMS; DIGITAL SYSTEMS; LEARNING; USES; ON-LINE SYSTEMS
Citation Formats
Parlos, A G, Atiya, A F, Fernandez, B, Tsai, W K, and Chong, K T. Neural node network and model, and method of teaching same. United States: N. p., 1995.
Web.
Parlos, A G, Atiya, A F, Fernandez, B, Tsai, W K, & Chong, K T. Neural node network and model, and method of teaching same. United States.
Parlos, A G, Atiya, A F, Fernandez, B, Tsai, W K, and Chong, K T. Tue .
"Neural node network and model, and method of teaching same". United States.
@article{osti_170476,
title = {Neural node network and model, and method of teaching same},
author = {Parlos, A G and Atiya, A F and Fernandez, B and Tsai, W K and Chong, K T},
abstractNote = {The present invention is a fully connected feed forward network that includes at least one hidden layer. The hidden layer includes nodes in which the output of the node is fed back to that node as an input with a unit delay produced by a delay device occurring in the feedback path (local feedback). Each node within each layer also receives a delayed output (crosstalk) produced by a delay unit from all the other nodes within the same layer. The node performs a transfer function operation based on the inputs from the previous layer and the delayed outputs. The network can be implemented as analog or digital or within a general purpose processor. Two teaching methods can be used: (1) back propagation of weight calculation that includes the local feedback and the crosstalk or (2) more preferably a feed forward gradient decent which immediately follows the output computations and which also includes the local feedback and the crosstalk. Subsequent to the gradient propagation, the weights can be normalized, thereby preventing convergence to a local optimum. Education of the network can be incremental both on and off-line. An educated network is suitable for modeling and controlling dynamic nonlinear systems and time series systems and predicting the outputs as well as hidden states and parameters. The educated network can also be further educated during on-line processing. 21 figs.},
doi = {},
journal = {},
number = ,
volume = ,
place = {United States},
year = {1995},
month = {12}
}