Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Back propagation parameter analysis on multiprocessors

Conference · · Neural Networks; (United States)
OSTI ID:6024227
In order to develop systems of artificial neural networks which can be scaled up to perform practical tasks such as pattern recognition or speech processing, the use of powerful computing tools is essential. Multiprocessors are becoming increasingly popular in the simulation and study of large networks, as the inherent parallelism of many neural architectures and learning algorithms lends itself quite naturally to implementation on concurrent processors. In this study, a multiprocessor system based on the Inmos transputer has been used to examine the stability and convergence rates of the back propagation algorithm as a function of changes in parameters such as activation values, number of hidden units, learning rate, momentum, and initial weight and bias configurations. The Victor V32 is a prototype low-cost message-passing multiprocessor system, designed and implemented by the Victor project in the Microsystems and VLSI group at the IBM T.J. Watson Research Center. A sample topology for the system is 32 nodes in a fixed 4 x 8 mesh. A host processor interfaced to a PC AT and connected to one of the corners of the mesh provides screen and disc I/O. Each of the 32 nodes consists of an INMOS T414 transputer and 4 Megabytes of local memory. Four high-speed (20 Mbits/sec) serial links provide communication among the nodes.
Research Organization:
Dept. of Electrical Engineering, Columbia Univ., New York, NY (US)
OSTI ID:
6024227
Report Number(s):
CONF-8809132-
Conference Information:
Journal Name: Neural Networks; (United States) Journal Volume: 1:1
Country of Publication:
United States
Language:
English