Complexity theory of neural networks. Final technical report, 15 Sep-14 Apr 91
Significant progress has been made in laying the foundations of a complexity theory of neural networks. The fundamental complexity classes have been identified and studied. The class of problems solvable by small, shallow neural networks has been found to be the same class even if (1) probabilistic behaviour (2)Multi-valued logic, and (3)analog behaviour, are allowed (subject to certain resonable technical assumptions). Neural networks can be made provably fault-tolerant by physically separating the summation units from the thresholding units. New results have also been obtained on the complexity of approximation, communication complexity, the complexity of learning from examples and counterexamples, learning with multi-valued neurons, exponential lower bounds for restricted neural networks, and fault tolerance in distributed computation.
- Research Organization:
- Pennsylvania State Univ., University Park, PA (United States). Dept. of Computer Science
- OSTI ID:
- 6090840
- Report Number(s):
- AD-A-241807/7/XAB; CNN: AFOSR-87-0400
- Country of Publication:
- United States
- Language:
- English
Similar Records
A rule-based fault-tolerant neurocontroller
Neural network and letter recognition
Related Subjects
NEURAL NETWORKS
ANALOG SYSTEMS
BEHAVIOR
COMMUNICATIONS
DISTRIBUTED DATA PROCESSING
DISTRIBUTION
FAULT TOLERANT COMPUTERS
LEARNING
NERVE CELLS
ANIMAL CELLS
COMPUTERS
DATA PROCESSING
DIGITAL COMPUTERS
PROCESSING
SOMATIC CELLS
990200* - Mathematics & Computers