# Learning rate and attractor size of the single-layer perceptron

## Abstract

We study the simplest possible order one single-layer perceptron with two inputs, using the delta rule with online learning, in order to derive closed form expressions for the mean convergence rates. We investigate the rate of convergence in weight space of the weight vectors corresponding to each of the 14 out of 16 linearly separable rules. These vectors follow zigzagging lines through the piecewise constant vector field to their respective attractors. Based on our studies, we conclude that a single-layer perceptron with N inputs will converge in an average number of steps given by an Nth order polynomial in (t/l), where t is the threshold, and l is the size of the initial weight distribution. Exact values for these averages are provided for the five linearly separable classes with N=2. We also demonstrate that the learning rate is determined by the attractor size, and that the attractors of a single-layer perceptron with N inputs partition R{sup N}+R{sup N}.

- Authors:

- Department of Mathematics, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801 (United States)
- (United States)

- Publication Date:

- OSTI Identifier:
- 21072391

- Resource Type:
- Journal Article

- Resource Relation:
- Journal Name: Physical Review. E, Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics; Journal Volume: 75; Journal Issue: 2; Other Information: DOI: 10.1103/PhysRevE.75.026704; (c) 2007 The American Physical Society; Country of input: International Atomic Energy Agency (IAEA)

- Country of Publication:
- United States

- Language:
- English

- Subject:
- 71 CLASSICAL AND QUANTUM MECHANICS, GENERAL PHYSICS; ATTRACTORS; CONVERGENCE; DISTRIBUTION; LAYERS; LEARNING; POLYNOMIALS; VECTOR FIELDS; VECTORS

### Citation Formats

```
Singleton, Martin S., Huebler, Alfred W., and Department of Physics, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801.
```*Learning rate and attractor size of the single-layer perceptron*. United States: N. p., 2007.
Web. doi:10.1103/PHYSREVE.75.026704.

```
Singleton, Martin S., Huebler, Alfred W., & Department of Physics, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801.
```*Learning rate and attractor size of the single-layer perceptron*. United States. doi:10.1103/PHYSREVE.75.026704.

```
Singleton, Martin S., Huebler, Alfred W., and Department of Physics, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801. Thu .
"Learning rate and attractor size of the single-layer perceptron". United States.
doi:10.1103/PHYSREVE.75.026704.
```

```
@article{osti_21072391,
```

title = {Learning rate and attractor size of the single-layer perceptron},

author = {Singleton, Martin S. and Huebler, Alfred W. and Department of Physics, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801},

abstractNote = {We study the simplest possible order one single-layer perceptron with two inputs, using the delta rule with online learning, in order to derive closed form expressions for the mean convergence rates. We investigate the rate of convergence in weight space of the weight vectors corresponding to each of the 14 out of 16 linearly separable rules. These vectors follow zigzagging lines through the piecewise constant vector field to their respective attractors. Based on our studies, we conclude that a single-layer perceptron with N inputs will converge in an average number of steps given by an Nth order polynomial in (t/l), where t is the threshold, and l is the size of the initial weight distribution. Exact values for these averages are provided for the five linearly separable classes with N=2. We also demonstrate that the learning rate is determined by the attractor size, and that the attractors of a single-layer perceptron with N inputs partition R{sup N}+R{sup N}.},

doi = {10.1103/PHYSREVE.75.026704},

journal = {Physical Review. E, Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics},

number = 2,

volume = 75,

place = {United States},

year = {Thu Feb 15 00:00:00 EST 2007},

month = {Thu Feb 15 00:00:00 EST 2007}

}