Approximate gradient projection method and backpropagation algorithm
Conference
·
OSTI ID:36245
We analyze the convergence of an approximate gradient projection method for minimizing the sum of continuously differentiable functions over a nonempty closed convex set. In this method, the functions are aggregated and, at each iteration, a succession of gradient steps, one for each of the aggregate functions, is applied and the result is projected onto the convex set. We show that if the gradients of the functions are bounded and Lipschitz continuous over a certain level set and the stepsizes are chosen to be proportional to a certain residual squared or to be square summable, then every cluster point of the iterates is a stationary point. We apply these results to the backpropagation algorithm to obtain new deterministic convergence results for this algorithm. We will also report some numerical simulation results.
- OSTI ID:
- 36245
- Report Number(s):
- CONF-9408161--
- Country of Publication:
- United States
- Language:
- English
Similar Records
A new interior point method for the variational inequality problem
A Regularization Newton Method for Solving Nonlinear Complementarity Problems
Projected gradient methods for linearly constrained problems
Conference
·
Fri Dec 30 23:00:00 EST 1994
·
OSTI ID:36155
A Regularization Newton Method for Solving Nonlinear Complementarity Problems
Journal Article
·
Sun Nov 14 23:00:00 EST 1999
· Applied Mathematics and Optimization
·
OSTI ID:21067537
Projected gradient methods for linearly constrained problems
Technical Report
·
Thu May 01 00:00:00 EDT 1986
·
OSTI ID:5424690