On collinear scaling algorithms that extend quasi-Newton methods
Quasi-Newton methods for unconstrained minimization are based on local affine scalings of the domain, and local quadratic approximations to the objective function that interpolate the gradient. Quasi-Newton algorithms have finite termination on quadratic functions, and are invariant under affine scalings. In 1980, Davidon presented a new class of algorithms for unconstrained minimization based on local collinear scalings of the domain, and local conic approximations to the objective function that interpolate both the gradient and function value. We refer to these algorithms as Davidon`s collinear scaling algorithms. Collinear scalings and conic functions generalize affine scalings and quadratic functions respectively, and therefore collinear scaling algorithms extend quasi-Newton algorithms. Davidon`s collinear scaling algorithms have finite termination on conic functions, and are invariant under collinear scalings. In this talk, we present a new derivation of Davidon`s collinear scaling algorithms. It indicates that collinear scaling algorithms extending quasi-Newton methods studied to date are different from those of Davidon. It also explains why it has not been possible to demonstrate that these other collinear scaling algorithms have finite termination on conic functions, and are invariant under collinear scalings.
- OSTI ID:
- 35777
- Report Number(s):
- CONF-9408161--
- Country of Publication:
- United States
- Language:
- English
Similar Records
Algorithms with conic termination for nonlinear optimization
Extension of quasi-Newton methods to constrained optimization and to general systems of nonlinear equations and inequalities. Final report, February 1, 1976--January 31, 1977