skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Large-scale sequential quadratic programming algorithms

Abstract

The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use ofmore » incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.« less

Authors:
Publication Date:
Research Org.:
Stanford Univ., CA (United States). Systems Optimization Lab.
Sponsoring Org.:
USDOE, Washington, DC (United States); Department of Defense, Washington, DC (United States); National Science Foundation, Washington, DC (United States)
OSTI Identifier:
10102731
Report Number(s):
SOL-92-4
ON: DE93002528; CNN: Grant DDM-9204208; Grant N00014-90-J-1242
DOE Contract Number:
FG03-92ER25117
Resource Type:
Technical Report
Resource Relation:
Other Information: PBD: Sep 1992
Country of Publication:
United States
Language:
English
Subject:
99 GENERAL AND MISCELLANEOUS//MATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; NONLINEAR PROGRAMMING; ALGORITHMS; FUNCTIONS; NEWTON METHOD; LAGRANGIAN FUNCTION; CONVERGENCE; 990200; MATHEMATICS AND COMPUTERS

Citation Formats

Eldersveld, S.K. Large-scale sequential quadratic programming algorithms. United States: N. p., 1992. Web. doi:10.2172/10102731.
Eldersveld, S.K. Large-scale sequential quadratic programming algorithms. United States. doi:10.2172/10102731.
Eldersveld, S.K. 1992. "Large-scale sequential quadratic programming algorithms". United States. doi:10.2172/10102731. https://www.osti.gov/servlets/purl/10102731.
@article{osti_10102731,
title = {Large-scale sequential quadratic programming algorithms},
author = {Eldersveld, S.K.},
abstractNote = {The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.},
doi = {10.2172/10102731},
journal = {},
number = ,
volume = ,
place = {United States},
year = 1992,
month = 9
}

Technical Report:

Save / Share:
  • The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function.more » Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.« less
  • The problem considered is that of finding local minimizers for a function subject to general nonlinear inequality constraints, when first and perhaps second derivatives are available. The methods studied belong to the class of sequential quadratic programming (SQP) algorithms. In particular, the methods are based on the SQP algorithm embodied in the code NPSOL, which was developed at the Systems Optimization Laboratory, Stanford University. The goal of this paper is to develop SQP algorithms that allow some flexibility in their design. Specifically, we are interested in introducing modifications that enable the algorithms to solve large-scale problems efficiently. The following issuesmore » are considered in detail: instead of trying to obtain the search direction as a minimizer for the QP, the solution process is terminated after a limited number of iterations. Suitable termination criteria are defined that ensure convergence for an algorithm that uses a quasi-Newton approximation for the full Hessian. For many problems the reduced Hessian is considerably smaller than the full Hessian. Consequently, there are considerable practical benefits to be gained by only requiring an approximation to the reduced Hessian. Theorems are proved concerning the convergence and rate of convergence for an algorithm that uses a quasi-Newton approximation for the reduced Hessian when early termination of the QP subproblem is enforced. The use of second derivatives, while having significant practical advantages, introduces new difficulties; for example, the QP subproblems may be non-convex, and even a minimizer for the subproblem is no longer guaranteed to yield a suitable search direction. Also, theorems are proved for the convergence and rate of convergence of these algorithms. Finally, some numerical results, obtained from a modification of the code NPSOL, are presented. 43 refs., 4 tabs.« less
  • Abstract not provided.
  • We analyze sequential quadratic programming (SQP) methods to solve nonlinear constrained optimization problems that are more flexible in their definition than standard SQP methods. The type of flexibility introduced is motivated by the necessity to deviate from the standard approach when solving large problems. Specifically we no longer require a minimizer of the QP subproblem to be determined or particular Lagrange multiplier estimates to be used. Our main focus is on an SQP algorithm that uses a particular augmented Lagrangian merit function. New results are derived for this algorithm under weaker conditions than previously assumed; in particular, it is notmore » assumed that the iterates lie on a compact set.« less