skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Nonlinearly-constrained optimization using asynchronous parallel generating set search.

Abstract

Many optimization problems in computational science and engineering (CS&E) are characterized by expensive objective and/or constraint function evaluations paired with a lack of derivative information. Direct search methods such as generating set search (GSS) are well understood and efficient for derivative-free optimization of unconstrained and linearly-constrained problems. This paper addresses the more difficult problem of general nonlinear programming where derivatives for objective or constraint functions are unavailable, which is the case for many CS&E applications. We focus on penalty methods that use GSS to solve the linearly-constrained problems, comparing different penalty functions. A classical choice for penalizing constraint violations is {ell}{sub 2}{sup 2}, the squared {ell}{sub 2} norm, which has advantages for derivative-based optimization methods. In our numerical tests, however, we show that exact penalty functions based on the {ell}{sub 1}, {ell}{sub 2}, and {ell}{sub {infinity}} norms converge to good approximate solutions more quickly and thus are attractive alternatives. Unfortunately, exact penalty functions are discontinuous and consequently introduce theoretical problems that degrade the final solution accuracy, so we also consider smoothed variants. Smoothed-exact penalty functions are theoretically attractive because they retain the differentiability of the original problem. Numerically, they are a compromise between exact and {ell}{sub 2}{sup 2}, i.e., theymore » converge to a good solution somewhat quickly without sacrificing much solution accuracy. Moreover, the smoothing is parameterized and can potentially be adjusted to balance the two considerations. Since many CS&E optimization problems are characterized by expensive function evaluations, reducing the number of function evaluations is paramount, and the results of this paper show that exact and smoothed-exact penalty functions are well-suited to this task.« less

Authors:
;
Publication Date:
Research Org.:
Sandia National Laboratories
Sponsoring Org.:
USDOE
OSTI Identifier:
909393
Report Number(s):
SAND2007-3257
TRN: US200722%%1088
DOE Contract Number:
AC04-94AL85000
Resource Type:
Technical Report
Country of Publication:
United States
Language:
English
Subject:
99 GENERAL AND MISCELLANEOUS//MATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; NONLINEAR PROGRAMMING; OPTIMIZATION; CALCULATION METHODS; NONLINEAR PROBLEMS; CONVERGENCE; Nonlinear programming.; Optimization methods; Mathematical optimization.; Optimization.

Citation Formats

Griffin, Joshua D., and Kolda, Tamara Gibson. Nonlinearly-constrained optimization using asynchronous parallel generating set search.. United States: N. p., 2007. Web. doi:10.2172/909393.
Griffin, Joshua D., & Kolda, Tamara Gibson. Nonlinearly-constrained optimization using asynchronous parallel generating set search.. United States. doi:10.2172/909393.
Griffin, Joshua D., and Kolda, Tamara Gibson. Tue . "Nonlinearly-constrained optimization using asynchronous parallel generating set search.". United States. doi:10.2172/909393. https://www.osti.gov/servlets/purl/909393.
@article{osti_909393,
title = {Nonlinearly-constrained optimization using asynchronous parallel generating set search.},
author = {Griffin, Joshua D. and Kolda, Tamara Gibson},
abstractNote = {Many optimization problems in computational science and engineering (CS&E) are characterized by expensive objective and/or constraint function evaluations paired with a lack of derivative information. Direct search methods such as generating set search (GSS) are well understood and efficient for derivative-free optimization of unconstrained and linearly-constrained problems. This paper addresses the more difficult problem of general nonlinear programming where derivatives for objective or constraint functions are unavailable, which is the case for many CS&E applications. We focus on penalty methods that use GSS to solve the linearly-constrained problems, comparing different penalty functions. A classical choice for penalizing constraint violations is {ell}{sub 2}{sup 2}, the squared {ell}{sub 2} norm, which has advantages for derivative-based optimization methods. In our numerical tests, however, we show that exact penalty functions based on the {ell}{sub 1}, {ell}{sub 2}, and {ell}{sub {infinity}} norms converge to good approximate solutions more quickly and thus are attractive alternatives. Unfortunately, exact penalty functions are discontinuous and consequently introduce theoretical problems that degrade the final solution accuracy, so we also consider smoothed variants. Smoothed-exact penalty functions are theoretically attractive because they retain the differentiability of the original problem. Numerically, they are a compromise between exact and {ell}{sub 2}{sup 2}, i.e., they converge to a good solution somewhat quickly without sacrificing much solution accuracy. Moreover, the smoothing is parameterized and can potentially be adjusted to balance the two considerations. Since many CS&E optimization problems are characterized by expensive function evaluations, reducing the number of function evaluations is paramount, and the results of this paper show that exact and smoothed-exact penalty functions are well-suited to this task.},
doi = {10.2172/909393},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Tue May 01 00:00:00 EDT 2007},
month = {Tue May 01 00:00:00 EDT 2007}
}

Technical Report:

Save / Share:
  • Generating set search (GSS) is a family of direct search methods that encompasses generalized pattern search and related methods. We describe an algorithm for asynchronous linearly-constrained GSS, which has some complexities that make it different from both the asynchronous bound-constrained case as well as the synchronous linearly-constrained case. The algorithm has been implemented in the APPSPACK software framework and we present results from an extensive numerical study using CUTEr test problems. We discuss the results, both positive and negative, and conclude that GSS is a reliable method for solving small-to-medium sized linearly-constrained optimization problems without derivatives.
  • We describe an asynchronous parallel derivative-free algorithm for linearly-constrained optimization. Generating set search (GSS) is the basis of ourmethod. At each iteration, a GSS algorithm computes a set of search directionsand corresponding trial points and then evaluates the objective function valueat each trial point. Asynchronous versions of the algorithm have been developedin the unconstrained and bound-constrained cases which allow the iterations tocontinue (and new trial points to be generated and evaluated) as soon as anyother trial point completes. This enables better utilization of parallel resourcesand a reduction in overall runtime, especially for problems where the objec-tive function takes minutes ormore » hours to compute. For linearly-constrained GSS,the convergence theory requires that the set of search directions conform to the3 nearby boundary. The complexity of developing the asynchronous algorithm forthe linearly-constrained case has to do with maintaining a suitable set of searchdirections as the search progresses and is the focus of this research. We describeour implementation in detail, including how to avoid function evaluations bycaching function values and using approximate look-ups. We test our imple-mentation on every CUTEr test problem with general linear constraints and upto 1000 variables. Without tuning to individual problems, our implementationwas able to solve 95% of the test problems with 10 or fewer variables, 75%of the problems with 11-100 variables, and nearly half of the problems with100-1000 variables. To the best of our knowledge, these are the best resultsthat have ever been achieved with a derivative-free method. Our asynchronousparallel implementation is freely available as part of the APPSPACK software.4« less
  • We derive new stationarity results for derivative-free, generating set search methods for linearly constrained optimization. We show that a particular measure of stationarity is of the same order as the step length at an identifiable subset of the iterations. Thus, even in the absence of explicit knowledge of the derivatives of the objective function, we still have information about stationarity. These results help both unify the convergence analysis of several classes of direct search algorithms and clarify the fundamental geometrical ideas that underlie them. In addition, these results validate a practical stopping criterion for such algorithms.
  • Parallel pattern search (PPS) can be quite useful for engineering optimization problems characterized by a small number of variables (say 10--50) and by expensive objective function evaluations such as complex simulations that take from minutes to hours to run. However, PPS, which was originally designed for execution on homogeneous and tightly-coupled parallel machine, is not well suited to the more heterogeneous, loosely-coupled, and even fault-prone parallel systems available today. Specifically, PPS is hindered by synchronization penalties and cannot recover in the event of a failure. The authors introduce a new asynchronous and fault tolerant parallel pattern search (AAPS) method andmore » demonstrate its effectiveness on both simple test problems as well as some engineering optimization problems« less
  • APPSPACK is software for solving unconstrained and bound constrained optimization problems. It implements an asynchronous parallel pattern search method that has been specifically designed for problems characterized by expensive function evaluations. Using APPSPACK to solve optimization problems has several advantages: No derivative information is needed; the procedure for evaluating the objective function can be executed via a separate program or script; the code can be run in serial or parallel, regardless of whether or not the function evaluation itself is parallel; and the software is freely available. We describe the underlying algorithm, data structures, and features of APPSPACK version 4.0more » as well as how to use and customize the software.« less