Overview
The NLP procedure (NonLinear Programming)
offers a set of optimization techniques for minimizing or
maximizing a continuous nonlinear function f(x) of n
decision variables, x = (x1, ... ,xn)T with lower and upper bound,
linear and nonlinear, equality and inequality
constraints. This can be expressed as solving
where f is the objective function, the ci's are
the nonlinear functions,
and ui, li's are the upper and lower bounds.
Problems of this type are found in many settings ranging from
optimal control to maximum likelihood estimation.
The NLP procedure provides a number of algorithms for solving
this problem that take advantage of special structure on
the objective function and constraints.
One example is the quadratic programming problem:
where the ci(x)'s are linear functions;
g = (g1, ... ,gn)T and b = (b1, ... ,bn)T are vectors and
G is an n ×n symmetric matrix.
Another example is the least-squares problem:
where the ci(x)'s are linear functions,
and f1(x),...,fm(x) are nonlinear functions of x.
The following problems are handled by PROC NLP.
- quadratic programming with an option for sparse problems
- unconstrained minimization/maximization
- constrained minimization/maximization
- linear complementarity problem
The following optimization techniques are supported in PROC NLP.
- Quadratic Active Set Technique
- Trust-Region Method
- Newton-Raphson Method With Line-Search
- Newton-Raphson Method With Ridging
- Quasi-Newton Methods
- Double-Dogleg Method
- Conjugate Gradient Methods
- Nelder-Mead Simplex Method
- Levenberg-Marquardt Method
- Hybrid Quasi-Newton Methods
These optimization techniques require
a continuous objective function f, and all but
one (NMSIMP) require continuous first-order derivatives of the
objective function f.
Some of the techniques also require
continuous second-order derivatives.
There are three ways to compute derivatives in PROC NLP:
- analytically (using a special derivative compiler), the default method
- via finite difference approximations
- via user-supplied exact or approximate numerical functions
Nonlinear programs can be input into the procedure
in various ways.
The objective, constraint, and derivative functions are
specified using the programming statements of PROC NLP.
In addition, information in SAS data sets can be used to define
the structure of objectives and constraints as well as specify
constants used in objectives, constraints and derivatives.
PROC NLP uses data sets to input various pieces of information.
- The DATA= data set enables you to specify data shared by
all functions involved in a least-squares problem.
- The INQUAD= data set contains the arrays appearing in a
quadratic programming problem.
- The INVAR= data set
specifies initial values for the decision variables,
the values of constants that are referred
to in the program statements, and simple boundary and
general linear constraints.
- The MODEL= data set specifies a model (functions, constraints,
derivatives) saved at a previous execution of the NLP procedure.
PROC NLP uses data sets to output various results.
-
The OUTVAR= data set
saves the values of the decision variables, the derivatives,
the solution, and the covariance matrix at the solution.
-
The OUT= output data set contains
variables generated in the program statements defining the
objective function as well as selected variables of the
DATA= input data set, if available.
-
The OUTMODEL= data
set saves the programming statements. It can be used to
input a model in the MODEL= input data set.
Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.