Solves general dense or sparse constrained nonlinear
with explicit handling of
linear inequality and equality constraints,
nonlinear inequality and equality constraints
and simple bounds on the variables.
a sequential quadratic programming (SQP) method
by Schittkowski. If the QP subproblem fails to give a search step, an augmented
Lagrangian merit function gradient subspace step with line search is used.
conSolve is using second order information in the function, and
first order information in the constraints, if available.
Two methods are implemented:
Schittkowski SQP method (default)
Klaus Schittkowski: On the convergence of a Sequential Quadratic
Method with an Augmented Lagrangian Line Search Function,
Systems Optimization Laboratory, Dept. of Operations Research, Stanford
University, Stanford CA 94305, January 1982.
Han-Powell SQP method
See e.g. Luenberger: Linear and nonlinear
The QP systems are generally indefinite if using
explicit Hessians. The QP solvers qpSolve or QPOPT, or the general
solver MINOS may be used to solve these problems.
MINOS is more suitable for large problems
The line search on the augmented Lagrangian merit
function is a modified version of an algorithm
by Fletcher (1987)
BFGS safeguarded quasi-Newton
updates are used when Hessian information is missing.
The initial BFGS matrix may be given by the user, if some prior
knowledge is known about the Hessian.
If missing, unknown gradients and
constraint Jacobians are estimated
using any of the Tomlab methods.
Either analytic Hessian, numerical Hessian, or Hessian estimated by
BFGS update may be used for both methods.
Generally it is more unstable to use a numerical Hessian then the
other two methods, and if the gradients are also estimated numerically,
numerical Hessian should be avoided.
If rank problem occurs in the merit function
the solver is using subspace minimization techniques.
A rank tolerance parameter may be set by the user.