MixedInteger Nonlinear ProgrammingRecommended Downloads:
If the user is able to provide at least first derivatives for both the objective and the nonlinear constraints, or automatic differentiation using TOMLAB /MAD is possible to apply to obtain first derivatives, the minlpBB solver in TOMLAB /MINLP is the best choice. minlpBB is using an outer approximation branch and bound algorithm and solves problems that are convex in the continuous variables, but also many nonconvex problems. The result for minlpBB on nonconvex problems is dependent on the initial guess of the unknowns. Another approach for MINLP is the TOMLAB /OQNLP solver, that combines statistical sampling methods with intelligent choices of initial points for local searches, hopefully finding the global minimum among the local minima found. For lowdimensional blackbox problems, the glcCluster solver in the TOMLAB Base Module offers a similar solution as TOMLAB /OQNLP. Here the sampling is deterministic by applying the DIRECT algorithm as implemented in the glcFast solver. A clustering algorithm is then applied on the sampled points to find a good set of initial points to do local searches for. By default glcCluster is using conSolve for the local searches, but any TOMLAB nonlinear programming solver could be used, and recommended is NPSOL or SNOPT in TOMLAB /SOL. Another approach for lowdimensional blackbox problems is to run glcFast for a larger number of iterations, if the needed accuracy is not that high. This is especially appropriate is the function is noisy or nonsmooth, and derivative based methods are likely to have trouble.
It is also possible to solve the problems using one of the global solver included in the TOMLAB Base Module, such as glcFast and rbfSolve. Solver reference:
