rbfSolve
Solves costly boxbounded global optimization problems with additional linear, nonlinear and integer constraints
using a Radial Basis Function (RBF) interpolation algorithm.
The idea of the rbfSolve algorithm
is to first fit a response surface to data collected by evaluating the
objective function at a few points.
Then, rbfSolve balances between local and global search by
doing a cycle changing target values.
Main features

rbfSolve implements the algorithm RBF in
Mattias Björkman, Kenneth Holmström:
Global Optimization of Costly Nonconvex Functions
Using Radial Basis Functions,
Optimization and Engineering 1,
373397, 2000.

The original reference on the RBF algorithm is the paper
HansMartin Gutmann:
A radial basis function method for global
optimization,
Journal of Global Optimization 19,
201:207, 2001.

The algorithm in rbfSolve also includes several new algorithmic
features, to be published later on, as well as a major speedup
of the algorithm compared to the
Optimization and Engineering 1
paper.

rbfSolve is extended to handle
noncostly linear, integer and nonlinear constraints.

Costly constraints could be treated by adding these with
penalties to the objective function. Works well as shown in the
Optimization and Engineering paper.
Ongoing algorithm development will treat the costly constraints explicitly.

rbfSolve saves all information from the run on the file
cgoSave.mat.
This makes warm start possible.
The solver ego
is using the same format and file.
This makes it possible to run any number of iterations and combinations
using both the solvers
rbfSolve and ego.

rbfSolve could be started with:

1. the users own set of starting points.

2. all corner points, optionally adding the mid point.

3. some corner points (Gutmann), optionally adding the mid point.

4. a latin hypercube experimental design procedure (default).

5. a new ellipsoid experimental design algorithm (Holmström).

6. a warm start with points from file.

rbfSolve may either use
Thin Plate Splines or
Cubic Splines
as the radial basis functions.

There are four different strategies to determine target values
and search strategies on the surface.
The first two does a cycle of six points with different target values fnStar.
They both depend on the range of the previously sampled function values.
The third depend on the relation to the lowest function value obtained and
normally does a short cycle of three points, including an "infstep".
The "infstep" is a target value of infinity, searching the most unexplored
parts of the search space.
The fourth strategy (called idea 2) does a cycle of four points for different
constraint bounds alpha, that implicitly determines the target value.
The infstep is optionally added to strategy 1,2 and 4.
The cycle lengths are possible to change for the first three strategies.

There is one option to scale the problem to the unit hypercube,
which has shown to be an advantage for most test cases.

One option is to replace all large function values with the median
of all function values. This has shown to stabilize the
behavior of the
radial basis function interpolation.

The solvers used for global or local search on the response surface
are possible to change.
The recommended choices are either
the combination glcFast and
NPSOL or glcCluster.

1. rbfSolve is using glcFast for a predefined number of iterations.
Then tries to improve the best point found using a local search
with NPSOL (default).

2. rbfSolve is using the glcCluster.
The default for glcCluster is to
use glcFast for the DIRECT algorithm search,
then cluster the output of the DIRECT algorithm into a set of
initial points, and use
NPSOL for this local search from each of the points in the set.
Thus, in both cases the same two solvers will in some way be used.
glcCluster will probably be more reliable, however it might spend
too much time in the search on the surface at each iteration.
When rbfSolve is using glcCluster it avoids the final local solution phase,
because this is already done in glcCluster.

There is a natural way to parallelize the RBF algorithm by
doing all computations for a cycle at the same time.
