Dakota Reference Manual  Version 6.16
Explore and Predict with Confidence
 All Pages

Multi-Start Optimization Method


Alias: none

Argument(s): none

Child Keywords:

Required/Optional Description of Group Dakota Keyword Dakota Keyword Description
(Choose One)
Sub-method Selection (Group 1) method_name Specify sub-method by name
method_pointer Pointer to sub-method to run from each starting point
Optional random_starts Number of random starting points
Optional starting_points List of user-specified starting points
Optional iterator_servers Specify the number of iterator servers when Dakota is run in parallel
Optional iterator_scheduling Specify the scheduling of concurrent iterators when Dakota is run in parallel
Optional processors_per_iterator Specify the number of processors per iterator server when Dakota is run in parallel


In the multi-start iteration method (multi_start), a series of iterator runs are performed for different values of parameters in the model. A common use is for multi-start optimization (i.e., different local optimization runs from different starting points for the design variables), but the concept and the code are more general. Multi-start iteration is implemented within the MetaIterator branch of the Iterator hierarchy within the ConcurrentMetaIterator class. Additional information on the multi-start algorithm is available in the Users Manual [4].

The multi_start meta-iterator must specify a sub-iterator using either a method_pointer or a method_name plus optional model_pointer. This iterator is responsible for completing a series of iterative analyses from a set of different starting points. These starting points can be specified as follows: (1) using random_starts, for which the specified number of starting points are selected randomly within the variable bounds, (2) using starting_points, in which the starting values are provided in a list, or (3) using both random_starts and starting_points, for which the combined set of points will be used. In aggregate, at least one starting point must be specified. The most common example of a multi-start algorithm is multi-start optimization, in which a series of optimizations are performed from different starting values for the design variables. This can be an effective approach for problems with multiple minima.

Expected HDF5 Output

If Dakota was built with HDF5 support and run with the hdf5 keyword, this method writes the starting points for each sub-iterator it runs, as well as the best parameters and responses returned by each sub-iterator. See the Multistart and Pareto Set documentation for details.