Dakota Reference Manual  Version 6.12
Explore and Predict with Confidence
 All Pages
Rosenbrock

The Rosenbrock function[32] is a well-known test problem for optimization algorithms. The standard formulation includes two design variables, and computes a single objective function. This problem can also be posed as a least-squares optimization problem with two residuals to be minimzed because the objective function is the sum of squared terms.

Standard Formulation

The standard two-dimensional formulation can be stated as

\begin{equation} \texttt{minimize } f=100(x_2-x_1^2)^2+(1-x_1)^2 \tag{rosenstd} \end{equation}

Surface and contour plots for this function are shown in the Dakota User's Manual.

The optimal solution is:

\begin{eqnarray*} x_1 &=& 1.0 \\ x_2 &=& 1.0 \end{eqnarray*}

with

\begin{eqnarray*} f^{\ast} &=& 0.0 \end{eqnarray*}

A Least-Squares Optimization Formulation

This test problem may also be used to exercise least-squares solution methods by recasting the standard problem formulation into:

\begin{equation} \texttt{minimize } f = (f_1)^2+(f_2)^2 \tag{rosenls} \end{equation}

where

\begin{equation} f_1 = 10 (x_2 - x_1^2) \tag{rosenr1} \end{equation}

and

\begin{equation} f_2 = 1 - x_1 \tag{rosenr2} \end{equation}

are residual terms.

The included analysis driver can handle both formulations. In the dakota/share/dakota/test directory, the rosenbrock executable (compiled from Dakota_Source/test/rosenbrock.cpp) checks the number of response functions passed in the parameters file and returns either an objective function (as computed from Equation rosenstd) for use with optimization methods or two least squares terms (as computed from Equations rosenr1 -rosenr2 ) for use with least squares methods. Both cases support analytic gradients of the function set with respect to the design variables. See the User's Manual for examples of both cases (search for Rosenbrock).