Dakota Reference Manual  Version 6.9
Explore and Predict with Confidence
 All Pages
Textbook

The two-variable version of the ``textbook'' test problem provides a nonlinearly constrained optimization test case. It is formulated as:

\begin{align} \texttt{minimize } & f = (x_1-1)^{4}+(x_2-1)^{4} \nonumber \\ \texttt{subject to } & g_1 = x_1^2-\frac{x_2}{2} \le 0 \tag{textbookform} \\ & g_2 = x_2^2-\frac{x_1}{2} \le 0 \nonumber \\ & 0.5 \le x_1 \le 5.8 \nonumber \\ & -2.9 \le x_2 \le 2.9 \nonumber \end{align}

Contours of this test problem are illustrated in the next two figures.

textbook_contours.png
Contours of the textbook problem on the [-3,4] x [-3,4] domain. The feasible region lies at the intersection of the two constraints g_1 (solid) and g_2 (dashed).
textbook_closeup.png
Contours of the textbook problem zoomed into an area containing the constrained optimum point (x_1,x_2) = (0.5,0.5). The feasible region lies at the intersection of the two constraints g_1 (solid) and g_2 (dashed).

For the textbook test problem, the unconstrained minimum occurs at $(x_1,x_2) = (1,1)$. However, the inclusion of the constraints moves the minimum to $(x_1,x_2) = (0.5,0.5)$. Equation textbookform presents the 2-dimensional form of the textbook problem. An extended formulation is stated as

\begin{align} \texttt{minimize } & f = \sum_{i=1}^{n}(x_i-1)^4 \nonumber \\ \texttt{subject to } & g_1 = x_1^2-\frac{x_2}{2} \leq 0 \tag{tbe} \\ & g_2=x_2^2-\frac{x_1}{2} \leq 0 \nonumber \\ & 0.5 \leq x_1 \leq 5.8 \nonumber \\ & -2.9 \leq x_2 \leq 2.9 \nonumber \end{align}

where $n$ is the number of design variables. The objective function is designed to accommodate an arbitrary number of design variables in order to allow flexible testing of a variety of data sets. Contour plots for the $n=2$ case have been shown previously.

For the optimization problem given in Equation tbe, the unconstrained solution

(num_nonlinear_inequality_constraints set to zero) for two design variables is:

\begin{eqnarray*} x_1 &=& 1.0 \\ x_2 &=& 1.0 \end{eqnarray*}

with

\begin{eqnarray*} f^{\ast} &=& 0.0 \end{eqnarray*}

The solution for the optimization problem constrained by $g_1$\ (num_nonlinear_inequality_constraints set to one) is:

\begin{eqnarray*} x_1 &=& 0.763 \\ x_2 &=& 1.16 \end{eqnarray*}

with

\begin{eqnarray*} f^{\ast} &=& 0.00388 \\ g_1^{\ast} &=& 0.0 ~~\mathrm{(active)} \end{eqnarray*}

The solution for the optimization problem constrained by $g_1$ and $g_2$\ (num_nonlinear_inequality_constraints set to two) is:

\begin{eqnarray*} x_1 &=& 0.500 \\ x_2 &=& 0.500 \end{eqnarray*}

with

\begin{eqnarray*} f^{\ast} &=& 0.125 \\ g_1^{\ast} &=& 0.0 ~~\mathrm{(active)} \\ g_2^{\ast} &=& 0.0 ~~\mathrm{(active)} \end{eqnarray*}

Note that as constraints are added, the design freedom is restricted (the additional constraints are active at the solution) and an increase in the optimal objective function is observed.