cmss A simple example

Figure 1: 2D and 1D slices of the function that is minimized throughout this tutorial. Although not obvious at first sight, it has a unique minimum.
\resizebox*{0.3\textwidth}{!}{\includegraphics{figures/2D_slice-3.eps2}}  \resizebox*{0.5\textwidth}{!}{\includegraphics{figures/optim_tutorial_slice.eps}}

We will use a call of the type

[x_best, best_value, niter] = minimize (func, x_init)
to find the minimum of

\begin{displaymath}
\begin{array}{cccc}
f\, : & \left( x_{1},.x_{2},x_{3}\right)...
...\left( x_{2}-1\right) -\cos \left( x_{3}-1\right) .
\end{array}\end{displaymath}

The following commands should find a local minimum of \( f() \), using the Nelder-Mead (aka ``downhill simplex'') algorithm and starting from a randomly chosen point x0 :

function cost = foo (xx)

  xx-;  

  cost = sum (-cos(xx)+xx.^2/9);

endfunction

x0 = [-1, 3, -2];

[x,v,n] = minimize ("foo", x0)

The output should look like :

x =

  1.00000 1.00000 1.00000

v = -3.0000

n = 248

This means that a minimum has been found in \( \left( 1,1,1\right) \) and that the value at that point is \( -3 \). This is correct, since all the points of the form \( x_{1}=1+2i\pi ,\, x_{2}=1+2j\pi ,\, x_{3}=1+2k\pi \), for some \( i,j,k\in \N\), minimize \( f() \). The number of function evaluations, 248, is also returned. Note that this number depends on the starting point. You will most likely obtain different numbers if you change x0.

The Nelder-Mead algorithm is quite robust, but unfortunately it is not very efficient. For high-dimensional problems, its execution time may become prohibitive.

Søren Hauberg 2008-04-29