*A quick note for those who have been regular readers here. I’ve been out of commission for the last month mostly because of getting sick (twice) and then having to do quadruple time for catching up on work, etc.*

Now on to some writing …

I recently taught a genetic algorithms lesson as part of a larger lecture series I gave on Data Science and Python. This was a broad survey course that touched on statistical analysis, regression, neural nets, and genetic algorithms.

In the genetic algorithms course, I wanted to show how they can be used as generic solvers [with a little bit of work]. But I also wanted to show some off-the-shelf solvers that already exist. One example is with Python’s `scipy`

‘s `optimize`

module that contains a function called `minimize`

. If we let \(h(x) = \sin(x) – e^{\cos(x^{2})} – x\) and provide an initial guess of \(x = 0.5\) with the constraint that we wish to minimize \(h(x)\) on \([0,1]\):

>>> scipy.optimize.minimize(h,[0.5,],bounds=((0,1),))

`minimize`

gives \(x = 0\) and \(h(x) = e\).

Now what’s fun about this is what `minimize`

does with different initial guesses. With an initial guess of \(x_{0} = 0.48\), we have \(h(x) = -2.71831419\) at \(x = 0.09182462\).

>>> scipy.optimize.minimize(h,[0.48,],bounds=((0,1),))

Test this out on Wolfram Alpha which gives \(x \approx .0919095\).

And more hack work with the initial guesses: for an initial guess of \(x_{0}=0.01\), `scipy.optimize.minimize`

tells me the minimum is at \(x=0.01004454\) with a value of \(−2.71828198\). However, an initial guess of \(x_{0}=0.02\) gives \((0.09188086,−2.71831419)\).

Interestingly, if we graph \(h(x)\) on \([0,1]\) we see this

And if we zoom in to \([0,0.1]\) we see this

So a simple word to the wise, even when running (effectively) black box optimization algorithms, it may be helpful to provide multiple initial guesses.

As an aside, any of the results given by `scipy.optimize.minimize`

were in agreement with each other to 4 places for minimizing \(h(x)\). So, depending on the context, any of the solutions given by `scipy.optimize.minimize`

are either good enough or not good enough. If we recall, \(\sin(x) \approx x\) and \(\cos(x) \approx 1\) for small \(x\). Thus, \(h(x) \approx x – e – x = -e\) for small \(x\).

New here? Check out the "About The Blog" page and say "Hello" to @shahlock!

Your Thoughts Are Welcome. Leave A Comment!