Search code examples
pythonscipymathematical-optimization

Restricting magnitude of change in guess in fmin_bfgs


I'm trying to estimate a statistical model using MLE in python using the fmin_BFGS function in Scipy.Optimize and a numerically computed Hessian.

It is currently giving me the following warning: Desired error not necessarily achieved due to precision loss.

When I print the results of each evaluation, I see that while the starting guess yields a reasonable log-likelihood. However, after a few guesses, the cost function jumps from ~230,000 to 9.5e+179.

Then it gives a runtime warning: RuntimeWarning: overflow encountered in double_scalars when trying to compute radical = B * B - 3 * A * C in the linesearch part of the routine.

I suspect that the algo is trying to estimate the cost function at a point that approaches an overflow. Is there a way to reduce the rate at which the algorithm changes parameter values to keep the function in a well-behaved region? (I would use the constrained BFGS routine but I don't have good priors over what the parameter values should be)


Solution

  • I recently ran into the same problem with fmin_bfgs.

    As far as I could see, the answer is negative. I didn't see a way to limit the stepsize.

    My workaround was to first run Nelder-Mead fmin for some iterations, and then switch to fmin_bfgs. Once I was close enough to the optimum, the curvature of my function was much nicer and fmin_bfgs didn't have problems anymore.

    In my case the problem was that the gradient of my function was very large at points further away from the optimum.

    fmin_l_bfgs_b works also without constraints, and several users have reported reliable performance.

    aside: If you are able to convert your case to a relatively simple test case, then you could post it to the scipy issue tracker so that a developer or contributor can look into it.