Search code examples
pythonmatlabscipyminimization

Why I'm not getting the same results from BFGS minimize in Python as the ones in MATLAB?


I'm trying to find the mimimum of a function Minimum of a function with BFGS method (Page 29 of the PDF document)

And I'm not getting the same results as the ones reported in the link, I already try with and without the jacobian with no luck. Any help, will be appreciated.

The code so far:

import numpy as np
from scipy.optimize import minimize
def objective(x):
   x1=x[0]
   x2=x[1]
   print ("x1: ",x1,"    ","x2: ",x2)
   return pow(x1,4.0)-2*x2*pow(x1,2.0)+pow(x2,2.0)+pow(x1,2.0)-2.0*x1+5.0

def jacobiano(x):
  x1=x[0]
  x2=x[1]
  jaco=np.zeros(2)
  jaco[0]=4.0*x1-4.0*x2*x1+2.0*x1-2.0
  jaco[1]=-2.0*pow(x1,2.0)+2.0*x2
  print ("dx1: ",jaco[0],"    ","dx2: ",jaco[1])
  return jaco
x0=np.array([1.0,2.0], dtype=np.double)
print(objective(x0))
sol=minimize(objective,x0,method='BFGS',jac=jacobiano, options={'disp': True})
print(sol)

Solution

  • The problem arises because you have incorrectly calculated the Jacobian, in your case df/dx1 is incorrect.

    if f = x1**4 -2*x2*x1**2 +x2**2+ x1**2 -2.0*x1+5.0

    then df/dx1 = 4.0*x1**3 -4.0*x2*x1 + 2.0*x1-2.0


    import numpy as np
    from scipy.optimize import minimize
    
    def objective(x):
       x1, x2 = x
       print ("x1: ",x1,"    ","x2: ",x2)
    
       return x1**4 -2*x2*x1**2 +x2**2+ x1**2 -2.0*x1+5.0
    
    
    def jacobiano(x):
      x1, x2 = x
      jaco=np.zeros(2)
      jaco[0]=4.0*x1**3 -4.0*x2*x1 + 2.0*x1-2.0
      jaco[1]=-2.0*x1**2.+2.0*x2
    
      print("dx1: ",jaco[0],"    ","dx2: ",jaco[1])
      return jaco
    
    x0=np.array([1.0,2.0], dtype=np.double)
    
    sol=minimize(objective,
     x0,method='BFGS',jac=jacobiano, options={'disp': True})
    print(sol)
    

    Output:

    Optimization terminated successfully.
             Current function value: 4.000000
             Iterations: 7
             Function evaluations: 9
             Gradient evaluations: 9
          fun: 4.000000000002963
     hess_inv: array([[ 0.50324351,  1.0154575 ],
           [ 1.0154575 ,  2.55695728]])
          jac: array([  7.65547714e-06,  -2.90129716e-06])
      message: 'Optimization terminated successfully.'
         nfev: 9
          nit: 7
         njev: 9
       status: 0
      success: True
            x: array([ 1.00000093,  1.0000004 ])
    

    Matlab:

    x1=1.00863, x2=1.01932, f=4.00008
    

    Python:

    x1=1.00000093, x2=1.0000004, f=4.000000000002963
    

    Optimal solution

    x1=1.0, x2=1.0, f=4.0