Search code examples
python-3.xodenumerical-integrationscipy-optimizemethod-signature

How can I fix this TypeError when fitting ODE with scipy?


I would like to adjust parameters of a simple ODE using the scipy package. I have the feeling that it is bearable. I am aware about this post but I think my question is different.

First we import required packages:

import numpy as np
from scipy import integrate, optimize

We define the ODE with a signature compliant to the new scipy.interpolate.solve_ivp method:

def GGM_ODE(t, C, r, p):
    return r*np.power(C, p)

We define the integrated ODE solution with a signature compliant to the classic scipy.optimize.curve_fit:

def GGM_sol(t, C, r, p):
    return integrate.solve_ivp(GGM_ODE, (t[0], t[-1]), [C], t_eval=t, args=(r, p))

We create a synthetic dataset by solving the IV problem for a given set of parameters:

t = np.arange(0, 21)
sol = GGM_sol(t, 1, 0.5, 0.7)

This works perfectly.

Finally, we try to adjust parameters by fitting the integrated solution:

popt, pcov = optimize.curve_fit(GGM_sol, t, sol.y)

Unfortunately, this last step fails with a cryptic error (at least cryptic to me because I haven't enough insight on how scipy is built):

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-45-22b0c3097986> in <module>
----> 1 popt, pcov = optimize.curve_fit(GGM_sol, t, sol.y)

~\AppData\Local\Continuum\anaconda3\lib\site-packages\scipy\optimize\minpack.py in curve_fit(f, xdata, ydata, p0, sigma, absolute_sigma, check_finite, bounds, method, jac, **kwargs)
    761         # Remove full_output from kwargs, otherwise we're passing it in twice.
    762         return_full = kwargs.pop('full_output', False)
--> 763         res = leastsq(func, p0, Dfun=jac, full_output=1, **kwargs)
    764         popt, pcov, infodict, errmsg, ier = res
    765         ysize = len(infodict['fvec'])

~\AppData\Local\Continuum\anaconda3\lib\site-packages\scipy\optimize\minpack.py in leastsq(func, x0, args, Dfun, full_output, col_deriv, ftol, xtol, gtol, maxfev, epsfcn, factor, diag)
    386     if not isinstance(args, tuple):
    387         args = (args,)
--> 388     shape, dtype = _check_func('leastsq', 'func', func, x0, args, n)
    389     m = shape[0]
    390 

~\AppData\Local\Continuum\anaconda3\lib\site-packages\scipy\optimize\minpack.py in _check_func(checker, argname, thefunc, x0, args, numinputs, output_shape)
     24 def _check_func(checker, argname, thefunc, x0, args, numinputs,
     25                 output_shape=None):
---> 26     res = atleast_1d(thefunc(*((x0[:numinputs],) + args)))
     27     if (output_shape is not None) and (shape(res) != output_shape):
     28         if (output_shape[0] != 1):

~\AppData\Local\Continuum\anaconda3\lib\site-packages\scipy\optimize\minpack.py in func_wrapped(params)
    461     if transform is None:
    462         def func_wrapped(params):
--> 463             return func(xdata, *params) - ydata
    464     elif transform.ndim == 1:
    465         def func_wrapped(params):

TypeError: unsupported operand type(s) for -: 'OdeResult' and 'float'

I can see this error is classic TypeError about incompatible operands for the difference operator. It claims it cannot subtract a float to an OdeResult object. It also only concerns optimize package not the integrate.

What I do not understand it is why I am getting this error.

What must I change in my function signature or function call to make curve_fit work? Or is there something else I have missed?


Solution

  • It is exactly as the error message says, solve_ivp returns a solution object which contains the solution data. Try

    def GGM_sol(t, C, r, p):
        res = integrate.solve_ivp(GGM_ODE, (t[0], t[-1]), [C], t_eval=t, args=(r, p))
        return res.y[0]
    

    to get only the solution values.