I was trying to integrate a square wave using python 3.5 and the scipy.integrate.odeint
function but the results don't make any sense and vary wildly with the array of time points selected.
The square wave has a period of 10sec and the simulation runs for 100sec. Since the array of time points has size 500, there will be 50 time points on each period of the square wave, but that doesn't seem to be happening.
Using the optional parameter hmax=0.02
fixes it, but shouldn't it be inferred automatically?
Here's the code:
import numpy as np
import matplotlib.pyplot as plt
import scipy.integrate as integrate
# dx/dt = f(t), where f(t) is a square wave
def f(x, t):
return float(t % 10.0 < 5.0) * 0.3
T = 100
tt = np.linspace(0, T, 500)
xx = integrate.odeint(f, 0, tt, hmax=0.2)
plt.figure()
plt.subplot(2,1,1)
plt.plot(tt, xx)
plt.axis([0,T,0,16])
plt.subplot(2,1,2)
plt.plot(tt, [f(None,t) for t in tt])
plt.axis([0, T, 0, 1])
plt.show()
I'm hoping someone can put some light into what is happening here.
Try changing T
between 80 and 100 (simulation time).
I think your problem is that the odeint function takes continuous Ordinary Differential Equations which a square wave is not.
i'd start by redefining your square-wave function to:
def g(t):
return float(t % 10.0 < 5.0) * 0.3
then define a function to calculate the integral step-by-step:
def get_integral(tt):
intarray = np.zeros_like(tt)
step_size = tt[1] -tt[0]
for i,t in enumerate(tt):
intarray[i] = intarray[i-1] + g(t)*step_size
return intarray
Then:
xx = get_integral(tt)
should give you the result you're looking for.