Search code examples
pythonlinear-algebracalculusnon-linear-regression

Evolve dynamic system without an analytic expression for time derivative by solving for all the time-changing values at the same time


I have a dynamic system where I have an expression for the residual of the time derivate but not an analytic expression for what the time derivative actually is: sum(da_i/dt) ** 2 = f(a). I can solve this by performing an optimization at every time step. I think I can recast this into a least squares problem and solve for every value at every time step in a single solve. I think this is a linear casting but I'm not 100% sure about that. How can I form this system of equations to efficiently solve for the values across all times in one solve?

'''
my situation is that I know that this relationship is true:
(\sum_{i=1}^n d a_i / dt) ** 2 = f(a).
In this example, I assume a is 2 dimensional, f(a) = a1 **2 - a1 * a2,
and I want to evolve the equation for 5 time steps.
'''
def residual(dadt, current_a):
    return (np.sum(dadt) ** 2 - (current_a[0] **2 - np.sum(a) * current_a[0] * current_a[1])) ** 2

'''
My real problem is a little more complicated than this, but the 
basic elements are here. Now, my strategy is to attempt to form 
something like an Ax=b formulation.
'''

'''
Here is an approach that works. I find a values that make the residual zero.
This approach is a little time-consuming because I have to solve the least-squares
problem after each time iteration.
This is a little goofy because I use an optimization engine to solve the problem.
I think it would be better to cast this as a least squares regression problem.
'''
from scipy.optimize import minimize as mini
n_time_steps = 300
a0 = np.ones(2) * .5
a = a0
dt = 1e-3
a0_record = []
for _ in range(n_time_steps):
    a0_record.append(a[0])
    dadt = mini(residual, np.ones(a.size), args=(a,)).x
    a += dadt * dt

# Let's plot the time history. Maybe that can help verify the new approach.
import matplotlib.pyplot as plt
plt.plot(a0_record)
plt.savefig('a0baseline.pdf')
plt.clf()


'''
I want to cast this into a single least-squares problem. In this case, I think
I should have 2 * 300 = 600 free variables. Since I only have 300 time steps
to enforce the residual to be zero, I think that my system is underdetermined.
I don't quite know how to do this. I sketched my best shot so far below but I 
am sure that there is much improvement to make.
'''
a = np.ones((2, n_time_steps))
A = np.stack([residual((a[i] - a[i-1]) / dt, a[i-1]) for i in range(1,n_time_steps)])
xx = np.linalg.lstsq(A, np.zeros(n_time_steps))

Solution

  • It's not a linear problem. Alls you haves to do is solve it with least_squares or some other nonlinear solver, or use an implicit method. Beware, this explicit method takes a very long time to run.

    from scipy.optimize import least_squares
    def fitness(A):
        mysum = 0.
        A = A.reshape(n_time_steps, 2)
        for ii in range(n_time_steps-1):
             mysum += residual((A[ii+1] - A[ii]) / dt, A[ii]) ** 2
        return np.array(mysum)
    
    least_squares(fitness, np.random.random(600)) # beware of the trivial solution a=0