I'm trying to solve a delayed differential equation in Matlab:
mRNA' = k0 + k*Activator(t-delta_t) - gamma*mRNA(t)
In this equation
k0 is constant, representing basal transcription (production) of mRNA;
k is another constant parameter representing the rate of Activator stimulated mRNA production that is dependent on the amount of Activator at time t-delta_t;
gamma is another constant representing the rate of degradation of mRNA
mRNA at time t is the amount of mRNA at time t.
I'm trying to simulate this equation so that I can mess around and see how it behaves with different parameters (i.e. different time delays, comparisons with ODE's etc.). I'm following the code example here with limited success.
My code so far is:
function General_mRNA_DDE
sol = dde23(@General_mRNA_DDE2,2,@input_function,[0,5])
figure;
plot(sol.x,sol.y)
function dydt = General_mRNA_DDE2(t,y,z)
k0=1;
k=10;
mRNA0=1; %initial concentration of mRNA
gamma=0.1;
z
dydt= [k0 + k*z - gamma*y];
end
function hist = input_function(t)
hist = 1;
end
end
But what I have essentially looks like a very steep exponential curve. Here is what I'm trying to reproduce:
From this paper doi: 10.15252/msb.20177554 (http://msb.embopress.org/content/msb/13/5/928.full.pdf)
Does anybody have any advice for me to accurately reproduce the figure?
Thanks in advance
This is not a time-delayed differential equation, as the derivative and value of the unknown mRNA are taken from the same time. That the value of the control function is from a delayed time is not important if the Activator value does not depend on the mRNA value at that previous time.
You can apply an integrating factor exp(gamma*t)
so that the new differential equation
( exp(gamma*t) * mRNA(t) )' = exp(gamma*t) * ( k0 + k*Activator(t-delta_t) )
can be solved by simple integrations, esp. if the Activator function is piecewise constant.