I have strange problem. Let's say that there is an illness that lasts exactly 3 days, at the beginning there is m infectees. During 1st day of infection infectee infects d1 others, on 2nd day d2 others, on 3rd day d3 others, and then on 4th day he recovers (he is no longer considered ill). Task is to say how many people will get infected during n-th day.
example1
n = 4, m = 2
d1 = 1, d2 = 2, d3 = 3
result = 34
example2
n = 1, m = 5
d1 = 100, d2 = 200, d3 = 5
result = 500
example3
n = 3, m = 5
d1 = 5, d2 = 5, d3 = 5
result = 900
I believe it is similar to fibonacci sequence that can be calculated in O(log n) time using matrix...
Let f(n)
denote the number of infectees.
We only need to know f(n-2)
, f(n-1)
, and f(n)
, in order to calculate f(n+1)
using the formula from Tomer Shetah's answer:
f(n+1) = d1 * f(n) + d2 * f(n-1) + d3 * f(n-2)
Since this equation is a linear recurrence, we can calculate f(n)
using exponentiation by squaring and matrices. Here's an implementation in Python using NumPy:
import numpy as np
def solve(n, m, d1, d2, d3):
transform = np.array([[d1, 1, 0], [d2, 0, 1], [d3, 0, 0]], dtype='object')
initial_state = np.array([[m, 0, 0]], dtype='object')
final = initial_state * np.linalg.matrix_power(transform, n)
return final[0][0]
assert solve(4, 2, 1, 2, 3) == 34
assert solve(1, 5, 100, 200, 5) == 500
assert solve(3, 5, 5, 5, 5) == 900
This solution is faster than the naïve approach. Its time complexity is O(log n)
, provided we assume the computational cost of each multiplication and addition to be constant.