I'm trying to implement a difference equation in Python. These take the form yn+1 = a * yn + b, given y0 where y0 is initial value and it iterates -- meaning
y1 = a * y0 + b,
y2 = a * y1 + b,
...
An example problem (from my calculus class) would be thus: Say you take out a loan of $60,000 and plan on paying $700 a month back at an interest rate of 1.2%. How much would be left after 5 years? This would be set up as yn+1 = 1.1 * yn - 700, y0 = 60,000
I understand recursion in Python in the sense that you can say, for example,
i = 0
while i < 20:
i = i+1
but I'm unsure how to approach it when the next iteration requires a value from the previous.
Typically you'd do this by storing a single variable with the last value from the calculation, and seeding it with your starting value. EG:
y = 60000
while True:
y = .1 * y - 700
Of course, you have to determine when to stop and what to do with the values. You certainly want to print them:
y = 60000
while True:
y = .1 * y - 700
print(y)
But you probably only want to do it for 100 times or something instead of forever:
y = 60000
for i in range(12*5):
y = .1 * y - 700
print("%d: %f" % (i,y))
And you may want to store them for use later as well, so put those in an array:
y = 60000
results=[]
for i in range(12*5):
y = .1 * y - 700
results.append(y)
print(results)