I want to create a column that produces a value based on a linear interpolation/extrapolation of other columns in my df.
For example, I have a row with column a being a value of 500, column b being a value of 0, column c being a value of 400, column d being a value of -100. I want to find the value extrapolated from columns b and d where column a/c = 300. Basically given this linear relationship between col a, b, c, d what would x value have to be for y = 300.
I have tried/looked at other snippits of code but cannot find something of such nature.
Thanks!
you can interpolate the missing value for column a (where column a/c = 300) using the linear interpolation method
import pandas as pd
data = {'a': [500], 'b': [0], 'c': [400], 'd': [-100]}
df = pd.DataFrame(data)
#interpolate column 'a' using linear method
df['a'] = df['a'].interpolate(method='linear')
#calculate the corresponding values for columns 'b' and 'd'
df['b'] = (300 - df['a']) * (df['b'] - df['c']) / (df['a'] - df['c']) + df['b']
df['d'] = (300 - df['a']) * (df['d'] - df['c']) / (df['a'] - df['c']) + df['d']
#now df['b'] and df['d'] contain the extrapolated values for y = 300
print(df)