I have a program where I want to transform a variable into another in an equation. The idea is to get an input from the user, which is an equation and transform the symbolic variables into another ones. Here the example:
from sympy import symbols, limit, diff
x_1, x_2, X_1, X_2 = symbols('x_1 x_2 X_1 X_2')
def afosm_nl_ls(x_1_mean, x_2_mean, x_1_std, x_2_std, g):
x_1 = X_1*x_1_std + x_1_mean
x_2 = X_2*x_2_std + x_2_mean
g = g.subs((x_1, X_1), (x_2, X_2))
print(g)
afosm_nl_ls(50, 30, 3, 2, x_1**2 + 1 - x_2)
Giving me this output:
x_1**2 - x_2 + 1
Then, I want to make x_i
turn to X_i*x_i_std + x_i_mean
. I made it easily without the function, just using this:
from sympy import limit, diff, symbols
x_1, x_2, X_1, X_2 = symbols('x_1 x_2 X_1 X_2')
x_1 = X_1*3 + 50
x_2 = X_2*2 + 30
g = x_1**2 + 1 - x_2
print(g)
Where the result is:
-2*X_2 + (3*X_1 + 50)**2 - 29
So, how should I write the code in the function to still get the equation from the user and get the right transformation?
You redifined x_1
and x_2
as expressions in the function so subs
couldn't find them. And you also entered the wrong input to subs
. Input should be subs(old, new)
or subs(list of (old, new))
-- compare what you have to this:
from sympy import symbols, limit, diff
x_1, x_2 = symbols('x_1 x_2')
def afosm_nl_ls(x_1_mean, x_2_mean, x_1_std, x_2_std, g):
v1 = x_1*x_1_std + x_1_mean
v2 = x_2*x_2_std + x_2_mean
return g, g.subs([(x_1, v1), (x_2, v2)])
afosm_nl_ls(50, 30, 3, 2, x_1**2 + 1 - x_2)
which gives
(x_1**2 - x_2 + 1, -2*x_2 + (3*x_1 + 50)**2 - 29)