Search code examples
pythonpython-3.xnumpyevaluationbuilt-in

Python eval() fails to regonize numpy and math symbols if used together with a dictionary


I have the following formula that I would like to evaluate:

import math
import numpy as np
formula  = 'np.e**x + math.erf(x) + np.pi  +  math.erf(u)'

I can easily then evaluate the formula for given float values of x and u and eval() recognizes math.erf, np.pi and np.e. For example:

x=1.0; u=0.3; eval(formula)

yields 7.03. But, I want x and u to be arrays. I tried to use eval with a dictionary following this post.:

var = {'x':np.array([1,1]),'u':np.array([0.1,0.2])}
eval(formula, var)

which yields error messages, 'np' and 'math' are not defined, which was not the case above when eval was used without a dict. The same error messages are also obtained when 'x' and 'u' are set to floats instead of array with the var dictionary. Also, there are no problems if the dictionary is used with a 'formula' without np. and math., e.g. "

formula = 'x + u'. 

Does anybody have an idea how can I evaluate formula containing np.e, math.erf, etc., when x and u are arrays?

P.S. I am using Python 3.8.


Solution

  • Passing var as globals to eval() means the global names np and math are not available to the expression (formula). Simply pass var as locals instead.

    var = {'x': 1.0, 'u': 0.3}
    eval(formula, None, var)  # -> 7.031202034457681
    

    Note that this code doesn't work with arrays since the math module only works on scalars.