I'm trying to do an extremely simple pass-in of a string variable into my eval
statement. However, my string is being treated as an undefined variable.
Here is my code:
condition = 'hi'
print(eval("2 + 4 * len(%s)" % (condition)))
Output:
Traceback (most recent call last):
File "C:\test.py", line 3, in <module>
print(eval("4 + 3 * len(%s)" % (condition)))
File "<string>", line 1, in <module>
NameError: name 'hi' is not defined
However, when I define hi
as if it were a variable, all the sudden the code compiles and runs:
condition = 'hi'
hi = 'hi'
print(eval("2 + 4 * len(%s)" % (condition)))
Output:
10
What in the world? This seems totally unintuitive to me. Could someone help me define condition
in a way that Python does not ask for 'hi'
to be defined as well?
You need quotes around %s, like so:
condition = 'hi'
print(eval("2 + 4 * len('%s')" % (condition)))
This way you are passing len() a string 'hi' instead of a variable hi.