In the following:
a = 3
def b():
# global a; // will error without this statement
a = a + 1
print a
It will error unless I add a global
. It seems in this sense that python evaluates the LHS of the expression first (creating a new local for a
) rather than the RHS first (which equals 4) and then assigning it to the local a. In other words, like:
local a <-- local a + 1
^
doesnt exist so look up in parent environment
local a <-- global a + 1
local a <-- 3 + 1
I'm just curious why that approach isn't used in python as the default. For example, in C it uses this pattern:
// file.c
#include <stdio.h>
int a=3;
int main(void)
{
a = a + 1;
printf("%d\n", a);
}
$ gcc file.c -o file; ./file
4
There is no official explanation, but I can think of two reasons
a = a + 1
is an assigment, it refers to the local variable a
, not the global one (unless otherwise specified). Since you have not declared a local a
, it is implicitly defined, but not initialized (something similar happens in javascript too, and is also a common source of confusion). In C you would not have that misunderstanding, it's a static language, you would have defined a local int a
if it existed.c()
inside function b()
, which would bind to the a
variable inside b
, not the global a
. C doesn't have closures, so this is not useful.