In Python it is possible for functions to access module variables as the follows:
def pow(a):
return a**b
a = 3
b = 2
print(pow(a))
Is this considered bad practice?
The obvious alternative would be to explicitly pass all arguments to functions:
def pow(a, b):
return a**b
a = 3
b = 2
print(pow(a, b))
I am using the first form as a way to limit the number of parameters passed to a function, and I was wondering whether this is considered bad practice.
It is possible; but use the global
or nonlocal
keywords if you do this.
LOGGER = logging.getLogger('FooLogger')
LOGGER.setLevel(logging.INFO)
def log_this(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
global LOGGER # this is in the global namespace
nonlocal func # this is from the parent function
retval = func(*args, **kwargs)
LOGGER.info('%s(*%s, **%s)=%s', func.__name__, args, kwargs, retval)
return retval
return wrapper
@log_this
def cube(base, *, mod=None):
return pow(base, 3)
>>> cube(4)
INFO:FooLogger:cube(*(4,), **{})=64
64
>>> cube(4, mod=7)
INFO:FooLogger:cube(*(4,), **{'mod': 7})=1
1
But it's generally better to pass a variable explicitly to a function. That enhances readability, and reduces ambiguity in the function's purpose. Consider the following:
MIN_SIZE = 5
def check_size(collection):
global MIN_SIZE
return len(collection) > MIN_SIZE
>>> check_size([1, 2, 3])
False
At first glance, I see that check_size
is called, but what size is it checking against? If I didn't know, I'd have to search for it. This is a readability problem that will only serve to confuse more, as the program grows.
MIN_SIZE = 5
def check_size(collection, size):
return len(v) > n
>>> check_size(range(10), MIN_SIZE)
True