I am maintaining a few Python packages, and on systems where dependencies are not fully satisfied, functions which may otherwise work, fail due to global scope imports:
import numpy as np
def lala(in):
out = max(in)
return out
def fufu(in):
out = np.mean(in)
return out
So for instance here I cannot use lala()
if I do not have numpy, ven if lala does not use numpy.
Of course, ideally dependencies would be managed correctly, still, it would make for a more robust package if functions failed only when they had to fail.
Is there any reason why imports are almost never done in function scope? Is this reason just reducing the number of lines?
Is there any reason why imports are almost never done in function scope?
The reason is that you generally want errors to be detected sooner rather than later. A function may only be invoked after a while, so with imports inside functions you could have a program that appears to work, only to fail at some point at run-time. Such failures are especially bad news when they crash the program after a long calculation, or in production at a customer's site, and they are best avoided.
The above does not apply if the function is designed to be optional. In that case importing inside the function is quite appropriate, possibly with catching the ImportError
and reraising it as a business exception.