Search code examples
pythonpython-2.7python-module

In Python, is it a bad idea for a directory module's __init__.py to do this?


I have a package with several subdirectories containing __init__.py files. These files perform checks and initialization jobs.

Now I also have in some of these folders a file that isn't supposed to be referenced directly by an import as it relies on the sanity checks performed in its respective __init__.py. Let's call these "hidden modules" - as a convention I use an underscore to make those obvious.

Is it a bad idea to do the following inside my __init__.py (with a _implementation.py located in the same folder):

import os, sys
if sanity_check_successful:
    from ._implementation import *
    __all__ = sys.modules[__name__ + "._implementation"].__all__

The idea should be clear, I am trying to provide meaningful error information at each respective module level in the package whenever a sanity check fails.

Is this a bad idea, i.e. copying the __all__ array over from the "hidden module"? If so, why and are there better alternatives?

Bonus points: is there a more concise way of writing those two lines?:

    from ._implementation import *
    __all__ = sys.modules[__name__ + "._implementation"].__all__

In particular it itches me that I have to use a string "._implementation" in one place and as a module name in another.


Solution

  • There are simpler ways to set __all__ from the ._implementation submodule; setting __all__ otherwise fine:

    from ._implementation import *
    from ._implementation import __all__
    

    This simply imports __all__ and binds it to the local module namespace __all__.