I've created a Python package that contains both pure modules and scripts, all within the same folder. The scripts need to make use of functionality within the modules so simply import them:
import *module*
I now wish to share this package with others so I've turned it into a distributable package using distutils. As instructed by the documentation for distutils, I've declared the script files within my package in 'setup.py' as follows:
setup(...,
scripts=['path/to/script/a', 'path/to/script/b']
)
After installing this package, I notice that distutils has installed a copy of my command line scripts to the 'Scripts' folder within my Python installation. All well and good. Now, if I try an run any one of these scripts, it fails with:
ImportError: No module named *module*
Presumably this is because the relative path between the scripts and the modules has changed after installation so it can no longer find the modules. So my question is, how are you supposed to import modules from scripts within the same package so that it works both pre-bundling with distutils and post-installation?
Now I could easily solve this by modifying my module import like so:
try:
import *module*
except ImportError:
from *package* import *module*
This seems like a bit of a hack. Am I missing a trick here? I would have expected distutils to take care of this for me. Is there a better and more robust way of handling this?
Maybe try setuptools
- it has a nice way to automatically create main script. Short example:
setup(
# other arguments here...
entry_points = {
'console_scripts': [
'foo = my_package.some_module:main_func',
'bar = other_module:some_func',
],
'gui_scripts': [
'baz = my_package_gui.start_func',
]
}
)