I'm new to using python for larger projects. I figured out following folder structure for my python project:
project
├── doc
├── src
│ ├── hardware
│ ├── devices
│ │ ├── device1
│ │ └── device2
│ ├── measurement
│ ├── ui
│ └── util
└── tests
├── hardware
├── devices
│ ├── device1
│ └── device2
├── measurement
├── ui
└── util
The tests folder contains unittests for modules in the soure folder. Is this the right approach for such a program?
How should I do the importing? I thought to add the root folder to the python path and import every module absolute from the root path. Should I do this so?
This program is used on several coimputers. How should I deploy it? Currently I use SVN
and every computer gets the whole project and starts it from there. There is one disadvantage, I have to add the prject path to the PYTHONPATH
on every computer.
The global structure looks good to me. You need a module to write a setup.py script to install your package such as setuptools. This will allow installation and distribution of the package. See also Pypi for distribution of the package. Typically, you can :
sudo setup.py install
to install your package at the system level. Or you can :
sudo setup.py install --prefix=~/local
to install at the user level. But then you have to add ~/local/lib/python/site-packages to your PYTHONPATH. But it has to be done only once for all packages installed by this user.
If your package is stored on PyPI, you can on any machine do :
easy_install mypackage
to auto-magically install the package.
Even better you can make your packages only visible in isolated python "virtual environment" using virtualenv. This allow for testing of various packages and versions on the same computer a breeze.
You can also add README files and such at the package root level.