Some of my python projects are tested under continuous integration with a setup like the one described here : "Pretty" Continuous Integration for Python.
I currently use easy_install
to install the project, its dependencies and the test tools (nose, coverage).
Sometimes, my builds report as failed because easy_install
was not able to download the dependencies due to networking problems: either the internet connection, PyPI or one of the packages download servers is down or doesn't seem to respond.
I would like to prevent my build to fail in such a case by using a local cache of packages: when we cannot download a fresh dependency, we'll use the local one (which should be updated when possible). It's important for me to first try to download a fresh dependency because I want to be alerted as soon as possible that my project break because of an API change in a dependency.
My question is: how can I setup such a cache that doesn't break on networking problems? I first tried to use collective.eggproxy for that problem, but it doesn't capture all errors as far as I known.
I ended up using collective.eggproxy
to cache the downloads, but used a startup delay after running collective.eggproxy
as a daemon to prevent errors from happening when I try to use easy_install
before collective.eggproxy
is fully started.
However, the answers suggesting using pip seems to be equally valid to me, but since I already use easy_install
and collective.eggproxy
, it's easier for me to stick with them.