Search code examples
pythonpython-requestspython-standalone

Python script "standalone"


I've recently started writing python scripts and I'm still newbie to the language.

I'm stuck with a problem: My script requires the 'requests' library(and the other packages that comes with it when using pip) to be installed by pip for the script to work(and some folders like the 'database', where I store a sqlite3 file) and I need to install the script in a lot of machines, that have different Ubuntu versions, therefore different Python versions, and I want my script to run 'standalone' and to not have to install/update Python, pip and the 'requests' package every time I setup the script in a new machine. I'm developing in a virtualenv on my machine that is currently setuped with all the necessary packages to run the script.

Can I make a make a 'copy' of my virtualenv so it can be moved with my Python script to other computers, including my database folder, without having to install/update python and pip on every machine, instead using this standalone version of python? All the machines are Linux.

I already tried to copy my virtualenv to my project folder but the virtualenv crashed when I tried running my script using the python interpreter inside it in the shebang line, even when using the --relocatable argument, so I guess it's not the case.

I've also tried using PyInstaller, no success.


Solution

  • Welcome to the world of deployment! The answer you seek is far from trivial.

    First off, Python is an interpreted language that isn't really supposed to be distributed as a desktop application. If you would like to create executables, then there are some libraries for that, such as py2exe. However, these are ad-hoc solutions at best. They "freeze" the whole of Python along with your code, and then you ship everything together.

    The best practice way to stipulate your dependencies is in a requirements.txt file. You can create one with this command:

    pip freeze > requirements.txt
    

    What this does is checks all the libraries that are currently in whatever env you're working in, and saves them to a file called requirements.txt. That file will then have all of your required libraries in it, and anyone who receives your code can just run

    pip install -r requirements.txt 
    

    and it will install all the dependencies.

    However, that just takes care of library dependencies. What about the version of python itself, OS environment etc... So this is where you may need to start looking at solutions like Docker. With Docker, you can specify the full environment in a Dockerfile. Then anyone on another machine can run the docker images, with all of the dependencies included. This is fast become the de-facto way of shipping code (in all languages, but especially useful in Python).