Search code examples
pythondockerpip

How to install local packages using pip as part of a docker build?


I've got a package that I want to build into a docker image which depends on an adjacent package on my system.

My requirements.txt looks something like this:

-e ../other_module
numpy==1.0.0
flask==0.12.5

When I call pip install -r requirements.txt in a virtualenv this works fine. However, if I call this in a Dockerfile, e.g.:

ADD requirements.txt /app
RUN pip install -r requirements.txt

and run using docker build . I get an error saying the following:

../other_module should either be a path to a local project or a VCS url beginning with svn+, git+, hg+, or bzr+

What, if anything, am I doing wrong here?


Solution

  • First of all, you need to add other_module to your Docker image. Without that, the pip install command will not be able to find it. However you can ADD a directory that is outside the directory of the Dockerfile according to the documentation:

    The path must be inside the context of the build; you cannot ADD ../something /something, because the first step of a docker build is to send the context directory (and subdirectories) to the docker daemon.

    So you have to move the other_module directory into the same directory as your Dockerfile, i.e. your structure should look something like

    .
    ├── Dockerfile
    ├── requirements.txt
    ├── other_module
    |   ├── modue_file.xyz
    |   └── another_module_file.xyz
    

    then add the following to the dockerfile:

    ADD /other_module /other_module
    ADD requirements.txt /app
    WORKDIR /app
    RUN pip install -r requirements.txt
    

    The WORKDIR command moves you into /app so the next step, RUN pip install... will be executed inside the /app directory. And from the app-directory, you now have the directory../other_module avaliable