Search code examples
node.jspython-3.xdockerfileamazon-elastic-beanstalkawsdeploy

Python libraries in Dockerfile for Node.js Project


I am trying to upload my Node.js project to Docker Platform on AWS Elastic Beanstalk. I am running into problems adding python libraries in Dockerfile. With this file deployment is fine:

FROM node:8.16
WORKDIR /opt/app
COPY package.json package-lock.json* ./
RUN npm cache clean --force && npm install
COPY . /opt/app
ENV PORT 80
EXPOSE 80
CMD [ "npm", "start" ]

But once I add Python libraries for Docker file:

FROM node:8.16
WORKDIR /opt/app
COPY package.json package-lock.json* ./
RUN npm cache clean --force && npm install
COPY . /opt/app

FROM python:3.7    
COPY requirements.txt /tmp/
RUN pip install --requirement /tmp/requirements.txt
COPY . /tmp/

ENV PORT 80
EXPOSE 80
CMD [ "npm", "start" ]

I get an error while deploying:

Failed to run Docker container: a46e6adbe0fee8d3 docker: Error response from daemon: OCI runtime create failed: container_linux.go:348: starting container process caused "exec: \"npm\": executable file not found in $PATH": unknown.. Check snapshot logs for details.

Thanks in advance for any help


Solution

  • As mentioned in the docker reference for FROM, while having multiple FROM instructions in a single Dockerfile is allowed:

    Each FROM instruction clears any state created by previous instructions.

    If you are looking to build your application on top of an environment containing both node.js and python 3, I suggest you look around docker hub. Perhaps this one will help.