When creating production build with docker what is the strategy people uses about compiling and bundling the code.
So outside the docker world, I would create a build (using some sort of npm command) which will create a dist. code (without any source code, uglified and compressed javascript for e.g.) and then I point a web server to the dist folder.
In docker world where would you build the code, Is it in docker image or on host os and just copy the dist folder to the docker image? Basically I do not want the whole npm_modules and all source code files in the docker image/container.
Any idea how to achieve this?
Thanks
It sounds like you're worried about two different, valid, problems:
You can achieve both with the approach you suggested - have the build steps run as part of your Dockerfile
. But this has the disadvantages that you mention - you're left with all of your source/development artifacts at runtime, unless you take explicit steps to remove them all.
Docker introduced multi-stage builds to somewhat alleviate this issue - it effectively allows you to "squash" multiple layers into one. But it doesn't eliminate the problem of needing to explicitly clean up.
So in my experience, the most common solution is indeed to build your artifact externally, and then COPY
it into your production image.
That solves problem #1, but not #2. So go one step further - build your Docker image inside a Docker container! CI platforms are increasingly supporting this approach as a first-class concept - see e.g. Circle CI's Docker executor.