I'm packing python application into docker with nix's dockerTools
and all is good except of the image size. Python itself is about 40Mb, and if you add numpy
and pandas
it would be few hundreds of megabytes, while the application code is only ~100Kb.
The only solution I see is to pack dependencies in separate image and then inherit main one from it, it won't fix the size, but at least I won't need to transfer huge images on every commit. Also I don't know how to do this, should I use some image with nix, or build environment with pythonPackages.buildEnv
and the attach my app to it?
It would be great to have some generic solution, but python specific would be good. Even if you have imperfect solution, please share.
Ok, with fromImage
attr for buildImage
I split one huge layer into huge dependency layer and small app code layer.
I wonder if there is any way to move this fat dependency layer into separate image, so I could share it among my other projects?
After googling a bit and reading dockerTools
code I ended with this solution:
let
deps = pkgs.dockerTools.buildImage {
name = "deps";
content = [ list of all deps here ];
};
in pkgs.dockertools.buildImage {
name = "app";
fromImage = deps;
}
This will build two layer docker image, one of them would be dependencies, other one is app. Also is seems that value for fromImage
could be result of pullImage
which should give you same result (if I understood code correctly), but I wasn't able to check it.