I am currently in the process of creating a docker container for my C++ project. The setup I have uses CPM to look a for local libraries before downloading them if they are not found. With docker most C++ examples that I have looked at use a either apt-get, git and wget to install libraries. But if I did that I would basically end up with two dependency management systems.
If I was only using docker for building and deployment then it would not be an issue, as I can let CPM handle the dependencies when building. However, I also plan to use docker for development. As such, It would be optimal to have the dependencies as a layer in the development image.
So should I use another package management system (Conan, Vcpkg) that works better with Docker? Find some way to call the CPM cmake code from the docker file without building? Maybe use a persistent volume to store the libraries inn? Or just use apt-get for my development image and let CPM handle the build and deployment images?
Yes, I would encourage you to use a mature C++ package management system like Conan or Vcpkg for this. The packages provided by them have been tested on all major platforms, which will save you quite a bit of potential frustration, and also offer them as binary downloads if you have a matching OS+compiler configuration.
To get started with Conan, for example, you can create a basic conanfile.txt
in your repo
[requires]
zlib/1.3.1
[options]
zlib/*:shared=True
[layout]
cmake_layout
[generators]
CMakeDeps
CMakeToolchain
and then include the following commands in your Dockerfile:
RUN pip install conan
RUN conan profile detect
RUN conan install . -s build_type=Release --build=missing
RUN cmake . --preset conan-release
RUN cmake --build --preset conan-release