I work in a large organisation, and we are considering using Bazel as a build tool for our large repo.
Bazel has a number of dependencies (such as specific build rules) that in turn have their own dependencies, and those dependencies are hosted on github.
Our company would like to avoid risks, such as github being down and us having to use our CI, so some say that we should host all those dependencies internally. The problem is that all those deps urls are hardcoded into the rules_<something>
code and it's not quite clear how to do it with reasonable effort.
The question is, does it really make sense? Doesn't it go against the approach that the developers of Bazel chose? Or is there a good way to create those local caches?
Yes, there is a good way: --experimental_downloader_config
.
See also: https://bazel.build/reference/command-line-reference#flag--experimental_downloader_config
Essentially you provide a config that disallows downloads from everywhere except your company internal cache and rewrites all URLs to use that internal cache.
As for how to setup the cache: any caching proxy should do. Artifactory or Nexus come to mind.