Search code examples
gogo-modulesgo-packages

Is there an easier way to keep local Go packages updated


I am using multiple packages that I import into different projects, these range from custom adapters for my business logic that are shared by lambda and google cloud functions and other public packages. The way I do this right now is that I vendor them and include them for cloud functions. For applications that can be compiled and deployed on a VM, I compile them separately. This works fine for me, however, its a pain developing these modules.

If I update the method signature and names in the package, I have to push my changes to github / gitlab (my package path is something like gitlab.com/groupName/projectName/pkg/packageName) and then do a go get -u <pacakgeName> to update the package.

This also, does not really update it, sometimes am stuck with an older version with no idea on how to update it. Is there an easier way of working with this I wonder.


For sake of clarity:

Exported package 1 Path: gitlab.com/some/name/group/pkg/clients/psql

psql-client
    |
    |_ pkg
        |
        |_psql.go

Application 1 uses psql-client Path: gitlab.com/some/name/app1

Application 2 uses psql-client Path: gitlab.com/some/name/app2


Solution

  • My understanding is that (a) you are using the new Go modules system, and that (b) part of the problem is that you don't want to keep pushing changes to github or gitlab across different repositories when you are doing local development.

    In other words, if you have your changes locally, it sounds like you don't want to round-trip those changes through github/gitlab in order for those changes to be visible across your related repositories that you are working on locally.

    Most important advice

    It is greatly complicating your workflow to have > 1 module in a single repository.

    As is illustrated by your example, in general it is almost always more work on an on-going basis to have > 1 module in a single repository. It is also very hard to get right. For most people, the cost is almost always not worth it. Also, often the benefit is not what people expect, or in some cases, there is no practical benefit to having > 1 module in a repo.

    I would definitely recommend you follow the commonly followed rule of "1 repo == 1 module", at least for now. This answer has more details about why.

    Working with multiple repos

    Given you are using Go modules, one approach is you can add a replace directive to a module's go.mod file that informs that Go module about the on-disk location of other Go modules.

    Example structure

    For example, if you had three repos repo1, repo2, repo3, you could clone them so that they all sit next to each other on your local disk:

    myproject/
    ├── repo1
    ├── repo2
    └── repo3
    

    Then, if repo1 depends on repo2 and repo3, you could set the go.mod file for repo1 to know the relative on-disk location of the other two modules:

    repo1 go.mod:

    replace github.com/me/repo2 => ../repo2
    replace github.com/me/repo3 => ../repo3
    

    When you are inside the repo1 directory or any of its child directories, a go command like go build or go test ./.... will use the on-disk versions of repo2 and repo3.

    repo2 go.mod:

    If repo2 depends on repo3, you could also set:

    replace github.com/me/repo3 => ../repo3
    

    repo3 go.mod:

    If for example repo3 does not depend on either of repo1 or repo2, then you would not need to add a replace to its go.mod.

    Additional details

    The replace directive is covered in more detail in the replace FAQ on the Modules wiki.

    Finally, it depends on your exact use case, but a common solution at this point is to use gohack, which automates some of this process. In particular, it creates a mutable copy of a dependency (by default in $HOME/gohack, but the location is controlled by $GOHACK variable). gohackalso sets your current go.mod file to have a replace directive to point to that mutable copy.