How to keep up with remote modules' versions?

(I also asked this question at SO but gave up hope to get a complete answer)


I am coming from the Python/JavaScript world where importing a module normally means doing it from previously installed data. Therefore if I start a Python program twice and it imports a module, I will get the same module each time (if it has not been specifically updated on the host) - even if there is in the meantime a new version available.

I dockerize(*) all my code and start from an empty base image, install the modules and then run the program with the latest version of the modules (I am aware and understand the implications of doing so and not having a fixed-version installation, but there are reasons for that).

I am now preparing (for a home project) a build pipe for a Go project (namely the Caddy web server). I will be rebuilding it every night into a docker image. There are two main Go commands:

go mod init caddy
go build

The sources import several modules from GitHub:

import (

        _ ""
        _ ""
        _ ""
        _ ""
        _ ""

When running the build for the first time, I saw a lot of messages telling me that the plugins are being processed, a few examples lines being

go: finding v1.1.4
go: finding latest

I do not see them anymore during a subsequent docker build and the cached version of the layer is used. This means that during that phase there were no changes in the artifact.

The fact that the binary which is built is the same is normal: no source code has changed in the meantime. What worries me is that I did not see the build process at all.


Based on the comments in the SO question, my questions are:

  • does a go mod check for the current version of remote modules, or does it just ensures that a cache is available at the level defined in go.mod? In other words - if there is a new version will go mod fetch it on a subsequent call when it has already created go.* files in previous runs?
  • if I remove the go.* files created by go mod, will go mod fetch the current versions of the modules despite having a cache (if the versions between remote and local differ)? Or will it just look at what is in import, see that something is in the cache and say 'fine, I have the module locally so I do not need to check with github at all)
  • how does go get plays with the above? Would go get -u force an update of the cached modules (if the versions between local and remote differ)?

And generally: how should I build to make sure that I build using the latest versions of modules in import? ← I asked this several times (now I realize I should have probably worded my question like that) on SO and got answers which were interesting but never got to the point (run go <this> , and go <that> and the build will use the latest version of the modules). Is there such a way?

(*) I build my code via Gitlabs’ CI/CD using runners i host on my server. They run with a shell executor which menas that from a practical aspect the code is built on a host, using that host resources. This also means that the go command is common to all executions (and therefore whatever that command could cache would be reused too)

go get -u ./... from your module root upgrades all the direct and indirect dependencies of your module, and now excludes test dependencies.


This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.