(I also asked this question at SO but gave up hope to get a complete answer)
Introduction
I am coming from the Python/JavaScript world where importing a module normally means doing it from previously installed data. Therefore if I start a Python program twice and it imports a module, I will get the same module each time (if it has not been specifically updated on the host) - even if there is in the meantime a new version available.
I dockerize(*) all my code and start from an empty base image, install the modules and then run the program with the latest version of the modules (I am aware and understand the implications of doing so and not having a fixed-version installation, but there are reasons for that).
I am now preparing (for a home project) a build pipe for a Go project (namely the Caddy web server). I will be rebuilding it every night into a docker image. There are two main Go commands:
go mod init caddy
go build
The sources import several modules from GitHub:
import (
"github.com/caddyserver/caddy/caddy/caddymain"
_ "github.com/lucaslorentz/caddy-docker-proxy/plugin"
_ "github.com/pyed/ipfilter"
_ "github.com/caddyserver/dnsproviders/ovh"
_ "github.com/aablinov/caddy-geoip"
_ "github.com/abiosoft/caddy-git"
)
When running the build for the first time, I saw a lot of messages telling me that the plugins are being processed, a few examples lines being
go: finding github.com/pyed/ipfilter v1.1.4
go: finding github.com/aablinov/caddy-geoip latest
I do not see them anymore during a subsequent docker build and the cached version of the layer is used. This means that during that phase there were no changes in the artifact.
The fact that the binary which is built is the same is normal: no source code has changed in the meantime. What worries me is that I did not see the build process at all.
Questions
Based on the comments in the SO question, my questions are:
- does a
go mod
check for the current version of remote modules, or does it just ensures that a cache is available at the level defined ingo.mod
? In other words - if there is a new version willgo mod
fetch it on a subsequent call when it has already createdgo.*
files in previous runs? - if I remove the
go.*
files created bygo mod
, willgo mod
fetch the current versions of the modules despite having a cache (if the versions between remote and local differ)? Or will it just look at what is inimport
, see that something is in the cache and say 'fine, I have the module locally so I do not need to check with github at all) - how does
go get
plays with the above? Wouldgo get -u
force an update of the cached modules (if the versions between local and remote differ)?
And generally: how should I build to make sure that I build using the latest versions of modules in import
? ← I asked this several times (now I realize I should have probably worded my question like that) on SO and got answers which were interesting but never got to the point (run go <this>
, and go <that>
and the build will use the latest version of the modules). Is there such a way?
(*) I build my code via Gitlabs’ CI/CD using runners i host on my server. They run with a shell
executor which menas that from a practical aspect the code is built on a host, using that host resources. This also means that the go
command is common to all executions (and therefore whatever that command could cache would be reused too)