How To Fix The Go Package Management Problem

We need to fix the package management situation in Go. With that, I started with my first stake in the ground by writing “How To Fix The Go Package Management Problem”


Full disclosure: I just went through this process myself, with a large update to a toolkit used by our engineering team. I disagree that Go’s package management is an issue for reasons below.

  1. If one is updating libraries, then they must be aware of the potential API changes regardless of whether it’s a buildfix/minor/major semantic version change or a git commit change. The work is identical, but instead of looking at tags and short commit hashes, one looks for changes to the code based on suggestions by a version number. Again, the work is identical.

  1. Go supports vendoring now as you stated here. It seems that checking out the exact code used for a specific project would resolve dependency issues completely. Updates happen if-and-when a developer chooses, with the developer resolving the resulting build errors, before it is released.

Can you imagine that every time a library needs to increment a major version it needs to create a new repo on GitHub? Yeah, no one does that.

That’s an artifact of private Github accounts and Github’s ridiculous restrictions – every other repository hosting company (, Assembla, etc) charges per-user instead of per-repo, making this a non-issue.

And if we’re talking about public repos, why wouldn’t you fork/copy and make major API changes as a new repo?

Are we attached to the name of a library/repo? goimports and other tools can detect that name/path change and update all code accordingly, if it’s really that inconvenient.

@tydavis Thanks for your response and reasoning. It does raise some questions for me:

  1. How do you know when a revision is an intended release, that you should know, or a point along the development path? A point in development that you shouldn’t use.

This is about communication as much as it is technical. It’s about enabling people who already have process, tooling, and automation. You may want to resolve everything by hand and choose the revisions you use. Many people want and can use more.

  1. The Go vendoring highlights the complication if you are one of the many who want automation to resolve the dependency tree. That same level of automation available is so many other languages.

You may not want that, and that’s fine, but others do. Are we going to build the system for the 80% or the one you want? I’m shooting for the one that enables the 80%.

Go is running into adoption issues because of package management. For every person who expresses your opinion of package management I’ve heard many more talk about pain, not trying Go, or walking away from it because of package management.

  1. Making major API changes in a new repo isn’t typical. Doesn’t matter if it’s on GitHub, Bitbucket, a GitLab instance, or somewhere else. This isn’t how developers tend to operate.

This isn’t about what’s capable or how you can do things. It’s about what enables the 80% to be successful. or, the 90%, or 99%.

@mattfarina – I don’t believe I thanked you for speaking out nor for bringing up utilities like Glide. Please permit me to do so now.

As for your comments, please permit me to respond both to each point and in whole:

  1. If everyone follows the community rule of “don’t break Master”, then development versions won’t make it to the mainstream “go get” path. Using Development versions would require effort. If this is followed, then one should always be able to run “go get -u ./…” to update every dependency without issue.
  2. If you need a tool to automatically resolve dependency needs, that sounds like a very large project. Can it be broken down into more manageable conceptual chunks? Can those chunks be audited? Also see below regarding “what others want.”
  3. Developers don’t want to change how they do things. Doesn’t mean they shouldn’t or they can’t, they just dislike it. That’s human and they’ll complain until it becomes the new normal (e.g. changes to Facebook and user reactions).

Now, to respond in whole…

There’s an adage about “you code in what you started with, even if it’s a different language” e.g. if you learned python first, you’ll write your java code like python, except where java forces you to do something specific.

In the case of Go, you’re talking about people coming from JVM/Ruby/Python/NodeJS shops with all this expectant baggage. They wonder why there isn’t a class inspector, where their favorite debugging tools are, where the build manager is, etc. They don’t look at how they should work with the language and environment, they look at how the language can be adapted to work with them. That is not a perspective that promotes learning of any kind. That is the mindset the “I tried Go for four days and left” crowd is deliberately not adopting.

I have heard Engineers complain about getting into any language that they don’t already know. I’ve heard developers complain loudly about tools they already use (and continue to choose for new projects) daily. When they really try to learn, you have this (emphasis mine):

Having worked on the C# language for more than a decade, I was super excited about the design of Go when I looked at the language with a fresh and pragmatic mindset.

In contrast,

It’s about enabling people who already have process, tooling, and automation.
[. . .]
That same level of automation available is so many other languages.
[. . .]
It’s about what enables the 80% to be successful.

Your arguments sound like the intent is to bring the same old processes and design into a new language that is explicitly not following the same process as other languages developed over the last thirty years.

1 Like

@tydavis Thanks for engaging in this conversation in such a cordial manner.

I wish everyone followed the rule of “don’t break Master”. I wish that solved the communication problem. I wish using commit ids was easy when you have to do things like automatically resolve diamond dependencies automatically.

Part of this is to aide automation instead of people manually doing it. Look at Kubernetes. We pass it configuration and let it run instances of our application. The new way engages automation rather than manually stand up VMs, install applications, and manage the lifecycle. When automation is engaged you can make is simpler for people.

The same works for package management. We can manually resolve things. That takes work and time. Sometimes we stop testing with the tip of master and pin down to really old versions for testing. When it’s manual there’s a pain point so you do it less often. When you can automate things you can leave it up to your automation more often.

I entirely get not doing the same old thing. To break from paradigms. But, Go doesn’t do that completely. Go is a C based language. It’s not a whole new paradigm. It uses the same or similar keywords to other language. Multiple returns was even around before Go (though maybe not popular). A lot about Go is familiar which makes it easy to learn. They did tried and true things with the language while still innovating in some areas.

Package management in its current state is neither a tried and tested solution or something innovative. I’m open to truly innovating in this space. In light of that I believe it better to go with something tried and true.

Leaving a hole in the feature space here is frustrating people. I’m only vocalizing what I’ve been hearing in back channels for some time.

@mattfarina – I hear you. I have been working with the Kubernetes team for a while and use it daily, so I understand wanting automation to take over and handle things for us. I completely understand that manual resolution is a pain point. I also concede your points about Go being C-based language and only innovating in certain areas.

I wish everyone followed the rule of “don’t break Master”. I wish that solved the communication problem. I wish using commit ids was easy when you have to do things like automatically resolve diamond dependencies automatically.
. . .
Package management in its current state is neither a tried and tested solution or something innovative. I’m open to truly innovating in this space. In light of that I believe it better to go with something tried and true.
. . .
Leaving a hole in the feature space here is frustrating people. I’m only vocalizing what I’ve been hearing in back channels for some time.

From what I can tell, the Go designers side-stepped the package / dependency management issue and said “let’s let the community handle it.” Then someone (community? designers? maintainers?) pushed for vendoring as a solution and it got enough traction that it’s now part of the official release.

Similarly, setting up multiple major API versions with different repos / names solves the diamond dependency problem – both can coexist in the same project and still be correct!

This isn’t a matter of “we need a new tool here,” it’s a matter of educating developers on how to do things in Go. The frustration is because they expect it to be their way, and it isn’t – it works the Go way.

1 Like

The flipside is that separate repositories don’t aid discoverability, make the canonical version less obvious, and make it harder to consolidate issues/documentation as a whole.

As I’ve mentioned on this forum before: library authors have it tough. You can’t (don’t want to) break master because go get users who don’t use third party vendoring tools will suffer breakage, and you can’t version import paths outside of something like

As a (sometimes) library author I’ve decided to ignore this problem and just use version tags and branches like everyone else. The people who care about versions and compatibility will either use vendoring or a tool that understands versions. If “go-get and hope for the best” is your deployment strategy under any sort of circumstance where the result actually matters you already have worse problems. :wink: (Edit: That’s the general/hypothetical “you”, not the elithrar-you despite this technically being in reply to your post above.)

I don’t really agree the community pushed for vendoring. The community used vendoring as a hack to work around the lack of tooling and the idea that the “go” tool should be the only tool. The path of minimal resistance was then to bless this behavior.


From a pkg consumer point of view I’m totally fine with the way things are. Git and vendoring have always been enough for my personal projects and thinking about my experiences with npm, gem and composer leaves a sour taste in my mouth.


Would you be able to quantify your complaints about the things that you dislike about npm, gem, and composer. I think it would be germaine to this discussion.

Sure, I’ll try but I have to admit that in most cases it has probably been caused my own lack of experience but first experiences unfortunately tend to stick.

About composer:
I’ve been working on a small Laravel project with about 2 or 3 thid-party dependencies declared in the composer file. I’ve kept the vendor files outside of the VCS at first. I then deployed the project to a cheap webhost and ran composer install which exceeded max execution time. After increasing the setting it ran through but took really,really long. And for every update after that as well. And I remember running into package requirement conflicts at one point. I then vendored the dependencies and checked them into the CVS using composer only when necessary.

About gem:
That was quite some time ago and I can’t really recall the exact problems I ran into. I believe that much of it was related to me using Windows back then and issues with Ruby 1.x / 2.x. Using gem (together with RVM) works pretty ok nowadys but is pretty slow.

For npm:
Had issues with registry downtime. Having local npm packages for many projects really grows in size. Security concerns regarding post install scripts. Package requirement conflicts as well.

I definitely admit that it’s hard problem but for my personal use cases git, submodules and vendoring have been enough so I’d prefer to avoid extra complexity by another code management tool.

1 Like

Jens, thank you for your perspective.

I can relate to the problems you’ve run into and have even had to personally deal with them. For example, I’ve worked on projects with Composer where we built the site before deploying to hosting because hosting was slow. And, the npm scripts issue… I laughed when someone created a post install script that did a “rm -rf /” to showcase the security issues. Then reviewed the install scripts in the project I was working on.

I know that when I work on Glide or recommend solutions I do try to keep these things in mind. We may not agree on approach but I, for one, and listening to issues like this. Thanks for sharing the details.

I’m not as experienced a developer than my peers here. But the lack of dependency management in Golang at least forces me to do two things:

  1. Think deeper about which dependencies I pull in.
  2. Use the standard library more often.

The problems with them, as far as I understand, are:

  1. Even after choosing a good project as a dependency, it has its own dependencies in turn.
  2. This may be not time efficient, and may not be easily reproducible by contributors, co-workers, etc.

The vendoring support was a great start to get projects more isolated and reproducible. But pulling with go get just from master sometimes doesn’t solve the issue because of the issues already discussed above - where developers sometimes break master and usually don’t create a new repo when a major (potentially breaking) change happens.

I like the way tools like glide approaches these problems, and I’d love for it to stay simple like it is, grounded in Go’s pragmatic philosophy.

I think code version control is essential in project organization, and that it should be integrated into project development. By that I mean that I think that, in Go, we should be able to pull from hand-picked versions of version controlled dependencies in a more automated manner.

That’s why I like tool like glide. I get dependency management without loosing the control over them. It joins Go’s new vendoring support and enables easy reproducibility.

For me the golden standard has always been CPAN. I do not recall having the same problems I’ve had with rubygems, npm or pip, which basically boil down to poor handling of dependencies, and specifically, poor handling of multiple versions of the same thing in the dependency graph. And it’s not like there’s something inherent to Perl that solves this issue, for me it’s more that the Perl developer community figured out how to avoid the issue in the first place.

In that regard, Go has been much smoother. I have not worked with a project so large with so many imports that eventually it ends up importing the same thing twice, but require different versions. Probably because of past experience, what I have done from the start is having multiple GOPATHs, each with the actual code that I’m working on and its dependencies. In that way what I have done, without intending to do so, is to build large projects out of many smaller ones.

Going back to CPAN, since CPAN didn’t offer a simple way to do the equivalent of having multiple GOPATHs, it kind of forced you to figure out how to work happily with what we would today call a monorepo. And yes, looking back, I realize today that CPAN did offer all the bits and pieces to do this, but again, the developer community is such that that it favors (favored?) the other solution.

@mem, how did they do this ? Can you elaborate. Was it a technical solution? Was there something in the process of publishing an artefact to CPAN that discouraged people from doing this ? Was it a policy decision, and enforced with social pressure, like gofmt ?

I’m not sure I understand why this is such a contentious area. Clearly some people want automated dependency resolution. Given that we have the vendoring mechanism, they can write a tool which resolves version dependencies into the vendor directory. Those who want to use the tool can do so, those who don’t want to can skip it. If the tool is easy to use and doesn’t involve YAML I might even use it myself.

There’s no reason why the tool has to be an official part of Go. Bundler is the de facto standard tool for doing something similar in the Ruby world, and it’s not part of the main Ruby distribution.


Without a standard or defacto-standard (Gemfiles are it) it becomes extremely frustrating to parse dependencies.

As a library author do I forsake users of X tool and only support Y? Do I include manifest files for all of the tools I can think of? That’s a pretty poor experience for all involved.

1 Like

Yes, clearly there will need to be some sort of de facto standard for specifying the version of a package and what versions of other packages it depends on. But there might be multiple different package management tools that use that information.


This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.