Bringing the BERT NLP Model to Go

Hey all,

I’ve had a few weeks of free time and put together a Go library for interfacing with the state of the art BERT NLP model via TensorFlow C bindings. The project is very much a WIP, but think it’s in a place where I can start sharing it with folks. The gist of the project is that BERT creates sentence vectors (embeddings) from any natural language that can be used for downstream learning tasks (fine-tuning) such as classification or the vectors can be used directly to compare sentences for semantic similarity. The mantra of the project is to build your BERT models in Python and then run them in Go.

The tokenize package should be pretty stable, but the model package has a bit of an experimental API that may need to be honed and its missing some key functionality such as converting token vectors into sentences (pooling). The semantic search demo flexes the most, but to get the general point taking a look at the classifier or similarity examples may be useful.

It was very interesting hooking up Go with tensorflow models and can provide some really interesting capabilities.

Take a peek: https://github.com/buckhx/gobert

1 Like

Welcom to the forum @buckhx! Thanks for sharing your work. I have been looking into TensorFlow a bit more lately to see what I can do with it. I’ll play around with your repo when I get some free time.

Keep up the good work!

2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.