How to execute multiple query by using go routine

How to execute multiple sql query using go routine

This seems like a pretty broad question about concurrency in go. See the tour of go concurrency section:

And go by example goroutines:

It depends which database you want to use.

Here’s a tutorial:
There’s also this tutorial :
Both tutorials use this mysql-driver:

You could also play with sqlite.
Then you need this driver: but it has the problem that it depends on a C-compiler (cgo) so it doesn’t always compile on every system (Linux probably works but macOS or Windows is a bit harder).

Another popular database is postgresql, then I would recommend this driver: which is also usable with the standard import "database/sql".
That library also provides lower-level access to pg-databases which makes it a bit faster and supports more features specific to postgresql but then you aren’t using database/sql.

In case you want sqlite but don’t want the dependency on a C-compiler (cgo), you can use which uses modernc to convert the original sqlite to something completely running in Go-only.

You could also use libraries specifically for sqlite, that support all (or most of) sqlite’s features but then you are again outside of Go’s standard database/sql-package. There’s but that depends on cgo again and there is which is inspired by crawshaw but also uses modernc so no need for cgo.

edit: make links use URL

Goroutine 1 select * from emp
Goroutine 2 select * from dep

It should stop here unless and untill 1 and 2 execute

Goroutine 3 I will process the data

Having multiple goroutines running stuff and synchronizing them is a relatively easy problem to solve. Since you mentioned blocking until your two jobs are complete, check out sync.WaitGroup:

From the docs:

Other than the Once and WaitGroup types, most are intended for use by low-level library routines. Higher-level synchronization is better done via channels and communication.

So to that end, refer to the the tour of go concurrency section I linked to above that demonstrates channels. That said, I’d be willing to be you are prematurely optimizing. If I were you I’d just try running the queries and start thinking about concurrency only if you have a specific problem you want to solve.

1 Like

Arguably the right answer would (as @Dean_Davidson said) indeed be to just:

  • use the main routine

but also:

  • do the data processing with an appropriate SQL query (only using Go for the final steps)

After all, SQL basically is a domain-specific data-processing language.

Also, database-systems have their own concurrency-mechanisms (transactions, ACID, …) so you might not need to depend on the Go mechanics for this.

Now all laptop and server have core and thread.

If we run independent queries on different thread. It will be fast. It is my thought process

It’s not that simple.

  1. bottlenecks (e.g. maybe your database is waiting for the previous request to finish, then your goroutine is just waiting)
  2. race conditions: whenever multiple threads access the same data, there is a big risk of corrupting the data (better a CORRECT program than a FAST but WRONG program)

Golang with it’s unique channels (and other things like WaitGroup, …) makes it easier to avoid race conditions than most other languages but that does not mean you should always write multi-threaded programs. It’s still more difficult than single-threaded because you have to be mindful about the synchronisation between the threads/goroutines.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.