Efficient way to connect to DB

Hello,

I am working on a Golang http based project for last 6 months and I am looking for the best way to connect to DB using a persistant connection.

The reason for checking for efficient method is I am getting “too many files open” error and it is crashing the system.

Could anyone please help me by providing your feedback if the code for DB is efficient or not:

//Variable stores DBConn and will be called in any function
var DbConn *sql.DB


func DbOpenConn() (*sql.DB, error) {
    connstr := "conn_string"
    DbConn, err := sql.Open("mysql", connstr)
    CheckErr(err)

    DbConn.SetMaxIdleConns(10)
    DbConn.SetMaxOpenConns(50)
    return DbConn, err
}

//Check if the conn is still alive
func FixConn() (*sql.DB) {
    checkping := DbConn.Ping()
    var err error
    if checkping != nil { //if error
		    DbConn, err = DbOpenConn()
		    CheckErr(err)
		    return DbConn
	}
	return DbConn
}

// function to execute the statements
func DbExecute(db *sql.DB, query string, params ...interface{}) (sql.Result, error) {
    db = FixConn()
    stmt, err := db.Prepare(query)
    CheckErr(err)
    defer stmt.Close()
    res, err := stmt.Exec(params...)
    CheckErr(err)
    return res, err
}

  func DbQueryGetRowsParams(db *sql.DB, query string, params ...interface{}) (*sql.Rows, error) {
   db = FixConn()
   stmt, err1 := db.Prepare(query)
   CheckErr(err1)
   defer stmt.Close()
   rows, err2 := stmt.Query(params...)
   CheckErr(err2)
   return rows, err2

}

Now I am calling it as:

func MyFunc(w http.ResponseWriter, r *http.Request) {
    db := funcs.DbConn
    selDB, err = funcs.DbQueryGetRowsParams(db, query, params)
    for selDB.Next() {
        //Some code
    }
   selDB.Close()  //using it either with or without defer
}

I think my code is fine but still would like an expert advice if I need to improve something.

Actually, I am literally pulling my hairs due to the “too many files open” error. I am analyzing my code from last two days but still not successful.

Thanks

EDIT: Better look at @CurtGreen’s reply first (last two paragraphs beind the link) before trying the steps outlined below. I guess that’s more likely the cause.


Based on this comment on GitHub I would first try to set MaxIdleConnections to 0 and see if this improves anything.

Second thing I’d try is described in this comment further down the thread: Find out the delay after which your database closes idle connections, and set SetConnMaxLifetime to this delay minus one second.

These are just two wild guesses, though. I found the GH issue by googling for golang "database/sql" "sql.DB" "too many files open", so it might just be a textual match with no relevance to your issue. (But still worth a try.)

1 Like

http://go-database-sql.org/accessing.html

1 Like

Hello,

I analyzed the code found out that it was Viper (was reading .toml file) that was generating too many open files error.

Viper was generating approx 4000 threads under the Golang process. And, I was testing the web page using:
https://www.uptrends.com/tools/uptime

So for each request/country, it was generating 4000 threads (TID) in Ubuntu.

I was able to find the open files by following commands:
lsof -n | grep <Process_ID> | wc -l
lsof -p <Process_ID> | wc -l

Thanks for your help.

You are opening and closing a database connection on every request which can lead to the problems mentioned in Curtis’ link if you have many concurrent requests. I guess viper is also involved in getting the connection string?
Instead you could pass a connection pool pointer to your HTTP handlers. Some ways to do this are shown here for example http://www.alexedwards.net/blog/organising-database-access

1 Like

In general, you can tune the MySQL connection too. It’s the DB host server configuration. You can google it. This should be the last option after all the code changes needed to handle your proper connection handling

1 Like