Not Able to read 1 Million record from csv file

Not Able to read 1 Million record from csv file and while updating to mysql data base getting nil pointer exception.

We can’t help you unless you share some code. You are basically telling us “My car is not working” and expecting us to tell you what is wrong.

2 Likes
start := time.Now()
csvFile, _ := os.Open("data/data_dump.csv")
reader := csv.NewReader(bufio.NewReader(csvFile))
var status []models.EntryStatus
var errorCount = 0
var count = 0
var successCount = 0
for {
	line, error := reader.Read()
	if error == io.EOF {
		break
	} else {
		count++
		validation := utils.Validation(line)
		if validation {
			errorCount++
		} else {
			insert, err := datasource.Db.Prepare("INSERT INTO geo_location(ip_address,country_code,country,city,latitude,longitude,mystery_value) VALUES(?,?,?,?,?,?,?)")
			if error != nil {
				errorCount++
				panic(err.Error())
			} else {
				insert.Exec(line[0], utils.TrimQuotes(line[1]), utils.TrimQuotes(line[2]), utils.TrimQuotes(line[3]), utils.TrimQuotes(line[4]), utils.TrimQuotes(line[5]), utils.TrimQuotes(line[6]))
				defer datasource.Db.Close()
				successCount++
			}
		}
	}
}
elapsed := time.Since(start)
status =
	append(status, models.EntryStatus{
		Error:   errorCount,
		Success: successCount,
		Count:   count,
		Elapsed: elapsed,
	})
statusJson, _ := json.Marshal(status)
fmt.Fprintf(w, string(statusJson))

I have 1 Million record in my csv file. How i have to read and update in my database.

Could you provide the error message you get there?

Can’t create more than max_prepared_stmt_count statements (current value: 16382))wait for 2 Seconds%!(EXTRA *mysql.MySQLError=Error 1461: Can’t create more than max_prepared_stmt_count statements (current value: 16382))wait for 2 Seconds%!(EXTRA *mysql.MySQLError=Error 1461: Can’t create more
than max_prepared_stmt_count statements (current value: 16382))wait for 2 Seconds%!(EXTRA *mysql.MySQLError=Error 1461: Can’t create more than max_prepared_stmt_count statements (current value: 16382))wait for 2 Seconds%!(EXTRA *mysql.MySQLError=Error 1461: Can’t create more than max_prepared_stmt_count statements (current value: 16382))wait for 2 Seconds%!(EXTRA *mysql.MySQLError=Error 1461: Can’t create more than max_prepared_stmt_count statements (current value: 16382))wait for 2 Seconds%!(EXTRA *mysql.MySQLError=Error 1461: Can’t create more than max_prepared_stmt_count statements (current value: 16382))wait for 2 Seconds%!(EXTRA *mysql.MySQLError=Error 1461: Can’t create more than max_prepared_stmt_

Looks like you should create your “insert” statement outside the “for” loop and reuse it. Also, you use “err” for the return value then test on “error”.

Okay Charles but in Go lang how to read chunks of files .

1 Like

Let me check and implement this in my code . Anyways thanks.

I suggest you using directly insert instead prepare+insert combination.

1 Like

Please have a look in my code and rectify the code. My file should read in chunks and it should update database and because i have csv file with 1 million data .

You can use something like this (fictive example):

if _, err := db.Exec("INSERT table SET a=?,b=?", a,b); err != nil {
    log.Println(err)
}
1 Like

Preparing statement beforehand should be more efficient, but like someone said above, it should be done before the for loop, so just once. That implifies that closing should be also done after the loop ends.

However, this is the socondary problem as I understood and the main is how to read huge CSV. I pasted a way above, so he can either use that code or just peek inside how to do it.

Use bufio.NewReaderSize(). Make it as big as you want.

Okay.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.