I want to created a utility which reads a huge amount of JSON Documents (Approx 1 lakh) From a csv/flat file and bulk Inserts them into couchbase in a single iteration. Since the array buffer size is limited, currently what I have to do is divide all the data into smaller batches and push them with multiple bucket.Do() operations. This process has made the insertion very slow.
Can anyone suggest me any alternate approach for this purpose?
@lutzhorn, I’m a newbie in go. My aim is to create a load on Couchbase server by inserting the documents at a speed of 0.1 M writes/sec. I was planning to split data into multiple files (0.1 M documents per file) and push them. Reading the file document by document will make it really slow. plus the bulk insert operation is only able to insert approx 2100 records at once.
The files contain tab separated ID and JSON Documents.