I am using ioutil.ReadFile() and ioutil.WriteFile() to replace some text in some files. But it makes many issues, we have to edit 30+ files and this causes issue.
Here is my codes :
It is something similar to sed in bash, when we run this function at once for 30+ files then some files stuck i mean it stuck at some files and when we try to edit those files with nano command in bash then the file won’t open, it just shows blank screen like there are too many processes for this file are running. It is making many issues, it increases the server load. Please anyone have some solution for it?
run the file replacement in parallel, in batches of (CPU cores count) at a time
run the file replacement one at a time (if the point above is still not enough)
write your own version of the RabinKarp algorithm, if you know beforehand how {your “old” and “new” strings look like, and how your files look like (their content is likely not random)}
split the files into many small files (one 100 Gb file into one hundred 1 Gb files), replace the smaller files then join them together (and replace the joints)
run your app on 2 instances (machines), one replacing 15 files, while the other the rest
Before considering these, do some checks:
if you comment out the replace part, is the reading file contents and rewriting them (without replacing) still enough to cause the issue ?
how often does this run ? could it be that one run (for 30 files) does not get to finish before the next run starts ?
how does the garbage collection look like ? Could it be that GC is contribution too much to the CPU consumption ? (add a runtime.GC() after each write, though this has only a small chance to help)