I have a program that is starting to get a bit large (currently ~300 lines). One thing that is a bit of an eye sore for me is my program opens, defers, and creates buffio.NewScanners for three different files. Then it also creates different arrays for each set of data. I am guessing all together takes up about 30-40 lines of repetitive code. Is there a way where I can do this work inside of a loop or what is the preferred way of handling this data from a programming perspective?
I tried creating an array of filenames and going through a loop but it did not look right and I noticed other posts that saids Go does not support dynamic variables so any help, thoughts, or comments would be useful to me.
Basically, I have a block of code that looks like the below:
Currently, I have been just trying to think of ways I can either clean it up or ways I can re-use code. I do not have a programming background but I feel like there is a better way.
package main
import (
"bufio"
"fmt"
"os"
"strings"
)
func main() {
iplist, err := read("file.txt")
if err != nil {
panic(err)
}
subnet, err := read("file2.txt")
if err != nil {
panic(err)
}
ngdomain, err := read("file3.txt")
if err != nil {
panic(err)
}
fmt.Println(iplist, subnet, ngdomain)
}
func read(file string) ([]string, error) {
f, err := os.Open(file)
if err != nil {
return nil, err
}
defer f.Close()
var res []string
s := bufio.NewScanner(f)
for s.Scan() {
res = append(res, strings.Split(s.Text(), "\n")[0])
// If you just want to remove the newline at the end of the text, you should use the following code
// res = append(res, strings.TrimRight(s.Text(), "\n"))
}
return res, nil
}
The three file reads are not identical. The last file read doesn’t remove the newline at the end of the lines. Not sure if it’s deliberate or a mistake.
Apart from this, your suggested solution is of course perfect. Using TrimRight is indeed preferable, or the following which I assume would be more efficient
txt := s.Text()
res = append(res, txt[:len(txt)-1])