Process Multiple Files Dynamicly

Hello,

I have a program that is starting to get a bit large (currently ~300 lines). One thing that is a bit of an eye sore for me is my program opens, defers, and creates buffio.NewScanners for three different files. Then it also creates different arrays for each set of data. I am guessing all together takes up about 30-40 lines of repetitive code. Is there a way where I can do this work inside of a loop or what is the preferred way of handling this data from a programming perspective?

I tried creating an array of filenames and going through a loop but it did not look right and I noticed other posts that saids Go does not support dynamic variables so any help, thoughts, or comments would be useful to me.

Thanks,
Joe

Please publish some of the code your are using to get a better idea what you are trying to accomplish

Cheers,
Yamil

Basically, I have a block of code that looks like the below:
Currently, I have been just trying to think of ways I can either clean it up or ways I can re-use code. I do not have a programming background but I feel like there is a better way.

hostlist, _ := os.Open("file.txt")
//hostlist, _ := os.Open("temp")
filesubnetblacklist, _ := os.Open("file2.txt")
gooddomainlist, _ := os.Open("file3.txt")
defer hostlist.Close()
defer filesubnetblacklist.Close()
defer gooddomainlist.Close()
hostbuff := bufio.NewScanner(hostlist)
fileScanner := bufio.NewScanner(filesubnetblacklist)

domainbuff := bufio.NewScanner(gooddomainlist)

for domainbuff.Scan() {
    s := strings.Split(domainbuff.Text(), "\n")
    ngdomain = append(ngdomain, s[0])
}
for fileScanner.Scan() {
    s := strings.Split(fileScanner.Text(), "\n")
    subnet = append(subnet, s[0])
}
for hostbuff.Scan() {
    iplist = append(iplist, hostbuff.Text())
}
package main

import (
	"bufio"
	"fmt"
	"os"
	"strings"
)

func main() {
	iplist, err := read("file.txt")
	if err != nil {
		panic(err)
	}
	subnet, err := read("file2.txt")
	if err != nil {
		panic(err)
	}
	ngdomain, err := read("file3.txt")
	if err != nil {
		panic(err)
	}

	fmt.Println(iplist, subnet, ngdomain)
}

func read(file string) ([]string, error) {
	f, err := os.Open(file)
	if err != nil {
		return nil, err
	}

	defer f.Close()

	var res []string
	s := bufio.NewScanner(f)
	for s.Scan() {
		res = append(res, strings.Split(s.Text(), "\n")[0])
		// If you just want to remove the newline at the end of the text, you should use the following code
		// res = append(res, strings.TrimRight(s.Text(), "\n"))
	}

	return res, nil
}
2 Likes

The three file reads are not identical. The last file read doesn’t remove the newline at the end of the lines. Not sure if it’s deliberate or a mistake.
Apart from this, your suggested solution is of course perfect. Using TrimRight is indeed preferable, or the following which I assume would be more efficient

txt := s.Text()
res = append(res, txt[:len(txt)-1])
2 Likes

Perfect,

I am kicking myself a bit for not thinking of this myself…but thanks a lot for the feedback.

Cheer up! the conventional thinking is to separate public logic, which will improve code reuse and maintainability

The bufio.Scanner returns lines without line endings. There are no line endings to remove or split by.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.