Limit goroutines

Hi all
Code below parse all input URL string on slice strings

inputString := flag.String("input", "http://golang.org\nhttp://golang.org", "url's")
flag.Parse()
k := strings.Split(*inputString,`\n`)

k =
[http://golang.org
http://golang.org]

How i can call only 5 goroutines wich used GET method to do somethings with any address in slice
but only once - if one goroutine work with string A other never not use this string ?
Thanks!

Buffered channel, for example like this:

http://jmoiron.net/blog/limiting-concurrency-in-go/

2 Likes

Your code does not include how you currently use Goroutines to handle the URLs in k. To help you we need to

  • see your code
  • know how large k can be
  • know how many Goroutines you want to run that handle the URLs.
2 Likes

I do something like this
But i need some A wich set strong only 5 goroutines work parallel

package main

import (
	"flag"
	"fmt"
	"strings"
	"net/http"
	"io/ioutil"
	"sync"
	"strconv"
)

func main() {

	var wg sync.WaitGroup
	inputString := flag.String("input", "http://golang.org\nhttp://golang.org", "url's")
	flag.Parse()

	k := strings.Split(*inputString,`\n`)
	wg.Add(len(k)) //wrong way

	var result = 0
	for _, v := range k {
		go func() { 
			defer wg.Done()
			var url = ""
			var total = 0
			url, total = callGet(s)
			result = result + total
			fmt.Printf("Count for " + url + ":" + strconv.Itoa(total) + "\n")
		}()
	}

	wg.Wait()
	fmt.Println("Total ", result)
}


func callGet(s string) (name string, value int)  {
	r, _ := http.Get(s) 
	defer r.Body.Close()

	body, _ := ioutil.ReadAll(r.Body)
    bodyStr := string(body)
	name = s
	value = strings.Count(bodyStr, "Go")

	return
}

If you want to only five goroutines you must not start k goroutines. Only start five and use channels to pass URLs and results to and from them.

Try something like this:

package main

import (
	"fmt"
	"strings"
	"sync"
)

func main() {
	input := `http://www.example.com/1
http://www.example.com/2
http://www.example.com/3
http://www.example.com/4
http://www.example.com/5
http://www.example.com/6
http://www.example.com/7
http://www.example.com/8
http://www.example.com/9
http://www.example.com/10`
	urls := strings.Split(input, "\n")

	// Channel to pass URLs to.
	work := make(chan string)

	// Channel to pass results to.
	results := make(chan string)

	// Start three worker goroutines.
	var done sync.WaitGroup
	for i := 0; i < 3; i++ {
		done.Add(1)
		go func() {
			defer done.Done()
			for w := range work {
				// TODO: Do some real work on the task w.
				results <- fmt.Sprintf("my result for '%s': '%s", w, strings.ToUpper(w))
			}
		}()
	}

	// Wait for the WaitGroup, then close the results channel.
	go func() {
		done.Wait()
		// All worker goroutines have finished, not more results will come.
		close(results)
	}()

	// Pass all URLs as tasks to the work channel.
	go func() {
		for _, url := range urls {
			work <- url
		}
		// All URLs passed, we can close the work channel.
		close(work)
	}()

	// Wait for the results.
	var combinedResult string
	for result := range results {
		combinedResult += result + "\n"
	}

	// We are done.
	fmt.Println(combinedResult)
}

See also https://goplay.space/#o9nqqSCswq1

2 Likes

Check out the semaphore package and examples. https://godoc.org/golang.org/x/sync/semaphore#example-package--WorkerPool

This is basically what the example in those docs is doing - limiting the number of goroutines in a worker pool.

1 Like

Some example with Worker Pools

https://gobyexample.com/worker-pools

1 Like

The use of buffered channels

jobs := make(chan int, 100)
results := make(chan int, 100)

in this example only works if the number of jobs is not larger than the buffer size. The example uses the ridiculous buffer size of 100 for five jobs which is bad practice.

This is not a very flexible approach that can easily lead to blocks or goroutines that can’t finish their work if the program terminates. Synchronisation using a semaphore or a waitgroup should be used instead.

1 Like

I wrote an article that describes different ways to limit concurrency. It includes working code and uses the same example to illustrate all the different methods shown.

https://pocketgophers.com/limit-concurrent-use/

2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.