Retrying HTTP requests

I need to make quite a few http GET/POST requests in my script so I moved the request part to a package and added simple retry in case the first request fails.
This is my code:

func Request(req *http.Request, someclient *http.Client, counter int) ([]byte, error) {
	resp, httperr := client.Do(req)
	if httperr != nil {
		switch counter {
		case 0:
			fmt.Println("Retrying second time...")
			counter += 1
			return Request(req, client, counter)
		case 1:
			fmt.Println("Retrying third time...")
			counter += 1
			return Request(req, client, counter)
		case 2:
			return nil, fmt.Errorf("Couldn't get a response from remote server: ", httperr)
	body, err := ioutil.ReadAll(resp.Body)
	if err != nil {
		return nil, fmt.Errorf("Error while reading the response body: ", err)
	return body, httperr

Since I’m a noob to Go and programming in general, could anybody review my function?
Any pointers on making this more robust or cleaner? Like calling time.Sleep for 2 seconds between each recursive call?


I mean I understand my code is noob level, but any help regarding pitfalls with this function or tips to improve are really appreciated.

Sure! Here are some off the cuff thoughts;

  • this would be clearer as a loop instead of recursion, not requiring the caller to pass an initial zero
  • you should probably look closer at the returned error to determine if it should be retried
  • if you do retry, sleeping a bit between is nice
  • don’t capitalize errors, and rather wrap using something like that retains the original error for inspection
  • if you want retries, maybe also handle errors from readall as this is where you’ll get some of the network errors etc
1 Like

Thank you.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.