Running a background process with logs

Hello, I want to run perl script as background process and redirect its output to a file.

Here is an example of print script


$i = 10; while($i--) {print time()."\n"; sleep 5;}

Here is how I run it:

package main

import (

func main() {
	file, err := os.OpenFile("/tmp/out.log", os.O_RDWR|os.O_CREATE, 0755)
	if err != nil {
		fmt.Println("failed to open log file")

	attr := os.ProcAttr{
		Env: os.Environ(),
		Files: []*os.File{
	process, err := os.StartProcess("/tmp/print", []string{"/tmp/print"}, &attr)
	if err != nil {
		fmt.Println("failed to start process: %v", err)

	err = process.Release()
	if err != nil {
		fmt.Println("failed to release: %v", err)

This works, but the out.log file is not filled until the script has finished. If I use os.Stdout instead of a file, then the output is printed immediately line by line.

Since the script is supposed to run for a long time, I would like to see the actual output in the log file. Is there a way to do this?

There is nothing to do with the go code, lgtm. The actual problem is with your perl script. I’m no expert in this language, but I googled some details and my guess here, is that print does not actually write data to the file immediately. Since you keep file opened the data awaits before resources will be released to write all the lines to the file with the least number of syscalls. In this case you can add buffer manual flush, so your prints will be actually added to the file in real time. Or reopen it to append every time you want to write something. You can check this article here and try to add it into your perl code.

You are right.
I’ve also checked with bash and python scripts that flush stdout and the code works as expected.
Thank you very much!

In perl, switch on autoflush: $| = 1
see perlvar - Perl predefined variables - Perldoc Browser

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.