Redirect output from file to stdout
If the program can only write to a file, then you could use a named pipe:
pipename=/tmp/mypipe.$$mkfifo "$pipename"./myprog -o "$pipename" &while read linedo echo "output from myprog: $line"done < "$pipename"rm "$pipename"
First we create the pipe, we put it into /tmp
to keep it out of the way of backup programs. The $$
is our PID, and makes the name unique at runtime.
We run the program in background, and it should block trying to write to the pipe. Some programs use a technique called "memory mapping" in which case this will fail, because a pipe cannot be memory mapped (a good program would check for this).
Then we read the pipe in the script as we would any other file.
Finally we delete the pipe.
You can cat
the contents of the file written by myprog
.
myprog -o tmpfile input_file && cat tmpfile
This would have the described effect -- allowing you to pipe the output of myprog
to some subsequent command -- although it is a different approach than you had envisioned.
In the circumstance that the output of myprog
(perhaps more aptly notmyprog
) is too big to write to disk, this approach would not be good.
A solution that cleans up the temp file in the same line and still pipes the contents out at the end would be this
myprog -o tmpfile input_file && contents=`cat tmpfile` && rm tmpfile && echo "$contents"
Which stores the contents of the file in a variable so that it may be accessed after deleting the file. Note the quotes in the argument of the echo
command. These are important to preserve newlines in the file contents.