Logging stderr and stdout to log file and handling errors in bash script [duplicate] Logging stderr and stdout to log file and handling errors in bash script [duplicate] unix unix

Logging stderr and stdout to log file and handling errors in bash script [duplicate]


Try this:

mylogger() { printf "Log: %s\n" "$(</dev/stdin)"; }mysqldump ... 2>&1 >dumpfilename.sql | mylogger


Both Cyrus's answer and Oleg Vaskevich's answer offer viable solutions for redirecting stderr to a shell function.

What they both imply is that it makes sense for your function to accept stdin input rather than expecting input as an argument.

To explain the idiom they use:

mysqldump ... 2>&1 > sdtout-file | log-func-that-receives-stderr-via-stdin
  • 2>&1 ... redirects stderr to the original stdout
    • from that point on, any stderr input is redirected to stdout
  • > sdtout-file then redirects the original stdout stdout to file stdout-out-file
    • from that point on, any stdout input is redirected to the file.

Since > stdout-file comes after 2>&1, the net result is:

  • stdout output is redirected to file stdout-file
  • stderr output is sent to [the original] stdout

Thus, log-func-that-receives-stderr-via-stdin only receives the previous command's stderr input through the pipe, via its stdin.

Similarly, your original approach - command 2> >(logFunction) - works in principle, but requires that your log() function read from stdin rather than expect arguments:

The following illustrates the principle:

ls / nosuchfile 2> >(sed s'/^/Log: /') > stdout-file
  • ls / nosuchfile produces both stdout and stderr output.
  • stdout output is redirected to file stdout-file.
  • 2> >(...) uses an [output] process substitution to redirect stderr output to the command enclosed in >(...) - that command receives input via its stdin.
    • sed s'/^/Log: /' reads from its stdin and prepends string Log: to each input line.

Thus, your log() function should be rewritten to process stdin:

  • either: by implicitly passing the input to another stdin-processing utility such as sed or awk (as above).
  • or: by using a while read ... loop to process each input line in a shell loop:
log() {  # `read` reads from stdin by default  while IFS= read -r line; do    printf 'STDERR line: %s\n' "$line"  done}mysqldump ... 2> >(log) > stdout-file


Let's suppose that your log function looks like this (it just echos the first argument):

log() { echo "$1"; }

To save the stdout of mysqldump to some file and call your log() function for every line in stderr, do this:

mysqldump 2>&1 >/your/sql_dump_file.dat | while IFS= read -r line; do log "$line"; done

If you wanted to use xargs, you could do it this way. However, you'd be starting a new shell every time.

export -f logmysqldump 2>&1 >/your/sql_dump_file.dat | xargs -L1 bash -i -c 'log $@' _