bash: set -x logs to file

26,267

Solution 1

After more than a year I've found the right solution to have both the "normal" output (stdout + stderr - bash trace) on the screen and all together (stdout + stderr + bash trace) in a file (bash.log):

exec   > >(tee -ia bash.log)
exec  2> >(tee -ia bash.log >& 2)
exec 19> bash.log

export BASH_XTRACEFD="19"
set -x

command1
command2

Solution 2

Based on this ServerFault answer Send bash -x output to logfile without interupting standard output, modern versions of bash include a BASH_XTRACEFD specifically for specifying an alternate file descriptor for the output of set -x

So for example you can do

#!/bin/bash

exec 19>logfile
BASH_XTRACEFD=19

set -x
command1
command2
...

to send the output of set -x to file logfile while preserving regular standard output and standard error streams for the following commands.

Note the use of fd 19 is arbitrary - it just needs to be an available descriptor (i.e. not 0, 1, 2 or another number that you have already allocated).

Solution 3

Steeldriver gave you one approach. Alternatively, you can simply redirect STDERR to a file:

script.sh 2> logfile

That, however, means that both the output created by the set -x option and any other error messages produced will go to the file. Steeldriver's solution will only redirect the set -x output which is probably what you want.

Solution 4

Automatic File Descriptor

Just to improve slightly on the accepted answer, and I'll keep it intact, here you can use the Bash 4.1+ automatic file descriptor allocation {any_var} to use the lowest available file descriptor - no need to hard-code it.

exec  1> >(tee -ia bash.log)
exec  2> >(tee -ia bash.log >& 2)

# Notice no leading $
exec {FD}> bash.log

# If you want to append instead of wiping previous logs
exec {FD}>> bash.log

export BASH_XTRACEFD="$FD"
set -x

# Just for fun, add this to show the file descriptors in this context
# and see the FD to your bash.log file
filan -s

command1
command2
Share:
26,267

Related videos on Youtube

redseven
Author by

redseven

Updated on September 18, 2022

Comments

  • redseven
    redseven almost 2 years

    I have a shell script with set -x to have verbose/debug output:

    #!/bin/bash
    
    set -x
    command1
    command2
    ...
    

    The output looks like this:

    + command1
    whatever output from command1
    + command2
    whatever output from command2
    

    My problem is, the shell output (caused by set -x) goes to the stderr, mixed with the output of the commands (command1, command2, ...). I would be happy to have the "normal" output on the screen (like the script woud run without set -x) and the "extra" output of bash separately in a file.

    So I would like to have this on the screen:

    whatever output from command1
    whatever output from command2
    

    and this in a log file:

    + command1
    + command2
    

    (also fine if the log file has everything together)

    The set -x 2> file obviously doens't take the right effect, because it's not the output of the set command, but it change the behaviour of the bash.

    Using bash 2> file for the entire script also doesn't do the right thing, because it redirects the stderr of every command which run in this shell as well, so I don't see the error message of the commands.

  • redseven
    redseven over 7 years
    "...both the output created by the set -x option and any other error messages produced will go to the file. " And that's why it doesn't work for me. My main issue is, I don't easily see "real errors", because all this bash output goes to the stderr. Redirecting the commands error messages would also hide me the "real errors" on a different way.
  • terdon
    terdon over 7 years
    @redseven I'm afraid what you're asking for isn't very clear then. Could you edit your question and clarify? Try avoiding the use of the term "output" for anything that isn't going to stdout. Do you want to separate i) normal output; ii) any errors thrown by your command and iii) set -x? Isn't steeldriver's answer enough then? If not, show us a simple script that we can copy and tell us how you'd want it to behave.
  • redseven
    redseven over 7 years
    @steeldriver has already perfectly answered the question.
  • terdon
    terdon over 7 years
    @redseven ah, cool then. Since your comment came so much later, I thought you still needed something and I couldn't see how steeldriver's answer failed to solve your issue. Glad it's all sorted then.
  • redseven
    redseven over 6 years
    It indeed saves the bash trace log separately, however it makes really hard to read the 2 outputs (stdout+stderr on the screen and the bash trace in the log files) as they completely out of sync. See the solution what I've just posted.
  • jarno
    jarno over 4 years
    That is just the combination of steeldriver's answer and this one.
  • Liso
    Liso over 3 years
    Doesn't work for me, the bash trace is not displayed on the screen, but the logfile contain all necessity.
  • redseven
    redseven over 3 years
    @Liso That's exactly how it has to work. On the screen you have the stdout and stderr just like before and in the log file you have both of them plus the trace log.
  • Liso
    Liso over 3 years
    @redseven Well I originally wanted them all to appear on the screen and on the log.
  • redseven
    redseven over 3 years
    @Liso That's very easy and doesn't requires any of these tricks discussed here. You simply turn on tracing then redirect all the outputs of your script to a tee...