Echo commands and output to file from interactive shell
Solution 1
set -x option puts your script in debugging mode. Where actual command is echoed to stderr. So you need to redirect both stdout and stderr to somefile. try this
your_script.sh &> outputfile.txt
Solution 2
The xtrace
output is going to stderr before the command is evaluated including before the redirections are performed.
So in:
set -x # or set -o xtrace
echo test 2> file
The + echo test
goes to wherever stderr was going to, then the shell opens file
on fd 2 and then runs echo test
. If you want the + echo test
to go to file
, you'd need:
{ echo test; } 2> file
This time, the redirection is performed for the command group and then the echo test
command is evaluated (and the + echo test
written to stderr which at that point goes to file
).
Note that some shells like AT&T ksh
will also output a + 2> file
. It's also bogus in some versions of mksh
.
That also means that the stderr of any command run within that command group will go the file
. To work around that, you'd need:
set -x
{
cmd 2>&3 3>&-
} 3>&2 2> file
That is keep a copy of the original stderr (on fd 3) and restore it for the commands inside the command group.
With bash
, an alternative is to use the $BASH_XTRACEFD
special variable:
exec 7> file
BASH_XTRACEFD=7
set -x
echo test
If you want a nice xtrace
output, try zsh
. Also note that it can be customized with the $PS4
special variable.
Related videos on Youtube
![JimTek](https://lh4.googleusercontent.com/-peID1VQOTrU/AAAAAAAAAAI/AAAAAAAAAFA/5viy54hTdHY/photo.jpg?sz=256)
JimTek
Updated on September 18, 2022Comments
-
JimTek almost 2 years
I am looking for a graceful way to have bash write both the command I executed, followed by the output of that command to a file from an interactive prompt. Such that running a command like this:
$ls -alh > list_dir
would write something resembling this to list_dir:
$ls -alh > list_dir total 12K drwxr-xr-x 2 root root 4.0K Dec 21 13:30 . drwx------ 5 root root 4.0K Dec 21 13:30 .. -rw-r--r-- 1 root root 842 Dec 21 13:09 file -rw-r--r-- 1 root root 0 Dec 21 13:29 file1 -rw-r--r-- 1 root root 0 Dec 21 13:29 file2
I have found many articles with solutions that claim to work within a script, but nothing that has worked from an interactive shell. For example, many people suggest adding this toward the top of a script:
set -x
If I enter this in my shell, the commands are printed (although not nicely) to standard output, but I have not managed to figure out how to include them in output redirection.
I also saw many people suggest using script to do this. This is the closest I've come to finding what I'm looking for, but I don't always want EVERYTHING to be recorded. I want to be in control of which commands are redirected to a file.
Does anyone know of a good solution to this? I'm open to more out-of-the-box ideas so long as they're not too clunky for regular use.
-
Paul H. over 7 yearsDo you require the redirection to be part of the command listed in the file? For instance, in your example, would having
ls -ahl
instead ofls -ahl > list_dir
as the first line of the file be acceptable? -
icarus over 7 yearsWould you take an
expect
script which does the same asscript
except it allows you to type something like~l
to turn logging on and off?
-