Bash scripting, checking for errors, logging

25,836

Solution 1

For logging, you can wrap sections of your script in curly braces and redirect the stdout to a log file:

{
    script_command_1
    script_command_2
    script_command_3
} >> /path/to/log_file

Solution 2

For logging, you can arrange for all output written on standard output and/or standard error to go to a file. That way, you don't need to redirect the output of each command:

# Save standard output and standard error
exec 3>&1 4>&2
# Redirect standard output to a log file
exec 1>/tmp/stdout.log
# Redirect standard error to a log file
exec 2>/tmp/stderr.log

# Now the output of all commands goes to the log files
echo "This goes to /tmp/stdout.log"
echo "This goes to /tmp/stderr.log" 1>&2
...

# Print a message to the original standard output (e.g. terminal)
echo "This goes to the original stdout" 1>&3

# Restore original stdout/stderr
exec 1>&3 2>&4
# Close the unused descriptors
exec 3>&- 4>&-

# Now the output of all commands goes to the original standard output & error
...

To execute a command only if a previous one succeeds, you can chain them with conditionals:

# Execute command2 only if command1 succeeds, and command3 only if both succeed:
command1 && command2 && command3

# Execute command2 only if command1 fails
command1 || command2

so you can do things like

{ find . -mtime +7 -type f -print0 | xargs -0 tar -cf "${TAR}" &&
  gzip ${TAR} && 
  find . -mtime +7 -type f -print0 | xargs -0 rm -f } || 
    { echo "Something failed" 1>&2; exit 1 }

or provide details in the log output:

find . -mtime +7 -type f -print0 | xargs -0 tar -cf "${TAR}" || 
  { echo "find failed!!" 1>&2; exit 1 }
gzip ${TAR} || 
  { echo "gzip failed!!" 1>&2; exit 1 }
find . -mtime +7 -type f -print0 | xargs -0 rm -f || 
  { echo "cleanup failed!!" 1>&2; exit 1}

Solution 3

The easy way out, but with no explicit error message add -e to the shebang, i.e. #!/bin/sh -e which will cause the shell to exit if a command fails..

Cron should give you an error message through the mail I guess though.

If you want to go full blown back-up scheme though, I'd suggest you use something that has already been made. There are a bunch out there, and most work very well.

Solution 4

GNU tar has --remove-files that will remove files once they've been added to the archive. v will cause it to list out files as it adds them. z will pipe the tar through gzip on the fly.

Your find solution is racy; a file may fit the criteria in between invocations, thus getting deleted but not backed up.

Share:
25,836
dr Hannibal Lecter
Author by

dr Hannibal Lecter

At the time of writing this, I have 6+ years of work experience, mostly web projects but also other types. I've used various languages and frameworks: PHP, ASP.NET, C#, SQL Server, Crystal Reports, MySQL, Informix, jQuery, CakePHP, Subversion, Git, and unfortunately even C. I've also played with Sqlite, RoR, Python, GTK (etc) but I have never done a real project with them to call myself an expert. I am of curious nature and like to learn new things, languages, tools and project management related stuff. Lately I've been educating myself about all things Agile which are non-existant in the company I work in (far fowls have fair feathers?).

Updated on April 03, 2020

Comments

  • dr Hannibal Lecter
    dr Hannibal Lecter about 4 years

    Here's one for the bash-fu wizards. No, actually, I'm just kidding, you'll all probably know this except for me..

    I'm trying to create a backup shell script. The idea is fairly simple: find files in a certain folder, older than 7 days, tar/gzip them to another directory, and remove them. The problem is, I'm not sure if I'll have enough permissions to create a tar/gzip file in the target dir. Is there any (proper) way to check if the file has been created successfully, and if so, delete the files. Otherwise, skip that part and don't destroy customers' data. I hear they are not very fond of that.

    Here's what I have so far:

    01: #!/bin/bash
    02: 
    03: ROOTDIR="/data/www"
    04: 
    05: TAR="${ROOTDIR}/log/svg_out_xml/export_out_ack_$(date +%Y-%m-%d).tar"
    06: cd ${ROOTDIR}/exchange/export/out_ack/
    07: find . -mtime +7 -type f -print0 | xargs -0 tar -cf "${TAR}"
    08: gzip ${TAR}
    09: find . -mtime +7 -type f -print0 | xargs -0 rm -f
    

    Basically, I'd need to check if everything went fine on lines 7 and 8, and if so execute 9.

    Additionally, I'd like to make a log file of these operations so I know everything went fine (this is a nightly cron job).