Delete files older than X days on remote server with SCP/SFTP

21,797

Solution 1

This question is very old but I still wanted to add my bash only solution as I was just searching for one when I came here. The grep tar in the listing command is just for my own purpose to list only tar files, can be adapted of course.

RESULT=`echo "ls -t path/to/old_backups/" | sftp -i ~/.ssh/your_ssh_key [email protected] | grep tar`

i=0
max=7
while read -r line; do
    (( i++ ))
    if (( i > max )); then
        echo "DELETE $i...$line"
        echo "rm $line" | sftp -i ~/.ssh/your_ssh_key [email protected]
    fi
done <<< "$RESULT"

This deletes all tar files in the given directory except the last 7 ones. It is not considering the date though but if you only have one backup per day it is good enough.

Solution 2

Sure I can write some script on perl etc but it's overkill.

You don't need a script to achieve the intended effect - a one-liner will do if you have shell access to send a command:

ssh user@host 'find /path/to/old_backups/* -mtime +7 -exec rm {} \;'

-mtime +7 matches files created one week ago from midnight of the present day.

Solution 3

If you insist on SCP/SFTP you can list files, parse them using a simple script and delete old backup files.

Batch mode "-b" switch should help you out. It reads sftp commands from file. http://linux.die.net/man/1/sftp

Share:
21,797

Related videos on Youtube

Mike
Author by

Mike

Work as programmer and systems administrator.

Updated on September 17, 2022

Comments

  • Mike
    Mike almost 2 years

    Do anyone know some good way to delete files on remote server that are older than X days using just SCP/SFTP? Sure I can write some script on perl etc but I feel it's overkill.
    Any UNIX way?
    Oneliner?
    Separate utility?

    Thanks

    P.S. The task is to delete some outdated backup files.

  • Mike
    Mike almost 14 years
    Sure it's possible, but I'm looking for more elegant UNIX way if it exists.
  • M_1
    M_1 over 13 years
    Do you have any idea? maybe suggestion from your side would help? Then we can make the idea a bit better?
  • Mike
    Mike over 13 years
    Sad but this is using SSH and remote oneliner. There is no shell access, just SCP/SFTP.
  • danlefree
    danlefree over 13 years
    @Mike - Well that one-liner can save you some time over writing a perl script, if that is the case - you could use atime instead of mtime to match the last access time (i.e. when your files were last downloaded) and run a daily cron job.
  • Mike
    Mike over 13 years
    there is no shell access to remote machine.
  • danlefree
    danlefree over 13 years
    @Mike I was under the impression that you could negotiate adding a cron job with the administrator of the server hosting your backup files - my apologies if this is not possible.
  • chicks
    chicks over 7 years
    While this is interesting and all, doesn't it make more sense to make one connection which completes the entire task instead of a connection for each file to be deleted plus one more to get the list of files?
  • dirkaholic
    dirkaholic over 7 years
    I'm not sure it is possible to run a sequence of commands like this using one connection and I think it would it would be over-optimizing as well. For the use case of deleting old backup files of backups that run once a day it means you would effectively do 2 ssh connections per day, one for the list and one for the one file that is out of max now. I think that is quite an acceptable tradeoff.