rsync from unmounted webdav source

5,317

As an alternative to rsync, one way you could accomplish this task is by using wget with the -m (--mirror) switch, which applies -r (recursion), -N (matching timestamps to the remote file) an infinite level depth that you may or may not wish to curtail using the -l switch, and also applies the --no-remove-listing switch (an FTP transfer option that I don't think matters when transferring via HTTP)

We will also need to target the directory for storage explicitly to avoid dumping everything into the directory in which wget was launched (default behaviour as I recall)

We can do this with the -P switch as in -P target or --directory-prefix=target

You will also likely want to avoid climbing above the source directory you are targeting with the -np (--no-parent) switch which applies:

Do not ever ascend to the parent directory when retrieving recursively.
This is a useful option, since it guarantees that only the files below 
a certain hierarchy will be downloaded.

You can specify credentials as follows:

--user=user
--password=password

 Specifies the username user and password password for both FTP and HTTP file retrieval.  These parameters can be overridden using the --ftp-user and
           --ftp-password options for FTP connections and the --http-user and --http-password options for HTTP connections.

Putting all this together, I end up with a command that looks like this.

wget -m 192.168.0.22/webdav/ -np -P ./media/martin/internal-large/CCTV-TWO/

In your case you may or may not have to add the --user=admin and --password=whateveritis switches.

There may be other useful switches for your particular use case. There's a detailed man page available with the command man wget One in particular you might wish to review is --no-clobber I've included an excerpt from the man page below:

--no-clobber
           If a file is downloaded more than once in the same directory,
           Wget's behavior depends on a few options, including -nc.  In
           certain cases, the local file will be clobbered, or overwritten,
           upon repeated download.  In other cases it will be preserved.

           When running Wget without -N, -nc, -r, or -p, downloading the same
           file in the same directory will result in the original copy of file
           being preserved and the second copy being named file.1.  If that
           file is downloaded yet again, the third copy will be named file.2,
--
           Therefore, ""no-clobber"" is actually a misnomer in this
           mode---it's not clobbering that's prevented (as the numeric
           suffixes were already preventing clobbering), but rather the
           multiple version saving that's prevented.

           When running Wget with -r or -p, but without -N, -nd, or -nc, re-
           downloading a file will result in the new copy simply overwriting
           the old.  Adding -nc will prevent this behavior, instead causing
           the original version to be preserved and any newer copies on the
           server to be ignored.

Sources:

man wget

https://stackoverflow.com/questions/1078524/how-to-specify-the-location-with-wget

https://stackoverflow.com/questions/17282915/download-an-entire-directory-using-wget

Share:
5,317

Related videos on Youtube

Martin KS
Author by

Martin KS

Updated on September 18, 2022

Comments

  • Martin KS
    Martin KS almost 2 years

    I've got a white-labeled, unbranded CCTV camera that exposes all it's recordings via a webdav server. For the past month or so I've been mounting the drive with davfs2, but it's not a particularly elegant solution:

    • The file list has to be regenerated for each (cp or tar) command, which takes a while
    • cp seems to slow down slightly after each successive file transfer
    • rsync --progress seems to measure only the local transfer speed from temp file to final destination

    I've seen a few people using rsync to backup to various types of webdav based cloud servers, but when I try:

    rsync -a admin@http://192.168.0.22/webdav/ /media/martin/internal-large/CCTV-TWO/
    

    all I get back is the error:

    ssh: Could not resolve hostname http: Name or service not known
    

    Is there a way of doing the transfer this way round?

    Edit from comment: I suspect that my question was too poorly worded to be understood. Firstly the webdav is the source. Secondly if you just use an IP address then rsync assumes you want an SSH based transfer.

    If I issue the command: rsync --progress -a [email protected]:/webdav /media/martin/internal-large/CCTV-TWO/

    I get the following errors:

    ssh: connect to host 192.168.0.24 port 22: Connection refused 
    rsync: connection unexpectedly closed (0 bytes received so far)
    [Receiver] rsync error: unexplained error (code 255) at io.c(226)
    
    • Martin KS
      Martin KS over 7 years
      While vaguely related, the other question is based on setting up SSH on the remote computer to allow rsync. The camera doesn't and can't support SSH, and so I'm forced to copy from a webdav share. In reality, I'm hoping for the exact opposite to the related question - a method that doesn't require SSH setup
    • Elder Geek
      Elder Geek over 7 years
      Have you considered trying to accomplish this task with wget or curl Perhaps if you were to edit in the model of the camera in question that would help us help you! Thank you!
    • bistoco
      bistoco over 7 years
      what about this answer where some flags improve speed of the rsync command. Also this is intended to be executed against a locally mounted dir, as explained on the 2nd reply to that answer and on this tutorial.
    • Elder Geek
      Elder Geek over 7 years
      Other options include cadaver and nd
    • Martin KS
      Martin KS over 7 years
      I'll try wget and cadaver and post results back here, thank you.
    • Elder Geek
      Elder Geek over 7 years
      Please add an @ symbol prior to my name when you report back so that I get notified , thank you!
  • Martin KS
    Martin KS over 7 years
    This works brilliantly so far - the full backup will take a couple of days. Do you know if this method will skip files it's already downloaded when run for the second time?
  • Elder Geek
    Elder Geek over 7 years
    @MartinKS I'm glad it's working for you. I Edited answer in response to your comment.
  • Martin KS
    Martin KS over 7 years
    Thanks, the addistional information is very handy. The command as it stood originally seems to have been exactly perfect, I've just run the third backup and managed to spot one line where the output was Server file no newer than local file (...) -- not retrieving.