Download entire site for offline usage with wget (including external image servers)

5,110

The manual told us:

Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to ‘-p’:

wget -E -H -k -K -p http://the.site.com

You'll have to combine that with some Recursive Download options. You'd rather use --wait=xx, --limit-rate=xxK and -U agent-string to not be blacklisted by the server…

Share:
5,110

Related videos on Youtube

avenas8808
Author by

avenas8808

Learning about PHP, graphic design etc.

Updated on September 18, 2022

Comments

  • avenas8808
    avenas8808 over 1 year

    OK, so I've got wget 1.12 on Windows 7, and I can do basic downloads of it.

    The site I'm trying to download: http://www.minsterfm.co.uk

    and all images on it are stored externally at http://cml.sad.ukrd.com/image/

    How can I download the site, and the external images and possibly allow all files to keep their original extension, without converting .php files to .htm

    I would appreciate any help, since I'm new to wget.

  • slhck
    slhck over 12 years
    Can you maybe explain what these switches do?