Download entire site for offline usage with wget (including external image servers)
The manual told us:
Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to ‘-p’:
wget -E -H -k -K -p http://the.site.com
You'll have to combine that with some Recursive Download options.
You'd rather use --wait=xx
, --limit-rate=xxK
and -U agent-string
to not be blacklisted by the server…
Related videos on Youtube
Comments
-
avenas8808 over 1 year
OK, so I've got wget 1.12 on Windows 7, and I can do basic downloads of it.
The site I'm trying to download: http://www.minsterfm.co.uk
and all images on it are stored externally at http://cml.sad.ukrd.com/image/
How can I download the site, and the external images and possibly allow all files to keep their original extension, without converting .php files to .htm
I would appreciate any help, since I'm new to wget.
-
Kvisle over 12 yearsWhat have you managed to do this far?
-
HikeMike over 12 yearspossible duplicate of Make wget download page resources on a different domain
-
Henke over 3 yearsDoes this answer your question? How can I download an entire website?
-
-
slhck over 12 yearsCan you maybe explain what these switches do?