How to download all images from a website (not webpage) using the terminal?

5,754

wget can do it:

wget -A.png,.jpg,.gif,.jpeg -e robots=off -m -k -nv -np -p \
--user-agent="Mozilla/5.0 (compatible; Konqueror/3.0.0/10; Linux)" \
http://site.url/

The only problem can be in case site generates its content using javascript.

Share:
5,754

Related videos on Youtube

Zignd
Author by

Zignd

Updated on September 18, 2022

Comments

  • Zignd
    Zignd almost 2 years

    I want a command that I type a URL, for example photos.tumblr.com, and it download all photos on this site in a folder, but not only images of the site's homepage. This command need to download the images from all parts of the site, such as photos.tumblr.com/ph1/1.png / photos.tumblr.com/ph3/4.jpg.

    Please show me an example using this url: http://neverending-fairytale.tumblr.com/ and test it before answer the question

  • kadamwolfe
    kadamwolfe almost 12 years
    Note for people trying to do this with HTTPS: you might need --no-check-certificate to get it to connect.