recursively download from a website
Solution 1
wget -nd -r -l1 -P /save/location -A jpeg,jpg http://www.example.com/products
Explanation :
-nd
prevents the creation of a directory hierarchy (i.e. no directories).
-r
enables recursive retrieval. See Recursive Download for more information.
-l1
Specify recursion maximum depth level. 1 for just this directory in your case it's products
.
-P
sets the directory prefix where all files and directories are saved to.
-A
sets a whitelist for retrieving only certain file types. Strings and patterns are accepted, and both can be used in a comma separated list (as seen above). See Types of Files for more information.
Solution 2
Try httrack(1)
, a web spider that is most useful for
creating local mirrors of entire web sites.
Homepage: https://www.httrack.com/
The examples in the linked manpage should get you started.
Related videos on Youtube
Gireesh T
Updated on September 18, 2022Comments
-
Gireesh T over 1 year
I am trying to get images from a website url "
www.example.com/products
" in this products folder lots of subfolders there I need to download the products folder.In the www.example.com/products, www.example.com/products/subfolders, the image is
- www.example.com/products/subfolder1/image.jpg,
- www.example.com/products/subfolder2/image.jpg,
- www.example.com/products/subfolder3/image.jpg
How can I download the products folder with subfolders with data.