Downloading all images from website - Ask Ubuntu
i'm trying download images of website
here website :
https://wall.alphacoders.com/by_sub_category.php?id=173173&name=naruto+wallpapers
i tried:
wget -nd -r -p /home/pictures/ -a jpeg,jpg,bmp,gif,png https://wall.alphacoders.com/by_sub_category.php?id=173173&name=naruto+wallpapers s
but doesn't download images
result
http request sent, awaiting response... 200 ok length: unspecified [text/html] /home/pictures: permission denied/home/pictures/by_sub_category.php?id=173173: no such file or directory
cannot write ‘/home/pictures/by_sub_category.php?id=173173’ (no such file or directory).
to download images specified page wget can use command:
wget -i `wget -qo- https://wall.alphacoders.com/by_sub_category.php\?id\=173173\&name\=naruto+wallpapers | sed -n '/<img/s/.*src="\([^"]*\)".*/\1/p'`
in example html file download wget stdout, parsed sed img url remain , passed wget -i
input list downloading.
note download images on page, thumbnails (350px wide).
if you'd download full images, should go step forward , change parsed img urls correspond hi-res images. can sed or awk:
wget -i `wget -qo- https://wall.alphacoders.com/by_sub_category.php\?id\=173173\&name\=naruto+wallpapers | sed -n '/<img/s/.*src="\([^"]*\)".*/\1/p' | awk '{gsub("thumb-350-", "");print}'`
the result of running last command pack of hd wallpapers on disk
Comments
Post a Comment