Downloading all images from website - Ask Ubuntu


i'm trying download images of website

here website :

https://wall.alphacoders.com/by_sub_category.php?id=173173&name=naruto+wallpapers

i tried:

wget -nd -r -p /home/pictures/ -a jpeg,jpg,bmp,gif,png https://wall.alphacoders.com/by_sub_category.php?id=173173&name=naruto+wallpapers s 

but doesn't download images

result

http request sent, awaiting response... 200 ok length: unspecified [text/html] /home/pictures: permission denied/home/pictures/by_sub_category.php?id=173173: no such file or directory

cannot write ‘/home/pictures/by_sub_category.php?id=173173’ (no such file or directory).

to download images specified page wget can use command:

wget -i `wget -qo- https://wall.alphacoders.com/by_sub_category.php\?id\=173173\&name\=naruto+wallpapers | sed -n '/<img/s/.*src="\([^"]*\)".*/\1/p'` 

in example html file download wget stdout, parsed sed img url remain , passed wget -i input list downloading.

note download images on page, thumbnails (350px wide).

if you'd download full images, should go step forward , change parsed img urls correspond hi-res images. can sed or awk:

wget -i `wget -qo- https://wall.alphacoders.com/by_sub_category.php\?id\=173173\&name\=naruto+wallpapers | sed -n '/<img/s/.*src="\([^"]*\)".*/\1/p' | awk '{gsub("thumb-350-", "");print}'` 

the result of running last command pack of hd wallpapers on disk

enter image description here


Comments

Popular posts from this blog

download - Firefox cannot save files (most of the time), how to solve? - Super User

windows - "-2146893807 NTE_NOT_FOUND" when repair certificate store - Super User