![]() |
Picture grabber
i once had a program that allowed me to download all the pics for a site syimply by typing in the url,but i have forgotten the namw of it.could you please tell me the name?i think it was posted somewhere in the erougenous zone.
|
Try This
There are many out there. Heres a couple or enter picture download url in your search engine.
http://www.picture-finder.net/ http://www.vowsoft.com/ |
I'd either recommend:
HTTrack. http://www.httrack.com/ - freeware - Available for Both linux and windows - uses GUI Wget - Linux [well, also for windows, but I couldn't get the windows binary to work] - GNU [obviously] and free - EXTREMELY configurable, but it has a steeper learning curve than httrack. - want to get used to the command line if you use this. good luck, keyshawn |
I agree that wget is a bit on the steep side of the learning curve, but you can't beat the convenience factor of either the wget --mirror switch, or the sheer usefulness of a one-liner like:
$ for num in `seq -w 1 20`; do wget http://some.site.with.pics/pics/pic_$num.jpg; done; /Linux geek :-P |
Quote:
Is the command above a better way of doing this? |
Quote:
|
|
Bukster is free and can scan HTML pages for links. It also allows you to specify what extensions are dled and doesn't dl the HTML or whatever.
http://www.snapfiles.com/get/bukster.html |
All times are GMT -8. The time now is 04:19 AM. |
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0 PL2
© 2002-2012 Tilted Forum Project