Quote:
Originally Posted by mangle
Looking for a computer program...
|
...which is why we have a computer forum ...
Quote:
Originally Posted by mangle
Does anyone know of a computer program that will automatically search through a folder on a website for all the files inside and download them all? For example, if I want to download all the images at http://www.sample.com/whatever, is there something that could download all the .jpgs in that folder without me knowing what the filenames are?
|
Quote:
Originally Posted by mangle
With that program, the file has to be linked off of one of the websites. What if I need to gather all the files, regardless of if they are linked or not?
|
ok, I'm a bit confused. what do you mean by 'linked'?
(*)according to your first post, you would want all of the jpg's physically contained in the 'whatever' dir of the
www.sample.com domain.
in other words,
http://www.sample.com/whatever/picture.jpg
....
so, by 'linked off of' do you mean, linked
off of? as in, a picture with a URL of say,
http://www.whatever.com/sample/picture.jpg, in an html file contained within the 'whatever' dir of the
www.sample.com domain?
am I confusing you yet?
just trying to straighten this out ...
because if you're asking what I think you are(*), httrack *will* work, as would wget.
or, for the expensive option, and what I use, offline explorer.
please try to more clearly explain what you're trying to do.
ps - if he's asking what I'm thinking, CS and cataklysm are a bit off.