![]() |
Looking for a computer program...
Does anyone know of a computer program that will automatically search through a folder on a website for all the files inside and download them all? For example, if I want to download all the images at http://www.sample.com/whatever, is there something that could download all the .jpgs in that folder without me knowing what the filenames are?
|
|
Not quite, but thanks. With that program, the file has to be linked off of one of the websites. What if I need to gather all the files, regardless of if they are linked or not?
|
I might be wrong, but I don't think such a program would be possible, as web servers generally do not allow people access to folder listings (except when there is no default html file to display).
The only way it could work would be by 'brute force' but that would be incredibly slow and not very ethical (or possibly legal?). |
CSflim mostly said it. You need read and execute permissions on the directory itself. Most websites don't leave things this open though so you're limited to reading the files linked to in html docs. It isn't always locked though, sometimes just obscured by a default html that makes it less obvious. Give leaching/ripping programs a try.
A rare alternative these days is FTP. If you can connect to ftp://www.sample.com/, or substitute ftp.sample.com, or just sample.com, look around for the www and whatever directories. FTP is rarely left open by careful admins these days but it's something to try. |
I'm not entirely sure if it does it, but you might want to look into Get Right.
http://www.getright.com/ |
Quote:
Quote:
Quote:
(*)according to your first post, you would want all of the jpg's physically contained in the 'whatever' dir of the www.sample.com domain. in other words, http://www.sample.com/whatever/picture.jpg .... so, by 'linked off of' do you mean, linked off of? as in, a picture with a URL of say, http://www.whatever.com/sample/picture.jpg, in an html file contained within the 'whatever' dir of the www.sample.com domain? am I confusing you yet? :p just trying to straighten this out ... because if you're asking what I think you are(*), httrack *will* work, as would wget. or, for the expensive option, and what I use, offline explorer. please try to more clearly explain what you're trying to do. ps - if he's asking what I'm thinking, CS and cataklysm are a bit off. |
as someone who runs a site, and has assorted stuff in folders that I only want certain people to see (and therefore are not linked on a webpage somewhere, I provide people with the specific URL to the image, .exe or whatever), I think what you're trying to accomplish is unethical. if you want everything in the folder, contact whoever runs the site and ask them.
|
...and if mikec's right on what you're trying to do ... then google may be your best option ...
|
just came upon this...if it's an open directory, it'll list everything in it..
http://www.opendirviewer.nl/ |
Quote:
|
All times are GMT -8. The time now is 07:29 PM. |
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0 PL2
© 2002-2012 Tilted Forum Project