05-04-2004, 10:47 PM | #1 (permalink) |
Professor of Drinkology
|
FTP Data Sync question (eek!) Help!!
I run reports every week for my SEO company via Web Position Gold. The directories to which the reports are generated is "c:\webpositiongold\reports"
Under that directory are a 3-dozen or so subdirectories for each of the websites I operate (eg., c:\webpositiongold\reports\dreisner_com). Every week, Web Position generates a few dozen more HTML files for each of the subdirectories (updates) and archives the old pages (the files remain as HTML files, but are linked to in an archive section of the main HTML menu). So, I have about 2 dozen new files amongst several thousand old files. I need a program that will *FTP* these new files into a remote computer (my webserver) and *only* the new files. To FTP the entire directory would be an upload of several hundred megabytes (876MB, to be precise), so that is inefficient and slow. I upload manually now, but that process it taking longer and longer as my client list grows. I need to automate. Any ideas? I've gone through downloads.com and I haven't found a single program that'll work. Most of the programs I've found are for syncing via a LAN... which is out of the question. The FTP sync programs I've located all trying to transfer the entire 876MBs... Duh! I'm running Windows XP Home on a cable line. I don't have acccess to install anything on the webserver, so I have to use the existing FTP, and use a client-side program.
__________________
Blah. Last edited by tritium; 05-04-2004 at 10:49 PM.. |
05-05-2004, 01:58 AM | #2 (permalink) |
Wah
Location: NZ
|
I'm not an expert on this area...
my instinctive response would be "PERL script" let me get this straight - can you install on the computer where the files are generated, but not on the webserver where they're supposed to be FTP'd to?
__________________
pain is inevitable but misery is optional - stick a geranium in your hat and be happy Last edited by apeman; 05-05-2004 at 02:05 AM.. |
05-05-2004, 08:41 AM | #4 (permalink) |
Wah
Location: NZ
|
oh, that should be pretty easy. I shall muse for a while and get back to you.
I take it you'd be happy to have new files copied to a local directory and just FTP the contents of it over? That would be a lot less work, right?
__________________
pain is inevitable but misery is optional - stick a geranium in your hat and be happy |
05-11-2004, 10:14 AM | #6 (permalink) |
Professor of Drinkology
|
On my computer the directory structure is as follows:
C:\ [Program Files] --[WebPosition] ----[Reports] ---------[Website A] ---------[Website B] ---------[Website C] The remote FTP has the following structure as root: [Website A] [Website B] [Website C] Every Tuesday morning, new files are generated for each of the "Website X" folders on my local computer. These files must be uploaded to their respective folder on the FTP server (eg., c:\..\Website_A\file1.htm must be copied to "ftp://website.com/reports/Website_A" and so on...
__________________
Blah. |
05-11-2004, 09:41 PM | #8 (permalink) |
Crazy
Location: Salt Town, UT
|
Shell scripting to the rescue
This looks like a perfect chance to learn the wonderful world of shell scripting.
Don't worry, you're gonna hate it at the beginning, but after a while, you will hardly be able to live without it. So, for starters, install cygwin, give yourself a nice full install, because getting components later is typically a pain. One caution, it's gonna take the better part of the day to download and install, it's pretty huge. [taps fingers on desk while cygwin installs] Now you have a wonderful unix-ish environment, which will almost work as well as a linux command line... almost. Since I guess you are probably new at this, I'm gonna try and take it a bit slowly, so you can actually (maybe, if I explain it well enough) learn what is going on and it won't be quite so much magic. Any line with a pound sign in front of it means you should type it in with the cygwin BASH shell. Beware, I might do something malicious so be sure you know what these things are doing before you do them, and at least make a backup. First off, lets get a file list of all of the new files (in the last two days): # cd c:/blah/blah/blah/ # find . -ctime 2 -type f Hopefully that gives you the list of what you want, and I will assume that it spits out a lot of junk that looks like this: ./site_a/file1.html Well, that is half the problem, finding all of the new files, but you could have done that with a simple "Find..." command. Now for the hard part, uploading them to the new directories with new filenames! Well, we want the filenames to be stored, so we can go through them one by one, so here we go: # FILELIST=`find . -ctime 2 -type f` Now we got that list of files, and threw it into $FILELIST, yay. What do you say we list what is in $FILELIST This will be typed in a little differently, because after the first line, it will prompt you for the other lines until you give it the 'done' command. # for $FILELIST in FILE do > echo "Looking at file $FILE" > done Okay, look at the output from that, if that looks good, then do this next one: # for $FILELIST in FILE do > echo "Uploading $FILE" > ncftpput -u myuser -p mypassword ftpsite /reports/$FILE > done That should do it. I'll leave it up to you to tweak it to what you really need, and to put it in to a file so it's easy for you to run. |
05-17-2004, 06:18 AM | #9 (permalink) |
Wah
Location: NZ
|
apologies I didn't get back sooner, I've been stressed out
the above is a good idea and much better than I'd have done too i tried a web search with the keywords "ftp incremental web sync" and there seems to be stuff out there
__________________
pain is inevitable but misery is optional - stick a geranium in your hat and be happy |
Tags |
data, eek, ftp, question, sync |
|
|