Tilted Forum Project Discussion Community  

Go Back   Tilted Forum Project Discussion Community > Interests > Tilted Technology


 
 
LinkBack Thread Tools
Old 05-04-2004, 10:47 PM   #1 (permalink)
Professor of Drinkology
 
FTP Data Sync question (eek!) Help!!

I run reports every week for my SEO company via Web Position Gold. The directories to which the reports are generated is "c:\webpositiongold\reports"

Under that directory are a 3-dozen or so subdirectories for each of the websites I operate (eg., c:\webpositiongold\reports\dreisner_com).

Every week, Web Position generates a few dozen more HTML files for each of the subdirectories (updates) and archives the old pages (the files remain as HTML files, but are linked to in an archive section of the main HTML menu). So, I have about 2 dozen new files amongst several thousand old files.

I need a program that will *FTP* these new files into a remote computer (my webserver) and *only* the new files. To FTP the entire directory would be an upload of several hundred megabytes (876MB, to be precise), so that is inefficient and slow. I upload manually now, but that process it taking longer and longer as my client list grows.

I need to automate. Any ideas? I've gone through downloads.com and I haven't found a single program that'll work. Most of the programs I've found are for syncing via a LAN... which is out of the question. The FTP sync programs I've located all trying to transfer the entire 876MBs... Duh!

I'm running Windows XP Home on a cable line. I don't have acccess to install anything on the webserver, so I have to use the existing FTP, and use a client-side program.
__________________
Blah.

Last edited by tritium; 05-04-2004 at 10:49 PM..
tritium is offline  
Old 05-05-2004, 01:58 AM   #2 (permalink)
Wah
 
Location: NZ
I'm not an expert on this area...

my instinctive response would be "PERL script"

let me get this straight - can you install on the computer where the files are generated, but not on the webserver where they're supposed to be FTP'd to?
__________________
pain is inevitable but misery is optional - stick a geranium in your hat and be happy

Last edited by apeman; 05-05-2004 at 02:05 AM..
apeman is offline  
Old 05-05-2004, 07:06 AM   #3 (permalink)
Professor of Drinkology
 
My personal PC (the one at which I am writing this) generates the reports, but the webserver to which the reports are sent is inaccessible to me.
__________________
Blah.
tritium is offline  
Old 05-05-2004, 08:41 AM   #4 (permalink)
Wah
 
Location: NZ
oh, that should be pretty easy. I shall muse for a while and get back to you.

I take it you'd be happy to have new files copied to a local directory and just FTP the contents of it over? That would be a lot less work, right?
__________________
pain is inevitable but misery is optional - stick a geranium in your hat and be happy
apeman is offline  
Old 05-07-2004, 06:53 AM   #5 (permalink)
Wah
 
Location: NZ
i know how to do it, it's simple enough. I'll just see if I can find a program already written to do it, cos I'm busy at the moment.
__________________
pain is inevitable but misery is optional - stick a geranium in your hat and be happy
apeman is offline  
Old 05-11-2004, 10:14 AM   #6 (permalink)
Professor of Drinkology
 
On my computer the directory structure is as follows:

C:\
[Program Files]
--[WebPosition]
----[Reports]
---------[Website A]
---------[Website B]
---------[Website C]

The remote FTP has the following structure as root:
[Website A]
[Website B]
[Website C]

Every Tuesday morning, new files are generated for each of the "Website X" folders on my local computer. These files must be uploaded to their respective folder on the FTP server (eg., c:\..\Website_A\file1.htm must be copied to "ftp://website.com/reports/Website_A" and so on...
__________________
Blah.
tritium is offline  
Old 05-11-2004, 07:00 PM   #7 (permalink)
Professor of Drinkology
 
bump
__________________
Blah.
tritium is offline  
Old 05-11-2004, 09:41 PM   #8 (permalink)
Crazy
 
Location: Salt Town, UT
Shell scripting to the rescue

This looks like a perfect chance to learn the wonderful world of shell scripting.

Don't worry, you're gonna hate it at the beginning, but after a while, you will hardly be able to live without it. So, for starters, install cygwin, give yourself a nice full install, because getting components later is typically a pain. One caution, it's gonna take the better part of the day to download and install, it's pretty huge.

[taps fingers on desk while cygwin installs]

Now you have a wonderful unix-ish environment, which will almost work as well as a linux command line... almost. Since I guess you are probably new at this, I'm gonna try and take it a bit slowly, so you can actually (maybe, if I explain it well enough) learn what is going on and it won't be quite so much magic. Any line with a pound sign in front of it means you should type it in with the cygwin BASH shell. Beware, I might do something malicious so be sure you know what these things are doing before you do them, and at least make a backup.

First off, lets get a file list of all of the new files (in the last two days):
# cd c:/blah/blah/blah/
# find . -ctime 2 -type f
Hopefully that gives you the list of what you want, and I will assume that it spits out a lot of junk that looks like this:
./site_a/file1.html
Well, that is half the problem, finding all of the new files, but you could have done that with a simple "Find..." command. Now for the hard part, uploading them to the new directories with new filenames!
Well, we want the filenames to be stored, so we can go through them one by one, so here we go:
# FILELIST=`find . -ctime 2 -type f`
Now we got that list of files, and threw it into $FILELIST, yay. What do you say we list what is in $FILELIST
This will be typed in a little differently, because after the first line, it will prompt you for the other lines until you give it the 'done' command.
# for $FILELIST in FILE do
> echo "Looking at file $FILE"
> done
Okay, look at the output from that, if that looks good, then do this next one:
# for $FILELIST in FILE do
> echo "Uploading $FILE"
> ncftpput -u myuser -p mypassword ftpsite /reports/$FILE
> done

That should do it. I'll leave it up to you to tweak it to what you really need, and to put it in to a file so it's easy for you to run.
Rawb is offline  
Old 05-17-2004, 06:18 AM   #9 (permalink)
Wah
 
Location: NZ
apologies I didn't get back sooner, I've been stressed out

the above is a good idea and much better than I'd have done too

i tried a web search with the keywords "ftp incremental web sync" and there seems to be stuff out there
__________________
pain is inevitable but misery is optional - stick a geranium in your hat and be happy
apeman is offline  
 

Tags
data, eek, ftp, question, sync


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On



All times are GMT -8. The time now is 03:59 AM.

Tilted Forum Project

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0 PL2
© 2002-2012 Tilted Forum Project

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360