Started using sitecopy. Had some initial config problems but a mail to the users-list sorted it out. After all, it was ported to win32 – I hardly expected it to run out of the box. So what does it do?
From the site
sitecopy is for easily maintaining remote web sites. The program will upload files to the server which have changed locally, and delete files from the server which have been removed locally, to keep the remote site synchronized with the local site with a single command.
I was a bit troubled when I saw my bandwidth usage go vertical – I do have 3 GB of it, but hey, we all are suckers for bandwidth. A look at the stats showed two things.
- The wiki functionality, while being a good feature, was guzzling the b/w.
- FTP transfers – upload, check, correct, upload, repeat cycle – was also taking up b/w.
I first locked all pages and set ACLs for those that were setup for collaborative stuff (that’s what a wiki is for). Next I setup Apache on my machine to carry out modifications locally. The final version would be uploaded. This caused another problem. Being impulsive by nature, I upload to my remote site after every trivial change. Using an FTP client became an agonizing experience – waiting for the client to connect, navigating to different directories in local and remote panes simultaneously, uploading and deleting stuff as necessary without missing out any edits… and watching the COMMAND-RESPONSE sequence go by. As the saying goes, I was ready for the next level.
Sitecopy fit the bill exactly. I just keep editing my local site, turning it into what I want the world to see. When I feel like it, sitecopy -u mysite updates the remote site. And it works in the reverse too, with the -s switch. So I can download any pages that visitors to the wiki may have edited. And it may be an initial euphoric reaction, but I feel sitecopy is faster than an FTP transaction.