Image

Imagetonytraductor wrote in Imagelinuxsupport perplexed

ftp mput recursive and/or curl question

I was trying to move some stuff up onto my server, and, I thought I'd like to try and move some stuff without using a gui ftp program or my host's online cpanel tools ( (blecch) for once.
Mostly, because I want to be able to script what I'm doing for future use.
Now, I've used gui ftp clients (and even wrote a little tcl one for quick jobs), but the one I wrote
will only move one file at time...I recall not being able to figure out how to send a dirfull, recursively, in fact, when making the little guy.

Now, I know it's possible to move a whole directory at a time, because thousands of existing gui ftp clients do it.
But I don't seem to be succeeding.
First, I can't find an ftp command (this using bash ftp on debian lenny, not using the tcl/ftp I used to write my little thingy) to move a whole directory, recursively.
I can give it a wildcard, and it will load up all the distinct files in a dir, but it won't send another dir within the dir and files therein, recursively, as I want.

I also thought I'd try curl.

I did something like this:
###########3
#!/bin/bash

# sending up the nanoblog

echo "writing nblist"
find nanoblog/* > nblist
echo "list written \nnow sending files..."
for i in $(cat nblist)
do

curl -v -u me:pwd -T --ftp-create-dirs $i ftp://ftp.baldwinsoftware.com/nb/

done
echo "files sent"
exit
################



It appears to function in the command line, but when I checked the remote files (html pages) on the server with a normal browser,
it apperas as though something has gone amiss.

I was expecting it to overwrite existing files on the remote server.
But it seems to hose the files, or something.
If I could get that to work, I'd like to make it only send/overwrite stuff to the remote server when the local file is new (not on the remote),
or newer (exists on remote, but was edited more recently, locally), and I don't seem to have found how to do that.
-Z seems to have something to do with download/getting files and establishing some time parameter, but I don't see how to
read the timestamp from the server, and if the local file is newer, go, and if not, abort, so, what I have, to my knowledge,
will overwrite all files, whether they've changed or not. That's not efficient.


Also, I wonder if there is a better way to "glob" the file list, so I'm not sending 35 separate logins and curl requests, but
one for a whole bunch of files.
Rather than find and for i in $(cat list) loop, I thought of the following
(didn't try it yet, because I'm falling a asleep at the keyboard, now)
#############
#!/bin/bash

# sending up the nanoblog

echo "writing new file list"

for i in $(ls -1 nanoblog/)
do echo $i,\ >> nblist
done
# that gives me a list, one item per line followed by comma and space

# but to glob it in the curl command, I believe I need one line,
# so let's remove the line breaks:

perl -pi -e 'tr/[\012\015]//d' nblist
# there might be an easier way to do that with tr or sed or something...

echo "file list written, now sending files..."

flist = $(cat nblist)
# didn't think curl -T {$(cat list)} would be smart

curl -# -u me:pwd -T {$flist} ftp://ftp.baldwinsoftware.com/nb/

done
echo "files sent"
exit
###############
aargh...my brain hurts...



So, my questions are:

Is there a way to send a dir and it's contents, including sub/dirs, recursively, via ftp in command line?
And, if so, what is it?
(I'm not finding that in the man page, and, I did some googling before coming to ask, but
I only found info about recursive mget, and when I tried to apply it to mput, did not achieve the desired result).

and/or

Why do my curl efforts not give me the desired result?

One more:
I found wput, which will send whole dirs, and recursively, but I don't see in the man where it authenticates on the server.
I need to login to ftp up (no anonymous on my server, no way).

-
I did end up sending everything up with gftp for today, incidentally, but, I will be updating
these pages frequently, and would rather be able to script it and do it from the command line,
in fact, nanoblog, to my knowledge, can call a script to publish the darned thing, if I make a suitable
script and put it in the conf, so, yes, I do, very much, want to learn how to accomplish recursive putting of files to my server,
via the command line, for future use.

I'd like to have a script, really, that will send everything up, only overwriting existing files on the remotes server
when the local file has been touched more recently.

Thanks,
Tony