09-02-2014
Quote:
read again the man pages of ftp.
Exactly!
Quote:
Why arent you using sftp, or scp?
or use
tar to stuff the whole tree to a single file, transfer that and untar it on the other side. This will preserve all the user information, filemodes, etc.. and might be, depending on your exact requirements (of which you gave us only a cursory impression) less effort.
I hope this helps.
bakunin
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
Another Unix question. How would I copy multiple directories at the same time? Right now I do:
cp -r -f /directory1/ ../backup/directory1/
I do that for each directory one at a time. But there are multiple directories I'd like to copy. So instead of sitting there and doing one at a time, is... (9 Replies)
Discussion started by: JPigford
9 Replies
2. UNIX for Dummies Questions & Answers
Can we copy a file to multiple directories using a single command line , i tried with * didnt work for me
cp /tmp/a.kool /tmp/folder/*/keys/
I am tryn to copy a.kool file to all keys folder in /tmp folder. is something i am missing ? (4 Replies)
Discussion started by: logic0
4 Replies
3. UNIX for Dummies Questions & Answers
I need to copy around 30 directories (each directory include one or more text file(s)) from NT server to Unix server at one go. For doing this what are the privillages i should have in both NT and Unix server.
Please let me know which command i can use in shell prompt.
TIA. (4 Replies)
Discussion started by: jhmr7
4 Replies
4. UNIX for Dummies Questions & Answers
Hi again All :)
After posting my first thread just a few eeks ago and having such a great response (Thank You once again :) ), I thought I'd perhaps ask the experts again. In short I'm trying to achieve a "find" and "copy" where the find needs to find directories:
find -d -name outbox
and... (6 Replies)
Discussion started by: Dean Rotherham
6 Replies
5. Shell Programming and Scripting
I am writing a simple backup script, but I cannot figure out how to remove directories that are found in a list. For example:
DONT_COPY="
.adobe/
.bin/google-earth
"
tar -zcvf - * --exclude=$DONT_COPY | openssl des3 -salt -k $1 | dd of=$(hostname)-$(date +%Y%m%d).tbz > COPIED
Note that... (4 Replies)
Discussion started by: dotancohen
4 Replies
6. Shell Programming and Scripting
Hello all,
I'm trying to copy all files within a specified directory to another location based on a find filter of mtime -1 (Solaris OS). The issue that I'm having is that in the destination directory, I want to retain the source directory structure while copying over only the files that have... (4 Replies)
Discussion started by: hunter55
4 Replies
7. Shell Programming and Scripting
I want to write a script that copys over a complete folder including the dirs to another
location.
However in the process I want to ignore several filetypse that SHOULD NOT get copied over.
I know Global Ignore is capable of make the copy command ignore one file type, however
I don't know how... (8 Replies)
Discussion started by: pasc
8 Replies
8. Shell Programming and Scripting
I have the following that I'd like to do:
1. I have split a file into separate files that I placed into the /tmp directory. These files are named F1 F2 F3 F4.
2. In addition, I have several directories which are alphabetized as dira dirb dirc dird.
3. I'd like to be able to copy F1 F2 F3 F4... (2 Replies)
Discussion started by: newbie2010
2 Replies
9. Shell Programming and Scripting
I have a simple script which copies directory from one place to another and deleting the source .
I am facing a situation when new files gets added when the script has started running. Its resulting in data loss
Please suggest a way to avoid data loss. I googled a lot but most are perl... (11 Replies)
Discussion started by: ningy
11 Replies
10. Shell Programming and Scripting
guys, i did create a script but its too long, though it function the same.
# cat nightlyscan.sh
#!/usr/ksh
deyt=`date +"%Y-%m-%d"`
for i in `ls -lrt|grep $deyt|awk '{print $9}'`
do
cp -f $i /S1/Sophos/logger/
done
#
but i did not paste it all.
this is the desired. (9 Replies)
Discussion started by: kenshinhimura
9 Replies
LEARN ABOUT DEBIAN
backup-manager-upload
BACKUP-MANAGER-UPLOAD(8) backup-manager-upload BACKUP-MANAGER-UPLOAD(8)
NAME
backup-manager-upload - Multiprotocol uploader for backup-manager.
SYNOPSIS
backup-manager-upload [options] date
DESCRIPTION
backup-manager-upload will upload all the archives generated on the given date to the specified host with either ftp or scp. It's also
possible to use this program for uploading data to an Amazon S3 account. Some metadates are available like "today" or "yesterday".
REQUIRED ARGS
--mode=transfer-mode
Select the transfer mode to use : ftp, scp, or s3.
--host=hostname1,hostname2,...,hostnameN
Select a list of remote hosts to connect to.
--user=username
Select the user to use for connection.
OPTIONAL ARGS
--password=password
Select the ftp user's password (only needed for ftp transfers).
--key=path_to_private_key
Select the ssh private key file to use when opening the ssh session for scp transfer. Obviously, this is only needed for scp transfer
mode. If you don't specify a key file, the user's default private key will be used.
--directory=directory
Select the location on the remote host where files will be uploaded. Default is /backup/uploads.
--bucket=bucket
Sets the bucket name for the Amazon S3 service backup into.
--root=directory
Select the local directory where files are. Default is /var/archives
--gpg-recipient=gpg-recipient
Select the gpg public key for encryptiong the archives when uploading with the method ssh-gpg. This can be a short or long key id or a
descriptive name. The precise syntax is described in the gpg man page.
--list
Just list the files to upload.
--ftp-purge
Purge the remote directory before uploading files in FTP mode.
--s3-purge
Purge the remote directory before uploading files in FTP mode.
--ssh-purge
Purge the remote directory before uploading files in SSH mode.
--verbose
Flag to enable verbose mode.
date
Date pattern to select some files to upload, can be a valid date (YYYYMMDD) or 'today' or 'yesterday'.
ERROR CODES
If something goes wrong during an upload, backup-manager-upload will exit with a non null value. In such a case every error messages are
sent to STDERR.
Here are the possible error codes:
bad command line (wrong arguments) : 10
FTP transfer failure : 20
SCP transfer failure : 21
S3 transfer failure : 22
Unknown upload method: 23
SEE ALSO
backup-manager(3)
AUTHORS
Alexis Sukrieh - main code and design
Brad Dixon - Amazon S3 upload method
Jan Metzger - ssh-gpg upload method
perl v5.14.2 2012-05-09 BACKUP-MANAGER-UPLOAD(8)