Sponsored Content
Full Discussion: Sftp large number of files
Top Forums Shell Programming and Scripting Sftp large number of files Post 302798961 by Corona688 on Thursday 25th of April 2013 12:37:43 PM
Old 04-25-2013
rsync, or 'remote sync', is an extremely common utility for duplicating entire trees of files. It is intelligent enough to avoid recopying up-to-date files while still copying ones that are out-of-date, corrupted, or missing. I've seen it used for managing hundreds of thousands of files in a go.

At its most basic, you give it a source directory and a destination directory, and it updates the contents of the destination to match the source. It has many options to control how it works. The manual page is a long read but useful.

Code:
$ mkdir a b
$ touch a/1 a/2 a/3 a/4
$ rsync a/ b  # Note a/, not a
$ ls b

1 2 3 4

$ rm a/1        # a now only contains 2 3 4
$ rsync -r a/ b # without --delete, does not delete extra files in b
$ ls b

1 2 3 4

$ rsync -r --delete a/ b # This will delete the extra file in b
$ ls b

2 3 4

$

If you run an rsync daemon on the destination, you can use rsync in a manner similar to scp: rsync /path/to/source/ user@host:/path/to/dest ...with the difference that rsync can be told to be recursive with -r, and is smarter than a blind copy.

Last edited by Corona688; 04-25-2013 at 01:46 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

moving large number of files

I have a task to move more than 35000 files every two hours, from the same directory to another directory based on a file that has the list of filenames I tried the following logics (1) find . -name \*.dat > list for i in `cat list` do mv $i test/ done (2) cat list|xargs -i mv "{}"... (7 Replies)
Discussion started by: bryan
7 Replies

2. UNIX for Dummies Questions & Answers

Problem using find with prune on large number of files

Hi all; I'm having a problem when want to list a large number of files in current directory using find together with the prune option. First i used this command but it list all the files including those in sub directories: find . -name "*.dat" | xargs ls -ltr Then i modified the command... (2 Replies)
Discussion started by: ashikin_8119
2 Replies

3. Shell Programming and Scripting

Script to Compare a large number of files.

I have a large Filesystem on an AIX server and another one on a Red Hat box. I have syncd the two filesystems using rsysnc. What Im looking for is a script that would compare to the two filesystems to make sure the bits match up and the number of files match up. its around 2.8 million... (5 Replies)
Discussion started by: zippdawg2001
5 Replies

4. Shell Programming and Scripting

Need help combining large number of text files

Hi, i have more than 1000 data files(.txt) like this first file format: 178.83 554.545 179.21 80.392 second file: 178.83 990.909 179.21 90.196 etc. I want to combine them to the following format: 178.83,554.545,990.909,... 179.21,80.392,90.196,... (7 Replies)
Discussion started by: mr_monocyte
7 Replies

5. UNIX for Dummies Questions & Answers

questing regarding tar large number of files

I want to tar large number of files about 150k. i am using the find command as below to create a file with all file names. & then trying to use the tar -I command as below. # find . -type f -name "gpi*" > include-file # tar -I include-file -cvf newfile.tar This i got from one of the posts... (2 Replies)
Discussion started by: crux123
2 Replies

6. Shell Programming and Scripting

Concatenation of a large number of files

Hellow i have a large number of files that i want to concatenate to one. these files start with the word 'VOICE_' for example VOICE_0000000000 VOICE_1223o23u0 VOICE_934934927349 I use the following code: cat /ODS/prepaid/CDR_FLOW/MEDIATION/VOICE_* >> /ODS/prepaid/CDR_FLOW/WORK/VOICE ... (10 Replies)
Discussion started by: chriss_58
10 Replies

7. UNIX for Dummies Questions & Answers

Delete large number of files

Hi. I need to delete a large number of files listed in a txt file. There are over 90000 files in the list. Some of the directory names and some of the file names do have spaces in them. In the file, each line is a full path to a file: /path/to/the files/file1 /path/to/some other/files/file 2... (4 Replies)
Discussion started by: inakajin
4 Replies

8. Shell Programming and Scripting

Using find in a directory containing large number of files

Hi All, I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files. The shell script looks like this (MODIFIED): find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

9. UNIX for Dummies Questions & Answers

Rename a large number of files in subdirectories

Hi, I have a large number of subdirectories (>200), and in each of these directories there is a file with a name like "opp1234.dat". I'd like to know how I could change the names of these files to say "out.dat" in all these subdirectories in one go. Thanks! (5 Replies)
Discussion started by: lost.identity
5 Replies

10. Shell Programming and Scripting

Removing large number of temp files

Hi All, I am having a situation now to delete a huge number of temp files created during run times approx. 16700+ files. We have never imagined that we will get this this much big list of files during run time. It worked fine for lesser no of files in the list. But when list is huge we are... (7 Replies)
Discussion started by: mad man
7 Replies
PARALLEL-RSYNC(1)														 PARALLEL-RSYNC(1)

NAME
parallel-rsync - deploy files to listed hosts SYNOPSIS
parallel-rsync [OPTIONS] -h hosts.txt local remote DESCRIPTION
pssh provides a number of commands for executing against a group of computers, using SSH. It's most useful for operating on clusters of homogenously-configured hosts. parallel-rsync deploy files files to all hosts you listed. OPTIONS
-r --recursive recusively copy directories (OPTIONAL) -a ----archive use rsync -a (archive mode) (OPTIONAL) -z --compress use rsync compression (OPTIONAL) -h --hosts hosts file (each line "host[:port] [user]") -l --user username (OPTIONAL) -p --par max number of parallel threads (OPTIONAL) -o --outdir output directory for stdout files (OPTIONAL) -e --errdir output directory for stderr files (OPTIONAL) -t --timeout timeout (secs) (-1 = no timeout) per host (OPTIONAL) -O --options SSH options (OPTIONAL) -v --verbose turn on warning and diagnostic messages (OPTIONAL) EXAMPLE
# parallel-rsync -r -h hosts.txt -l irb2 foo /home/irb2/foo ENVIRONMENT
All four programs take similar sets of options. All of these options can be set using the following environment variables: o PSSH_HOSTS o PSSH_USER o PSSH_PAR o PSSH_OUTDIR o PSSH_VERBOSE o PSSH_OPTIONS SEE ALSO
parallel-ssh(1), parallel-scp(1), parallel-slurp(1), parallel-nuke(1), ssh(1), rsync(1) AUTHOR
Brent N. Chun <bnc@theether.org> COPYING
Copyright: 2003, 2004, 2005, 2006, 2007 Brent N. Chun NOTES
1. bnc@theether.org mailto:bnc@theether.org 03/30/2009 PARALLEL-RSYNC(1)
All times are GMT -4. The time now is 06:22 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy