Sponsored Content
Operating Systems SCO Backup of files using NFS a faster way Post 302967894 by trolley on Monday 29th of February 2016 09:37:57 PM
Old 02-29-2016
I started playing around with rsync and it has promise. Much faster than the way I am doing it. I post the final shell script when done.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Backup system to NFS Appliance device

I have been tasked with getting an AIX 4.3.3 box to backup to a NAS applicance device which provides NFS service. It is an intermediary repository so that other tools can transport the resulting backup file to another NAS Applicance at a remote site on a secondary frame connection. Anyone have... (10 Replies)
Discussion started by: sirhisss
10 Replies

2. UNIX for Advanced & Expert Users

Faster Concatenation of files

Dear All I am using cat command for concatenating multiple files. Some time i also use append command when there are few files. Is there faster way of concatenating multiple files(60 to 70 files) each of 156 MB or less/more.:) Thanx (1 Reply)
Discussion started by: tkbharani
1 Replies

3. UNIX for Dummies Questions & Answers

backup to NFS mount Redhat-Solaris

Hi guys, I have a redhat laptop and a sun solaris 8 server networked together I created an nfs share on the sun server and backed up an image of the Redhat laptop to it. The Hard disk size of the laptop is 40Gb but I have about 38Gb free space on the sun server. So I compressed the image... (9 Replies)
Discussion started by: Stin
9 Replies

4. Shell Programming and Scripting

Running rename command on large files and make it faster

Hi All, I have some 80,000 files in a directory which I need to rename. Below is the command which I am currently running and it seems, it is taking fore ever to run this command. This command seems too slow. Is there any way to speed up the command. I have have GNU Parallel installed on my... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

5. Shell Programming and Scripting

rsync backup mode(--backup) Are there any options to remove backup folders on successful deployment?

Hi Everyone, we are running rsync with --backup mode, Are there any rsync options to remove backup folders on successful deployment? Thanks in adv. (0 Replies)
Discussion started by: MVEERA
0 Replies

6. Shell Programming and Scripting

Faster Line by Line String/Date Comparison of 2 Files

Hello, I was wondering if anyone knows a faster way to search and compare strings and dates from 2 files? I'm currently using "for loop" but seems sluggish as i have to cycle through 10 directories with 10 files each containing thousands of lines. Given: -10 directories -10 files... (4 Replies)
Discussion started by: agentgrecko
4 Replies

7. Shell Programming and Scripting

Faster command to remove headers for files in a directory

Good evening Im new at unix shell scripting and im planning to script a shell that removes headers for about 120 files in a directory and each file contains about 200000 lines in average. i know i will loop files to process each one and ive found in this great forum different solutions... (5 Replies)
Discussion started by: alexcol
5 Replies

8. AIX

GTAR - new ways for faster backup - help required

We are taking backup of our application data(cobol file system, AIX/unix) before and after EOD job runs. The data size is approximately 260 GB in biggest branch. To reduce the backup time, 5 parallel execution is scheduled through control-m which backups up the files in 5 different *.gz. The job... (2 Replies)
Discussion started by: Bharath_79
2 Replies

9. AIX

GTAR - new ways to faster backup - help required

We are taking backup of our application data(cobol file system, AIX/unix) before and after EOD job runs. The data size is approximately 260 GB in biggest branch. To reduce the backup time, 5 parallel execution is scheduled through control-m which backups up the files in 5 different *.gz. The job... (8 Replies)
Discussion started by: Bharath_79
8 Replies

10. UNIX for Advanced & Expert Users

Need help for faster file read and grep in big files

I have a very big input file <inputFile1.txt> which has list of mobile no inputFile1.txt 3434343 3434323 0970978 85233 ... around 1 million records i have another file as inputFile2.txt which has some log detail big file inputFile2.txt afjhjdhfkjdhfkd df h8983 3434343 | 3483 | myout1 |... (3 Replies)
Discussion started by: reldb
3 Replies
PARALLEL-RSYNC(1)														 PARALLEL-RSYNC(1)

NAME
parallel-rsync - deploy files to listed hosts SYNOPSIS
parallel-rsync [OPTIONS] -h hosts.txt local remote DESCRIPTION
pssh provides a number of commands for executing against a group of computers, using SSH. It's most useful for operating on clusters of homogenously-configured hosts. parallel-rsync deploy files files to all hosts you listed. OPTIONS
-r --recursive recusively copy directories (OPTIONAL) -a ----archive use rsync -a (archive mode) (OPTIONAL) -z --compress use rsync compression (OPTIONAL) -h --hosts hosts file (each line "host[:port] [user]") -l --user username (OPTIONAL) -p --par max number of parallel threads (OPTIONAL) -o --outdir output directory for stdout files (OPTIONAL) -e --errdir output directory for stderr files (OPTIONAL) -t --timeout timeout (secs) (-1 = no timeout) per host (OPTIONAL) -O --options SSH options (OPTIONAL) -v --verbose turn on warning and diagnostic messages (OPTIONAL) EXAMPLE
# parallel-rsync -r -h hosts.txt -l irb2 foo /home/irb2/foo ENVIRONMENT
All four programs take similar sets of options. All of these options can be set using the following environment variables: o PSSH_HOSTS o PSSH_USER o PSSH_PAR o PSSH_OUTDIR o PSSH_VERBOSE o PSSH_OPTIONS SEE ALSO
parallel-ssh(1), parallel-scp(1), parallel-slurp(1), parallel-nuke(1), ssh(1), rsync(1) AUTHOR
Brent N. Chun <bnc@theether.org> COPYING
Copyright: 2003, 2004, 2005, 2006, 2007 Brent N. Chun NOTES
1. bnc@theether.org mailto:bnc@theether.org 03/30/2009 PARALLEL-RSYNC(1)
All times are GMT -4. The time now is 01:24 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy