Sponsored Content
Operating Systems SCO Backup of files using NFS a faster way Post 302967774 by Scrutinizer on Sunday 28th of February 2016 05:15:08 AM
Old 02-28-2016
Also, the script posted in post #1 appears to be using a single NAS backup volume (defined in /etc/fstab). IMO an approach like this should only be used if there are multiple backup NAS volumes, either taken from the host itself or from the NAS volume.

Otherwise this would not be a secure way of doing it, since it first wipes the old backup before making a new one, so if anything happens during that backup you end up with nothing..
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Backup system to NFS Appliance device

I have been tasked with getting an AIX 4.3.3 box to backup to a NAS applicance device which provides NFS service. It is an intermediary repository so that other tools can transport the resulting backup file to another NAS Applicance at a remote site on a secondary frame connection. Anyone have... (10 Replies)
Discussion started by: sirhisss
10 Replies

2. UNIX for Advanced & Expert Users

Faster Concatenation of files

Dear All I am using cat command for concatenating multiple files. Some time i also use append command when there are few files. Is there faster way of concatenating multiple files(60 to 70 files) each of 156 MB or less/more.:) Thanx (1 Reply)
Discussion started by: tkbharani
1 Replies

3. UNIX for Dummies Questions & Answers

backup to NFS mount Redhat-Solaris

Hi guys, I have a redhat laptop and a sun solaris 8 server networked together I created an nfs share on the sun server and backed up an image of the Redhat laptop to it. The Hard disk size of the laptop is 40Gb but I have about 38Gb free space on the sun server. So I compressed the image... (9 Replies)
Discussion started by: Stin
9 Replies

4. Shell Programming and Scripting

Running rename command on large files and make it faster

Hi All, I have some 80,000 files in a directory which I need to rename. Below is the command which I am currently running and it seems, it is taking fore ever to run this command. This command seems too slow. Is there any way to speed up the command. I have have GNU Parallel installed on my... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

5. Shell Programming and Scripting

rsync backup mode(--backup) Are there any options to remove backup folders on successful deployment?

Hi Everyone, we are running rsync with --backup mode, Are there any rsync options to remove backup folders on successful deployment? Thanks in adv. (0 Replies)
Discussion started by: MVEERA
0 Replies

6. Shell Programming and Scripting

Faster Line by Line String/Date Comparison of 2 Files

Hello, I was wondering if anyone knows a faster way to search and compare strings and dates from 2 files? I'm currently using "for loop" but seems sluggish as i have to cycle through 10 directories with 10 files each containing thousands of lines. Given: -10 directories -10 files... (4 Replies)
Discussion started by: agentgrecko
4 Replies

7. Shell Programming and Scripting

Faster command to remove headers for files in a directory

Good evening Im new at unix shell scripting and im planning to script a shell that removes headers for about 120 files in a directory and each file contains about 200000 lines in average. i know i will loop files to process each one and ive found in this great forum different solutions... (5 Replies)
Discussion started by: alexcol
5 Replies

8. AIX

GTAR - new ways for faster backup - help required

We are taking backup of our application data(cobol file system, AIX/unix) before and after EOD job runs. The data size is approximately 260 GB in biggest branch. To reduce the backup time, 5 parallel execution is scheduled through control-m which backups up the files in 5 different *.gz. The job... (2 Replies)
Discussion started by: Bharath_79
2 Replies

9. AIX

GTAR - new ways to faster backup - help required

We are taking backup of our application data(cobol file system, AIX/unix) before and after EOD job runs. The data size is approximately 260 GB in biggest branch. To reduce the backup time, 5 parallel execution is scheduled through control-m which backups up the files in 5 different *.gz. The job... (8 Replies)
Discussion started by: Bharath_79
8 Replies

10. UNIX for Advanced & Expert Users

Need help for faster file read and grep in big files

I have a very big input file <inputFile1.txt> which has list of mobile no inputFile1.txt 3434343 3434323 0970978 85233 ... around 1 million records i have another file as inputFile2.txt which has some log detail big file inputFile2.txt afjhjdhfkjdhfkd df h8983 3434343 | 3483 | myout1 |... (3 Replies)
Discussion started by: reldb
3 Replies
rdiff(1)						      General Commands Manual							  rdiff(1)

NAME
rdiff - compute and apply signature-based file differences SYNOPSYS
rdiff [options] signature old-file signature-file rdiff [options] delta signature-file new-file delta-file rdiff [options] patch basis-file delta-file new-file USAGE
You can use rdiff to update files, much like rsync does. However, unlike rsync, rdiff puts you in control. There are three steps to updating a file: signature, delta, and patch. DESCRIPTION
In every case where a filename must be specified, - may be used instead to mean either standard input or standard output as appropriate. Be aware that if you do this, you'll need to terminate your options with -- or rdiff will think you are passing it an empty option. RETURN VALUE
0 for successful completion, 1 for environmental problems (file not found, invalid options, IO error, etc), 2 for a corrupt file and 3 for an internal error or unhandled situation in librsync or rdiff. SEE ALSO
librsync(3) AUTHOR
Martin Pool <mbp@samba.org> The original rsync algorithm was discovered by Andrew Tridgell. rdiff development has been supported by Linuxcare, Inc and VA Linux Systems. $Date: 2002/01/25 21:25:34 $ rdiff(1)
All times are GMT -4. The time now is 05:36 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy