Sponsored Content
Operating Systems SCO Backup of files using NFS a faster way Post 302971563 by trolley on Thursday 21st of April 2016 08:44:12 PM
Old 04-21-2016
I have added the -t and -p qualifiers to my script.

Thanks to all that responded.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Backup system to NFS Appliance device

I have been tasked with getting an AIX 4.3.3 box to backup to a NAS applicance device which provides NFS service. It is an intermediary repository so that other tools can transport the resulting backup file to another NAS Applicance at a remote site on a secondary frame connection. Anyone have... (10 Replies)
Discussion started by: sirhisss
10 Replies

2. UNIX for Advanced & Expert Users

Faster Concatenation of files

Dear All I am using cat command for concatenating multiple files. Some time i also use append command when there are few files. Is there faster way of concatenating multiple files(60 to 70 files) each of 156 MB or less/more.:) Thanx (1 Reply)
Discussion started by: tkbharani
1 Replies

3. UNIX for Dummies Questions & Answers

backup to NFS mount Redhat-Solaris

Hi guys, I have a redhat laptop and a sun solaris 8 server networked together I created an nfs share on the sun server and backed up an image of the Redhat laptop to it. The Hard disk size of the laptop is 40Gb but I have about 38Gb free space on the sun server. So I compressed the image... (9 Replies)
Discussion started by: Stin
9 Replies

4. Shell Programming and Scripting

Running rename command on large files and make it faster

Hi All, I have some 80,000 files in a directory which I need to rename. Below is the command which I am currently running and it seems, it is taking fore ever to run this command. This command seems too slow. Is there any way to speed up the command. I have have GNU Parallel installed on my... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

5. Shell Programming and Scripting

rsync backup mode(--backup) Are there any options to remove backup folders on successful deployment?

Hi Everyone, we are running rsync with --backup mode, Are there any rsync options to remove backup folders on successful deployment? Thanks in adv. (0 Replies)
Discussion started by: MVEERA
0 Replies

6. Shell Programming and Scripting

Faster Line by Line String/Date Comparison of 2 Files

Hello, I was wondering if anyone knows a faster way to search and compare strings and dates from 2 files? I'm currently using "for loop" but seems sluggish as i have to cycle through 10 directories with 10 files each containing thousands of lines. Given: -10 directories -10 files... (4 Replies)
Discussion started by: agentgrecko
4 Replies

7. Shell Programming and Scripting

Faster command to remove headers for files in a directory

Good evening Im new at unix shell scripting and im planning to script a shell that removes headers for about 120 files in a directory and each file contains about 200000 lines in average. i know i will loop files to process each one and ive found in this great forum different solutions... (5 Replies)
Discussion started by: alexcol
5 Replies

8. AIX

GTAR - new ways for faster backup - help required

We are taking backup of our application data(cobol file system, AIX/unix) before and after EOD job runs. The data size is approximately 260 GB in biggest branch. To reduce the backup time, 5 parallel execution is scheduled through control-m which backups up the files in 5 different *.gz. The job... (2 Replies)
Discussion started by: Bharath_79
2 Replies

9. AIX

GTAR - new ways to faster backup - help required

We are taking backup of our application data(cobol file system, AIX/unix) before and after EOD job runs. The data size is approximately 260 GB in biggest branch. To reduce the backup time, 5 parallel execution is scheduled through control-m which backups up the files in 5 different *.gz. The job... (8 Replies)
Discussion started by: Bharath_79
8 Replies

10. UNIX for Advanced & Expert Users

Need help for faster file read and grep in big files

I have a very big input file <inputFile1.txt> which has list of mobile no inputFile1.txt 3434343 3434323 0970978 85233 ... around 1 million records i have another file as inputFile2.txt which has some log detail big file inputFile2.txt afjhjdhfkjdhfkd df h8983 3434343 | 3483 | myout1 |... (3 Replies)
Discussion started by: reldb
3 Replies
svn-fast-backup(1)					      General Commands Manual						svn-fast-backup(1)

NAME
svn-fast-backup - very fast backup for Subversion fsfs repositories. SYNOPSIS
svn-fast-backup [-q] [-k{N|all}] [-f] [-t] [-s] repos_path backup_dir DESCRIPTION
svn-fast-backup uses rsync snapshots for very fast backup of a Subversion fsfs repository at repos_path to backup_dir/repos-rev, the latest revision number in the repository. Multiple fsfs backups share data via hardlinks, so old backups are almost free, since a newer revision of a repository is almost a complete superset of an older revision. This is good for replacing incremental log-dump+restore-style backups because it is just as space-conserving and even faster; there is no inter-backup state (old backups are essentially caches); each backup directory is self-contained. It has the same command-line interface as svn-hot-backup(1) (if you use --force), but only works for fsfs repositories. svn-fast-backup keeps 64 backups by default and deletes backups older than these; this can be adjusted with the -k option. OPTIONS
-h, --help Shows some brief help text. -q, --quiet Quieter-than-usual operation. -k, --keep=N Keep a specified number of backups; the default is to keep 64. -k, --keep=all Do not delete any old backups at all. -f, --force Make a new backup even if one with the current revision exists. -t, --trace Show actions. -s, --simulate Don't perform actions. AUTHOR
Voluntary contributions made by many individuals. Copyright (C) 2006 CollabNet. 2006-11-09 svn-fast-backup(1)
All times are GMT -4. The time now is 05:57 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy