Sponsored Content
Operating Systems SCO Backup of files using NFS a faster way Post 302971549 by MadeInGermany on Thursday 21st of April 2016 03:03:34 PM
Old 04-21-2016
Consider adding -p and -t options to also preserve permissions and time.
Or use rsync -auvH that will also preserve links and other special files (something that cp -rp cannot do).
-a means archive, see
Code:
man rsync

 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Backup system to NFS Appliance device

I have been tasked with getting an AIX 4.3.3 box to backup to a NAS applicance device which provides NFS service. It is an intermediary repository so that other tools can transport the resulting backup file to another NAS Applicance at a remote site on a secondary frame connection. Anyone have... (10 Replies)
Discussion started by: sirhisss
10 Replies

2. UNIX for Advanced & Expert Users

Faster Concatenation of files

Dear All I am using cat command for concatenating multiple files. Some time i also use append command when there are few files. Is there faster way of concatenating multiple files(60 to 70 files) each of 156 MB or less/more.:) Thanx (1 Reply)
Discussion started by: tkbharani
1 Replies

3. UNIX for Dummies Questions & Answers

backup to NFS mount Redhat-Solaris

Hi guys, I have a redhat laptop and a sun solaris 8 server networked together I created an nfs share on the sun server and backed up an image of the Redhat laptop to it. The Hard disk size of the laptop is 40Gb but I have about 38Gb free space on the sun server. So I compressed the image... (9 Replies)
Discussion started by: Stin
9 Replies

4. Shell Programming and Scripting

Running rename command on large files and make it faster

Hi All, I have some 80,000 files in a directory which I need to rename. Below is the command which I am currently running and it seems, it is taking fore ever to run this command. This command seems too slow. Is there any way to speed up the command. I have have GNU Parallel installed on my... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

5. Shell Programming and Scripting

rsync backup mode(--backup) Are there any options to remove backup folders on successful deployment?

Hi Everyone, we are running rsync with --backup mode, Are there any rsync options to remove backup folders on successful deployment? Thanks in adv. (0 Replies)
Discussion started by: MVEERA
0 Replies

6. Shell Programming and Scripting

Faster Line by Line String/Date Comparison of 2 Files

Hello, I was wondering if anyone knows a faster way to search and compare strings and dates from 2 files? I'm currently using "for loop" but seems sluggish as i have to cycle through 10 directories with 10 files each containing thousands of lines. Given: -10 directories -10 files... (4 Replies)
Discussion started by: agentgrecko
4 Replies

7. Shell Programming and Scripting

Faster command to remove headers for files in a directory

Good evening Im new at unix shell scripting and im planning to script a shell that removes headers for about 120 files in a directory and each file contains about 200000 lines in average. i know i will loop files to process each one and ive found in this great forum different solutions... (5 Replies)
Discussion started by: alexcol
5 Replies

8. AIX

GTAR - new ways for faster backup - help required

We are taking backup of our application data(cobol file system, AIX/unix) before and after EOD job runs. The data size is approximately 260 GB in biggest branch. To reduce the backup time, 5 parallel execution is scheduled through control-m which backups up the files in 5 different *.gz. The job... (2 Replies)
Discussion started by: Bharath_79
2 Replies

9. AIX

GTAR - new ways to faster backup - help required

We are taking backup of our application data(cobol file system, AIX/unix) before and after EOD job runs. The data size is approximately 260 GB in biggest branch. To reduce the backup time, 5 parallel execution is scheduled through control-m which backups up the files in 5 different *.gz. The job... (8 Replies)
Discussion started by: Bharath_79
8 Replies

10. UNIX for Advanced & Expert Users

Need help for faster file read and grep in big files

I have a very big input file <inputFile1.txt> which has list of mobile no inputFile1.txt 3434343 3434323 0970978 85233 ... around 1 million records i have another file as inputFile2.txt which has some log detail big file inputFile2.txt afjhjdhfkjdhfkd df h8983 3434343 | 3483 | myout1 |... (3 Replies)
Discussion started by: reldb
3 Replies
TAR(1)							      General Commands Manual							    TAR(1)

NAME
tar - The GNU version of the tar archiving utility SYNOPSIS
tar [ - ] A --catenate --concatenate | c --create | d --diff --compare | r --append | t --list | u --update | x -extract --get [ --atime- preserve ] [ -b, --block-size N ] [ -B, --read-full-blocks ] [ -C, --directory DIR ] [ --checkpoint ] [ -f, --file [HOSTNAME:]F ] [ --force-local ] [ -F, --info-script F --new-volume-script F ] [ -G, --incremental ] [ -g, --listed-incremental F ] [ -h, --dereference ] [ -i, --ignore-zeros ] [ -j, -I, --bzip ] [ --ignore-failed-read ] [ -k, --keep-old-files ] [ -K, --starting-file F ] [ -l, --one-file-sys- tem ] [ -L, --tape-length N ] [ -m, --modification-time ] [ -M, --multi-volume ] [ -N, --after-date DATE, --newer DATE ] [ -o, --old-ar- chive, --portability ] [ -O, --to-stdout ] [ -p, --same-permissions, --preserve-permissions ] [ -P, --absolute-paths ] [ --preserve ] [ -R, --record-number ] [ --remove-files ] [ -s, --same-order, --preserve-order ] [ --same-owner ] [ -S, --sparse ] [ -T, --files-from=F ] [ --null ] [ --totals ] [ -v, --verbose ] [ -V, --label NAME ] [ --version ] [ -w, --interactive, --confirmation ] [ -W, --verify ] [ --exclude FILE ] [ -X, --exclude-from FILE ] [ -Z, --compress, --uncompress ] [ -z, --gzip, --ungzip ] [ --use-compress-program PROG ] [ --block-compress ] [ -[0-7][lmh] ] filename1 [ filename2, ... filenameN ] directory1 [ directory2, ...directoryN ] DESCRIPTION
This manual page documents the GNU version of tar , an archiving program designed to store and extract files from an archive file known as a tarfile. A tarfile may be made on a tape drive, however, it is also common to write a tarfile to a normal file. The first argument to tar must be one of the options: Acdrtux, followed by any optional functions. The final arguments to tar are the names of the files or directories which should be archived. The use of a directory name always implies that the subdirectories below should be included in the archive. FUNCTION LETTERS
One of the following options must be used: -A, --catenate, --concatenate append tar files to an archive -c, --create create a new archive -d, --diff, --compare find differences between archive and file system --delete delete from the archive (not for use on mag tapes!) -r, --append append files to the end of an archive -t, --list list the contents of an archive -u, --update only append files that are newer than copy in archive -x, --extract, --get extract files from an archive OTHER OPTIONS
--atime-preserve don't change access times on dumped files -b, --block-size N block size of Nx512 bytes (default N=20) -B, --read-full-blocks reblock as we read (for reading 4.2BSD pipes) -C, --directory DIR change to directory DIR --checkpoint print directory names while reading the archive -f, --file [HOSTNAME:]F use archive file or device F (default /dev/rmt0) --force-local archive file is local even if has a colon -F, --info-script F --new-volume-script F run script at end of each tape (implies -M) -G, --incremental create/list/extract old GNU-format incremental backup -g, --listed-incremental F create/list/extract new GNU-format incremental backup -h, --dereference don't dump symlinks; dump the files they point to -i, --ignore-zeros ignore blocks of zeros in archive (normally mean EOF) -j, -I, --bzip filter the archive through bzip2. Note: -I is deprecated and may get a different meaning in the near future. --ignore-failed-read don't exit with non-zero status on unreadable files -k, --keep-old-files keep existing files; don't overwrite them from archive -K, --starting-file F begin at file F in the archive -l, --one-file-system stay in local file system when creating an archive -L, --tape-length N change tapes after writing N*1024 bytes -m, --modification-time don't extract file modified time -M, --multi-volume create/list/extract multi-volume archive -N, --after-date DATE, --newer DATE only store files newer than DATE -o, --old-archive, --portability write a V7 format archive, rather than ANSI format -O, --to-stdout extract files to standard output -p, --same-permissions, --preserve-permissions extract all protection information -P, --absolute-paths don't strip leading `/'s from file names --preserve like -p -s -R, --record-number show record number within archive with each message --remove-files remove files after adding them to the archive -s, --same-order, --preserve-order list of names to extract is sorted to match archive --same-owner create extracted files with the same ownership -S, --sparse handle sparse files efficiently -T, --files-from=F get names to extract or create from file F --null -T reads null-terminated names, disable -C --totals print total bytes written with --create -v, --verbose verbosely list files processed -V, --label NAME create archive with volume name NAME --version print tar program version number -w, --interactive, --confirmation ask for confirmation for every action -W, --verify attempt to verify the archive after writing it --exclude FILE exclude file FILE -X, --exclude-from FILE exclude files listed in FILE -Z, --compress, --uncompress filter the archive through compress -z, --gzip, --ungzip filter the archive through gzip --use-compress-program PROG filter the archive through PROG (which must accept -d) 30 October 2000 TAR(1)
All times are GMT -4. The time now is 12:41 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy