Sponsored Content
Top Forums Shell Programming and Scripting Modifying command for Tar.gz Files. Post 302230140 by era on Thursday 28th of August 2008 05:26:19 PM
Old 08-28-2008
Should work without any modification. At least here, zgrep doesn't complain if you give it files which are not gzipped (although restricting to -name '*.gz' might be a good idea for performance reasons).
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

using tar command to copy files?

hi, can i use the tar command to copy an entire directory and its content in another folder? What is the proper syntax? thx (2 Replies)
Discussion started by: tomapam
2 Replies

2. Solaris

TAR command , listing backed up files

I've backed up several files to tape using tar, and wish to list those that have backed up. % tar cvf /dev/rmt/2un /s_1/oradata/pgpub/config.ora a /s_1/oradata/pgpub/config.ora 2 tape blocks But when I go to list the files: % tar tvf /dev/rmt/2un tar: tape read error What am I... (3 Replies)
Discussion started by: FredSmith
3 Replies

3. UNIX for Advanced & Expert Users

How to create a Tar of multiple Files in Unix and FTP the tar to Windows.

Hi, On my Unix Server in my directory, I have 70 files distributed in the following directories (which have several other files too). These files include C Source Files, Shell Script Source Files, Binary Files, Object Files. a) /usr/users/oracle/bin b) /usr/users/oracle... (1 Reply)
Discussion started by: marconi
1 Replies

4. UNIX for Advanced & Expert Users

UNIX: Command to compress folder and all files into a tar

I am trying to grab a folder and all the folders and files underneath it and send it from one computer to another. I basically want to compress the whole folder into a tar, tgz, or zip file so that it can be sent as one file. is there a command to compress a folder and all its contents into a tar... (7 Replies)
Discussion started by: kane4355
7 Replies

5. Shell Programming and Scripting

tar command to explore multiple layers of tar and tar.gz files

Hi all, I have a tar file and inside that tar file is a folder with additional tar.gz files. What I want to do is look inside the first tar file and then find the second tar file I'm looking for, look inside that tar.gz file to find a certain directory. I'm encountering issues by trying to... (1 Reply)
Discussion started by: bashnewbee
1 Replies

6. Shell Programming and Scripting

Single command - unzip files from a tar command

I have a tar file that contains multiple .Z files. Hence I need to issue a tar command followed by a gzip command to fully extract the files. How do I do it in a single command? What I'm doing now is tar xvf a.tar (this will output 1.Z and 2.Z) gzip -d *.Z (to extract 1.Z and 2.Z) (9 Replies)
Discussion started by: ericlim
9 Replies

7. UNIX for Dummies Questions & Answers

How to tar files of different extensions in one command

Linux RHEL 5.4 It is easy to create a tarball when you have files same extension For eg: You want to tar all files with the extension .log . This is easy tar -cvf diagnose.tar *.log I have two files with different extensions .log and .sh : error.log myscript.sh I want to create a... (5 Replies)
Discussion started by: John K
5 Replies

8. Solaris

Command to remove existing files in the tar files in Solaris 10

Hi, I am using solaris 10 OS.Please help me out with the commands needed in below two scenarios. 1)How to delete the existing files in the tar file. suppose i have a main tarfile named application.tar and it contains a file called ingres.tar. what is the command to remove ingres.tar... (2 Replies)
Discussion started by: muraliinfy04
2 Replies

9. UNIX for Dummies Questions & Answers

Tar command not taring all files

I am just creating tar.gz file with comand tar -zcvf xyz.tar.gz /home/xyz/* xyz folder contains thousands of files mostly .c, .cpp, etc.. I see that many times all the files are not zipped. Many files(in hundreds) are abruptly left out. What may be the reason for this and how to resolve... (10 Replies)
Discussion started by: rupeshkp728
10 Replies

10. Shell Programming and Scripting

Backingup larger files with TAR command

I need to backup my database but the files are very large and the TAR command will not let me. I searched aids and found that I could do something with the mknod, COMPRESS and TAR command using them together. I appreciate your help. (10 Replies)
Discussion started by: frizcala
10 Replies
DUFF(1) 						    BSD General Commands Manual 						   DUFF(1)

NAME
duff -- duplicate file finder SYNOPSIS
duff [-0HLPaeqprtz] [-d function] [-f format] [-l limit] [file ...] duff [-h] duff [-v] DESCRIPTION
The duff utility reports clusters of duplicates in the specified files and/or directories. In the default mode, duff prints a customizable header, followed by the names of all the files in the cluster. In excess mode, duff does not print a header, but instead for each cluster prints the names of all but the first of the files it includes. If no files are specified as arguments, duff reads file names from stdin. Note that as of version 0.4, duff ignores symbolic links to files, as that behavior was conceptually broken. Therefore, the -H, -L and -P options now apply only to directories. The following options are available: -0 If reading file names from stdin, assume they are null-terminated, instead of separated by newlines. Also, when printing file names and cluster headers, terminate them with null characters instead of newlines. This is useful for file names containing whitespace or other non-standard characters. -H Follow symbolic links listed on the command line. This overrides any previous -L or -P option. Note that this only applies to directories, as symbolic links to files are never followed. -L Follow all symbolic links. This overrides any previous -H or -P option. Note that this only applies to directories, as symbolic links to files are never followed. -P Don't follow any symbolic links. This overrides any previous -H or -L option. This is the default. Note that this only applies to directories, as symbolic links to files are never followed. -a Include hidden files and directories when searching recursively. -d function The message digest function to use. The supported functions are sha1, sha256, sha384 and sha512. The default is sha1. -e Excess mode. List all but one file from each cluster of duplicates. Also suppresses output of the cluster header. This is useful when you want to automate removal of duplicate files and don't care which duplicates are removed. -f format Set the format of the cluster header. If the header is set to the empty string, no header line is printed. The following escape sequences are available: %n The number of files in the cluster. %c A legacy synonym for %d, for compatibility reasons. %d The message digest of files in the cluster. This may not be combined with -t as no digest is calculated. %i The one-based index of the file cluster. %s The size, in bytes, of a file in the cluster. %% A '%' character. The default format string when using -t is: %n files in cluster %i (%s bytes) The default format string for other modes is: %n files in cluster %i (%s bytes, digest %d) -h Display help information and exit. -l limit The minimum size of files to be sampled. If the size of files in a cluster is equal or greater than the specified limit, duff will sample and compare a few bytes from the start of each file before calculating a full digest. This is stricly an optimization and does not affect which files are considered by duff. The default limit is zero bytes, i.e. to use sampling on all files. -q Quiet mode. Suppress warnings and error messages. -p Physical mode. Make duff consider physical files instead of hard links. If specified, multiple hard links to the same physical file will not be reported as duplicates. -r Recursively search into all specified directories. -t Thorough mode. Distrust digests as a guarantee for equality. In thorough mode, duff compares files byte by byte when their sizes match. -v Display version information and exit. -z Do not consider empty files to be equal. This option prevents empty files from being reported as duplicates. EXAMPLES
The command: duff -r foo/ lists all duplicate files in the directory foo and its subdirectories. The command: duff -e0 * | xargs -0 rm removes all duplicate files in the current directory. Note that you have no control over which files in each cluster that are selected by -e (excess mode). Use with care. The command: find . -name '*.h' -type f | duff lists all duplicate header files in the current directory and its subdirectories. The command: find . -name '*.h' -type f -print0 | duff -0 | xargs -0 -n1 echo lists all duplicate header files in the current directory and its subdirectories, correctly handling file names containing whitespace. Note the use of xargs and echo to remove the null separators again before listing. DIAGNOSTICS
The duff utility exits 0 on success, and >0 if an error occurs. SEE ALSO
find(1), xargs(1) AUTHORS
Camilla Berglund <elmindreda@elmindreda.org> BUGS
duff doesn't check whether the same file has been specified twice on the command line. This will lead it to report files listed multiple times as duplicates when not using -p (physical mode). Note that this problem only affects files, not directories. duff no longer (as of version 0.4) reports symbolic links to files as duplicates, as they're by definition always duplicates. This may break scripts relying on the previous behavior. If the underlying files are modified while duff is running, all bets are off. This is not really a bug, but it can still bite you. BSD
January 18, 2012 BSD
All times are GMT -4. The time now is 11:00 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy