Sponsored Content
Special Forums UNIX Desktop Questions & Answers limit number of sub-dirs searched for files Post 302437643 by MJThom713 on Thursday 15th of July 2010 04:19:54 PM
Old 07-15-2010
The date is the actual directory for each date of the year.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

I need to ls all files in 4-6 deep dirs

I need to print to file , a listing of all files below a certain directory. Example: I need to print to file a listing of all files below the etc dir (including the subdirectories) with their full path. Any ideas on how to do this with one command. Or is this something I need to do on all... (4 Replies)
Discussion started by: gforty
4 Replies

2. UNIX for Advanced & Expert Users

limit to number of files in a given directory

Everyone, We are on a unix AIX 4.3 platform and our application is written as such that all configuration files must reside in a specific directory. Currently there are over 10,000 files in this directory (and growing at about 300 per month). My question is is there a physical limit to the... (2 Replies)
Discussion started by: hedrict
2 Replies

3. Shell Programming and Scripting

How to print the files names that being searched

Hello all Im doing search in jar files using this oneLiener : find . -name "*.jar" -print -exec jar -tvf {} \; | grep -n \/someClassName.class but I also will like to see the jar file names that the grep succeed the search What I need to add to this command so it will give the file names? (2 Replies)
Discussion started by: umen
2 Replies

4. UNIX for Dummies Questions & Answers

Number of long limit

Hi Hi! I'm currently using AIX 5.2 and would like to know where can i find to see that there's a restriction on the number of login times a user can have. Example, I want give a 2 login per user but some one to give 3 login and some one have to give unlit login time (without logging off the... (2 Replies)
Discussion started by: herath
2 Replies

5. Solaris

How to limit number of Commands

Is there a way that I can limit number of commands that one user can run during period of time. For example Max 10 commands per senconds.:) (3 Replies)
Discussion started by: winger0608
3 Replies

6. Shell Programming and Scripting

Searching across multiple files if pattern is available in all files searched

I have a list of pattern in a file, I want each of these pattern been searched from 4 files. I was wondering this can be done in SED / AWK. say my 4 files to be searched are > cat f1 abc/x(12) 1 abc/x 3 cde 2 zzz 3 fdf 4 > cat f2 fdf 4 cde 3 abc 2... (6 Replies)
Discussion started by: novice_man
6 Replies

7. UNIX for Dummies Questions & Answers

Limit Number of files

hi guys how can i limit number of files in a disk or partition ? or how can i make a limit to inode number for a disk or partition ? ext3 or ext4 file system (1 Reply)
Discussion started by: mhs
1 Replies

8. Debian

Problem with files/dirs deletion

Hi, The other day i installed a PHP based CMS (modx) on my shell account and noticed that i couldn't delete any of files/dirs it created after. Also, i noticed that all that stuff is owned by username-www instead of username. I tried chown, chmod and using a PHP script to do the same wti... (4 Replies)
Discussion started by: pentago
4 Replies

9. Shell Programming and Scripting

Moving files into dirs corresponding to dates

I am trying to find a way to move files into corresponding date files. i=0 while read line do array="$line" (( i++ )) done < <(ls) cd $(echo ${array}) echo ${array}} pwd #cd "$(array}" ] || mkdir 2015 cd "2015" ] || mkdir 02-February ] || mkdir 03-March ] || mkdir... (10 Replies)
Discussion started by: newbie2010
10 Replies

10. UNIX for Beginners Questions & Answers

Limit number of files transferred

I've a folder in remote server and it has 50 files. I like to transfer these files by first 10 and next 10 files. I'm using mget command to transfer the files. How to limit the file transfer limit to 10. instead of copying 50 files at a time. Thanks Janarthan (5 Replies)
Discussion started by: Janarthan
5 Replies
DUFF(1) 						    BSD General Commands Manual 						   DUFF(1)

NAME
duff -- duplicate file finder SYNOPSIS
duff [-0HLPaeqprtz] [-d function] [-f format] [-l limit] [file ...] duff [-h] duff [-v] DESCRIPTION
The duff utility reports clusters of duplicates in the specified files and/or directories. In the default mode, duff prints a customizable header, followed by the names of all the files in the cluster. In excess mode, duff does not print a header, but instead for each cluster prints the names of all but the first of the files it includes. If no files are specified as arguments, duff reads file names from stdin. Note that as of version 0.4, duff ignores symbolic links to files, as that behavior was conceptually broken. Therefore, the -H, -L and -P options now apply only to directories. The following options are available: -0 If reading file names from stdin, assume they are null-terminated, instead of separated by newlines. Also, when printing file names and cluster headers, terminate them with null characters instead of newlines. This is useful for file names containing whitespace or other non-standard characters. -H Follow symbolic links listed on the command line. This overrides any previous -L or -P option. Note that this only applies to directories, as symbolic links to files are never followed. -L Follow all symbolic links. This overrides any previous -H or -P option. Note that this only applies to directories, as symbolic links to files are never followed. -P Don't follow any symbolic links. This overrides any previous -H or -L option. This is the default. Note that this only applies to directories, as symbolic links to files are never followed. -a Include hidden files and directories when searching recursively. -d function The message digest function to use. The supported functions are sha1, sha256, sha384 and sha512. The default is sha1. -e Excess mode. List all but one file from each cluster of duplicates. Also suppresses output of the cluster header. This is useful when you want to automate removal of duplicate files and don't care which duplicates are removed. -f format Set the format of the cluster header. If the header is set to the empty string, no header line is printed. The following escape sequences are available: %n The number of files in the cluster. %c A legacy synonym for %d, for compatibility reasons. %d The message digest of files in the cluster. This may not be combined with -t as no digest is calculated. %i The one-based index of the file cluster. %s The size, in bytes, of a file in the cluster. %% A '%' character. The default format string when using -t is: %n files in cluster %i (%s bytes) The default format string for other modes is: %n files in cluster %i (%s bytes, digest %d) -h Display help information and exit. -l limit The minimum size of files to be sampled. If the size of files in a cluster is equal or greater than the specified limit, duff will sample and compare a few bytes from the start of each file before calculating a full digest. This is stricly an optimization and does not affect which files are considered by duff. The default limit is zero bytes, i.e. to use sampling on all files. -q Quiet mode. Suppress warnings and error messages. -p Physical mode. Make duff consider physical files instead of hard links. If specified, multiple hard links to the same physical file will not be reported as duplicates. -r Recursively search into all specified directories. -t Thorough mode. Distrust digests as a guarantee for equality. In thorough mode, duff compares files byte by byte when their sizes match. -v Display version information and exit. -z Do not consider empty files to be equal. This option prevents empty files from being reported as duplicates. EXAMPLES
The command: duff -r foo/ lists all duplicate files in the directory foo and its subdirectories. The command: duff -e0 * | xargs -0 rm removes all duplicate files in the current directory. Note that you have no control over which files in each cluster that are selected by -e (excess mode). Use with care. The command: find . -name '*.h' -type f | duff lists all duplicate header files in the current directory and its subdirectories. The command: find . -name '*.h' -type f -print0 | duff -0 | xargs -0 -n1 echo lists all duplicate header files in the current directory and its subdirectories, correctly handling file names containing whitespace. Note the use of xargs and echo to remove the null separators again before listing. DIAGNOSTICS
The duff utility exits 0 on success, and >0 if an error occurs. SEE ALSO
find(1), xargs(1) AUTHORS
Camilla Berglund <elmindreda@elmindreda.org> BUGS
duff doesn't check whether the same file has been specified twice on the command line. This will lead it to report files listed multiple times as duplicates when not using -p (physical mode). Note that this problem only affects files, not directories. duff no longer (as of version 0.4) reports symbolic links to files as duplicates, as they're by definition always duplicates. This may break scripts relying on the previous behavior. If the underlying files are modified while duff is running, all bets are off. This is not really a bug, but it can still bite you. BSD
January 18, 2012 BSD
All times are GMT -4. The time now is 10:42 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy