Sponsored Content
Top Forums Shell Programming and Scripting disk space used for files with in a directory structure. Post 60304 by Perderabo on Friday 14th of January 2005 12:47:22 PM
Old 01-14-2005
I too am lost, but if you cd to fileserver1, you can do:
find 123* -type f
to get a list of files under the 123* directory. Is this a list of files that you want? Next you can do:
find 123* -type f | xargs ls -s
to see the files with their sizes. Just want a total? Use:
find 123* -type f | xargs -s | awk '{x+=$1} END {print x}'
Since -s is giving the size in blocks you might want to use "print x * 512" to get the size in bytes. This should be enough ideas to get you started.
 

9 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

How do I increase disk space available to a directory?

I know very basic Unix commands s I would really appreacite the assistance of a Unix guru. I am installing an application on a Sun server, when attempting to install I get an error that says I do not have enough sapce allocated for my install directory. Error says it has 7235m but needs 15360m.... (2 Replies)
Discussion started by: rhack
2 Replies

2. UNIX for Advanced & Expert Users

MV files from one directory structure(multiple level) to other directory structure

Hi, I am trying to write a script that will move all the files from source directory structure(multiple levels might exist) to destination directory structure. If a sub folder is source doesnot exist in destination then I have to skip and goto next level. I also need to delete the files in... (4 Replies)
Discussion started by: srmadab
4 Replies

3. UNIX for Dummies Questions & Answers

copy files with directory structure

i have a text file as. /database/sp/NTR_Update_Imsi_List.sql /database/sp/NTR_Update_Imsi_Range_List.sql /database/sp/NTR_Vlr_Upload.sql /database/tables/StatsTables.sql /mib/ntr.mib /mib/ntr.v2.mib /scripts/operations/ntr/IMSITracer.ph /scripts/operations/ntr/IMSITracer.pl ... (3 Replies)
Discussion started by: adddy
3 Replies

4. SCO

Transfer files wih directory structure.

I need to transfer software off a SCO OpenServer 5.0.5 server. I can not seem to read this server's tape on my other server since the tape drive (IBM Gen 5 DAT 72GB) will continuosly "eject" this DAT 8 tape. I have been able to 'tarball' most of the smaller directories with success and... (11 Replies)
Discussion started by: uxlunatick
11 Replies

5. Shell Programming and Scripting

Script to remove all empty files within the directory structure?

Hi I need to write a shell script which basically searches for all the empty files within the directory structure, lists them before asking the user to confirm if they would like to delete them. If the user deletes the file then a notice would appear confirming the file is deleted. I've be... (5 Replies)
Discussion started by: cat123
5 Replies

6. Programming

C++: how to check my directory disk space

I have a directory, and I write some files in to that. How to throw the error exception when my directory is full. i.e. there is no disk space (2 Replies)
Discussion started by: SamRoj
2 Replies

7. Shell Programming and Scripting

How to traverse directory structure and sum size of files?

How do I write a bash or ruby or perl or groovy script to print all the files in my directory tree that are one-to-two years old, the size of each file, and the sum of file sizes and then delete them? I was using find . -atime +365 -exec rm '{}' \; but the problem was that I could not... (5 Replies)
Discussion started by: siegfried
5 Replies

8. Shell Programming and Scripting

Extract files from tar ball without directory structure

Hi, I have tar filw which has multiple directories which contain files. When i extract using tar -xf the directory structure also get extracted. I require only files and not directory structures as there will be overhead of moving the files again. So i searched here and got a solution but... (4 Replies)
Discussion started by: chetan.c
4 Replies

9. Shell Programming and Scripting

Archiving files keeping the same structure directory

Hello Team, We would like to backup a lot of files inside of a structure of directories, four, five or more levels in some Ubuntu, Mac and Solaris systems. For instance: /home/chuck/sales/virgin/rent-quote.pdf /home/chuck/sales/marriott/vacation-quote.pdf... (2 Replies)
Discussion started by: csierra
2 Replies
DUFF(1) 						    BSD General Commands Manual 						   DUFF(1)

NAME
duff -- duplicate file finder SYNOPSIS
duff [-0HLPaeqprtz] [-d function] [-f format] [-l limit] [file ...] duff [-h] duff [-v] DESCRIPTION
The duff utility reports clusters of duplicates in the specified files and/or directories. In the default mode, duff prints a customizable header, followed by the names of all the files in the cluster. In excess mode, duff does not print a header, but instead for each cluster prints the names of all but the first of the files it includes. If no files are specified as arguments, duff reads file names from stdin. Note that as of version 0.4, duff ignores symbolic links to files, as that behavior was conceptually broken. Therefore, the -H, -L and -P options now apply only to directories. The following options are available: -0 If reading file names from stdin, assume they are null-terminated, instead of separated by newlines. Also, when printing file names and cluster headers, terminate them with null characters instead of newlines. This is useful for file names containing whitespace or other non-standard characters. -H Follow symbolic links listed on the command line. This overrides any previous -L or -P option. Note that this only applies to directories, as symbolic links to files are never followed. -L Follow all symbolic links. This overrides any previous -H or -P option. Note that this only applies to directories, as symbolic links to files are never followed. -P Don't follow any symbolic links. This overrides any previous -H or -L option. This is the default. Note that this only applies to directories, as symbolic links to files are never followed. -a Include hidden files and directories when searching recursively. -d function The message digest function to use. The supported functions are sha1, sha256, sha384 and sha512. The default is sha1. -e Excess mode. List all but one file from each cluster of duplicates. Also suppresses output of the cluster header. This is useful when you want to automate removal of duplicate files and don't care which duplicates are removed. -f format Set the format of the cluster header. If the header is set to the empty string, no header line is printed. The following escape sequences are available: %n The number of files in the cluster. %c A legacy synonym for %d, for compatibility reasons. %d The message digest of files in the cluster. This may not be combined with -t as no digest is calculated. %i The one-based index of the file cluster. %s The size, in bytes, of a file in the cluster. %% A '%' character. The default format string when using -t is: %n files in cluster %i (%s bytes) The default format string for other modes is: %n files in cluster %i (%s bytes, digest %d) -h Display help information and exit. -l limit The minimum size of files to be sampled. If the size of files in a cluster is equal or greater than the specified limit, duff will sample and compare a few bytes from the start of each file before calculating a full digest. This is stricly an optimization and does not affect which files are considered by duff. The default limit is zero bytes, i.e. to use sampling on all files. -q Quiet mode. Suppress warnings and error messages. -p Physical mode. Make duff consider physical files instead of hard links. If specified, multiple hard links to the same physical file will not be reported as duplicates. -r Recursively search into all specified directories. -t Thorough mode. Distrust digests as a guarantee for equality. In thorough mode, duff compares files byte by byte when their sizes match. -v Display version information and exit. -z Do not consider empty files to be equal. This option prevents empty files from being reported as duplicates. EXAMPLES
The command: duff -r foo/ lists all duplicate files in the directory foo and its subdirectories. The command: duff -e0 * | xargs -0 rm removes all duplicate files in the current directory. Note that you have no control over which files in each cluster that are selected by -e (excess mode). Use with care. The command: find . -name '*.h' -type f | duff lists all duplicate header files in the current directory and its subdirectories. The command: find . -name '*.h' -type f -print0 | duff -0 | xargs -0 -n1 echo lists all duplicate header files in the current directory and its subdirectories, correctly handling file names containing whitespace. Note the use of xargs and echo to remove the null separators again before listing. DIAGNOSTICS
The duff utility exits 0 on success, and >0 if an error occurs. SEE ALSO
find(1), xargs(1) AUTHORS
Camilla Berglund <elmindreda@elmindreda.org> BUGS
duff doesn't check whether the same file has been specified twice on the command line. This will lead it to report files listed multiple times as duplicates when not using -p (physical mode). Note that this problem only affects files, not directories. duff no longer (as of version 0.4) reports symbolic links to files as duplicates, as they're by definition always duplicates. This may break scripts relying on the previous behavior. If the underlying files are modified while duff is running, all bets are off. This is not really a bug, but it can still bite you. BSD
January 18, 2012 BSD
All times are GMT -4. The time now is 11:44 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy