03-28-2009
Clarify your question, what is the format of the names of created subdirectories and wich directories should be delete? Give some examples.
Regards
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
I will like to write a script that delete all files that are older than 7 days in a directory and it's subdirectories. Can any one help me out witht the magic command or script?
Thanks in advance,
Odogboly98:confused: (3 Replies)
Discussion started by: odogbolu98
3 Replies
2. Shell Programming and Scripting
I'm new to shell script.... can any one help...
What is the shell script to delete the files older than 2 days ? (3 Replies)
Discussion started by: satishpabba
3 Replies
3. Shell Programming and Scripting
Hi.
I need a script (either bash or perl) that can delete previous versions of files.
For instance, from our continuous build process I get directories such as
build5_dev_1.21
build5_dev_1.22
build5_dev_1.23
build5_dev_1.24
I need a script that I can run every night (using "at"... (6 Replies)
Discussion started by: jbsimon000
6 Replies
4. Solaris
Hi all,
I want to delete log files with extension .log which are older than 30
days. How to delete those files?
Operating system -- Sun solaris 10
Your input is highly appreciated.
Thanks in advance.
Regards,
Williams (2 Replies)
Discussion started by: William1482
2 Replies
5. Shell Programming and Scripting
Hi All ,
I want to delete files from /tmp directory created by "xxxx" id.
because i got the list says more than 60 thousand files were created by "xxxx" id since 2002.
The /tmp directory has lot of files created by different user ids like root,system etc..
But, i need a script to... (2 Replies)
Discussion started by: vparunkumar
2 Replies
6. Shell Programming and Scripting
is it -mtime +1 as i need all files older than today to be deleted (6 Replies)
Discussion started by: dinjo_jo
6 Replies
7. Shell Programming and Scripting
Hi All,
I am using below code to delete files older than 2 days. In case if there are no files, I should log an error saying no files to delete.
Please let me know, How I can achive this.
find /path/*.xml -mtime +2
Thanks and Regards
Nagaraja. (3 Replies)
Discussion started by: Nagaraja Akkiva
3 Replies
8. Shell Programming and Scripting
hi i need a script to delete the files older than 2 days...
if my input is say in a folder versions
A_14122012.txt
A_15122012.txt
A_16122012.txt
A_17122012.txt
i want my output to be
A_16122012.txt
A_17122012.txt
thanks in advance
hemanth saikumar. (2 Replies)
Discussion started by: hemanthsaikumar
2 Replies
9. Shell Programming and Scripting
Hi ,
I am a newbie!!!
I want to develop a script for deleting files older than x days from multiple paths. Now I could reach upto this piece of code which deletes files older than x days from a particular path. How do I enhance it to have an input from a .txt file or a .dat file? For eg:... (12 Replies)
Discussion started by: jhilmil
12 Replies
10. UNIX for Dummies Questions & Answers
Hi,
I need a command for deleting all the compress files *.Z that are older than the current date - 5 days. Basically I have a directory where daily I meet some back up files and I want to remove automatically the ones 5 days (or more) older than the current date. How can I write a 'rm' command... (1 Reply)
Discussion started by: Francy
1 Replies
LEARN ABOUT DEBIAN
fdupes
FDUPES(1) General Commands Manual FDUPES(1)
NAME
fdupes - finds duplicate files in a given set of directories
SYNOPSIS
fdupes [ options ] DIRECTORY ...
DESCRIPTION
Searches the given path for duplicate files. Such files are found by comparing file sizes and MD5 signatures, followed by a byte-by-byte
comparison.
OPTIONS
-r --recurse
for every directory given follow subdirectories encountered within
-R --recurse:
for each directory given after this option follow subdirectories encountered within (note the ':' at the end of option; see the
Examples section below for further explanation)
-s --symlinks
follow symlinked directories
-H --hardlinks
normally, when two or more files point to the same disk area they are treated as non-duplicates; this option will change this behav-
ior
-n --noempty
exclude zero-length files from consideration
-f --omitfirst
omit the first file in each set of matches
-A --nohidden
exclude hidden files from consideration
-1 --sameline
list each set of matches on a single line
-S --size
show size of duplicate files
-m --summarize
summarize duplicate files information
-q --quiet
hide progress indicator
-d --delete
prompt user for files to preserve, deleting all others (see CAVEATS below)
-N --noprompt
when used together with --delete, preserve the first file in each set of duplicates and delete the others without prompting the user
-v --version
display fdupes version
-h --help
displays help
SEE ALSO
md5sum(1)
NOTES
Unless -1 or --sameline is specified, duplicate files are listed together in groups, each file displayed on a separate line. The groups are
then separated from each other by blank lines.
When -1 or --sameline is specified, spaces and backslash characters () appearing in a filename are preceded by a backslash character.
EXAMPLES
fdupes a --recurse: b
will follow subdirectories under b, but not those under a.
fdupes a --recurse b
will follow subdirectories under both a and b.
CAVEATS
If fdupes returns with an error message such as fdupes: error invoking md5sum it means the program has been compiled to use an external
program to calculate MD5 signatures (otherwise, fdupes uses internal routines for this purpose), and an error has occurred while attempting
to execute it. If this is the case, the specified program should be properly installed prior to running fdupes.
When using -d or --delete, care should be taken to insure against accidental data loss.
When used together with options -s or --symlink, a user could accidentally preserve a symlink while deleting the file it points to.
Furthermore, when specifying a particular directory more than once, all files within that directory will be listed as their own duplicates,
leading to data loss should a user preserve a file without its "duplicate" (the file itself!).
AUTHOR
Adrian Lopez <adrian2@caribe.net>
FDUPES(1)