08-12-2008
rm -i `ls -lrt|awk '$5==0{print $9}'`
this will also do
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
I want to search for a file pattern in more than one directory. How I need to do that?
Here is the scenario:
I am having a directory structure like the following:
/log
...../20051001
..........ftp_server_20051001.log
..........ftp_down_server.log
..........ftp_up_server.log... (7 Replies)
Discussion started by: ravikirankethe
7 Replies
2. UNIX for Dummies Questions & Answers
Hello friends,
I am compiling some set of SQL scripts in a set of sub directories demoed as below. After compiling log files are being created.
Each and every time after compiling time I had to go subdir by subdir to delete the log files. I am sure there should be simple way to look for all log... (4 Replies)
Discussion started by: adurga
4 Replies
3. Shell Programming and Scripting
Hi,
I have a shell script to find files older than 'X' days ($2) in directory path ($1) and delete them.
Like this:
my_file_remover.sh /usr/home/c 90
Now, I need to modify this script and add it in CRON, so that it checks other directories also.
Like:
my_file_remover.sh /usr/home/c... (3 Replies)
Discussion started by: guruparan18
3 Replies
4. Shell Programming and Scripting
Can any one please help me in deleting all the Files over 7 days from sub-directories A, B, C...
Top-Directory
Sub-Directory-A
File-1
File-2
.....
File-n
Sub-Directory-B
File-1
File-2
.....
File-n
Sub-Directory-C
File-1
... (1 Reply)
Discussion started by: sureshcisco
1 Replies
5. UNIX for Dummies Questions & Answers
Hello everyone,
can anybody help me in finding a way to obtain a list of all the directories and their sizes.
I would like to be able to run this and obtain an output like a tree structure with each branch saying how much space it is taking up .
Hope you can point me in the right direction.... (1 Reply)
Discussion started by: gio001
1 Replies
6. Shell Programming and Scripting
Hi,
I want to access files from different directories (for example: /home/dir1/file1 , /home/dir2/file2 ...) Like this i have to access these files(file1, file2...). (3 Replies)
Discussion started by: bangarukannan
3 Replies
7. Homework & Coursework Questions
Use and complete the template provided. The entire template must be completed. If you don't, your post may be deleted!
1. The problem statement, all variables and given/known data:
Need to make a script, to remove all empty files and folders from current category.
It also should show the name... (2 Replies)
Discussion started by: Itixop
2 Replies
8. Shell Programming and Scripting
I have a task, I usually do manually, but with growing responsibilities I tend to forget to do this weekly, I want to write a script that automates this, but I cant seem to work it out in my head, I have the shell of it out, but need help, and you guys have helped me with EVERY problem I have... (5 Replies)
Discussion started by: gkelly1117
5 Replies
9. Shell Programming and Scripting
OS: SUNOS 5.10 i386
Hello guys I wrote a shell script in bash shell to delete the files less than 30 days old. The following is the script.
=======================================
#!/bin/bash
for dirs in `/clu04/oracle/directory_list.lst`
do
find $dirs -type f -mtime -30 -exec rm {} \;... (3 Replies)
Discussion started by: zacsparrow
3 Replies
10. Shell Programming and Scripting
Hello,
i have an dynamical apache_cache that I need to clean all days (older tant one day) with an unix command :
find /usr/IBM/HTTPServer/apache_cache/ -name '*' -mtime +1 -print0|xargs -0 rm -r --
but it didn't work.
Could you explain me why.
So I will put below all my script :... (13 Replies)
Discussion started by: steiner
13 Replies
LEARN ABOUT DEBIAN
fdupes
FDUPES(1) General Commands Manual FDUPES(1)
NAME
fdupes - finds duplicate files in a given set of directories
SYNOPSIS
fdupes [ options ] DIRECTORY ...
DESCRIPTION
Searches the given path for duplicate files. Such files are found by comparing file sizes and MD5 signatures, followed by a byte-by-byte
comparison.
OPTIONS
-r --recurse
for every directory given follow subdirectories encountered within
-R --recurse:
for each directory given after this option follow subdirectories encountered within (note the ':' at the end of option; see the
Examples section below for further explanation)
-s --symlinks
follow symlinked directories
-H --hardlinks
normally, when two or more files point to the same disk area they are treated as non-duplicates; this option will change this behav-
ior
-n --noempty
exclude zero-length files from consideration
-f --omitfirst
omit the first file in each set of matches
-A --nohidden
exclude hidden files from consideration
-1 --sameline
list each set of matches on a single line
-S --size
show size of duplicate files
-m --summarize
summarize duplicate files information
-q --quiet
hide progress indicator
-d --delete
prompt user for files to preserve, deleting all others (see CAVEATS below)
-N --noprompt
when used together with --delete, preserve the first file in each set of duplicates and delete the others without prompting the user
-v --version
display fdupes version
-h --help
displays help
SEE ALSO
md5sum(1)
NOTES
Unless -1 or --sameline is specified, duplicate files are listed together in groups, each file displayed on a separate line. The groups are
then separated from each other by blank lines.
When -1 or --sameline is specified, spaces and backslash characters () appearing in a filename are preceded by a backslash character.
EXAMPLES
fdupes a --recurse: b
will follow subdirectories under b, but not those under a.
fdupes a --recurse b
will follow subdirectories under both a and b.
CAVEATS
If fdupes returns with an error message such as fdupes: error invoking md5sum it means the program has been compiled to use an external
program to calculate MD5 signatures (otherwise, fdupes uses internal routines for this purpose), and an error has occurred while attempting
to execute it. If this is the case, the specified program should be properly installed prior to running fdupes.
When using -d or --delete, care should be taken to insure against accidental data loss.
When used together with options -s or --symlink, a user could accidentally preserve a symlink while deleting the file it points to.
Furthermore, when specifying a particular directory more than once, all files within that directory will be listed as their own duplicates,
leading to data loss should a user preserve a file without its "duplicate" (the file itself!).
AUTHOR
Adrian Lopez <adrian2@caribe.net>
FDUPES(1)