Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Pls. help with script to remove million files Post 302862999 by methyl on Friday 11th of October 2013 06:32:33 PM
Old 10-11-2013
The belt and braces method is to not let shell see the list

Assuming that you want to remove qfr* (not dfr*) and that the directory does not contain subdirectories containing files which you do not want do delete. if your "find" does not have "-print", then omit the parameter.

Code:
find . -type f -name qfr\* -mtime +5 -print | while read FILENAME
do
       # Remove echo after testing
       echo rm "${FILENAME}"
done

Try with the "echo" in first to check that the list of commands is suitable.
The "-mtime +5" parameter is designed to avoid deleting anything created in the last 5 days.
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Script to mv files and then remove

Hi Guys i'm looking for some help with creating a scripts: Background - I run alot of testing which creates huge files. This fills up the diskspace over 3 weeks so needs constant deleting. but generally, I keep forgetting!!! so every 3 weeks it fails lol. Basically i can't format the drive... (2 Replies)
Discussion started by: defamer
2 Replies

2. Solaris

Need to know command to delete more than 3 million files from /var/spool/clientmqueue

Hi I need to delete more than 3 million files from /var/spool/clientmqueue. When I give the following command to delete the files, I get the error # pwd /var/spool/clientmqueue # rm -f * /usr/bin/rm: arg list too long Please tell me how can I delete the files (5 Replies)
Discussion started by: sb200
5 Replies

3. Shell Programming and Scripting

Matching 10 Million file records with 10 Million in other file

Dear All, I have two files both containing 10 Million records each separated by comma(csv fmt). One file is input.txt other is status.txt. Input.txt-> contains fields with one unique id field (primary key we can say) Status.txt -> contains two fields only:1. unique id and 2. status ... (8 Replies)
Discussion started by: vguleria
8 Replies

4. Shell Programming and Scripting

Fast processing(mv command) of 1 million+ files using find, mv and xargs

Hi, I'd like to ask if anybody can help improve my code to move 1 million+ files from a directory to another: find /source/dir -name file* -type f | xargs -I '{}' mv {} /destination/dir I learned this line of code from this forum as well and it works fine. However, file movement is kinda... (6 Replies)
Discussion started by: agentgrecko
6 Replies

5. UNIX for Dummies Questions & Answers

Deleting a million of files ..

Hi, Which way is faster rm -rf /path/ or find / -name -exec rm {} \; and why? (7 Replies)
Discussion started by: cain82
7 Replies

6. UNIX for Dummies Questions & Answers

Pls. help remove the static route

Hi, I am on Linux Redhat 5.3. I added this static route but now I can't seem to take it out. Can you help? netstat -rn Kernel IP routing table Destination Gateway Genmask Flags MSS Window irtt Iface 167.76.151.28 192.1.25.249 255.255.255.255 UGH 0 0... (1 Reply)
Discussion started by: samnyc
1 Replies

7. AIX

Script to remove backup files

HI, I want to remove my backup files keeping last 30 days. Now i am doing it manually. Does anyone have a script to automate this process? Thanks in advance (5 Replies)
Discussion started by: ElizabethPJ
5 Replies

8. Shell Programming and Scripting

How to remove newline, tab, spaces in curly braces.. :( Pls Help?

Hi Everyone, in the below "xyz (Exception e)" part... after the curly braces, there is a new line and immediately few tabs are present before closing curly brace. xyz (Exception e) { } note: there can be one or more newlines between the curly braces. My desired output should be ... (6 Replies)
Discussion started by: NY_777
6 Replies

9. UNIX for Advanced & Expert Users

Zip million files taking 12 hours or more

Hi I have task to zip files based on modified time but they are in millions and it is taking lot of time more than 12 hours and also eating up high cpu is there any other / better way to handle it quickly with less cpu consumptionfind . ! -name \"*.gz\" -mtime +7 -type f | grep -v '/.*/' |... (2 Replies)
Discussion started by: reldb
2 Replies
FIND(1) 						      General Commands Manual							   FIND(1)

NAME
find - find files meeting a given condition SYNOPSIS
find directory expression EXAMPLES
find / -name a.out -print # Print all a.out paths find /usr/ast ! -newer f -ok rm {} ; # Ask before removing find /usr -size +20 -exec mv {} /big ; # move files > 20 blks find / -name a.out -o -name '*.o' -exec rm {}; # 2 conds DESCRIPTION
Find descends the file tree starting at the given directory checking each file in that directory and its subdirectories against a predi- cate. If the predicate is true, an action is taken. The predicates may be connected by -a (Boolean and), -o (Boolean or) and ! (Boolean negation). Each predicate is true under the conditions specified below. The integer n may also be +n to mean any value greater than n, -n to mean any value less than n, or just n for exactly n. -name s true if current filename is s (include shell wild cards) -size n true if file size is n blocks -inum n true if the current file's i-node number is n -mtime ntrue if modification time relative to today (in days) is n -links ntrue if the number of links to the file is n -newer ftrue if the file is newer than f -perm n true if the file's permission bits = n (n is in octal) -user u true if the uid = u (a numerical value, not a login name) -group gtrue if the gid = g (a numerical value, not a group name) -type x where x is bcdfug (block, char, dir, regular file, setuid, setgid) -xdev do not cross devices to search mounted file systems Following the expression can be one of the following, telling what to do when a file is found: -print print the file name on standard output -exec execute a MINIX command, {} stands for the file name -ok prompts before executing the command SEE ALSO
test(1), xargs(1). FIND(1)
All times are GMT -4. The time now is 10:02 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy