04-20-2011
xargs -d '\n' rm -f < hugelistofiles ought to tell it to split on newlines alone without resorting to editing the input file.
Huge numbers of arguments shouldn't be a problem for xargs, that being what it was made for. It understands the maximum argument size and will split across multiple rm -f calls accordingly.
---------- Post updated at 04:56 PM ---------- Previous update was at 04:53 PM ----------
Your while loop would work, but wouldn't be nearly as efficient as xargs.
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
I have a task to move more than 35000 files every two hours, from the same directory to another directory based on a file that has the list of filenames
I tried the following logics
(1)
find . -name \*.dat > list
for i in `cat list` do mv $i test/ done
(2)
cat list|xargs -i mv "{}"... (7 Replies)
Discussion started by: bryan
7 Replies
2. Shell Programming and Scripting
I have a large Filesystem on an AIX server and another one on a Red Hat box. I have syncd the two filesystems using rsysnc.
What Im looking for is a script that would compare to the two filesystems to make sure the bits match up and the number of files match up.
its around 2.8 million... (5 Replies)
Discussion started by: zippdawg2001
5 Replies
3. Shell Programming and Scripting
Hi,
i have more than 1000 data files(.txt) like this
first file format:
178.83 554.545
179.21 80.392
second file:
178.83 990.909
179.21 90.196
etc.
I want to combine them to the following format:
178.83,554.545,990.909,...
179.21,80.392,90.196,... (7 Replies)
Discussion started by: mr_monocyte
7 Replies
4. UNIX for Dummies Questions & Answers
I want to tar large number of files about 150k.
i am using the find command as below to create a file with all file names.
& then trying to use the tar -I command as below.
# find . -type f -name "gpi*" > include-file
# tar -I include-file -cvf newfile.tar
This i got from one of the posts... (2 Replies)
Discussion started by: crux123
2 Replies
5. Shell Programming and Scripting
Hellow i have a large number of files that i want to concatenate to one. these files start with the word 'VOICE_' for example
VOICE_0000000000
VOICE_1223o23u0
VOICE_934934927349
I use the following code:
cat /ODS/prepaid/CDR_FLOW/MEDIATION/VOICE_* >> /ODS/prepaid/CDR_FLOW/WORK/VOICE
... (10 Replies)
Discussion started by: chriss_58
10 Replies
6. Shell Programming and Scripting
Hi All,
I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files.
The shell script looks like this (MODIFIED):
find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies
7. UNIX for Dummies Questions & Answers
Hi,
I have a data file that contains 61 columns. I want to delete all the columns except columns, 3,6 and 8. The columns are tab de-limited. How would I achieve this on the terminal?
Thanks (2 Replies)
Discussion started by: lost.identity
2 Replies
8. UNIX for Dummies Questions & Answers
Hi,
I have a large number of subdirectories (>200), and in each of these directories there is a file with a name like "opp1234.dat".
I'd like to know how I could change the names of these files to say "out.dat" in all these subdirectories in one go.
Thanks! (5 Replies)
Discussion started by: lost.identity
5 Replies
9. Shell Programming and Scripting
Want to sftp large number of files ... approx 150 files will come to server every minute. (AIX box)
Also need make sure file has been sftped successfully...
Please let me know :
1. What is the best / faster way to transfer files?
2. should I use batch option -b so that connectivity will be... (3 Replies)
Discussion started by: vegasluxor
3 Replies
10. Shell Programming and Scripting
Hi All,
I am having a situation now to delete a huge number of temp files created during run times approx. 16700+ files. We have never imagined that we will get this this much big list of files during run time. It worked fine for lesser no of files in the list. But when list is huge we are... (7 Replies)
Discussion started by: mad man
7 Replies
UUQ(1C) UUQ(1C)
NAME
uuq - examine or manipulate the uucp queue
SYNOPSIS
uuq [ -l ] [ -h ] [ -ssystem ] [ -uuser ] [ -djobno ] [ -rsdir ] [ -bbaud ]
DESCRIPTION
Uuq is used to examine (and possibly delete) entries in the uucp queue.
When listing jobs, uuq uses a format reminiscent of ls. For the long format, information for each job listed includes job number, number
of files to transfer, user who spooled the job, number of bytes to send, type of command requested (S for sending files, R for receiving
files, X for remote uucp), and file or command desired.
Several options are available:
-h Print only the summary lines for each system. Summary lines give system name, number of jobs for the system, and total number of
bytes to send.
-l Specifies a long format listing. The default is to list only the job numbers sorted across the page.
-ssystem Limit output to jobs for systems whose system names begin with system.
-uuser Limit output to jobs for users whose login names begin with user.
-djobno Delete job number jobno (as obtained from a previous uuq command) from the uucp queue. Only the UUCP Administrator is permitted
to delete jobs.
-rsdir Look for files in the spooling directory sdir instead of the default directory.
-bbaud Use baud to compute the transfer time instead of the default 1200 baud.
FILES
/usr/spool/uucp/ Default spool directory /usr/spool/uucp/C./C.* Control files
/usr/spool/uucp/Dhostname./D.* Outgoing data files /usr/spool/uucp/X./X.* Outgoing execution files
SEE ALSO
uucp(1C), uux(1C), uulog(1C), uusnap(8C)
BUGS
No information is available on work requested by the remote machine.
The user who requests a remote uucp command is unknown.
Uuq -l can be horrendously slow.
AUTHOR
Lou Salkind, New York University
4.3 Berkeley Distribution April 24, 1986 UUQ(1C)