I assume that bipinajith's proposed solution will still fail because /app/folder1/* is expanding to a list of arguments too long to process. The following should avoid that problem and be MUCH more efficient:
Note, however, that if your .dat files are binary files (rather than text files) grep might not work. (The standards only specify the behavior of grep when its input files are text files.)
Also note that if grep is called with only one file operand, the name of the file in which the line is found will not be printed; just the contents of the matching line. If your .dat files are text files and you want the name of the file to be printed as well as the matched lines even if there is only one file operand, add /dev/null as another file operand. If you only want the names of matching files, but don't need to see the matched lines use the -l (letter ell; not digit 1) option.
And note that the pattern specified by '.sh' is a basic regular expression will match the two characters sh as long as they are not the 1st two characters on a line. If you want to match the three characters .sh, you need to add the -F option, use the obsolescent fgrep utility instead of grep, or escape the period in the BRE.
Hello,
I'm trying to search through 30,000 files in 1 directory, and am getting the "arg list too long" error. I've searched this forum and have been playing around with xargs and can't get that to work either. I'm using ksh on Solaris.
Here's my original code:
nawk "/Nov 21/{_=2}_&&_--"... (14 Replies)
Hi,
Help. I have a file that contains a list of users in a file. I want to cat the content of the file and feed it into sed to a preformated report. The error I got is "ksh: /usr/bin/sed: arg list too long" My method below.
A=`cat FILE1.txt`
B=`echo $A`
sed "s#USERLIST#$B#" FILE2 >... (2 Replies)
Hey guys. I have a program written in which i am trying to get the files from one remote machine and transferring the files to another remote machine using SCP.
It works fine for 50 or 60 files but when the files grows to 250 then i get an error message stating "Arg list too long".
#scp -p... (5 Replies)
Hi,
I am trying to perform this task:
tar -cvf tar.newfile ??????.bas
I got error "arg list too long". Is ther any way around? I have about 1500 file need to be tar together.
Thanks in advance (5 Replies)
hello all
i need some help because i am a unix/linux dummy...i have the following:
DIR1> has 121437 files in it with varying dates going back to early April,
a sub dir DIR1/DIR2> has 55835 files in it
I need to move all files (T*.*) out of DIR1 into DIR2 that are older than today?
Ive been... (2 Replies)
echo dirname/filename* | xargs ls -t
As a substitute doesn't give the results desired when I exceed the buffer size. I still want the files listed in chronological order, unfortunately xargs releases the names piecemeal...does anyone have any ideas? :( (4 Replies)
Hi all
I have more than 1000 files in a folder and when ever i use a "compress" or "zcat" command it give error
/bin/zcat: Arg list too long. .
any solution for this :o (3 Replies)
hi everyone,
We have a heck of a lot of files in a particular directory and I need to search through all of them to find a list of all files containing particular text strings...one being a date and the other being the name of the report that is printed on the files.....
I've tried the... (6 Replies)
I do ls -l ABC*, I get arg list too long message. This will not happen if ABC* has small no of files I believe 4000 files is limit. Any way of avoiding this.
I even tried like this
for i in `ls -l ABC*`
do
echo $i
done
Same problem.
Any solution would be great.
I am on HP-UX... (5 Replies)