grep limitation


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers grep limitation
# 1  
Old 02-10-2009
grep limitation

Hello,

I am looking for a way to get around an issue, as I am using the grep command in a very common situation:

Code:
grep ^50 File.*.txt | "some awk process"

My problem is that bash throws me an error on the grep command if the directory in question contains several thousands files.

Code:
bash: /usr/bin/grep: Arg list too long

Is there a way around that kind of limitation?

Thanks
# 2  
Old 02-10-2009
Code:
$ ls -1 File.*.txt | xargs grep ^50 | "some awk process"
or
$ find . -type f -name 'File.*.txt' -exec grep ^50 '{}' ';' | "some awk process"
or
$ awk '/^50/ "some awk process"'

Third one is probably the fastest.
# 3  
Old 02-10-2009
Thanks for the reply.

However only #2 seems to work, #1 and #3 throw me "bash: /usr/bin/grep: Arg list too long".

#2 is taking a lot of time too.
# 4  
Old 02-10-2009
Rewrite #1 as
Code:
$ ls -1 File.*.txt | xargs -n 10 grep ^50 | awk...

which passes 10 files at a time to grep, adjust that as needed.
#3 can't throw "bash: /usr/bin/grep: Arg list too long", since grep is never invoked.
#2 is slow because file starts a new process for every file found
# 5  
Old 02-10-2009
Ok, this is what I´m getting:

Code:
cscyabl@star:(prodbass)> ls -1 IP* | xargs -n 10 grep ^50 |awk 'BEGIN {FS="|"} {if ($2 != "") users++} END {print users}'  
bash: /usr/bin/ls: Arg list too long

cscyabl@star:(prodbass)> awk '/^50/ BEGIN {FS="|"} {if ($2 != "") users++} END {print users}' IP*
bash: /usr/bin/awk: Arg list too long

# 6  
Old 02-10-2009
Quote:
Originally Posted by Indalecio
Ok, this is what I´m getting:

Code:
cscyabl@star:(prodbass)> ls -1 IP* | xargs -n 10 grep ^50 |awk 'BEGIN {FS="|"} {if ($2 != "") users++} END {print users}'  
bash: /usr/bin/ls: Arg list too long

cscyabl@star:(prodbass)> awk '/^50/ BEGIN {FS="|"} {if ($2 != "") users++} END {print users}' IP*
bash: /usr/bin/awk: Arg list too long

for both cases the "expended" 'IP*' exceeds the max # of vars shell can handle.
Use the 'find' option - you can also combine the 'grep' and the 'awk' into one 'awk' and execute it as part of either the '-exec' or ' xargs'.
# 7  
Old 02-10-2009
Which is basically what pludi's solution #2 is about.
Works like a charm, although performance is a bit bad, but I still buy it.

Thanks for all the help guys
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. AIX

What is the limitation in AIX?

Hi All, i got few questions... 1) What is the maximum number of files that we can save under a single directory in AIX ? (* we have enough storage/disk space) 2) And what is the maximum number of sub - directories in side a directory? I know that...every directory is a (special)... (11 Replies)
Discussion started by: System Admin 77
11 Replies

2. UNIX for Dummies Questions & Answers

Limitation in addition

whats wrong with this addition? Whats the maximum number of digits can be handled? pandeeswaran@ubuntu:~/Downloads$ const=201234454654768979799999 pandeeswaran@ubuntu:~/Downloads$ let new+=const pandeeswaran@ubuntu:~/Downloads$ echo $new -2152890657037557890 pandeeswaran@ubuntu:~/Downloads$ (4 Replies)
Discussion started by: pandeesh
4 Replies

3. Shell Programming and Scripting

Limitation on rm command

Hi all, does any one know ,if there is any limitation on rm command limitation referes here as a size . Ex:when my script try to rum rm command which have size of nearly 20-22 GB ..CPU load gets high ? if anyone know the relation of CPU load and limitation of rm command . (8 Replies)
Discussion started by: niteshagrawal06
8 Replies

4. HP-UX

Limitation on *.ext

Is there a size limit when passing an argument using wildcards? I.E. when I pass an argument in the form (like) "ftp_auto *.txt" - is there a limitation on the size of UNIX expanding "*.txt" ? (1 Reply)
Discussion started by: vslewis
1 Replies

5. Shell Programming and Scripting

Awk limitation

Hi All, I have an awk code below. I have an input file which has a line which has a last field with about 4000 characters and it pop up an error stated below. It is too much for awk to take ? Awk Code: {if( $NF == "2007" && $1 == "**" ) LINE = $0;} END{printf("%20s\n",LINE); } Error:... (28 Replies)
Discussion started by: Raynon
28 Replies

6. Shell Programming and Scripting

Is this a bug or a limitation?

Hi, I'm having a problem with a while loop syntax that doesn't seem to loop correctly. TODAY=`date +%d%m%Y` while read hostname #for hostname in $(cat $CONFIG) do OUTFILE=/tmp/health_check.$hostname.$TODAY if then touch $OUTFILE func_header else rm $OUTFILE ... (2 Replies)
Discussion started by: gilberteu
2 Replies

7. Shell Programming and Scripting

Limitation of ls command

Hi, Iam using an alias to get the file count from one directory using normal ls command like ls file*|wc -l.If my file increases more than 35,000 ,my alias is not working.It shows that arg list too long. is that can be limitation of ls or problem in alias? I would appreciate if anyone can... (2 Replies)
Discussion started by: cskumar
2 Replies

8. Programming

data type limitation

I am writing some code to do analysis on the file system (HP-UX 11.11). I am using stat(..) to get file information. My problem is that the file-size may exceed the data types defined in 'sys/stat.h' & 'sys/types.h' respectively. Thus file-sizes in the Giga-byte range are not read correctly.... (2 Replies)
Discussion started by: ALTRUNVRSOFLN
2 Replies

9. HP-UX

HP-UX 11i - File Size Limitation And Number Of Folders Limitation

Hi All, Can anyone please clarify me the following questions: 1. Is there any file size limitation in HP-UX 11i, that I can able to create upto certain size of file (say 2 GB) and not more then that???? 2. At max. how many files we can able to keep inside a folder???? 3. How many... (2 Replies)
Discussion started by: sundeep_mohanty
2 Replies

10. Shell Programming and Scripting

find limitation

Hi , i'm trying to use "find "command with "-size "option but i encounter 2gb file limitation. Can you confirm this limitation ? Is there a simple way to do the same thing ? My command is : <clazz01g-notes01>/base/base01 # find /base/base01 -name '*.nsf' -size +5242880000c -exec ls... (2 Replies)
Discussion started by: Nicol
2 Replies
Login or Register to Ask a Question