Command very Slow


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Command very Slow
# 15  
Old 05-19-2015
If this has still "too many arguments" then combine with find
Code:
find . -name "star_st*" -exec awk '/1175 876330/;{nextfile}' {} +

# 16  
Old 05-19-2015
Quote:
Originally Posted by Don Cragun
We are still waiting for mohtashims to tell us if:
Code:
cd /directory/containing/your/files
awk '$0 ~ /1175 876330/
{nextfile}' star_st*

works.
This find . -name star_st* -exec head -1 {} + | grep "1175 876330" gives output however, this awk '$0 ~ /1175 876330/ {nextfile}' star_st* does not show any Output
# 17  
Old 05-19-2015
Put a ; or a <newline> after the /.../. This is about 50% faster on my machine.
These 2 Users Gave Thanks to RudiC For This Post:
# 18  
Old 05-19-2015
Quote:
Originally Posted by mohtashims
This find . -name star_st* -exec head -1 {} + | grep "1175 876330" gives output however, this awk '$0 ~ /1175 876330/ {next file}' star_st* does not show any Output
That is strange, before you said that:
Code:
find . -name star_st* -exec head -1 {} + | grep "1175 876330"

gave you a syntax error (which it would unless you ran it in a directory where there is no more than one file with a name starting with star_st). We told you before that the -name primary's argument has to be quoted to work properly.

I see that RudiC has already explained that I didn't put that newline in the middle of my awk script just of the fun of it.
# 19  
Old 05-19-2015
Quote:
Originally Posted by MadeInGermany
If this has still "too many arguments" then combine with find
Code:
find . -name "star_st*" -exec awk '/1175 876330/;{nextfile}' {} +

@MadeInGermany: I was testing your suggestion verses that of Don Cragun's find . -name "star_st*" -exec head -1 {} + | grep "1175 876330"

I check if the result is found or not using if [ $? -eq 0 ]; then

I m not able to test n compare the performace of both as MadeInGermany's command always passes the condition of $? -eq 0 even if there are no results found. May be you [MadeInGermany] can provide a fix there ?

---------- Post updated at 09:18 AM ---------- Previous update was at 09:11 AM ----------

Quote:
Originally Posted by RudiC
Put a ; or a <newline> after the /.../. This is about 50% faster on my machine.
ok, I put a ; but it gives me error
Code:
awk '$0 ~ /1175 876330/; {nextfile}' star_st*
bash: /bin/awk: Argument list too long

---------- Post updated at 09:23 AM ---------- Previous update was at 09:18 AM ----------

Quote:
Originally Posted by Don Cragun
That is strange, before you said that:
Code:
find . -name star_st* -exec head -1 {} + | grep "1175 876330"

gave you a syntax error (which it would unless you ran it in a directory where there is no more than one file with a name starting with star_st). We told you before that the -name primary's argument has to be quoted to work properly.

I see that RudiC has already explained that I didn't put that newline in the middle of my awk script just of the fun of it.
I am sorry Don i meant "star_st*" and not star_st*
# 20  
Old 05-19-2015
Pipe to grep, that gives an exit status
Code:
find . -name "star_st*" -exec awk '/1175 876330/;{nextfile}' {} + | grep ^

# 21  
Old 05-19-2015
Just to be clear, if the command:
Code:
find . -name "star_st*" -exec awk '/1175 876330/;{next file}' {} +

yields exit status 0, all of the files with names starting with star_st were successfully processed by awk and the directory containing them was successfully processed by find, but it doesn't tell you if any lines were written to standard output while processing those files.

On the other hand, the commands:
Code:
find . -name "star_st*" -exec awk '/1175 876330/;{nextfile}' {} + | grep ^

and:
Code:
find . -name "star_st*" -exec awk '/1175 876330/;{next file}' {} + | fgrep 1

(which might or might not be slightly faster) will give you an exit status that indicates if grep or fgrep successfully printed a line read from find and awk (exit status 0), hit end-of-file before reading a line from the pipeline (exit status 1), or had some internal error (exit status greater than 1). It will tell you absolutely nothing about whether or not find or any of the awk commands it executed completed successfully or failed.

Some versions of some shells generate an array that contains the exit status of all elements of the latest pipeline executed, but every shell that provides it does it differently and many shells don't do it at all.

We can use subshells to save the exit status of each element of a pipeline that will work with any POSIX conforming shell, if that is what you need. So, please explain exactly what information you need to capture from the exit status of the pipeline elements.
Login or Register to Ask a Question

Previous Thread | Next Thread

8 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Why is SED so slow?

I have many files which contain about two million lines. Now I want to use sed to delete the 9th line and add a new line behind the 8th line. I use the command as follows: for((i=1;i<100;i++)); do echo $i; sed -i '9d' $i.dat; sed -i '8a this is a new line' $i.dat; done But it is... (3 Replies)
Discussion started by: wxuyec
3 Replies

2. Shell Programming and Scripting

Making a faster alternative to a slow awk command

Hi, I have a large number of input files with two columns of numbers. For example: 83 1453 99 3255 99 8482 99 7372 83 175 I only wish to retain lines where the numbers fullfil two requirements. E.g: =83 1000<=<=2000 To do this I use the following... (10 Replies)
Discussion started by: s052866
10 Replies

3. Solaris

Slow while running a command for the first time

I am facing a performance problem on a Solaris 10 Sparc V890 server, it is an old one I know. The first time we realized there is a problem with the server, is the time when ftp transfers are made. There were 4 other identical servers doing much better. Network drivers are checked and there... (3 Replies)
Discussion started by: royalliege
3 Replies

4. Shell Programming and Scripting

File processing is very slow with cut command

Dear All, I am using the following script to find and replace the date format in a file. The field18 in the file has the following format: "01/26/2010 11:55:14 GMT+04:00" which I want to convert into the following format "20100126115514" for this purpose I am using the following lines of codes:... (5 Replies)
Discussion started by: bilalghazi
5 Replies

5. Shell Programming and Scripting

slow command execution?

Dear World, I just wrote a script, which puzzled me somewhat. The siginficant code was: for file in `ls splits*`; # splits* came from a split command executed earlier do tail -$SomeNumber $file | cut -d" " -f6 > $file; done; The interesting thing is this: A few of the $files were... (2 Replies)
Discussion started by: BandGap
2 Replies

6. UNIX for Dummies Questions & Answers

scp is slow

All of the sudden scp got really slow ... from 2-3 seconds to 30 seconds. This happened for 5 hours, and then it went back to running fast. Why? If I use the -q qualifier which "Disables the progress meter" could this have any adverse effect? Thanks (1 Reply)
Discussion started by: tomstone_98
1 Replies

7. UNIX for Dummies Questions & Answers

Slow System

Hi, I have an SCO-Unix server running. There are some processes (unknown to me) which consume a lot of the system resources. This slows down the server dramatically. Is there a command or program that monitors what processes are using the cpu, disk, etc.. and tell me how excessive and how... (3 Replies)
Discussion started by: Hansaplast
3 Replies

8. Post Here to Contact Site Administrators and Moderators

Slow

The site has gone slow for quite some time... Can you do somethin abt it (2 Replies)
Discussion started by: DPAI
2 Replies
Login or Register to Ask a Question