While loop problem taking too long


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting While loop problem taking too long
# 1  
Old 03-11-2013
While loop problem taking too long

Code:
while read myhosts
do
                                while read discovered
                                do
                                                echo "$discovered"
                                     
                                done < $LOGFILE | grep -Pi "[a-z]|[A-Z]" | egrep "${myhosts}" | egrep "CRITICAL" | awk -F";" '{print $3}' | sort -n | uniq

done < $SERVERS | egrep -v "#"

can the above be fixed? for some weird reason it's taking forever to complete.

the logfile is only 31MB. the list of servers in "SERVERS" is only 10.
# 2  
Old 03-11-2013
whenever you have grep | grep | grep | awk | sed | cut you might as well do it all in one awk.

Also, your inner 'while read' loop is pointless, you could have just given grep the filename.

Also, you don't need -P to do [a-z]|[A-Z], just do [a-zA-Z].

Also, half of the greps in that huge line seem redundant -- searching for [A-Z] when all lines must have CRITICAL in them, etc.

Also, you can feed grep multiple hosts instead of running it over and over and over and...

Code:
grep -v "^#" < $SERVERS > /tmp/myhosts

grep -F -f /tmp/myhosts $LOGFILE | awk -F";" '/CRITICAL/ { print $3 }' | sort -n | uniq

rm -f /tmp/myhosts


Last edited by Corona688; 03-11-2013 at 12:31 PM..
This User Gave Thanks to Corona688 For This Post:
# 3  
Old 03-11-2013
I guess this should be

Code:
done < $LOGFILE | grep -Pi "[a-z]|[A-Z]" | egrep "${myhosts}" | egrep "CRITICAL" | awk -F";" '{print $3}' | sort -n | uniq

changed to

Code:
done < grep -Pi "[a-z]|[A-Z]" $LOGFILE | egrep "${myhosts}" | egrep "CRITICAL" | awk -F";" '{print $3}' | sort -n | uniq

What say? Smilie
This User Gave Thanks to PikK45 For This Post:
# 4  
Old 03-11-2013
Quote:
Originally Posted by Corona688
whenever you have grep | grep | grep | awk | sed | cut you might as well do it all in one awk.

Also, your inner 'while read' loop is pointless, you could have just given grep the filename.

Also, you don't need -P to do [a-z]|[A-Z], just do [a-zA-Z].

Also, half of the greps in that huge line seem redundant -- searching for [A-Z] when all lines must have CRITICAL in them, etc.

Also, you can feed grep multiple hosts instead of running it over and over and over and...

Code:
grep -v "^#" < $SERVERS > /tmp/myhosts

grep -F -f /tmp/myhosts $LOGFILE | awk -F";" '/CRITICAL/ { print $3 }' | sort -n | uniq

rm -f /tmp/myhosts


i want to avoid redirecting anything to files. i want this done directly in the script. is this possible?
# 5  
Old 03-11-2013
What are you trying to do? Extract log lines fore\ each of 10 hosts? WHat kind of ourput do you want?

Almost everywhere a file is used, you can pipe or load a variable.
# 6  
Old 03-11-2013
Quote:
Originally Posted by SkySmart
I want to avoid redirecting anything to files. I want this done directly in the script. Is this possible?
Any particular reason when using a file seems the obvious, efficient, and portable way to do so?

I suppose you could try this:

Code:
set -- `grep -v "^#" $SERVERS`
IFS="|"

egrep "$*" $LOGFILE | awk '/CRITICAL/ { print $3 }' | sort -n -u

But it will not work if the lines you match in $SERVERS have spaces in them, and it will obliterate your $1 $2 ... variables.
This User Gave Thanks to Corona688 For This Post:
# 7  
Old 03-11-2013
Quote:
Originally Posted by DGPickett
What are you trying to do? Extract log lines fore\ each of 10 hosts? WHat kind of ourput do you want?

Almost everywhere a file is used, you can pipe or load a variable.
yes i want to extract log lines for each of the 10 hosts. and i just want the output to be whatever is found in the logs for each host in the list.

Last edited by SkySmart; 03-11-2013 at 05:46 PM..
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Rm -rf is taking very long, will it timeout?

I have so many (hundreds of thousands) files and directories within this one specific directory that my "rm -rf" command to delete them has been taking forever. I did this via the SSH, my question is: if my SSH connection times out before rm -rf finishes, will it continue to delete all of those... (5 Replies)
Discussion started by: phpchick
5 Replies

2. UNIX for Dummies Questions & Answers

ls is taking long time to list

Hi, All the data are kept on Netapp using NFS. some directories are so fast when doing ls but few of them are slow. After doing few times, it becomes fast. Then again after few minutes, it becomes slow again. Can you advise what's going on? This one directory I am very interested is giving... (3 Replies)
Discussion started by: samnyc
3 Replies

3. Solaris

Re-sync Taking Extremely Long.

It's almost 3 days now and my resync/re-attach is only at 80%. Is there something I can check in Solaris 10 that would be causing the degradation. It's only a standby machine. My live system completed in 6hrs. (9 Replies)
Discussion started by: ravzter
9 Replies

4. UNIX for Dummies Questions & Answers

Job is taking long time

Hi , We have 20 jobs are scheduled. In that one of our job is taking long time ,it's not completing. If we are not terminating it's running infinity time actually the job completion time is 5 minutes. The job is deleting some records from the table and two insert statements and one select... (7 Replies)
Discussion started by: ajaykumarkona
7 Replies

5. UNIX for Dummies Questions & Answers

gref -f taking long time for big file

grep -f taking long time to compare for big files, any alternate for fast check I am using grep -f file1 file2 to check - to ckeck dups/common rows prsents. But my files contains file1 contains 5gb and file 2 contains 50 mb and its taking such a long time to compare the files. Do we have any... (10 Replies)
Discussion started by: gkskumar
10 Replies

6. Shell Programming and Scripting

<AIX>Problem in purge script, taking very very long time to complete 18.30hrs

Hi, I have here a script which is used to purge older files/directories based on defined purge period. The script consists of 45 find commands, where each command will need to traverse through more than a million directories. Therefore a single find command executes around 22-25 mins... (7 Replies)
Discussion started by: sravicha
7 Replies

7. UNIX for Advanced & Expert Users

Comparison and For Loop Taking Too Long

I'd like to 1. Check and compare the 10,000 pnt files contains single record from the /$ROOTDIR/scp/inbox/string1 directory against 39 bad pnt files from the /$ROOTDIR/output/tma/pnt/bad/string1 directory based on the fam_id column value start at position 38 to 47 from the record below. Here is... (1 Reply)
Discussion started by: hanie123
1 Replies

8. Shell Programming and Scripting

For Loop Taking Too Long

I'm new from UNIX scripting. Please help. I have about 10,000 files from the $ROOTDIR/scp/inbox/string1 directory to compare with the 50 files from /$ROOTDIR/output/tma/pnt/bad/string1/ directory and it takes about 2 hours plus to complete the for loop. Is there a better way to re-write the... (5 Replies)
Discussion started by: hanie123
5 Replies

9. Red Hat

login process taking a long time

I'm having a bit of a login performance issue.. wondering if anyone has any ideas where I might look. Here's the scenario... Linux Red Hat ES 4 update 5 regardless of where I login from (ssh or on the text console) after providing the password the system seems to pause for between 30... (4 Replies)
Discussion started by: retlaw
4 Replies

10. Shell Programming and Scripting

Problem taking input from file with for loop

I am trying to take input from a file and direct it into a bash script. This script is meant to be a foreach loop. I would like the script to process each item in the list one by one and direct the output to a file. # cat 1loop #!/bin/bash # this 2>&1 to redirect STDERR & STDOUT to file... (4 Replies)
Discussion started by: bash_in_my_head
4 Replies
Login or Register to Ask a Question