Sponsored Content
Top Forums Shell Programming and Scripting While loop problem taking too long Post 302778863 by SkySmart on Monday 11th of March 2013 04:33:13 PM
Old 03-11-2013
Quote:
Originally Posted by DGPickett
What are you trying to do? Extract log lines fore\ each of 10 hosts? WHat kind of ourput do you want?

Almost everywhere a file is used, you can pipe or load a variable.
yes i want to extract log lines for each of the 10 hosts. and i just want the output to be whatever is found in the logs for each host in the list.

Last edited by SkySmart; 03-11-2013 at 05:46 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Problem taking input from file with for loop

I am trying to take input from a file and direct it into a bash script. This script is meant to be a foreach loop. I would like the script to process each item in the list one by one and direct the output to a file. # cat 1loop #!/bin/bash # this 2>&1 to redirect STDERR & STDOUT to file... (4 Replies)
Discussion started by: bash_in_my_head
4 Replies

2. Red Hat

login process taking a long time

I'm having a bit of a login performance issue.. wondering if anyone has any ideas where I might look. Here's the scenario... Linux Red Hat ES 4 update 5 regardless of where I login from (ssh or on the text console) after providing the password the system seems to pause for between 30... (4 Replies)
Discussion started by: retlaw
4 Replies

3. Shell Programming and Scripting

For Loop Taking Too Long

I'm new from UNIX scripting. Please help. I have about 10,000 files from the $ROOTDIR/scp/inbox/string1 directory to compare with the 50 files from /$ROOTDIR/output/tma/pnt/bad/string1/ directory and it takes about 2 hours plus to complete the for loop. Is there a better way to re-write the... (5 Replies)
Discussion started by: hanie123
5 Replies

4. UNIX for Advanced & Expert Users

Comparison and For Loop Taking Too Long

I'd like to 1. Check and compare the 10,000 pnt files contains single record from the /$ROOTDIR/scp/inbox/string1 directory against 39 bad pnt files from the /$ROOTDIR/output/tma/pnt/bad/string1 directory based on the fam_id column value start at position 38 to 47 from the record below. Here is... (1 Reply)
Discussion started by: hanie123
1 Replies

5. Shell Programming and Scripting

<AIX>Problem in purge script, taking very very long time to complete 18.30hrs

Hi, I have here a script which is used to purge older files/directories based on defined purge period. The script consists of 45 find commands, where each command will need to traverse through more than a million directories. Therefore a single find command executes around 22-25 mins... (7 Replies)
Discussion started by: sravicha
7 Replies

6. UNIX for Dummies Questions & Answers

gref -f taking long time for big file

grep -f taking long time to compare for big files, any alternate for fast check I am using grep -f file1 file2 to check - to ckeck dups/common rows prsents. But my files contains file1 contains 5gb and file 2 contains 50 mb and its taking such a long time to compare the files. Do we have any... (10 Replies)
Discussion started by: gkskumar
10 Replies

7. UNIX for Dummies Questions & Answers

Job is taking long time

Hi , We have 20 jobs are scheduled. In that one of our job is taking long time ,it's not completing. If we are not terminating it's running infinity time actually the job completion time is 5 minutes. The job is deleting some records from the table and two insert statements and one select... (7 Replies)
Discussion started by: ajaykumarkona
7 Replies

8. Solaris

Re-sync Taking Extremely Long.

It's almost 3 days now and my resync/re-attach is only at 80%. Is there something I can check in Solaris 10 that would be causing the degradation. It's only a standby machine. My live system completed in 6hrs. (9 Replies)
Discussion started by: ravzter
9 Replies

9. UNIX for Dummies Questions & Answers

ls is taking long time to list

Hi, All the data are kept on Netapp using NFS. some directories are so fast when doing ls but few of them are slow. After doing few times, it becomes fast. Then again after few minutes, it becomes slow again. Can you advise what's going on? This one directory I am very interested is giving... (3 Replies)
Discussion started by: samnyc
3 Replies

10. Shell Programming and Scripting

Rm -rf is taking very long, will it timeout?

I have so many (hundreds of thousands) files and directories within this one specific directory that my "rm -rf" command to delete them has been taking forever. I did this via the SSH, my question is: if my SSH connection times out before rm -rf finishes, will it continue to delete all of those... (5 Replies)
Discussion started by: phpchick
5 Replies
TRIMHISTORY(8)						      System Manager's Manual						    TRIMHISTORY(8)

NAME
trimhistory - Remove old Xymon history-log entries SYNOPSIS
trimhistory --cutoff=TIME [options] DESCRIPTION
The trimhistory tool is used to purge old entries from the Xymon history logs. These logfiles accumulate information about all status changes that have occurred for any given service, host, or the entire Xymon system, and is used to generate the event- and history-log web- pages. Purging old entries can be done while Xymon is running, since the tool takes care not to commit updates to a file if it changes mid-way through the operation. In that case, the update is aborted and the existing logfile is left untouched. Optionally, this tool will also remove logfiles from hosts that are no longer defined in the Xymon bb-hosts(5) file. As an extension, even logfiles from services can be removed, if the service no longer has a valid status-report logged in the current Xymon status. OPTIONS
--cutoff=TIME This defines the cutoff-time when processing the history logs. Entries dated before this time are discarded. TIME is specified as the number of seconds since the beginning of the Epoch. This is easily generated by the GNU date(1) utility, e.g. the following com- mand will trim history logs of all entries prior to Oct. 1st 2004: trimhistory --cutoff=`date +%s --date="1 Oct 2004"` --outdir=DIRECTORY Normally, files in the BBHIST directory are replaced. This option causes trimhistory to save the shortened history logfiles to another directory, so you can verify that the operation works as intended. The output directory must exist. --drop Causes trimhistory to delete files from hosts that are not listed in the bb-hosts(5) file. --dropsvcs Causes trimhistory to delete files from services that are not currently tracked by Xymon. Normally these files would be left untouched if only the host exists. --droplogs Process the BBHISTLOGS directory also, and delete status-logs from events prior to the cut-off time. Note that this can dramatically increase the processing time, since there are often lots and lots of files to process. --progress[=N] This will cause trimhistory to output a status line for every N history logs or status-log collections it processes, to indicate how far it has progressed. The default setting for N is 100. --env=FILENAME Loads the environment from FILENAME before executing trimhistory. --debug Enable debugging output. FILES
$BBHIST/allevents The eventlog of all events that have happened in Xymon. $BBHIST/HOSTNAME The per-host eventlogs. $BBHIST/HOSTNAME.SERVICE The per-service eventlogs. $BBHISTLOGS/*/* The historical status-logs. ENVIRONMENT VARIABLES
BBHIST The directory holding all history logs. BBHISTLOGS The top-level directory for the historical status-log collections. BBHOSTS The location of the bb-hosts file, holding the list of currently known hosts in Xymon. SEE ALSO
xymon(7), bb-hosts(5) Xymon Version 4.2.3: 4 Feb 2009 TRIMHISTORY(8)
All times are GMT -4. The time now is 04:53 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy