My script read out data out of 144 files per day - every ten minutes a file with data.
data-file
my script is working like this for files with the namestructure YYYYMMDDHHMM
so at the end i got one file like this
will bring
real 0m41.111s
user 0m1.400s
sys 0m4.134s
So far it is working properly, but i want to speed it up.
How can i change the script to make it more fast?
1. The commands with grep and awk combination can be done with awk only, like:
Instead of
Use
This will get rid of those 4 grep commands.
2. The date command is used twice in the beginning +"as many files you have" times, whereas just one date would have done the thing for you. also the one inside the loop is not needed since the result is fixed. Instead of these 3 commands, use the below at the very beginning of the script:
This step will get rid of your umpteen date commands.
3. Instead of the statement
Use:
This will get rid of those 4 cut commands.
4. In place of the command:
Use:
ls is an external command.
Guru.
This User Gave Thanks to guruprasadpr For This Post:
Thanks a lot!
With the advancement of @guruprasadpr the script takes only half the time, bud with @Scrutinizer improvement it only needs 5.4 sec.
really great!!
Running 2 VM Guests on an HPUX Integrity Server. One Guest runs great, the other is always at a high NICE value and 0% idle as shown in TOP:
What do you think should be tuned to bring down the NICE and increase IDLE %? Thanks in advance
-hpuxadmin
slow VM GUEST
Load averages: 2.56,... (5 Replies)
Hi,
I have this routine that reads a microsoft dhcp.netsh dump. Where it finds optionvalue 3 STRING "0.0.0.0"
Replace it with the router IP based on the network
!/usr/bin/perl
while ( <> )
{
if ( /\# NET / ) { $net = $'; $net =~ s///g; }
else
{
s/set optionvalue 3... (1 Reply)
i would like to know how can i fine tune the following query since the cost of the query is too high ..
insert into temp temp_1 select a,b,c,d from xxxx
.. database used is IDS.. (1 Reply)
Can some body tune the below command, its occupyinh more than 90% of CPU some times.
tail -n 1000 /logs/trace.log | awk 'BEGIN{OOM = 0; ScE = 0; NaE = 0; Jms = 0}
/OutOfMemoryException/{OOM = 1}
/StaleConnectionException/{ScE = 1}
/NamingException/{NaE = 1}
/JmsTimeOutException/{Jms = 1}... (17 Replies)
I have very big log file around 2-3 GB in that it contians 24 hours log data. My work is extract only 5-5 data and count the patterns from them. I worte a script in linux and we're using that.
sed -n "/2009 05:/,/2009 17:/p" trace.log | grep -f patterns.txt > temp.log
while read string ;do... (5 Replies)
I have a requirement to separate only some numbers from the input file and produce it in a format.
The input is ( i have took a sample, the actual file contains more than 50000 rows around 840 MB in size)
$cat temp.txt
001 08 002 08 003 06 004 11 005 11 006 08
007 08 008 92* 009 92 010... (1 Reply)
Hi,
As a a security audit, how can I proceed further with Fine tuning and Hardening the linux kernel... I am not sure with the steps how to proceed further... If i do some thing wrong, then its comes with the Kernel panic error. So, I am afraid, how to do the tuning with the kernel.. (1 Reply)
I have big log file, which contains the netstat output from my application server to a particular DB server. I aim is to plot a daily graph for this. Please find the sample log file below.
@ - ...........................................................
@ - Total number of connection to the ... (3 Replies)