awk '($1 ~ /Average/){next} ##### Checking condition if $1's value has string Average then do not perform any action further.
NF==9 ##### if Number of fields are 9 then
{print A OFS $1 OFS $4 OFS $7;A++}' ##### print A which is variable it will print NOTHING on first line as it has NULL value at first, later I am increasing it.
Also print $1, $4 and $7 with OFS, output field seprator's value too.
OFS="\t\t" Input_file ##### Setting OFS(Output field seprator)'s value to double tab and mentioning the input file.
Thanks,
R. Singh
This User Gave Thanks to RavinderSingh13 For This Post:
Could this be simplified in structure rather? I generally don't like cats, but this one might make sense:-
Code:
cat -n sarb.out | while read seq tim bre lre rca bwr lwr wca pre pwr
do
printf "$seq\t$tim\t$rca\t$wca\n"
done
Does that make more sense? i struggle with awk & sed so this may be clearer to you, maybe not. It's just an alternate that might make it easier to maintain in future, however it will likely run slower than a single well written awk
By way of explanation
The cat -n prefixes each line with a sequence number
The while read loop reads each field from each line (including the sequence
The printf displays the fields you want, tab separated by \t and throwing a new-line at the end \n
Thank you Robin for nice code. Just want to add here above code will give empty line and line which contains Average string in it and count(seq) should start from 2nd line of input_file(sar_input in my case). So a little edited one of your code as follows.
Code:
cat -n sar_input | while read seq tim bre lre rca bwr lwr wca pre pwr
do
if [[ $seq -gt 1 && "$tim" != "Average" && $rca != "" && $wca != "" && "$tim" != "" ]]
then
seq=`expr $seq - 1`
printf "$seq\t$tim\t$rca\t$wca\n"
fi
done
Thanks,
R. Singh
This User Gave Thanks to RavinderSingh13 For This Post:
I would get rid of the cat (which only creates more work for your system and slows down your output), rewrite the if statements as a condition (to shorten your code), and get rid of the LINE and DATE variables (since they aren't needed) like this:
and it still produces exactly the same output. I would probably change Date in the heading output line to Time, but I like the text heading better than copying the timestamp from the input header. But, that choice is clearly up to you.
I am facing situation where sar -u command is showing 0 for all cps, so does it mean all the cpus are fully utilized, os is oracle Linux 6.8
01:34:13 PM all 0 0 0 0 0.00 0 (2 Replies)
We're experiencing some intermittent freezes on one of our systems and I'm trying to figure out what is happening.
We're running Solaris 10 zones mounting shares from netapp through nfs.
On the zone that freezes we have sar running and are getting this output:
SunOS prodserver 5.10... (3 Replies)
I was reviewing yesterday's sar file and came across this strange output! What in the world? Any reason why there's output like that?
SunOS unixbox 5.10 Generic_144488-07 sun4v sparc SUNW,T5240 Solaris
00:00:58 device %busy avque r+w/s blks/s avwait avserv
11:20:01 ... (4 Replies)
Hi,
Anyone knows how to extract sar command output to excel or Is there any free grapical tools to extract this sar log file. thanks, regards (2 Replies)
Hi,
We have 2 scripts below for reporting sar output which are pretty same.
In first script i want to add to the program whatever is given in the comments.
In second script I want to use while true to run this program every hour and everything that is in comment.
Finally I want to club... (0 Replies)
Hi All,
i tried sar command the output appears to be for several days
I would like to just see today's SAR output: Please advice me.
$sar
Linux 2.6.9-67.ELsmp (lrtp50) 02/28/09
00:00:01 CPU %user %nice %system %iowait %idle
00:05:02 all 3.10... (4 Replies)
I am trying to collect the sar output for around 90minutes.
When i do
sar 1 5000 >> /tmp/sar.out
It's not updating the sar.out file. When we decrease the 5000 to smaller number like 10, i can see the file sar.out updated after the 10seconds.If i kill my sar while it is running it's not... (1 Reply)