On top of what bakunin presented eloquently and exhaustively, some comments on your script:
- it doesn't make sense to evaluate one single process from top's output, as several processes can use up considerable CPU power, esp. if you are on a multiuser system, each of whom runs CPU intensive software. And, you seem to rely on the output being sorted by CPU%, which doesn't have to be the case - better control it (-o option).
- why run a handful of commands (head, sed, tail, cat, ...) if using the powerful awk tool anyhow? Use it to do the entire thing!
- top offers the numbers that bakunin alludes to (and that you might want to use) from the shelf, in its output's first line (man top):
Quote:
system load avg over the last 1, 5 and 15 minutes
So, one quite simple approach to your task might look like
To make sure that the locale of the system doesn't interfere, we use the C locale to run top in batch mode for one loop. awk then checks the one minute load average (field 10) - as we saw, the value to check against needs to be carefully chosen - and, depending on the comparison's result, exits with 0 , or 1 , which in turn can be evaluated by the shell to trigger the respective desired action. Be aware that shell and awk have reversed logical meanings of 0 and 1 .
how can I find cpu usage memory usage swap usage and
I want to know CPU usage above X% and contiue Y times and memory usage above X % and contiue Y times
my final destination is monitor process
logical volume usage above X % and number of Logical voluage above
can I not to... (3 Replies)
I'm writing a bash script to log some selections from a sensors output (core temp, mb temp, etc.) and I would also like to have the current cpu usage as a percentage. I have no idea how to go about getting it in a form that a bash script can use. For example, I would simply look in the output of... (3 Replies)
Hi all
can any one help me to script monitoring
CPU load avg when reaches threshold value
and disk usage if it exceeds some %
tried using awk but when df -h out put is in two different lines awk doesnt work for the particular output in two different line ( output for df -h is in two... (7 Replies)
Hi Experts,
I am executing multiple instances(in parallel) of perl script on HP-UX box.
OS is allocating substantial amount of CPU to these perl processes,resulting higher cpu utilization.
Glance always shows perl processes are occupying majority of the CPU resource. It is causing slower... (2 Replies)
I am looking for a way to log and graphically display cpu and RAM usage of linux processes over time. Since I couldn't find a simple tool to so (I tried zabbix and munin but installation failed) I started writing a shell script to do so
The script file parses the output of top command through... (2 Replies)
Hi All,
I have a script which does report the cpu usuage, there are few output parameter/fields displayed from the script. My problem is I have monitor the output and decide
which cpu number (column 2) has maximum value (column 6).
Since the output is displayed/updated every seconds, it's very... (1 Reply)
Hello Friends,
I am trying to create a shell script which will check the CPU utilization. I use command top to check the %CPU usage. It give s me below output
Cpu states:
CPU LOAD USER NICE SYS IDLE BLOCK SWAIT INTR SSYS
0 0.31 9.6% 0.0% 6.1% 84.3% 0.0% 0.0%... (3 Replies)
Hi all
I was wondering if its possible to write a script to keep CPU usage at 90%-95%? for a single cpu linux server?
I have a perl script I run on servers with multple cpu's and all I do is max all but one cpu to get into the 90'% utilised area. I now need a script that raises the CPU to... (4 Replies)
Hello experts,
we have input files with 700K lines each (one generated for every hour). and we need to convert them as below and move them to another directory once.
Sample INPUT:-
# cat test1
1559205600000,8474,NormalizedPortInfo,PctDiscards,0.0,Interface,BG-CTA-AX1.test.com,Vl111... (7 Replies)