Hi all,
We have a setup where our application is running on 2 AIX servers ( AIX 6.1 , 16 CPU, P5 570 boxes). These boxes works as disaster recovery server for each other i.e. in case of 1 box failure, whole load will run out of other box.
Average CPU utilization on each box is between 30-40 % with a max CPU utilization of around 65 % on each box.
There has been a question raised whether one box processing capacity is enough to run the whole load and what would be the impact if CPU utilization reaches close to 100 % (assuming memory and all other parameters are not a problem)
What should be the best way to estimate the average & worst case impact on the application performance if whole load is run out of 1 box
What we have done is
- taken CPU utilization for each minute interval from both the servers
- Arithmetic add the cpu utilization to arrive a theoretical cpu utilization beyond 100 %
- estimate how long CPU utilization would be above 100 % in a day
- Then arrive at a performance hit based on the above 2 factors.
Would like to check with experts here, is this the right approach and what other factors should be taken into consideration.
Thanks
Moderator's Comments:
|
|
edit by bakunin: please stay away from formatting your text. I'd be eternally grateful for not having to clear out half a ton of superfluous SIZE-, FONT- and whatnot-tags again. Thank you.
|
|