Sponsored Content
Top Forums UNIX for Advanced & Expert Users A variable and sum of its value in a huge data. Post 302317427 by varungupta on Tuesday 19th of May 2009 02:24:47 AM
Old 05-19-2009
A variable and sum of its value in a huge data.

Hi Experts,

I got a question..

In the following output of `ps -elf | grep DataFlow` I get:-

242001 A mqsiadm 2076676 1691742 0 60 20 26ad4f400 130164 * May 09 - 3:02 DataFlowEngine EAIDVBR1_BROKER 5e453de8-2001-0000-0080-fd142b9ce8cb VIPS_INQ1 0
242001 A mqsiadm 2081016 1691742 0 60 20 372c6c400 81248 * May 09 - 1:03 DataFlowEngine EAIDVBR1_BROKER 6fc4c4d3-2001-0000-0080-867a1d98f606 TEST2 1
242001 A mqsiadm 2088998 1691742 0 60 20 3facfd400 60924 * May 09 - 1:19 DataFlowEngine EAIDVBR1_BROKER c7072ee8-2001-0000-0080-ecd80b0acb20 EAI_CONTROL 0
242001 A mqsiadm 2093084 1691742 0 60 20 3fad7d400 59072 * May 09 - 1:12 DataFlowEngine EAIDVBR1_BROKER 18b140e8-2001-0000-0080-d0035678abd2 WEBSERVICES 0
242001 A mqsiadm 2117660 1691742 0 60 20 1e2d3e400 70436 * May 09 - 1:18 DataFlowEngine EAIDVBR1_BROKER 2fe72de8-2001-0000-0080-ecd80b0acb20 DISCOUNTING_CFC_APPID 0
242001 A mqsiadm 2121744 1691742 0 60 20 372d6c400 84244 * May 09 - 1:57 DataFlowEngine EAIDVBR1_BROKER a62a39e8-2001-0000-0080-ecd80b0acb20 UTILITY 0
242001 A mqsiadm 2125838 1691742 0 60 20 31ad61400 69952 * May 09 - 1:34 DataFlowEngine EAIDVBR1_BROKER 79922de8-2001-0000-0080-ecd80b0acb20 CTAX 0
242001 A mqsiadm 2134042 1691742 0 60 20 2c2dda400 93060 * May 09 - 2:38 DataFlowEngine EAIDVBR1_BROKER 318d32e8-2001-0000-0080-ecd80b0acb20 IDS_INQ1 0
242001 A mqsiadm 2138132 1691742 0 60 20 42d8a400 92372 * May 09 - 2:24 DataFlowEngine EAIDVBR1_BROKER d6d529e8-2001-0000-0080-bbfc9e717a98 CON_PAYOFFS2 0
242001 A mqsiadm 2146328 1691742 0 60 20 3cad7b400 80092 * May 09 - 1:34 DataFlowEngine EAIDVBR1_BROKER 3ed728e8-2001-0000-0080-bbfc9e717a98 CON_INQ 0

Above BOLD faced I got to pick and RED faced are the memory consumed by those Execution Groups (that are BOLD Faced).
I got to pick "each BOLD faced variable" and got to sum the RED faced value, for the interval of an hour. An hour can be divided into 6 subintervals of 10 seconds.

Please tell me how to achieve that.
I guess array can be used here.

Thanks in Advance.Smilie
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to extract data from a huge file?

Hi, I have a huge file of bibliographic records in some standard format.I need a script to do some repeatable task as follows: 1. Needs to create folders as the strings starts with "item_*" from the input file 2. Create a file "contents" in each folders having "license.txt(tab... (5 Replies)
Discussion started by: srsahu75
5 Replies

2. Shell Programming and Scripting

Split a huge data into few different files?!

Input file data contents: >seq_1 MSNQSPPQSQRPGHSHSHSHSHAGLASSTSSHSNPSANASYNLNGPRTGGDQRYRASVDA >seq_2 AGAAGRGWGRDVTAAASPNPRNGGGRPASDLLSVGNAGGQASFASPETIDRWFEDLQHYE >seq_3 ATLEEMAAASLDANFKEELSAIEQWFRVLSEAERTAALYSLLQSSTQVQMRFFVTVLQQM ARADPITALLSPANPGQASMEAQMDAKLAAMGLKSPASPAVRQYARQSLSGDTYLSPHSA... (7 Replies)
Discussion started by: patrick87
7 Replies

3. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

4. Shell Programming and Scripting

search a number in very very huge amount of data

Hi, I have to search a number in a very long listing of files.the total size of the files in which I have to search is 10 Tera Bytes. How to search a number in such a huge amount of data effectively.I used fgrep but it is taking many hours to search. Is there any other feasible solution to... (3 Replies)
Discussion started by: vsachan
3 Replies

5. Red Hat

Disk is Full but really does not contain huge data

Hi All, My disk usage show 100 % . When I check “df –kh” it shows my root partition is full. But when I run the “du –skh /” shows only 7 GB is used. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 28G 260MB 100% / How I can identify who is using the 20 GB of memory. Os: Centos... (10 Replies)
Discussion started by: kalpeer
10 Replies

6. Shell Programming and Scripting

Aggregation of huge data

Hi Friends, I have a file with sample amount data as follows: -89990.3456 8788798.990000128 55109787.20 -12455558989.90876 I need to exclude the '-' symbol in order to treat all values as an absolute one and then I need to sum up.The record count is around 1 million. How... (8 Replies)
Discussion started by: Ravichander
8 Replies

7. Shell Programming and Scripting

awk does not work well with huge data?

Dear all , I found that if we work with thousands line of data, awk does not work perfectly. It will cut hundreds line (others are deleted) and works only on the remain data. I used this command : awk '$1==1{$1="Si"}{print>FILENAME}' coba.xyz to change value of first column whose value is 1... (4 Replies)
Discussion started by: ariesto
4 Replies

8. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

9. Shell Programming and Scripting

Phrase XML with Huge Data

HI Guys, I have Big XML file with Below Format :- Input :- <pokl>MKL=1,FN=1,GBNo=B10C</pokl> <d>192</d> <d>315</d> <d>35</d> <d>0,7,8</d> <pokl>MKL=1,dFN=1,GBNo=B11C</pokl> <d>162</d> <d>315</d> <d>35</d> <d>0,5,6</d> <pokl>MKL=1,dFN=1,GBNo=B12C</pokl> <d>188</d> (4 Replies)
Discussion started by: pareshkp
4 Replies

10. UNIX for Advanced & Expert Users

Need Optimization shell/awk script to aggreagte (sum) for all the columns of Huge data file

Optimization shell/awk script to aggregate (sum) for all the columns of Huge data file File delimiter "|" Need to have Sum of all columns, with column number : aggregation (summation) for each column File not having the header Like below - Column 1 "Total Column 2 : "Total ... ...... (2 Replies)
Discussion started by: kartikirans
2 Replies
set_color(1)							       fish							      set_color(1)

NAME
set_color - set_color - set the terminal color set_color - set the terminal color Synopsis set_color [-v --version] [-h --help] [-b --background COLOR] [COLOR] Description Change the foreground and/or background color of the terminal. COLOR is one of black, red, green, brown, yellow, blue, magenta, purple, cyan, white and normal. o -b, --background Set the background color o -c, --print-colors Prints a list of all valid color names o -h, --help Display help message and exit o -o, --bold Set bold or extra bright mode o -u, --underline Set underlined mode o -v, --version Display version and exit Calling set_color normal will set the terminal color to whatever is the default color of the terminal. Some terminals use the --bold escape sequence to switch to a brighter color set. On such terminals, set_color white will result in a grey font color, while set_color --bold white will result in a white font color. Not all terminal emulators support all these features. This is not a bug in set_color but a missing feature in the terminal emulator. set_color uses the terminfo database to look up how to change terminal colors on whatever terminal is in use. Some systems have old and incomplete terminfo databases, and may lack color information for terminals that support it. Download and install the latest version of ncurses and recompile fish against it in order to fix this issue. Version 1.23.1 Sun Jan 8 2012 set_color(1)
All times are GMT -4. The time now is 05:10 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy