Hi, I am having the file which contains the following two columns.
518 _factorial
256 _main
73 _atol
52 ___do_global_ctors
170 ___main
52 ___do_g
How can calculate the percentage of each value in the first column ?
first need to get the sum of the first column and... (3 Replies)
Hi all,
I want to calculate the standard deviation for a column (happens to be column 3).
Does any know of simple awk script to do this?
Thanks (1 Reply)
A short excerpt of my .txt file looks like:
CXRA3Z2J9MQKR B
CXRA3Z2J9MQKR A
CXRA3Z2J9MQKR C
CXRA3Z2J9MQKR B
A162JX4ML69UIC C
A162JX4ML69UIC A
FZ9Z19TI2XOA5 A
FZ9Z19TI2XOA5 C
FZ9Z19TI2XOA5 B
FZ9Z19TI2XOA5 B
BRNTTJUB8GXE9 A
BRNTTJUB8GXE9 A
... (7 Replies)
So I have this input
1 10327 rs112750067 T C . PASS DP=65;AF=0.208;CB=BC,NCBI
1 10469 rs117577454 C G . PASS DP=2055;AF=0.020;CB=UM,BC,NCBI
1 10492 rs55998931 C T . PASS DP=231;AF=0.167;CB=BC,NCBI
1 10583 rs58108140 G A ... (3 Replies)
Hello All,
I am trying to create a script that will give me the processes that consume swap in %.
i am using the below line to get it done.
virtual=`echo "$virtual/$swp*100"|bc -l|sed -e "s/\(\.\).*/\1/g"`
but getting the following output after running it.
.039
.110
I want the... (3 Replies)
Hello Guru s
I need your kind help to solve my below issue
I have a directory of flat files and have to calculate sum of some columns from the flat file .
Say for flat file 302 I need the column summation of 2 and 3 rd column
For flat file 303 I need the column summation of 5 and... (2 Replies)
Hi All,
I have following html code
<TR><TD>9</TD><TD>AR_TVR_TBS </TD><TD>85000</TD><TD>39938</TD><TD>54212</TD><TD>46</TD></TR>
<TR><TD>10</TD><TD>ASCV_SMY_TBS </TD><TD>69880</TD><TD>33316</TD><TD>45698</TD><TD>47</TD></TR>
<TR><TD>11</TD><TD>ARC_TBS ... (9 Replies)
Is there an awk script that can easily perform the following operation?
I have a data file that is in the format of
1944-12,5.6
1945-01,9.8
1945-02,6.7
1945-03,9.3
1945-04,5.9
1945-05,0.7
1945-06,0.0
1945-07,0.0
1945-08,0.0
1945-09,0.0
1945-10,0.2
1945-11,10.5
1945-12,22.3... (3 Replies)
Hi All,
I have the following time stamp data in 2 columns
Date TimeStamp(also with milliseconds)
05/23/2012 08:30:11.250
05/23/2012 08:30:15.500
05/23/2012 08:31.15.500
.
.
etc
From this data I need the following output.
0.00( row1-row1 in seconds)
04.25( row2-row1 in... (5 Replies)
Hi,
I want to write a script which will add the entries in all columns for the same column id. I can do it in excel, but I need to do this for 384 columns which will come down to 96 (384/4). How can I do this iteratively
A A A A B B B B C C C C
1 0 1 0 2 1 4 5 3 4 5 6
2 0 0 2 3 5 70 100 1... (7 Replies)
Discussion started by: Diya123
7 Replies
LEARN ABOUT OPENSOLARIS
igawk
IGAWK(1) Utility Commands IGAWK(1)NAME
igawk - gawk with include files
SYNOPSIS
igawk [ all gawk options ] -f program-file [ -- ] file ...
igawk [ all gawk options ] [ -- ] program-text file ...
DESCRIPTION
Igawk is a simple shell script that adds the ability to have ``include files'' to gawk(1).
AWK programs for igawk are the same as for gawk, except that, in addition, you may have lines like
@include getopt.awk
in your program to include the file getopt.awk from either the current directory or one of the other directories in the search path.
OPTIONS
See gawk(1) for a full description of the AWK language and the options that gawk supports.
EXAMPLES
cat << EOF > test.awk
@include getopt.awk
BEGIN {
while (getopt(ARGC, ARGV, "am:q") != -1)
...
}
EOF
igawk -f test.awk
SEE ALSO gawk(1)
Effective AWK Programming, Edition 1.0, published by the Free Software Foundation, 1995.
AUTHOR
Arnold Robbins (arnold@skeeve.com).
ATTRIBUTES
See attributes(5) for descriptions of the following attributes:
+--------------------+-----------------+
| ATTRIBUTE TYPE | ATTRIBUTE VALUE |
+--------------------+-----------------+
|Availability | SUNWgawk |
+--------------------+-----------------+
|Interface Stability | Volatile |
+--------------------+-----------------+
NOTES
Source for gawk is available on http://opensolaris.org.
Free Software Foundation Nov 3 1999 IGAWK(1)