Sponsored Content
Top Forums Shell Programming and Scripting Sum up the column values group by using some field Post 302730537 by manas_ranjan on Tuesday 13th of November 2012 06:45:52 AM
Old 11-13-2012
thanks sky, but issue is I can't use perl, can I same suggestion in awk or something like that...
as well as O/p doesn't align with the requested o/p, total var sum for 2nd date 9 Nov is not summed up as well as some columns for 09th nov are missing

Last edited by manas_ranjan; 11-13-2012 at 07:59 AM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to sum column 1 values

I have a file file like this. I want to sum all column 1 values. input A 2 A 3 A 4 B 4 B 2 Out put A 9 B 6 (3 Replies)
Discussion started by: suresh3566
3 Replies

2. Shell Programming and Scripting

Column sum group by uniq records

Dear All, I want to get help for below case. I have a file like this. saman 1 gihan 2 saman 4 ravi 1 ravi 2 so i want to get the result, saman 5 gihan 2 ravi 3 like this. Pls help me. (17 Replies)
Discussion started by: Nayanajith
17 Replies

3. Shell Programming and Scripting

print unique values of a column and sum up the corresponding values in next column

Hi All, I have a file which is having 3 columns as (string string integer) a b 1 x y 2 p k 5 y y 4 ..... ..... Question: I want get the unique value of column 2 in a sorted way(on column 2) and the sum of the 3rd column of the corresponding rows. e.g the above file should return the... (6 Replies)
Discussion started by: amigarus
6 Replies

4. Shell Programming and Scripting

Sum of column by group wise

Hello All , I have a problem with summing of column by group Input File - COL_1,COL_2,COL_3,COL_4,COL_5,COL_6,COL_7,COL_8,COL_9,COL_10,COL_11 3010,21,1923D ,6,0,0.26,0,0.26,-0.26,1,200807 3010,21,192BI ,6,24558.97,1943.94,0,1943.94,22615.03,1,200807 3010,21,192BI... (8 Replies)
Discussion started by: jambesh
8 Replies

5. Shell Programming and Scripting

Getting a sum of column values

I have a file in the following layout: 201008005946873001846130058030701006131840000000000000000000 201008006784994001154259058033001009527844000000000000000000 201008007323067002418095058034801002418095000000000000000000 201008007697126001722141058029101002214158000000000000000000... (2 Replies)
Discussion started by: jclanc8
2 Replies

6. UNIX for Dummies Questions & Answers

awk to sum column field from duplicate row/lines

Hello, I am new to Linux environment , I working on Linux script which should send auto email based on the specific condition from log file. Below is the sample log file Name m/c usage abc xxx 10 abc xxx 20 abc xxx 5 xyz ... (6 Replies)
Discussion started by: asjaiswal
6 Replies

7. Shell Programming and Scripting

Sum column values based in common identifier in 1st column.

Hi, I have a table to be imported for R as matrix or data.frame but I first need to edit it because I've got several lines with the same identifier (1st column), so I want to sum the each column (2nd -nth) of each identifier (1st column) The input is for example, after sorted: K00001 1 1 4 3... (8 Replies)
Discussion started by: sargotrons
8 Replies

8. Shell Programming and Scripting

Sum column values matching other field

this is part of a KT i am going thru. i am writing a script in bash shell, linux where i have 2 columns where 1st signifies the nth hour like 00, 01, 02...23 and 2nd the file size. sample data attached. Desired output is 3 columns which will give the nth hour, number of entries in nth hour and... (3 Replies)
Discussion started by: alpha_1
3 Replies

9. UNIX for Dummies Questions & Answers

Match sum of values in each column with the corresponding column value present in trailer record

Hi All, I have a requirement where I need to find sum of values from column D through O present in a CSV file and check whether the sum of each Individual column matches with the value present for that corresponding column present in the trailer record. For example, let's assume for column D... (9 Replies)
Discussion started by: tpk
9 Replies

10. UNIX for Beginners Questions & Answers

Sum the values in the column using date column

I have a file which need to be summed up using date column. I/P: 2017/01/01 a 10 2017/01/01 b 20 2017/01/01 c 40 2017/01/01 a 60 2017/01/01 b 50 2017/01/01 c 40 2017/01/01 a 20 2017/01/01 b 30 2017/01/01 c 40 2017/02/01 a 10 2017/02/01 b 20 2017/02/01 c 30 2017/02/01 a 10... (6 Replies)
Discussion started by: Booo
6 Replies
GENSKYVEC(1)						      General Commands Manual						      GENSKYVEC(1)

NAME
genskyvec - compute patch radiance averages for a specific sky SYNOPSIS
genskyvec [ -m N ][ -c r g b ] DESCRIPTION
Genskyvec samples the Radinace sky description given on the standard input to generate a list of average patch radiances. If there is a sun in the description, genskyvec will include its contribution in the three nearest sky patches, distributing energy according to centroid proximity. By default, genskyvec divides the sky into 2305 patches, plus one patch for the ground. This corresponds to Reinhart's extension of the Tregenza sky, where the original 145 patches are subdivided into 16 subpatches, except at the zenith. A different subdivision may be spec- ified via the -m option. The value given will be used to subdivide each dimension, so the default of 4 yields almost 16 times as many patches as the original Tregenza sky, which can be specified with -m 1. A higher resolution sky is generally better for daylight coeffi- cient analysis where solar position is important. The -c option may be used to specify a color for the sky. The gray value should equal 1 for proper energy balance The default sky color is -c 0.960 1.004 1.118 . EXAMPLE
To generate 578 patches corresponding to a 2x2 subdivision of the Tregenza sky on a sunny equinox noon: gensky 9 21 12 | genskyvec -m 2 > sky09_21_12.dat AUTHOR
Greg Ward SEE ALSO
dctimestep(1), genBSDF(1), gensky(1), gentregvec(1), rtcontrib(1), rtrace(1) RADIANCE
12/09/09 GENSKYVEC(1)
All times are GMT -4. The time now is 09:43 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy