Sponsored Content
Full Discussion: Column total
Top Forums Shell Programming and Scripting Column total Post 302882575 by greycells on Wednesday 8th of January 2014 02:23:16 AM
Old 01-08-2014
Column total

Input
Code:
 
`0B0A   RDF1+TDEV       45      BL_lmapm03
`0CE7   TDEV            59       BL_lmapm03
 
`0B09   RDF1+TDEV       70       BL_lmapm02
`0CE6   TDEV            59       BL_lmapm02
 
`0B08   RDF1+TDEV       70       BL_lmapm01
`0CE5   TDEV            59       BL_lmapm01

Output needed

Code:
 
`0B0A   RDF1+TDEV       45      BL_lmapm03
`0CE7   TDEV            59       BL_lmapm03
                       [104]
 
`0B09   RDF1+TDEV       70       BL_lmapm02
`0CE6   TDEV            59       BL_lmapm02
                       [129]
 
`0B08   RDF1+TDEV       70       BL_lmapm01
`0CE5   TDEV            59       BL_lmapm01
                       [129]

I tried this …
Code:
nawk '{a[$4]+=$3} END{for(k in a)print "["a[k]"]";}1'

but this gives me the TOTAL after reading all records ...like this

Code:
`0B0A RDF1+TDEV 45 BL_lmapm03
`0CE7 TDEV 59 BL_lmapm03
`0B09 RDF1+TDEV 70 BL_lmapm02
`0CE6 TDEV 59 BL_lmapm02
`0B08 RDF1+TDEV 70 BL_lmapm01
`0CE5 TDEV 59 BL_lmapm01
[129]
[129]
[104]

pls help me modify it to include a total of $3 for each record grouped by $NF
tHE total needs to be in column 3 under each record
Pls
tHANKS

Last edited by Don Cragun; 01-08-2014 at 03:50 AM.. Reason: Use code tags rather than font changes to show sample code. (And, for ALL sample output.)
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

sum total by column

Hi, i have a file which content the following: >cat cols data a:23:data data b:76:data data c:-30:data i would like to sum up the value of column 2, but the result that return to me is 0. Can anyone help? i'm using this code to do the sum awk -F" " 'BEGIN {x=0} {x+=$2} END {print... (5 Replies)
Discussion started by: thh
5 Replies

2. Shell Programming and Scripting

Finding the total of a column using awk

Here is my file name countries USSR 8650 262 Asia Canada 3852 24 North America China 3692 866 Asia USA 3615 219 North America Brazil 3286 116 South America India 1269 637 Asia Argentina 1072 ... (8 Replies)
Discussion started by: ironhead3fan
8 Replies

3. Shell Programming and Scripting

need to get the total # of column for each line - NF not working

Hello, I just need to print the # of columns for each line of the input file. The input file uses the ascii 009 tab character. I specify this character as the FS (field separator) in the BEGIN section, and I know the FS character is correct because I can print it. When I try to print the #... (1 Reply)
Discussion started by: script_op2a
1 Replies

4. Shell Programming and Scripting

Help with total up based on same column info

Input file data US 100.25 data ENG 235.01 data US 23.12 data ENG 23.01 END UK 230.50 END Russia 20.00 . . Desired output data US 123.37 data ENG 258.02 END UK 230.50 END Russia 20.00 . . (1 Reply)
Discussion started by: perl_beginner
1 Replies

5. Shell Programming and Scripting

Help with total up all column info

Input file 11916 30640 9320 51876 5690 15874 4723 26287 5121 12269 2569 19959 9 71 6 86 Desired output file 11916 30640 9320 51876 5690 15874 4723 26287 5121 12269 2569 19959 9 71 6 86 22736 58854 16618 98208 Last part is the total up of first three data. I used the following... (6 Replies)
Discussion started by: perl_beginner
6 Replies

6. Shell Programming and Scripting

Total of 5th column using awk or any other utility in UNIX??

Hi I have this file which contains Al,AADESH,id1_0,23,2013-01-28,2,2 Al,AADESH,id1_0,23,2013-01-29,4,4 Al,AADESH,id1_0,23,2013-01-30,2,1 Al,AADESH,id1_0,31,2013-01-29,1,1 Al,AESH,id1_0,31,2013-01-31,2,2 Al,AESH,id2_2,23,2013-01-29,1,1 Al,AESH,id2_2,31,2013-01-31,1,1 ... (5 Replies)
Discussion started by: nikhil jain
5 Replies

7. Shell Programming and Scripting

Total of a column from a file

Hi i need to calculate the total of a column from a file in ksh vi file.txt System : CBSE ent=0.1 me=Cap Subject Maths Science xxxxx 56 98 yyyy 89 67 ooo 67 32 Here i need to calculate only the total of Maths column alone i.e., 56+89+67 ... (4 Replies)
Discussion started by: Priresh
4 Replies

8. Shell Programming and Scripting

awk command to find total number of Special character in a column

How to find total number of special character in a column? I am using awk -f "," '$col_number "*$" {print $col_number}' file.csv|wc -l but its not giving correct output. It's giving output as 1 even though i give no special character? Please use code tags next time for your code and... (4 Replies)
Discussion started by: AjitKumar
4 Replies

9. Shell Programming and Scripting

Help with calculate the total sum of record in column one

Input file: 101M 10M10D20M1I70M 10M10D39M4I48M 10M10D91M 10M10I13M2I7M1I58M 10M10I15M1D66M Output file: 101M 101 0 0 10M10D20M1I70M 100 1 10 10M10D39M4I48M 97 4 10 10M10D91M 101 0 10 10M10I13M2I7M1I58M 88 13 0 10M10I15M1D66M 91 10 1 I'm interested to count how many total of... (6 Replies)
Discussion started by: perl_beginner
6 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 02:29 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy