Sponsored Content
Top Forums Shell Programming and Scripting Summing columns over group of lines Post 302798331 by Don Cragun on Wednesday 24th of April 2013 08:38:18 AM
Old 04-24-2013
The short answer is yes.
But, I'm not sure I understand your requirements. If you want help creating an awk script to perform this taks, please answer the following questions:
  1. Do you want input fields 2 and 4 to be removed from every input line?
  2. If the value in input field 1 is not a constant in each set of four input lines, what happens?
    1. Is that set skipped? If so, should an error be printed?
    2. Should the value from the first line in the set be printed?
    3. Should the value from the last line in the set be printed?
  3. If the value in input field 3 is not a constant in each set of four input lines, what happens?
    1. Is that set skipped? If so, should an error be printed?
    2. Should the value from the first line in the set be printed?
    3. Should the value from the last line in the set be printed?
  4. Is the number of fields a constant for a given input file?
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Summing the columns of a file

Hi All, I have a file like - num.txt 12, 34, 65, line1 34, 65, 89, line2 43, 65, 77, line3 I want to do two things - 1. Add first three columns of each line and print the line with largest value. i.e. (12+34+65) for 1st line and so on. 2. Add middle column of each line i.e.... (3 Replies)
Discussion started by: asahlot
3 Replies

2. Shell Programming and Scripting

Awk: Summing values with group criteria

Hi Guys, I have a text file with ";" like separator F1;F2;F3;F4;F5 444;100041;IT;GLOB;1800000000 444;100041;TM;GLOB;1000000000 444;10300264;IT;GLOB;2000000000 444;10300264;IT;GLOB;2500000000 I have to sum the cullums F5 for same F2 and F3 collums The result must be: ... (7 Replies)
Discussion started by: gianluca2
7 Replies

3. Shell Programming and Scripting

Summing values in columns

Basically I have to process a text file which has been sorted this way: John 12 John 13 John 10 John 900 Peter 20 Peter 30 Peter 32 The first column is a name, and the second an arbitrary value, both delimited by a space. How can I sum them up such that it would become: John 935... (2 Replies)
Discussion started by: Dwee
2 Replies

4. Shell Programming and Scripting

Summing over specific lines and replacing the lines with the sum using sed, awk

Hi friends, This is sed & awk type question. I have a text file which has numbers spread all over the file. I want to sum the series of numbers whenever i find it and produce an output file with the sum. For example ###start of input text file #### abc def ghi 1 2 3 4 kjld random... (3 Replies)
Discussion started by: kaaliakahn
3 Replies

5. Shell Programming and Scripting

Summing over specific lines and replacing the lines with the sum

Hi friends, This is sed & awk type question. It is slightly different from my previous question. I have a text file which has numbers spread all over the file. I want to sum the series of numbers (but no more than 10 numbers in series) whenever i find it and produce an output file with the... (4 Replies)
Discussion started by: kaaliakahn
4 Replies

6. Homework & Coursework Questions

HELP with Unix scripts in summing columns in a file.

Use and complete the template provided. The entire template must be completed. If you don't, your post may be deleted! 1. The problem statement, all variables and given/known data: Hi guys, i'm a new guy here, and it's my first time creating a unix script. can you guys help me out here? i'd... (3 Replies)
Discussion started by: ramneim
3 Replies

7. Homework & Coursework Questions

HELP with Unix scripts in summing columns in a file

1. The problem statement, all variables and given/known data: Hi guys, i'm a new guy here, and it's my first time creating a unix script. can you guys help me out here? i'd really appreciate it. :( Here's my problem: This is the file i'm using, it has 6 columns, the first three columns are... (12 Replies)
Discussion started by: ramneim
12 Replies

8. Shell Programming and Scripting

Please Help!!!! Awk for summing columns based on selected column value

a,b,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,z,aa,bb,cc,dd,ee,ff,gg,hh,ii a thru ii are digits and strings.... The awk needed....if coloumn 9 == i (coloumn 9 is string ), output the sum of x's(coloumn 22 ) in all records and sum of y's (coloumn 23 ) in all records in a file (records.txt).... (6 Replies)
Discussion started by: BrownBob
6 Replies

9. Shell Programming and Scripting

Summing columns in line

I have a file with the following format AAAAA 1.34B 0.76B 0.00B 0.00B 0.00B 0.00B 0.00B 0.00B 0.00B 0.00B 0.00B 0.00B 0.00B 0.00B 0.90B 0.00B 0.00B 0.46B 0.00B 0.03B 0.00B ... (4 Replies)
Discussion started by: ncwxpanther
4 Replies

10. Shell Programming and Scripting

Summing per group in a loop

I want to sum and average all other columns by first column GR1 1 4 7 GR1 2 5 8 GR1 3 6 9 GR2 11 14 17 GR2 13 16 19 GR3 1 3 5 GR3 2 4 6 For a limited number of columns I can do... (2 Replies)
Discussion started by: jianp83
2 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 02:01 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy