Sponsored Content
Top Forums Shell Programming and Scripting [Solved] awk Column difference Post 302761501 by tukuyomi on Friday 25th of January 2013 05:55:54 PM
Old 01-25-2013
Code:
awk '{for(f=3; f<=NF;f+=2){
        if(NR==1)A[f]=$f;else{B[f]=A[f]-$f;A[f]=$f; $f=B[f]}
}}1' file

 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

[Solved] Using awk to obtain minimum of each column (ignoring zeros)

Hi, I have a wide and long dataset which looks as follows: 0 3 4 2 3 0 2 2 ... 3 2 4 0 2 2 2 3 ... 0 3 4 2 0 4 4 4 ... 3 0 4 2 2 4 2 4 ... .... I would like to obtain the minimum of each column (ignoring zero values) so the output would look like: 3 2 4 2 2 2 2 2 I have the... (3 Replies)
Discussion started by: kasan0
3 Replies

2. UNIX for Dummies Questions & Answers

[Solved] Column manipulation

Hi Everyone, I was wondering if someone could help me to transform my data into a format I need. Here is an example of what my data looks like E F G H A 1 2 3 4 B 5 6 7 8 C 9 1 2 3 D 4 5 6 7 and this is what I would need it to look like: AE 1 BE 5 CE 9 DE 4 AF 2 BF 6 CF 1 (6 Replies)
Discussion started by: zajtat
6 Replies

3. Shell Programming and Scripting

[Solved] sum up third and second columns by 0 difference

Hi Friends, I have the following file chr1 1 2 chr1 2 3 chr1 3 4 chr1 4 5 chr1 5 6 chr1 19 20 chr1 20 21 chr1 21 22 I want to compare the third column of record 1 to second column of next record and if the difference is zero, consider its third column and match it to next record... (4 Replies)
Discussion started by: jacobs.smith
4 Replies

4. UNIX for Dummies Questions & Answers

[Solved] Deleting all rows where the first column equals the second column

Hi, I have a tab delimited text file where the first two columns equal numbers. I want to delete all rows where the value in the first column equals the second column. How do I go about doing that? Thanks! Input: 1 1 ABC DEF 2 2 IJK LMN 1 2 ZYX OPW Output: 1 2 ZYX OPW (2 Replies)
Discussion started by: evelibertine
2 Replies

5. Shell Programming and Scripting

[Solved] Sorting a column in a file based on a column in a second file

Hello, I have two files as the following: File1: F0100020 A G F0100030 A T F0100040 A G File2: F0100040 A G BTA-28763-no-rs 77.2692 F0100030 A T BTA-29334-no-rs 11.4989 F0100020 A G BTA-29515-no-rs 127.006 I want to sort the second file based on the... (6 Replies)
Discussion started by: Homa
6 Replies

6. Shell Programming and Scripting

[Solved] Sorting a column based on another column

hello, I have a file as follows: F0100010 A C F0100040 A G BTA-28763-no-rs 77.2692 F0100020 A G F0100030 A T BTA-29334-no-rs 11.4989 F0100030 A T F0100020 A G BTA-29515-no-rs 127.006 F0100040 A G F0100010 A C BTA-29644-no-rs 7.29827 F0100050 A... (9 Replies)
Discussion started by: Homa
9 Replies

7. Shell Programming and Scripting

awk - how to get difference of the same column when other column matches

I have a file like this : # cat list cucm, location,76,2 cucm1,location1,76,4 cucm,location,80,8 cucm1,location1,90,8 cucm1,location1,87,11 cucm,location,67,9 and I want output like this : cucm,location,76,2 cucm1,location1,76,4 cucm,location,80, 6 ===> (8-2 =6) cucm1,location1,90,4... (5 Replies)
Discussion started by: Lakshmikumari
5 Replies

8. Shell Programming and Scripting

How to get difference of the same column between two files when other column matches?

File 1: 20130416,235800,10.78.25.104,BR2-loc,60.0,1624,50.0,0,50.0,0 20130416,235800,10.78.25.104,BR1-LOC,70.0,10,50.0,0,70.0,0 20130416,235800,10.78.25.104,Hub_None,60.0,15,60.0,0,50.0,0 File 2: 20130417,000200,10.78.25.104,BR2-loc,60.0,1626,50.0,0,50.0,0... (3 Replies)
Discussion started by: Lakshmikumari
3 Replies

9. Shell Programming and Scripting

Difference of the same column when two other column matches and one column differs less than 1 hour

This is my input file : # cat list 20130430121600, cucm, location,76,2 20130430121600,cucm1,location1,76,4 20130430122000,cucm,location,80,8 20130430122000,cucm1,location1,90,8 20130430140000,cucm1,location1,87,11 20130430140000, cucm,location,67,9 This is the required output ... (1 Reply)
Discussion started by: Lakshmikumari
1 Replies

10. Shell Programming and Scripting

awk to calculate difference of split and sum the difference

In the awk I am trying to subtract the difference $3-$2 of each matching $4 before the first _ (underscore) and print that value in $13. I think the awk will do that, but added comments. What I am not sure off is how to add a line or lines that will add sum each matching $13 value and put it in... (2 Replies)
Discussion started by: cmccabe
2 Replies
IGAWK(1)							 Utility Commands							  IGAWK(1)

NAME
igawk - gawk with include files SYNOPSIS
igawk [ all gawk options ] -f program-file [ -- ] file ... igawk [ all gawk options ] [ -- ] program-text file ... DESCRIPTION
Igawk is a simple shell script that adds the ability to have ``include files'' to gawk(1). AWK programs for igawk are the same as for gawk, except that, in addition, you may have lines like @include getopt.awk in your program to include the file getopt.awk from either the current directory or one of the other directories in the search path. OPTIONS
See gawk(1) for a full description of the AWK language and the options that gawk supports. EXAMPLES
cat << EOF > test.awk @include getopt.awk BEGIN { while (getopt(ARGC, ARGV, "am:q") != -1) ... } EOF igawk -f test.awk SEE ALSO
gawk(1) Effective AWK Programming, Edition 1.0, published by the Free Software Foundation, 1995. AUTHOR
Arnold Robbins (arnold@skeeve.com). ATTRIBUTES
See attributes(5) for descriptions of the following attributes: +--------------------+-----------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +--------------------+-----------------+ |Availability | SUNWgawk | +--------------------+-----------------+ |Interface Stability | Volatile | +--------------------+-----------------+ NOTES
Source for gawk is available on http://opensolaris.org. Free Software Foundation Nov 3 1999 IGAWK(1)
All times are GMT -4. The time now is 10:46 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy