Sponsored Content
Top Forums Shell Programming and Scripting Summation of Columns accross multiple files Post 302379196 by ghostdog74 on Wednesday 9th of December 2009 10:16:43 PM
Old 12-09-2009
Code:
join file file2| awk '{t=0;for(i=2;i<=NF;i++){t+=$i};print $1,t}'

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Combine multiple columns from multiple files

Hi there, I was wondering if someone can help me with this. I am trying the combine multiple columns from multiple files into one file. Example file 1: c0t0d0 c0t2d0 # hostname vgname c0t0d1 c0t2d1 # hostname vgname c0t0d2 c0t2d2 # hostname vgname c0t1d0 c0t3d0 # hostname vgname1... (5 Replies)
Discussion started by: martva
5 Replies

2. Shell Programming and Scripting

need help with post:extract multiple columns from multiple files

hello, I will would be grateful if anyone can help me reply to my post extract multiple cloumns from multiple files; skip rows and include filenames; awk Please see this thread. Thanks manishabh (0 Replies)
Discussion started by: manishabh
0 Replies

3. Shell Programming and Scripting

calculating column summation in a directory of flat files

Hello Guru s I need your kind help to solve my below issue I have a directory of flat files and have to calculate sum of some columns from the flat file . Say for flat file 302 I need the column summation of 2 and 3 rd column For flat file 303 I need the column summation of 5 and... (2 Replies)
Discussion started by: Pratik4891
2 Replies

4. Shell Programming and Scripting

Adding Multiple Files via Columns

I have a number of files with multiple rows that I need to add together. Let say I have 10 files: Each file has a great number of rows and columns. I need to add these files together the following way. In other words, If, for example, file A occupies Columns 1 to 19, I want to add file B... (7 Replies)
Discussion started by: Ernst
7 Replies

5. Shell Programming and Scripting

List and Compare Files accross different servers.

Hi all, This is my situation. First thing is I cannot use rsync to accomplish this. I don't have on my systems and we can't put it on. I run HP-UX 11v3. I have a list of files generated every day which tells me which files are not in sync with the rest of the servers.I want to ls -l the... (1 Reply)
Discussion started by: zixzix01
1 Replies

6. UNIX for Dummies Questions & Answers

cutting multiple columns into multiple files

Hypothetically, suppose that file1 id v1 v2 v3 v4 v5 v6 v7..........v100 1 1 1 1 1 1 2 2 .....50 2 1 1 1 1 1 2 2 .....50 3 1 1 1 1 1 2 2 .....50 4 1 1 1 1 1 2 2 .....50 5 1 1 1 1 1 2 2 .....50 I want to write a loop such that I take the id# and the first 5 columns (v1-v5) into the... (3 Replies)
Discussion started by: johnkim0806
3 Replies

7. Shell Programming and Scripting

Compare multiple files with multiple number of columns

Hi, input file1 abcd 123 198 xyz1:0909090-0909091 ghij 234 999 xyz2:987654:987655 kilo 7890 7990 xyz3:12345-12357 prem 9 112 xyz5:97-1134 input file2 abcd 123 198 xyz1:0909090-0909091 -9.122 0 abed 88 98 xyz1:98989-090808 -1.234 1.345 ghij 234 999 xyz2:987654:987655 -10.87090909 5... (5 Replies)
Discussion started by: jacobs.smith
5 Replies

8. Shell Programming and Scripting

Merging multiple files from multiple columns

Hi guys, I have very basic linux experience so I need some help with a problem. I have 3 files from which I want to extract columns based on common fields between them. File1: --- rs74078040 NA 51288690 T G 461652 0.99223 0.53611 3 --- rs77209296 NA 51303525 T G 461843 0.98973 0.60837 3... (10 Replies)
Discussion started by: bartman2099
10 Replies

9. Shell Programming and Scripting

Merging Multiple Columns between two files

Hello guys, I have 2 CSV files which goes like this: CSV1: Breaking.csv: UTF-8 "Name","Description","Occupation","Email" "Walter White","","Chemistry Teacher","w.w@bb.com" "Jessie Pinkman","","Junkie","j.p@bb.com" "Hank Schrader","","DEA Agent","h.s@bb.com" CSV2: Bad.csv... (7 Replies)
Discussion started by: jeffreybsu
7 Replies

10. Shell Programming and Scripting

Removing carriage returns from multiple lines in multiple files of different number of columns

Hello Gurus, I have a multiple pipe separated files which have records going over multiple Lines. End of line separator is \n and records going over multiple lines have <CR> as separator. below is example from one file. 1|ABC DEF|100|10 2|PQ RS T|200|20 3| UVWXYZ|300|30 4| GHIJKL|400|40... (7 Replies)
Discussion started by: dJHa
7 Replies
join(1) 							   User Commands							   join(1)

NAME
join - relational database operator SYNOPSIS
join [-a filenumber | -v filenumber] [-1 fieldnumber] [-2 fieldnumber] [-o list] [-e string] [-t char] file1 file2 join [-a filenumber] [-j fieldnumber] [-j1 fieldnumber] [-j2 fieldnumber] [-o list] [-e string] [-t char] file1 file2 DESCRIPTION
The join command forms, on the standard output, a join of the two relations specified by the lines of file1 and file2. There is one line in the output for each pair of lines in file1 and file2 that have identical join fields. The output line normally con- sists of the common field, then the rest of the line from file1, then the rest of the line from file2. This format can be changed by using the -o option (see below). The -a option can be used to add unmatched lines to the output. The -v option can be used to output only unmatched lines. The default input field separators are blank, tab, or new-line. In this case, multiple separators count as one field separator, and leading separators are ignored. The default output field separator is a blank. If the input files are not in the appropriate collating sequence, the results are unspecified. OPTIONS
Some of the options below use the argument filenumber. This argument should be a 1 or a 2 referring to either file1 or file2, respectively. -a filenumber In addition to the normal output, produce a line for each unpairable line in file filenumber, where filenumber is 1 or 2. If both -a 1 and -a 2 are specified, all unpairable lines will be output. -e string Replace empty output fields in the list selected by option -o with the string string. -j fieldnumber Equivalent to -1fieldnumber -2fieldnumber. -j1 fieldnumber Equivalent to -1fieldnumber. -j2 fieldnumber Equivalent to -2fieldnumber. Fields are numbered starting with 1. -o list Each output line includes the fields specified in list. Fields selected by list that do not appear in the input will be treated as empty output fields. (See the -e option.) Each element of which has the either the form filenum- ber.fieldnumber, or 0, which represents the join field. The common field is not printed unless specifically requested. -t char Use character char as a separator. Every appearance of char in a line is significant. The character char is used as the field separator for both input and output. With this option specified, the collating term should be the same as sort without the -b option. -v filenumber Instead of the default output, produce a line only for each unpairable line in filenumber, where filenumber is 1 or 2. If both -v 1 and -v 2 are specified, all unpairable lines will be output. -1 fieldnumber Join on the fieldnumberth field of file 1. Fields are decimal integers starting with 1. -2fieldnumber Join on the fieldnumberth field of file 2. Fields are decimal integers starting with 1. OPERANDS
The following operands are supported: file1 file2 A path name of a file to be joined. If either of the file1 or file2 operands is -, the standard input is used in its place. file1 and file2 must be sorted in increasing collating sequence as determined by LC_COLLATE on the fields on which they are to be joined, normally the first in each line (see sort(1)). USAGE
See largefile(5) for the description of the behavior of join when encountering files greater than or equal to 2 Gbyte (2**31 bytes). EXAMPLES
Example 1: Joining the password file and group file The following command line will join the password file and the group file, matching on the numeric group ID, and outputting the login name, the group name and the login directory. It is assumed that the files have been sorted in ASCII collating sequence on the group ID fields. example% join -j1 4-j2 3 -o 1.1 2.1 1.6 -t:/etc/passwd /etc/group Example 2: Using the -o option The -o 0 field essentially selects the union of the join fields. For example, given file phone: !Name Phone Number Don +1 123-456-7890 Hal +1 234-567-8901 Yasushi +2 345-678-9012 and file fax: !Name Fax Number Don +1 123-456-7899 Keith +1 456-789-0122 Yasushi +2 345-678-9011 where the large expanses of white space are meant to each represent a single tab character), the command: example% join -t"tab" -a 1 -a 2 -e '(unknown)' -o 0,1.2,2.2 phone fax would produce !Name Phone Number Fax Number Don +1 123-456-7890 +1 123-456-7899 Hal +1 234-567-8901 (unknown Keith (unknown) +1 456-789-012 Yasushi +2 345-678-9012 +2 345-678-9011 ENVIRONMENT VARIABLES
See environ(5) for descriptions of the following environment variables that affect the execution of join: LANG, LC_ALL, LC_CTYPE, LC_MES- SAGES, LC_COLLATE, and NLSPATH. EXIT STATUS
The following exit values are returned: 0 All input files were output successfully. >0 An error occurred. ATTRIBUTES
See attributes(5) for descriptions of the following attributes: +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWcsu | +-----------------------------+-----------------------------+ |CSI |Enabled | +-----------------------------+-----------------------------+ |Interface Stability |Standard | +-----------------------------+-----------------------------+ SEE ALSO
awk(1), comm(1), sort(1), uniq(1), attributes(5), environ(5), largefile(5), standards(5) NOTES
With default field separation, the collating sequence is that of sort -b; with -t, the sequence is that of a plain sort. The conventions of the join, sort, comm, uniq, and awk commands are wildly incongruous. SunOS 5.10 8 Feb 2000 join(1)
All times are GMT -4. The time now is 10:08 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy