Sponsored Content
Top Forums Shell Programming and Scripting Merge columns from multiple files Post 302903249 by ali.seifaddini on Monday 26th of May 2014 11:33:20 AM
Old 05-26-2014
Quote:
Originally Posted by RavinderSingh13
Hello,

you can use following, just adding a * to SriniShoo's code.

Code:
 awk 'NR==FNR{a[$2 " " $3]=$6;next} {a[$2 " " $3]=a[$2 " "  $3] OFS $6} END{for(i in a){print i OFS a[i]}}' File*


Output will be as follows.

Code:
3.61696 101.55112 0.000364633 0.000364633 0.000364633
3.62445 101.55112 0.0998519 0.000364633 0.000364633
3.63195 101.54361 0.107302 0.000364633 0.0696797
3.61696 101.54361 0.000364633 0.000364633 0.000364633
3.66192 101.54361 0.0804649 0.000364633 0.115307
3.62445 101.54361 0.0998519 0.000364633 0.000364633
3.63195 101.55112 0.107302 0.000364633 0.0696797



Thanks,
R. Singh
Hello sir

Something is wrong with the script because there are some extra columns and also rows are miss ordered.
I attached sample data for your try.

Thank you so much.
Sample.rar
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

merge the two files which has contain columns

Hi may i ask how to accomplish this task: I have 2 files which has multiple columns first file 1 a 2 b 3 c 4 d second file 14 a 9 .... 13 b 10.... 12 c 11... 11 d 12... I want to merge the second file to first file that will looks like this ... (2 Replies)
Discussion started by: jao_madn
2 Replies

2. UNIX for Dummies Questions & Answers

Merge two files with two columns being similar

Hi everyone. How can I merge two files, where each file has 2 columns and the first columns in both files are similar? I want all in a file of 4 columns; join command removes the duplicate columns. 1 Dave 2 Mark 3 Paul 1 Apple 2 Orange 3 Grapes to get it like this in the 3rd file:... (9 Replies)
Discussion started by: Atrisa
9 Replies

3. Shell Programming and Scripting

Merge columns of different files

Hi, I have tab limited file 1 and tab limited file 2 The output should contain common first column vales and corresponding 2nd column values; AND also unique first column value with corresponding 2nd column value of the file that contains it and 0 for the second file. the output should... (10 Replies)
Discussion started by: polsum
10 Replies

4. UNIX for Dummies Questions & Answers

How do I merge multiple columns into one column?

Hi all, I'm looking for a way to merge multiple columns (from one file) into a single column in an output file. The file I have looks somewhat like this: @HWI-ST212 1:N:0 AGTCCTACCGGGAGT + @@@DDDDDHHHHHII @HWI-ST212 1:N:0 CGTTTAAAAATTTCT + @;@B;DDDDH?:F;F... (4 Replies)
Discussion started by: Vnguyen
4 Replies

5. UNIX for Dummies Questions & Answers

Merge columns from multiple files

Hi all, I've searched the web for a long time trying to figure out how to merge columns from multiple files. I know paste will append columns like so: paste file1 file2 file3 file4 file5 ... But this becomes inconvenient when you want to append a large number of files into a single file. ... (2 Replies)
Discussion started by: torchij
2 Replies

6. Shell Programming and Scripting

Merge columns on different files

Hello, I have two files that have this format: file 1 86.82 0.00 86.82 43.61 86.84 0.00 86.84 43.61 86.86 0.00 86.86 43.61 86.88 0.00 86.88 43.61 file 2 86.82 0.22 86.84 0.22 86.86 0.22 86.88 0.22 I would like to merge these two files such that the final file looks like... (5 Replies)
Discussion started by: kayak
5 Replies

7. Shell Programming and Scripting

Merge 2 files with one reference columns

Hi All Source1 servername1,patchid1 servername1,patchid2 servername1,patchid3 servername2,patchid1 servername2,patchid2 servername3,patchid4 servername3,patchid5 Source2 servername1,appname1 servername1,appname2 servername1,appname3 servername2,appname1 servername2,appname2... (13 Replies)
Discussion started by: mv_mv
13 Replies

8. Shell Programming and Scripting

Merge records based on multiple columns

Hi, I have a file with 16 columns and out of these 16 columns 14 are key columns, 15 th is order column and 16th column is having information. I need to concate the 16th column based on value of 1-14th column as key in order of 15th column. Here are the example file Input File (multiple... (3 Replies)
Discussion started by: Ravi Agrawal
3 Replies

9. UNIX for Beginners Questions & Answers

Merge multiple columns into one using cat

I will like to merge several files using 'cat', but I observe the output is not consistent. the merge begins at the last line of the first file. file1.txt: 1234 1234 1234 file2.txt: aaaa bbbb cccc dddd cat file1.txt file2.txt > file3.txt file3.txt: 1234 1234 1234aaaa bbbb cccc... (13 Replies)
Discussion started by: geomarine
13 Replies

10. Shell Programming and Scripting

Join and merge multiple files with duplicate key and fill void columns

Join and merge multiple files with duplicate key and fill void columns Hi guys, I have many files that I want to merge: file1.csv: 1|abc 1|def 2|ghi 2|jkl 3|mno 3|pqr file2.csv: (5 Replies)
Discussion started by: yjacknewton
5 Replies
TORRUS_SNMPFAILURES(8)						      torrus						    TORRUS_SNMPFAILURES(8)

NAME
snmpfailures - Displays SNMP collector failures. SYNOPSIS
torrus snmpfailures --tree=TREENAME [options...] OPTIONS
--details In addition to failure counters, list the failed SNMP hosts and the time stamps of failure events. --help Displays a help message. DESCRIPTION
This utility prints the SNMP collector failure information in JSON format. Without --details option, it prints only the failure counters. Upon collector startup or after the tree re-compilation, the failure counters are reset to zero. The output is very convenient for further automatic processing in any scripting language. The top level of the output is a JSON object with the following name/value pairs: total_unreachable: NUMBER Displays the number SNMP hosts that are currently unreachable. The number adds up across multiple collector instances for a given tree. If a host becomes reachable again, the number is decreased. total_deleted: NUMBER Displays the number SNMP hosts that are completely removed from SNMP collection for the life cycle of the collector process. This happens when a host is unreachable for too long time and the collector gives up to reach it again. The number adds up across multiple collector instances for a given tree. total_mib_errors: NUMBER Displays the number of MIB errors (noSuchObject, noSuchInstance, and endOfMibView) during the collector life cycle. The number adds up across multiple collector instances for a given tree. detail_unreachable: OBJECT, detail_deleted: OBJECT If the option --details is specified, these objects contain the host names and timestamps of the failures. The keys are contactenations of SNMP host, UDP port, and SNMP community separated by "|". The values are objects representing the UNIX timestamp and a human-readable time string. detail_mib_errors: OBJECT If the option --details is specified, this object displays the MIB error details: for each SNMP host, it lists the datasource leaves which had these errors and the event timestamps. EXAMPLES
The following example illustrates an SNMP host unreachable: torrus failures --tree=main --details { "detail_deleted" : {}, "detail_mib_errors" : {}, "detail_unreachable" : { "217.101.101.101|161|public" : { "time" : "Fri Jul 23 14:15:10 2010", "timestamp" : 1279887310 } }, "total_deleted" : 0, "total_mib_errors" : 0, "total_unreachable" : 1 } The following example illustrates a MIB error: torrus failures --tree=main --details { "detail_deleted" : {}, "detail_mib_errors" : { "217.101.102.102|161|public" : { "count" : 1, "nodes" : { "/Routers/CMTS3/Temperature_Sensors/sensor_01" : { "time" : "Fri Jul 23 15:26:14 2010", "timestamp" : 1279891574 } } } }, "detail_unreachable" : {}, "total_deleted" : 0, "total_mib_errors" : 1, "total_unreachable" : 0 } SEE ALSO
torrus(8), NOTES
See more documentation at Torrus home page: http://torrus.org AUTHOR
Stanislav Sinyagin <ssinyagin@yahoo.com> torrus 2.03 2013-07-26 TORRUS_SNMPFAILURES(8)
All times are GMT -4. The time now is 02:01 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy