Sponsored Content
Full Discussion: Common records using AWK
Top Forums Shell Programming and Scripting Common records using AWK Post 302594166 by bartus11 on Monday 30th of January 2012 05:05:31 PM
Old 01-30-2012
Try this script:
Code:
#!/bin/bash
FILE1="input1.txt"
FILE2="input2.txt"
TR1=`awk '{for (i=1;i<=NF;i++){a[i]=a[i]" "$i}}END{for (i=1;i<=NF;i++){print a[i]}}' $FILE1`
TR2=`awk '{for (i=1;i<=NF;i++){a[i]=a[i]" "$i}}END{for (i=1;i<=NF;i++){print a[i]}}' $FILE2`
TR3=`comm -12 <(echo "$TR1"|sort) <(echo "$TR2"|sort)`
TR3="$TR3\n"`comm -23 <(echo "$TR1"|sort) <(echo "$TR2"|sort)`
TR3="$TR3\n"`comm -13 <(echo "$TR1"|sort) <(echo "$TR2"|sort)`
OUT=`echo -e "$TR3" | awk '{for (i=1;i<=NF;i++){a[i]=a[i]" "$i}}END{for (i=1;i<=NF;i++){print a[i]}}'`
echo "$OUT"

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

merge based on common, awk help

All, $ cat x.txt z 11 az x 12 ax y 13 ay $ cat y.txt ay TT ax NN Output required: y 13 ay TT x 12 ax NN (3 Replies)
Discussion started by: jkl_jkl
3 Replies

2. Shell Programming and Scripting

Common records after matching on different columns

Hi, I have the following files. cat 1.txt cat 2.txt output.txt The logic is as follows.... (10 Replies)
Discussion started by: jacobs.smith
10 Replies

3. Shell Programming and Scripting

Common records

Hi, I have the following files, A M 2 3 B E 4 5 C I 5 6 D O 4 5 A M 3 4 B E 5 2 F U 7 9 J K 2 3 OUTPUT A M 2 3 3 4 B E 4 5 5 2 thanks in advance, (7 Replies)
Discussion started by: jacobs.smith
7 Replies

4. UNIX for Dummies Questions & Answers

keeping last record among group of records with common fields (awk)

input: ref.1;rack.1;1 #group1 ref.1;rack.1;2 #group1 ref.1;rack.2;1 #group2 ref.2;rack.3;1 #group3 ref.2;rack.3;2 #group3 ref.2;rack.3;3 #group3 Among records from same group (i.e. with same 1st and 2nd field - separated by ";"), I would need to keep the last record... (5 Replies)
Discussion started by: beca123456
5 Replies

5. Shell Programming and Scripting

Help in awk to read the common txt

Dear all, I have small script which seems to be working but seems to have some bug. It suppose to read commonTxt and then print the noOfLines in outputFile. It is working for most of the txt but unable to add some of the variables values. Can somebody please spend looking at the thread and... (3 Replies)
Discussion started by: emily
3 Replies

6. Shell Programming and Scripting

Two columns-Common records - 20 files

Hi Friends, I have an input file like this cat input1 x 1 y 2 z 3 a 2 b 4 c 6 d 9 cat input2 x 7 h 8 k 9 l 5 m 9 d 12 (5 Replies)
Discussion started by: jacobs.smith
5 Replies

7. UNIX for Dummies Questions & Answers

Values with common field in same line with awk

Hi all ! I almost did it but got a small problem. input: cars red cars blue cars green truck black Wanted: cars red-blue-green truck black Attempt: gawk 'BEGIN{FS="\t"}{a = a (a?"-":"")$2; $2=a; print $1 FS $2}' input But I also got the intermediate records... (2 Replies)
Discussion started by: beca123456
2 Replies

8. Shell Programming and Scripting

Compare multiple files, identify common records and combine unique values into one file

Good morning all, I have a problem that is one step beyond a standard awk compare. I would like to compare three files which have several thousand records against a fourth file. All of them have a value in each row that is identical, and one value in each of those rows which may be duplicated... (1 Reply)
Discussion started by: nashton
1 Replies

9. UNIX for Beginners Questions & Answers

Comparing fastq files and outputting common records

I have two files: File_1: @M04961:22:000000000-B5VGJ:1:1101:9280:7106 1:N:0:86 GGCATGAAAACATACAAACCGTCTTTCCAGAAATTGTTCCAAGTATCGGCAACAGCTTTATCAATACCATGAAAAATATCAACCACACCAGAAGCAGCAT + GGGGGGGGGGGGGGGGGCCGGGGGF,EDFFGEDFG,@DGGCGGEGGG7DCGGGF68CGFFFGGGG@CGDGFFDFEFEFF:30CGAFFDFEFF8CAF;;8F ... (3 Replies)
Discussion started by: Xterra
3 Replies

10. Shell Programming and Scripting

awk common between files

Hello there: I want to find common among files. They all have one column. Format for data: CEU_snp_CHR21.txt 21:10758305 21:10827533 21:10913441 21:10920098 21:10952160 21:10966322 21:10985991 NAT_CHR21_variants.txt 21:10971951 (3 Replies)
Discussion started by: genome
3 Replies
MP3CUT(1)							   User Command 							 MP3CUT(1)

NAME
mp3cut - cut and assemble MP3 files SYNOPSIS
mp3cut [ -o outputfile ] [ -T title ] [ -A artist ] [ -N album-name ] [ -t [hh:]mm:ss[+ms]-[hh:]mm:ss[+ms] ] mp3file [[ -t ... ] mp3file1 ...] DESCRIPTION
The mp3cut utility cuts and assembles MP3 files according to the time specifications given on the command line. The mp3 output is written to the outputfile. If no outputfile is given on the command-line, the name for the outputfile is created from the name of the first mp3 file by adding output.mp3 at the end. The -t flag specifies which part of the mp3 file following it will be extracted. OPTIONS
-o outputfile Specify where the output is to be written. -T title Specify the title ID3 tag for the output file. -A artist Specify the artist ID3 tag for the output file. -N album-name Specify the album name ID3 tag for the output file. -t [hh:]mm:ss[+ms]-[hh:]mm:ss[+ms] Specify which part of the following mp3file will be included in the output file. hh = hours mm = minutes ss = seconds ms = milliseconds If the starting time is omitted, 00:00:00+00 is used as starting time. If the ending time is omitted, the end of the MP3 file is used as ending time. EXAMPLES
mp3cut -o output.mp3 -t 23:42+500-01:23:42+750 input.mp3 Cut the segment from 23 minutes, 42 seconds and 500 milliseconds to 1 hour, 23 minutes, 42 seconds and 750 milliseconds from input.mp3 and write the output to output.mp3. mp3cut -t 00:01-00:02 input1.mp3 -t -15:23 input2.mp3 -t 9:87+500- input3.mp3 Append the segments from input1.mp3, input2.mp3 and input3.mp3 and write the output to input1.output.mp3. AUTHORS
Manuel Odendahl <manuel@bl0rg.net>, Florian Wesch <dividuum@bl0rg.net> February 2005 MP3CUT(1)
All times are GMT -4. The time now is 02:12 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy