Sponsored Content
Full Discussion: Common lines from files
Top Forums Shell Programming and Scripting Common lines from files Post 302433219 by jaysean on Tuesday 29th of June 2010 03:19:03 AM
Old 06-29-2010
Thanks to both you guys. Both works fine. If anyone needs here it goes for a directory processing

Code:
ls DirectoryA | while read FILE; do
  awk 'NR==FNR{a[$1" "$2]=$3;next;}($1" "$2 in a){if(a[$1" "$2] > $3) print $1, $2,a[$1" "$2]; else print;}' DirectoryA/"$FILE" DirectoryB/"$FILE" | tr ' ' '\t' > DirectoryC/"$FILE"
done

the tr is because my file was tab separated and somehow in the output that was messed up.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

To find all common lines from 'n' no. of files

Hi, I have one situation. I have some 6-7 no. of files in one directory & I have to extract all the lines which exist in all these files. means I need to extract all common lines from all these files & put them in a separate file. Please help. I know it could be done with the help of... (11 Replies)
Discussion started by: The Observer
11 Replies

2. Shell Programming and Scripting

Drop common lines at head/tail of a large set of files

Hi! I have a large set of pairs of text files (each pair in their own subdirectory) and each pair shares head/tail (a couple of first and last lines) but differs in the middle part. I need to delete the heads/tails and keep only the middle portions in which they differ. The lengths of heads/tails... (1 Reply)
Discussion started by: dobryden
1 Replies

3. Shell Programming and Scripting

Common lines from files

Hello guys, I need a script to get the common lines from two files with a criteria that if the first two columns match then I keep the maximum value of the 5th column.(tab separated columns) . 3rd and 4th columns corresponds to the row which has highest value for the 5th column. Sample... (2 Replies)
Discussion started by: jaysean
2 Replies

4. Shell Programming and Scripting

Get common lines from multiple files

FileA chr1 31237964 NP_001018494.1 PUM1 M340L chr1 31237964 NP_055491.1 PUM1 M340L chr1 33251518 NP_037543.1 AK2 H191D chr1 33251518 NP_001616.1 AK2 H191D chr1 57027345 NP_001004303.2 C1orf168 P270S FileB chr1 ... (9 Replies)
Discussion started by: genehunter
9 Replies

5. Shell Programming and Scripting

Find common lines between multiple files

Hello everyone A few years Ago the user radoulov posted a fancy solution for a problem, which was about finding common lines (gene variation names) between multiple samples (files). The code was: awk 'END { for (R in rec) { n = split(rec, t, "/") if (n > 1) dup = dup ?... (5 Replies)
Discussion started by: bibb
5 Replies

6. UNIX for Dummies Questions & Answers

Filter lines common in two files

Thanks everyone. I got that problem solved. I require one more help here. (Yes, UNIX definitely seems to be fun and useful, and I WILL eventually learn it for myself. But I am now on a different project and don't really have time to go through all the basics. So, I will really appreciate some... (6 Replies)
Discussion started by: latsyrc
6 Replies

7. Shell Programming and Scripting

Finding out the common lines in two files using 4 fields with the help of awk and UNIX

Dear All, I have 2 files. If field 1, 2, 4 and 5 matches in both file1 and file2, I want to print the whole line of file1 and file2 one after another in my output file. File1: sc2/80 20 . A T 86 F=5;U=4 sc2/60 55 . G T ... (1 Reply)
Discussion started by: NamS
1 Replies

8. Shell Programming and Scripting

Find common lines with one file and with all of the files in another folder

Hi! I would like to comm -12 with one file and with all of the files in another folder that has a 100 files or more (that file is not in that folder) to find common text lines. I would like to have each case that they have common lines to be written to a different output file and the names of the... (6 Replies)
Discussion started by: Eve
6 Replies

9. Shell Programming and Scripting

Find common lines between all of the files in one folder

Could it be possible to find common lines between all of the files in one folder? Just like comm -12 . So all of the files two at a time. I would like all of the outcomes to be written to a different files, and the file names could be simply numbers - 1 , 2 , 3 etc. All of the file names contain... (19 Replies)
Discussion started by: Eve
19 Replies

10. UNIX for Beginners Questions & Answers

Awk: output lines with common field to separate files

Hi, A beginner one. my input.tab (tab-separated): h1 h2 h3 h4 h5 item1 grpA 2 3 customer1 item2 grpB 4 6 customer1 item3 grpA 5 9 customer1 item4 grpA 0 0 customer2 item5 grpA 9 1 customer2 objective: output a file for each customer ($5) with the item number ($1) only if $2 matches... (2 Replies)
Discussion started by: beca123456
2 Replies
APPLE_DUMP(1)							   Netatalk 2.2 						     APPLE_DUMP(1)

NAME
apple_dump - Dump AppleSingle/AppleDouble format file SYNOPSIS
apple_dump [-a] FILE | DIR apple_dump -f FILE apple_dump -d FILE apple_dump -h | -help | --help apple_dump -v | -version | --version DESCRIPTION
apple_dump dump AppleSingle/AppleDouble format file. OPTIONS
-a FILE|DIR This is default. Dump a AppleSingle/AppleDouble file for FILE or DIR automatically. Extrapolate FinderInfo type from absolute path. If FILE is not AppleSingle/AppleDouble format, look for .AppleDouble/FILE and ._FILE. If DIR, look for DIR/.AppleDouble/.Parent and ._DIR. -f FILE Dump FILE. Assume FinderInfo to be FileInfo. -d FILE Dump FILE. Assume FinderInfo to be DirInfo. -h, -help, --help Display the help and exit -v, -version, --version Show version and exit NOTE
There is no way to detect whether FinderInfo is FileInfo or DirInfo. By default, apple_dump examins whether file or directory, a parent directory is .AppleDouble, filename is ._*, filename is .Parent, and so on. If setting option -f or -d, assume FinderInfo and doesn't look for another file. SEE ALSO
ad(1) Netatalk 2.2 02 Sep 2011 APPLE_DUMP(1)
All times are GMT -4. The time now is 06:56 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy