Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Getting non unique lines from concatenated files Post 302506366 by pawannoel on Sunday 20th of March 2011 10:00:36 AM
Old 03-20-2011
Hi Bartus,

Thank you very much for this powerful code ... its does exactly what I want and allows comaprison of 2 or more files just by changing $N. But sorry I always have more questions ! Is there a way in which I can choose which files to compare? Let me explain: at the moment if I change $N=2 it compares file_1 and file_2, $N=3 will compare file_1, file_2 and file_3, $N=4 will compare file_1, file_2, file_3 and file_4, and so on.....
What if I wanted to compare only file_1, file_3 and file_7 OR file_2 and file_10 or any other pattern of files of choice ? Is is possible ? I will greatly appreciate your help and if you could try to comment on the code to make it understandable to me that would be just awesome.

Thanks again and have a nice Sunday Smilie

Cheers
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Lines Concatenated with awk

Hello, I have a bash shell script and I use awk to print certain columns of one file and direct the output to another file. If I do a less or cat on the file it looks correct, but if I email the file and open it with Outlook the lines outputted by awk are concatenated. Here is my awk line:... (6 Replies)
Discussion started by: xadamz23
6 Replies

2. Shell Programming and Scripting

Comparing 2 files and return the unique lines in first file

Hi, I have 2 files file1 ******** 01-05-09|java.xls| 02-05-08|c.txt| 08-01-09|perl.txt| 01-01-09|oracle.txt| ******** file2 ******** 01-02-09|windows.xls| 02-05-08|c.txt| 01-05-09|java.xls| 08-02-09|perl.txt| 01-01-09|oracle.txt| ******** (8 Replies)
Discussion started by: shekhar_v4
8 Replies

3. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

4. Shell Programming and Scripting

Compare multiple files and print unique lines

Hi friends, I have multiple files. For now, let's say I have two of the following style cat 1.txt cat 2.txt output.txt Please note that my files are not sorted and in the output file I need another extra column that says the file from which it is coming. I have more than 100... (19 Replies)
Discussion started by: jacobs.smith
19 Replies

5. UNIX for Dummies Questions & Answers

getting unique lines from 2 files

hi i have used comm -13 <(sort 1.txt) <(sort 2.txt) option to get the unique lines that are present in file 2 but not in file 1. but some how i am getting the entire file 2. i would expect few but not all uncommon lines fro my dat. is there anything wrong with the way i used the command? my... (1 Reply)
Discussion started by: anurupa777
1 Replies

6. Shell Programming and Scripting

compare 2 files and return unique lines in each file (based on condition)

hi my problem is little complicated one. i have 2 files which appear like this file 1 abbsss:aa:22:34:as akl abc 1234 mkilll:as:ss:23:qs asc abc 0987 mlopii:cd:wq:24:as asd abc 7866 file2 lkoaa:as:24:32:sa alk abc 3245 lkmo:as:34:43:qs qsa abc 0987 kloia:ds:45:56:sa acq abc 7805 i... (5 Replies)
Discussion started by: anurupa777
5 Replies

7. Shell Programming and Scripting

Print only lines where fields concatenated match strings

Hello everyone, Maybe somebody could help me with an awk script. I have this input (field separator is comma ","): 547894982,M|N|J,U|Q|P,98,101,0,1,1 234900027,M|N|J,U|Q|P,98,101,0,1,1 234900023,M|N|J,U|Q|P,98,54,3,1,1 234900028,M|H|J,S|Q|P,98,101,0,1,1 234900030,M|N|J,U|F|P,98,101,0,1,1... (2 Replies)
Discussion started by: Ophiuchus
2 Replies

8. Shell Programming and Scripting

Look up 2 files and print the concatenated output

file 1 Sun Mar 17 00:01:33 2013 submit , Name="1234" Sun Mar 17 00:01:33 2013 submit , Name="1344" Sun Mar 17 00:01:33 2013 submit , Name="1124" .. .. .. .. Sun Mar 17 00:01:33 2013 submit , Name="8901" file 2 Sun Mar 17 00:02:47 2013 1234 execute SUCCEEDED Sun Mar 17... (24 Replies)
Discussion started by: aravindj80
24 Replies

9. UNIX for Dummies Questions & Answers

Print unique lines without sort or unique

I would like to print unique lines without sort or unique. Unfortunately the server I am working on does not have sort or unique. I have not been able to contact the administrator of the server to ask him to add it for several weeks. (7 Replies)
Discussion started by: cokedude
7 Replies

10. UNIX for Beginners Questions & Answers

Print number of lines for files in directory, also print number of unique lines

I have a directory of files, I can show the number of lines in each file and order them from lowest to highest with: wc -l *|sort 15263 Image.txt 16401 reference.txt 40459 richtexteditor.txt How can I also print the number of unique lines in each file? 15263 1401 Image.txt 16401... (15 Replies)
Discussion started by: spacegoose
15 Replies
hardlink(1)						      General Commands Manual						       hardlink(1)

NAME
hardlink - Link multiple copies of a file SYNOPSIS
hardlink [option]... [directory|file]... DESCRIPTION
hardlink is a tool which replaces copies of a file with hardlinks, therefore saving space. OPTIONS
-h or --help print quick usage details to the screen. -v or --verbose More verbose output. If specified once, every hardlinked file is displayed, if specified twice, it also shows every comparison. -n or --dry-run Do not act, just print what would happen -f or --respect-name Only try to link files with the same (basename). -p or --ignore-mode Link/compare files even if their mode is different. This may be a bit unpredictable. -o or --ignore-owner Link/compare files even if their owner (user and group) is different. It is not predictable -t or --ignore-time Link/compare files even if their time of modification is different. This will retain the newest timestamp, unless -m or -M is given. -m or --maximize Try to maximize the link count of the files. -M or --minimize Try to minimize the link count of the files. -x or --exclude A regular expression which excludes files from being compared and linked. -i or --include A regular expression to include files. If the option --exclude has been given, this option re-includes files which would otherwise be excluded. If the option is used without --exclude, only files matched by the pattern are included. ARGUMENTS
hardlink takes one or more directories which will be searched for files to be linked. BUGS
hardlink assumes that the trees it operates on do not change during operation. If a tree does change, the result is undefined and poten- tially dangerous. For example, if a regular file is replaced by a device, hardlink may start reading from the device. If a component of a path is replaced by a symbolic link or file permissions change, security may be compromised. Do not run hardlink on a changing tree or on a tree controlled by another user. AUTHOR
The program hardlink and this manpage have been written by Julian Andres Klode, and are licensed under the MIT license. See the code of hardlink for further information. 0.2.0 2012-02-28 hardlink(1)
All times are GMT -4. The time now is 12:37 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy