Sponsored Content
Top Forums Shell Programming and Scripting Remove duplicate lines which has been repeated 4 times Post 303008320 by Scott on Thursday 30th of November 2017 08:34:59 AM
Old 11-30-2017
If I understand you correctly, you want to keep only the first four unique lines?

Code:
awk '++A[$0] < 5' file


Last edited by Scott; 11-30-2017 at 09:40 AM.. Reason: Changed "A[$0]++ <4" to "++A[$0] < 5"
This User Gave Thanks to Scott For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to remove duplicate lines

I have following file content (3 fields each line): 23 888 10.0.0.1 dfh 787 10.0.0.2 dssf dgfas 10.0.0.3 dsgas dg 10.0.0.4 df dasa 10.0.0.5 df dag 10.0.0.5 dfd dfdas 10.0.0.5 dfd dfd 10.0.0.6 daf nfd 10.0.0.6 ... as can be seen, that the third field is ip address and sorted. but... (3 Replies)
Discussion started by: fredao
3 Replies

2. Shell Programming and Scripting

Remove duplicate lines

Hi, I have a huge file which is about 50GB. There are many lines. The file format likes 21 rs885550 0 9887804 C C T C C C C C C C 21 rs210498 0 9928860 0 0 C C 0 0 0 0 0 0 21 rs303304 0 9941889 A A A A A A A A A A 22 rs303304 0 9941890 0 A A A A A A A A A The question is that there are a few... (4 Replies)
Discussion started by: zhshqzyc
4 Replies

3. Shell Programming and Scripting

AWK Duplicate lines multiple times based on a calculated value

Hi, I'm trying to create an XML sitemap of our dynamic ecommerce sites SEO Friendly URLs and am trying to create the initial page listing. I have a CSV file that looks like the following and need duplicate the lines based on a value which needs calculating. ... (2 Replies)
Discussion started by: jamesfx
2 Replies

4. Shell Programming and Scripting

Need to remove the duplicate lines from a log!!

Hello Folks, Can some one help me with the removal of duplicate lines from a log file and send it to another log file. It's bit complicated as two lines are same but only difference is the timestamp, but some lines are uniq. Line has been seperated by colon's. Log file:... (5 Replies)
Discussion started by: sim_je
5 Replies

5. Shell Programming and Scripting

Remove regularly repeated lines

How can i delete some regular repeated lines in a file? example: in_file EDGE 1 2 12 EDGE 2 3 23 EDGE 3 4 34 EDGE 5 6 56 EDGE 6 7 67 EDGE 7 8 78 EDGE 9 10 910 EDGE 10 11 1011 EDGE 11 12 1112 EDGE 13 14 1314 EDGE 14 15 1415 EDGE 15 16 1516 EDGE 17 18 1718 EDGE 18 19 1819 EDGE 19... (8 Replies)
Discussion started by: saeed.soltani
8 Replies

6. Shell Programming and Scripting

How to print the lines which are repeated 3 times in a file?

Hello All, I have a file which has repeated lines. I want to print the lines which are repeated three times. Please help. (3 Replies)
Discussion started by: ailnilanjan
3 Replies

7. UNIX for Dummies Questions & Answers

Remove Duplicate Lines

Hi I need this output. Thanks. Input: TAZ YET FOO FOO VAK TAZ BAR Output: YET VAK BAR (10 Replies)
Discussion started by: tara123
10 Replies

8. Shell Programming and Scripting

Remove lines containing 2 or more duplicate strings

Within my text file i have several thousand lines of text with some lines containing duplicate strings/words. I would like to entirely remove those lines which contain the duplicate strings. Eg; One and a Two Unix.com is the Best This as a Line Line Example duplicate sentence with the word... (22 Replies)
Discussion started by: martinsmith
22 Replies

9. UNIX for Beginners Questions & Answers

Export lines that have first entry repeated 5 times or above

Dears i want to extract lines only that have first entry repeated 3 times or above , ex data : -bash-3.00$ cat INTCONT-IS.CSV M205-00-106_AMDRN:1-0-6-22,12-662-4833,intContact,2016-11-15 02:32:16,50 M205-00-106_AMDRN:1-0-23-17,12-616-0462,intContact,2016-11-15 02:32:23,50... (5 Replies)
Discussion started by: is2_egypt
5 Replies

10. Shell Programming and Scripting

How to remove duplicate lines?

Hi All, I am storing the result in the variable result_text using the below code. result_text=$(printf "$result_text\t\n$name") The result_text is having the below text. Which is having duplicate lines. file and time for the interval 03:30 - 03:45 file and time for the interval 03:30 - 03:45 ... (4 Replies)
Discussion started by: nalu
4 Replies
COMM(1) 						    BSD General Commands Manual 						   COMM(1)

NAME
comm -- select or reject lines common to two files SYNOPSIS
comm [-123f] file1 file2 DESCRIPTION
The comm utility reads file1 and file2, which should be sorted lexically, and produces three text columns as output: lines only in file1; lines only in file2; and lines in both files. The filename ``-'' means the standard input. The following options are available: -1 Suppress printing of column 1. -2 Suppress printing of column 2. -3 Suppress printing of column 3. -f Fold case in line comparisons. Each column will have a number of tab characters prepended to it equal to the number of lower numbered columns that are being printed. For example, if column number two is being suppressed, lines printed in column number one will not have any tabs preceding them, and lines printed in column number three will have one. comm assumes that the files are lexically sorted; all characters participate in line comparisons. EXIT STATUS
comm exits 0 on success, >0 if an error occurred. SEE ALSO
cmp(1), diff(1), sort(1), uniq(1) STANDARDS
The comm utility conforms to IEEE Std 1003.2-1992 (``POSIX.2''). BSD
June 6, 1993 BSD
All times are GMT -4. The time now is 06:00 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy