Sponsored Content
Top Forums Shell Programming and Scripting Remove duplicate lines which has been repeated 4 times Post 303008329 by Kalia on Thursday 30th of November 2017 09:13:14 AM
Old 11-30-2017
Thanks
lines repeated up to 4 times should be printed as once
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to remove duplicate lines

I have following file content (3 fields each line): 23 888 10.0.0.1 dfh 787 10.0.0.2 dssf dgfas 10.0.0.3 dsgas dg 10.0.0.4 df dasa 10.0.0.5 df dag 10.0.0.5 dfd dfdas 10.0.0.5 dfd dfd 10.0.0.6 daf nfd 10.0.0.6 ... as can be seen, that the third field is ip address and sorted. but... (3 Replies)
Discussion started by: fredao
3 Replies

2. Shell Programming and Scripting

Remove duplicate lines

Hi, I have a huge file which is about 50GB. There are many lines. The file format likes 21 rs885550 0 9887804 C C T C C C C C C C 21 rs210498 0 9928860 0 0 C C 0 0 0 0 0 0 21 rs303304 0 9941889 A A A A A A A A A A 22 rs303304 0 9941890 0 A A A A A A A A A The question is that there are a few... (4 Replies)
Discussion started by: zhshqzyc
4 Replies

3. Shell Programming and Scripting

AWK Duplicate lines multiple times based on a calculated value

Hi, I'm trying to create an XML sitemap of our dynamic ecommerce sites SEO Friendly URLs and am trying to create the initial page listing. I have a CSV file that looks like the following and need duplicate the lines based on a value which needs calculating. ... (2 Replies)
Discussion started by: jamesfx
2 Replies

4. Shell Programming and Scripting

Need to remove the duplicate lines from a log!!

Hello Folks, Can some one help me with the removal of duplicate lines from a log file and send it to another log file. It's bit complicated as two lines are same but only difference is the timestamp, but some lines are uniq. Line has been seperated by colon's. Log file:... (5 Replies)
Discussion started by: sim_je
5 Replies

5. Shell Programming and Scripting

Remove regularly repeated lines

How can i delete some regular repeated lines in a file? example: in_file EDGE 1 2 12 EDGE 2 3 23 EDGE 3 4 34 EDGE 5 6 56 EDGE 6 7 67 EDGE 7 8 78 EDGE 9 10 910 EDGE 10 11 1011 EDGE 11 12 1112 EDGE 13 14 1314 EDGE 14 15 1415 EDGE 15 16 1516 EDGE 17 18 1718 EDGE 18 19 1819 EDGE 19... (8 Replies)
Discussion started by: saeed.soltani
8 Replies

6. Shell Programming and Scripting

How to print the lines which are repeated 3 times in a file?

Hello All, I have a file which has repeated lines. I want to print the lines which are repeated three times. Please help. (3 Replies)
Discussion started by: ailnilanjan
3 Replies

7. UNIX for Dummies Questions & Answers

Remove Duplicate Lines

Hi I need this output. Thanks. Input: TAZ YET FOO FOO VAK TAZ BAR Output: YET VAK BAR (10 Replies)
Discussion started by: tara123
10 Replies

8. Shell Programming and Scripting

Remove lines containing 2 or more duplicate strings

Within my text file i have several thousand lines of text with some lines containing duplicate strings/words. I would like to entirely remove those lines which contain the duplicate strings. Eg; One and a Two Unix.com is the Best This as a Line Line Example duplicate sentence with the word... (22 Replies)
Discussion started by: martinsmith
22 Replies

9. UNIX for Beginners Questions & Answers

Export lines that have first entry repeated 5 times or above

Dears i want to extract lines only that have first entry repeated 3 times or above , ex data : -bash-3.00$ cat INTCONT-IS.CSV M205-00-106_AMDRN:1-0-6-22,12-662-4833,intContact,2016-11-15 02:32:16,50 M205-00-106_AMDRN:1-0-23-17,12-616-0462,intContact,2016-11-15 02:32:23,50... (5 Replies)
Discussion started by: is2_egypt
5 Replies

10. Shell Programming and Scripting

How to remove duplicate lines?

Hi All, I am storing the result in the variable result_text using the below code. result_text=$(printf "$result_text\t\n$name") The result_text is having the below text. Which is having duplicate lines. file and time for the interval 03:30 - 03:45 file and time for the interval 03:30 - 03:45 ... (4 Replies)
Discussion started by: nalu
4 Replies
UNIQ(1) 						    BSD General Commands Manual 						   UNIQ(1)

NAME
uniq -- report or filter out repeated lines in a file SYNOPSIS
uniq [-cdu] [-f fields] [-s chars] [input_file [output_file]] DESCRIPTION
The uniq utility reads the standard input comparing adjacent lines, and writes a copy of each unique input line to the standard output. The second and succeeding copies of identical adjacent input lines are not written. Repeated lines in the input will not be detected if they are not adjacent, so it may be necessary to sort the files first. The following options are available: -c Precede each output line with the count of the number of times the line occurred in the input, followed by a single space. -d Don't output lines that are not repeated in the input. -f fields Ignore the first fields in each input line when doing comparisons. A field is a string of non-blank characters separated from adja- cent fields by blanks. Field numbers are one based, i.e. the first field is field one. -s chars Ignore the first chars characters in each input line when doing comparisons. If specified in conjunction with the -f option, the first chars characters after the first fields fields will be ignored. Character numbers are one based, i.e. the first character is character one. -u Don't output lines that are repeated in the input. If additional arguments are specified on the command line, the first such argument is used as the name of an input file, the second is used as the name of an output file. The uniq utility exits 0 on success, and >0 if an error occurs. COMPATIBILITY
The historic +number and -number options have been deprecated but are still supported in this implementation. SEE ALSO
sort(1) STANDARDS
The uniq utility is expected to be IEEE Std 1003.2 (``POSIX.2'') compatible. BSD
January 6, 2007 BSD
All times are GMT -4. The time now is 02:19 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy