Sponsored Content
Top Forums Shell Programming and Scripting How to check duplicate entries in file ? (Solaris-9) Post 302999173 by solaris_1977 on Wednesday 14th of June 2017 06:26:06 PM
Old 06-14-2017
Thanks. Didn't knew, that is different awk
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Check host file for duplicate entries

I need a KSH script that will check a host file for duplicate IP's and/or host names and report out the errors. Anyone out there have one they would like to share? Something like: Hostname blahblah appears X times IP Address xxx.xxx.xxx.xxx appears X times TIA (4 Replies)
Discussion started by: ThreeDot
4 Replies

2. Shell Programming and Scripting

Removal of Duplicate Entries from the file

I have a file which consists of 1000 entries. Out of 1000 entries i have 500 Duplicate Entires. I want to remove the first Duplicate Entry (i,e entire Line) in the File. The example of the File is shown below: 8244100010143276|MARISOL CARO||MORALES|HSD768|CARR 430 KM 1.7 ... (1 Reply)
Discussion started by: ravi_rn
1 Replies

3. Solaris

How to check file size in solaris?

Hi All I m having a file , path is /usr/Image/test how i can get size of the "test" file? is there any command for that? (3 Replies)
Discussion started by: sunray
3 Replies

4. Shell Programming and Scripting

duplicate entries /.rhosts file

Hi All, I have one problem related to /.rhosts file. According to my understanding, /.rhosts file is used for "rsh". What will happen if I have duplicate entries in this file? e.g> my .rhosts file looks like Code: wcars42g wcars89j wcars42g wcars42b wcars42b Will duplicate entries... (1 Reply)
Discussion started by: akash_mahakode
1 Replies

5. Shell Programming and Scripting

duplicate entries /.rhosts file

Hi, I forgot how to start a new thread. :( Can somebody please guide me? I have one problem related to /.rhosts file. According to my understanding, /.rhosts file is used for "rsh". What will happen if I have duplicate entries in this file? e.g> my .rhosts file looks like wcars42g... (2 Replies)
Discussion started by: akash_mahakode
2 Replies

6. Shell Programming and Scripting

check duplicate elements

hi if i have arry with lots of intergers , how to check if there is duplicate integers in perl ? say I ahve @intergers=( 1,2,3,3,4,5,4,99,99,100) (2 Replies)
Discussion started by: james94538
2 Replies

7. Shell Programming and Scripting

Counting duplicate entries in a file using awk

Hi, I have a very big (with around 1 million entries) txt file with IPv4 addresses in the standard format, i.e. a.b.c.d The file looks like 10.1.1.1 10.1.1.1 10.1.1.1 10.1.2.4 10.1.2.4 12.1.5.6 . . . . and so on.... There are duplicate/multiple entries for some IP... (3 Replies)
Discussion started by: sajal.bhatia
3 Replies

8. Shell Programming and Scripting

Request to check:remove entries with duplicate numbers in first row

Hi I have a file 1 xyz 456 1 xyz 456 1 xyz 456 2 abc 8459 3 gfd 657 4 ghf 658 4 ghf 658 I want the output 1 xyz 456 2 abc 8459 3 gfd 657 4 ghf 658 (3 Replies)
Discussion started by: manigrover
3 Replies

9. Shell Programming and Scripting

Check to identify duplicate values at first column in csv file

Hello experts, I have a requirement where I have to implement two checks on a csv file: 1. Check to see if the value in first column is duplicate, if any value is duplicate script should exit. 2. Check to verify if the value at second column is between "yes" or "no", if it is anything else... (4 Replies)
Discussion started by: avikaljain
4 Replies

10. UNIX for Dummies Questions & Answers

On Solaris, without stat, how to check how old a file is?

Hi, How do I check how old a file is? That is, is it 1 day old, 1 year old, generated x hours ago? Currently, I receive a supposed to be daily report and in the last few times, it has not been recent, that is instead of the one generated for the day, it is one that was created yesterday or... (3 Replies)
Discussion started by: newbie_01
3 Replies
uniq(1) 						      General Commands Manual							   uniq(1)

Name
       uniq - report repeated lines in a file

Syntax
       uniq [-udc[+n][-n]] [input[output]]

Description
       The  command  reads  the  input	file comparing adjacent lines.	In the normal case, the second and succeeding copies of repeated lines are
       removed; the remainder is written on the output file.  Note that repeated lines must be adjacent in order to be found.  For further  infor-
       mation, see

Options
       The n arguments specify skipping an initial portion of each line in the comparison:

       -n Skips specified number of fields.  A field is defined as a string of non-space, non-tab characters separated by tabs and spaces from its
	  neighbors.

       +n Skips specified number of characters in addition to fields.  Fields are skipped before characters.

       -c Displays number of repetitions, if any, for each line.

       -d Displays only lines that were repeated.

       -u Displays only unique (nonrepeated) lines.

       If the -u flag is used, just the lines that are not repeated in the original file are output.  The -d option specifies  that  one  copy	of
       just the repeated lines is to be written.  The normal mode output is the union of the -u and -d mode outputs.

       The  -c option supersedes -u and -d and generates an output report in default style but with each line preceded by a count of the number of
       times it occurred.

See Also
       comm(1), sort(1)

																	   uniq(1)
All times are GMT -4. The time now is 06:58 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy