Sponsored Content
Top Forums Shell Programming and Scripting Trying to find the distinct lines using uniq command Post 302896090 by shamrock on Friday 4th of April 2014 10:39:35 AM
Old 04-04-2014
Quote:
Originally Posted by kraljic
Platform :Oracle Linux 6.4
Shell : bash

The below file has 7 lines , some of them are duplicates. There are only 3 distinct lines. But why is the uniq command still showing 7 ?
I just want the distinct lines to be returned.
.
.
.
That's because uniq expects its input to be sorted...so that duplicate lines are adjacent to each other...in fact uniq can be eliminated by using the "-u" option to sort...
Code:
sort -u test.txt | grep -c UPDATE

This User Gave Thanks to shamrock For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

duplicated lines not recognized by sort and uniq

Hello all, I've got a strange behaviour of sort and uniq commands: they do not recognise apparently duplicated lines in a file (already sorted). The lines are identical by eye, but they must differ in smth, because when they are put in two files, those have slightly different size. What can make... (8 Replies)
Discussion started by: roussine
8 Replies

2. UNIX for Dummies Questions & Answers

find uniq lines in file, using the first field of line

Hello all, new to unix and have just found the forum. I think I will be here quite often, and hope that in time i will be able to provide soem help, role on not being a newbie anymore :) I have a question which iI am hoping someone could help me with. If i have a file with lines in in thus... (8 Replies)
Discussion started by: grom
8 Replies

3. UNIX for Dummies Questions & Answers

deleteing duplicate lines sing uniq while ignoring a column

I have a data set that has 4 columns, I want to know if I can delete duplicate lines while ignoring one of the columns, for example 10 chr1 ASF 30 15 chr1 ASF 20 5 chr1 ASF 30 6 chr2 EBC 15 4 chr2 EBC 30 ... I want to know if I can delete duplicate lines while ignoring column 1, so the... (5 Replies)
Discussion started by: japaneseguitars
5 Replies

4. Shell Programming and Scripting

Find distinct values

Hi, I have two files of the following format file1 chr1:345-456 chr2:123-456 chr2:455-678 chr3:456-789 chr3:444-555 file2 chr1:345-456 chr2:123-456 chr3:456-789 output (2 Replies)
Discussion started by: jacobs.smith
2 Replies

5. Shell Programming and Scripting

Need the distinct of these lines

Red Hat Enterprise Linux 5.4 I have a text file (error log file) , which has occurences of an error message like ORA-01652: unable to extend temp segment by 8 in tablespace xxxxxThere are around 3000 error messages like this in the error log file. But there are only 7 or 8 distinct... (3 Replies)
Discussion started by: John K
3 Replies

6. Shell Programming and Scripting

Command to identify distinct field value

Could anyone help me on a command to identify distinct values from a particular column ?my input file has header. So i need a command in which we pass Column1 as parameter. For eg my input is like Column1 Column2 Column3 ... (4 Replies)
Discussion started by: krish123
4 Replies

7. Shell Programming and Scripting

How do a distinct from a file using sort uniq in bash?

I have an output file .dat. From this file i have to do a distinct of the ID using the sort uniq command in bash script. How can i do it? i found : sort -u ${FILEOUT_DAT} but i don't think is my solution because the id isn't specified.. is there other solution? (7 Replies)
Discussion started by: punticci
7 Replies

8. Shell Programming and Scripting

How to find DISTINCT rows and combine in one row?

Hi , i need to display only one of duplicated values and merged them in one record only when tag started with 3100.2.128.8 3100.2.97.1=192.168.0.12 3100.2.128.8=418/66/03e9/0044801 3100.2.128.8=418/66/03ea/0044601 3100.2.128.8=418/66/03e9/0044801 3100.2.128.8=418/66/03ea/0044601... (5 Replies)
Discussion started by: OTNA
5 Replies

9. Shell Programming and Scripting

Need distinct values from command in a script

Hello, I am using below command srvctl config service -d cmdbut cmdbut_01 (P):/devoragridcn_01/app/oracle> srvctl config service -d cmdbut Service name: boms10.world Service is enabled Server pool: cmdbut_boms10.world Cardinality: 1 Disconnect: false Service role: PRIMARY Management... (7 Replies)
Discussion started by: Vishal_dba
7 Replies

10. Shell Programming and Scripting

Find distinct files in directory

Hi All, I am working on one of the script developement for my project, where I need to find the distinct types of files from given directory based on pattern provided. for e.g. directory listing is : abc123.dat.20141212_021012 abc123.dat.20141312_041012 abc123.dat.20141112_031012... (5 Replies)
Discussion started by: freakabhi
5 Replies
UNIQ(1) 						      General Commands Manual							   UNIQ(1)

NAME
uniq - report repeated lines in a file SYNOPSIS
uniq [ -udc [ +n ] [ -n ] ] [ input [ output ] ] DESCRIPTION
Uniq reads the input file comparing adjacent lines. In the normal case, the second and succeeding copies of repeated lines are removed; the remainder is written on the output file. Note that repeated lines must be adjacent in order to be found; see sort(1). If the -u flag is used, just the lines that are not repeated in the original file are output. The -d option specifies that one copy of just the repeated lines is to be written. The normal mode output is the union of the -u and -d mode outputs. The -c option supersedes -u and -d and generates an output report in default style but with each line preceded by a count of the number of times it occurred. The n arguments specify skipping an initial portion of each line in the comparison: -n The first n fields together with any blanks before each are ignored. A field is defined as a string of non-space, non-tab charac- ters separated by tabs and spaces from its neighbors. +n The first n characters are ignored. Fields are skipped before characters. SEE ALSO
sort(1), comm(1) UNIQ(1)
All times are GMT -4. The time now is 12:50 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy