Sponsored Content
Top Forums Shell Programming and Scripting Get a none duplicate list file Post 23275 by peter.herlihy on Thursday 20th of June 2002 01:47:42 AM
Old 06-20-2002
You can just use the syntax

sort -u myfile -o myfile

Cat is redundant in the suggested statement above and the sort command allows output to the input filename (unlike sed etc.)

Do you wish to retain the order in the file....and just remove duplicates? Or do you want to sort and remove duplicates?

sort -u -m myfile -o myfile

Will remove only duplicates that appear directly next door. It assumes the list is already sorted and looks for two rows next to each other that are the same. So a list of

B
B
A
A
B
B
A
A

Will come back as

B
A
B
A

------------
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Removing duplicate files from list with different path

I have a list which contains all the jar files shipped with the product I am involved with. Now, in this list I have some jar files which appear again and again. But these jar files are present in different folders. My input file looks like this /path/1/to a.jar /path/2/to a.jar /path/1/to... (10 Replies)
Discussion started by: vino
10 Replies

2. Shell Programming and Scripting

List Duplicate

Hi All This is not class assignment . I would like to know awk script how to list all the duplicate name from a file ,have a look below Sl No Name Dt of birth Location 1 aaa 1/01/1975 delhi 2 bbb 2/03/1977 mumbai 3 aaa 1/01/1976 mumbai 4 ... (22 Replies)
Discussion started by: vakharia Mahesh
22 Replies

3. UNIX for Dummies Questions & Answers

CSV file:Find duplicates, save original and duplicate records in a new file

Hi Unix gurus, Maybe it is too much to ask for but please take a moment and help me out. A very humble request to you gurus. I'm new to Unix and I have started learning Unix. I have this project which is way to advanced for me. File format: CSV file File has four columns with no header... (8 Replies)
Discussion started by: arvindosu
8 Replies

4. Shell Programming and Scripting

Remove duplicate lines from first file comparing second file

Hi, I have two files with below data:: file1:- 123|aaa|ppp 445|fff|yyy 999|ttt|jjj 555|hhh|hhh file2:- 445|fff|yyy 555|hhh|hhh The records present in file1, not present in file 2 should be writtent to the out put file. output:- 123|aaa|ppp 999|ttt|jjj Is there any one line... (3 Replies)
Discussion started by: gani_85
3 Replies

5. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

6. UNIX for Dummies Questions & Answers

Duplicate lines in a file

I have a file with following data A B C I would like to print like this n times(For eg:5 times) A B C A B C A B C A B C A (7 Replies)
Discussion started by: nsuresh316
7 Replies

7. Shell Programming and Scripting

Duplicate files and output list

Gents, I have a file like this. 1 1 1 2 2 3 2 4 2 5 3 6 3 7 4 8 5 9 I would like to get something like it 1 1 2 2 3 4 5 3 6 7 Thanks in advance for your support :b: (8 Replies)
Discussion started by: jiam912
8 Replies

8. Shell Programming and Scripting

Find and remove duplicate record and print list

Gents, I needs to delete duplicate values and only get uniq values based in columns 2-27 Always we should keep the last record found... I need to store one clean file and other with the duplicate values removed. Input : S3033.0 7305.01 0 420123.8... (18 Replies)
Discussion started by: jiam912
18 Replies

9. Shell Programming and Scripting

List duplicate files based on Name and size

Hello, I have a huge directory (with millions of files) and need to find out duplicates based on BOTH file name and File size. I know fdupes but it calculates MD5 which is very time-consuming and especially it takes forever as I have millions of files. Can anyone please suggest a script or... (7 Replies)
Discussion started by: prvnrk
7 Replies

10. UNIX for Beginners Questions & Answers

Iterate through a list - checking for a duplicate then report it ot

I have a job that produces a file of barcodes that gets added to every time the job runs I want to check the list to see if the barcode is already in the list and report it out if it is. (3 Replies)
Discussion started by: worky
3 Replies
SORT(1) 						      General Commands Manual							   SORT(1)

NAME
sort - sort a file of ASCII lines SYNOPSIS
sort [-bcdfimnru] [-tc] [-o name] [+pos1] [-pos2] file ... OPTIONS
-b Skip leading blanks when making comparisons -c Check to see if a file is sorted -d Dictionary order: ignore punctuation -f Fold upper case onto lower case -i Ignore nonASCII characters -m Merge presorted files -n Numeric sort order -o Next argument is output file -r Reverse the sort order -t Following character is field separator -u Unique mode (delete duplicate lines) EXAMPLES
sort -nr file # Sort keys numerically, reversed sort +2 -4 file # Sort using fields 2 and 3 as key sort +2 -t: -o out # Field separator is : sort +.3 -.6 # Characters 3 through 5 form the key DESCRIPTION
Sort sorts one or more files. If no files are specified, stdin is sorted. Output is written on standard output, unless -o is specified. The options +pos1 -pos2 use only fields pos1 up to but not including pos2 as the sort key, where a field is a string of characters delim- ited by spaces and tabs, unless a different field delimiter is specified with -t. Both pos1 and pos2 have the form m.n where m tells the number of fields and n tells the number of characters. Either m or n may be omitted. SEE ALSO
comm(1), grep(1), uniq(1). SORT(1)
All times are GMT -4. The time now is 04:30 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy