Sponsored Content
Top Forums Shell Programming and Scripting fastest way to remove duplicates. Post 76050 by pixelbeat on Friday 24th of June 2005 09:23:26 AM
Old 06-24-2005
It's equivalent to uniq, so it won't help you.
If your data is in fact already sorted then just use `uniq` instead of `sort -u`
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to delete/remove directory in fastest way

hello i need help to remove directory . The directory is not empty ., it contains several sub directories and files inside that.. total number of files in one directory is 12,24,446 . rm -rf doesnt work . it is prompting for every file .. i want to delete without prompting and... (6 Replies)
Discussion started by: getdpg
6 Replies

2. UNIX for Dummies Questions & Answers

How to remove duplicates without sorting

Hello, I can remove duplicate entries in a file by: sort File1 | uniq > File2 but how can I remove duplicates without sorting the file? I tried cat File1 | uniq > File2 but it doesn't work thanks (4 Replies)
Discussion started by: orahi001
4 Replies

3. Shell Programming and Scripting

Remove duplicates

Hello Experts, I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies

4. Shell Programming and Scripting

Remove duplicates from a file

Hi, I need to remove duplicates from a file. The file will be like this 0003 10101 20100120 abcdefghi 0003 10101 20100121 abcdefghi 0003 10101 20100122 abcdefghi 0003 10102 20100120 abcdefghi 0003 10103 20100120 abcdefghi 0003 10103 20100121 abcdefghi Here if the first colum and... (6 Replies)
Discussion started by: gpaulose
6 Replies

5. Shell Programming and Scripting

Script to remove duplicates

Hi I need a script that removes the duplicate records and write it to a new file for example I have a file named test.txt and it looks like abcd.23 abcd.24 abcd.25 qwer.25 qwer.26 qwer.98 I want to pick only $1 and compare with the next record and the output should be abcd.23... (6 Replies)
Discussion started by: antointoronto
6 Replies

6. Shell Programming and Scripting

remove duplicates and sort

Hi, I'm using the below command to sort and remove duplicates in a file. But, i need to make this applied to the same file instead of directing it to another. Thanks (6 Replies)
Discussion started by: dvah
6 Replies

7. Shell Programming and Scripting

Fastest way to delete duplicates from a large filelist.....

OK I have two filelists...... The first is formatted like this.... /path/to/the/actual/file/location/filename.jpg and has up to a million records The second list shows filename.jpg where there is more then on instance. and has maybe up to 65,000 records I want to copy files... (4 Replies)
Discussion started by: Bashingaway
4 Replies

8. Shell Programming and Scripting

bash - remove duplicates

I need to use a bash script to remove duplicate files from a download list, but I cannot use uniq because the urls are different. I need to go from this: http://***/fae78fe/file1.wmv http://***/39du7si/file1.wmv http://***/d8el2hd/file2.wmv http://***/h893js3/file2.wmv to this: ... (2 Replies)
Discussion started by: locoroco
2 Replies

9. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

10. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies
COMBINE(1)																COMBINE(1)

NAME
combine - combine sets of lines from two files using boolean operations SYNOPSIS
combine file1 and file2 combine file1 not file2 combine file1 or file2 combine file1 xor file2 _ file1 and file2 _ _ file1 not file2 _ _ file1 or file2 _ _ file1 xor file2 _ DESCRIPTION
combine combines the lines in two files. Depending on the boolean operation specified, the contents will be combined in different ways: and Outputs lines that are in file1 if they are also present in file2. not Outputs lines that are in file1 but not in file2. or Outputs lines that are in file1 or file2. xor Outputs lines that are in either file1 or file2, but not in both files. "-" can be specified for either file to read stdin for that file. The input files need not be sorted, and the lines are output in the order they occur in file1 (followed by the order they occur in file2 for the two "or" operations). Bear in mind that this means that the operations are not commutative; "a and b" will not necessarily be the same as "b and a". To obtain commutative behavior sort and uniq the result. Note that this program can be installed as "_" to allow for the syntactic sugar shown in the latter half of the synopsis (similar to the test/[ command). It is not currently installed as "_" by default, but you can alias it to that if you like. SEE ALSO
join(1) AUTHOR
Copyright 2006 by Joey Hess <joey@kitenet.net> Licensed under the GNU GPL. moreutils 2012-04-09 COMBINE(1)
All times are GMT -4. The time now is 01:40 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy