Sponsored Content
Top Forums Shell Programming and Scripting fastest way to remove duplicates. Post 76043 by amit_sapre on Friday 24th of June 2005 08:19:06 AM
Old 06-24-2005
Hi Vino,

This command will keep the first entry as it is and delete the other entries,

irrespective of whether the file is sorted or not.

No prior assumptions while executing this command.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to delete/remove directory in fastest way

hello i need help to remove directory . The directory is not empty ., it contains several sub directories and files inside that.. total number of files in one directory is 12,24,446 . rm -rf doesnt work . it is prompting for every file .. i want to delete without prompting and... (6 Replies)
Discussion started by: getdpg
6 Replies

2. UNIX for Dummies Questions & Answers

How to remove duplicates without sorting

Hello, I can remove duplicate entries in a file by: sort File1 | uniq > File2 but how can I remove duplicates without sorting the file? I tried cat File1 | uniq > File2 but it doesn't work thanks (4 Replies)
Discussion started by: orahi001
4 Replies

3. Shell Programming and Scripting

Remove duplicates

Hello Experts, I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies

4. Shell Programming and Scripting

Remove duplicates from a file

Hi, I need to remove duplicates from a file. The file will be like this 0003 10101 20100120 abcdefghi 0003 10101 20100121 abcdefghi 0003 10101 20100122 abcdefghi 0003 10102 20100120 abcdefghi 0003 10103 20100120 abcdefghi 0003 10103 20100121 abcdefghi Here if the first colum and... (6 Replies)
Discussion started by: gpaulose
6 Replies

5. Shell Programming and Scripting

Script to remove duplicates

Hi I need a script that removes the duplicate records and write it to a new file for example I have a file named test.txt and it looks like abcd.23 abcd.24 abcd.25 qwer.25 qwer.26 qwer.98 I want to pick only $1 and compare with the next record and the output should be abcd.23... (6 Replies)
Discussion started by: antointoronto
6 Replies

6. Shell Programming and Scripting

remove duplicates and sort

Hi, I'm using the below command to sort and remove duplicates in a file. But, i need to make this applied to the same file instead of directing it to another. Thanks (6 Replies)
Discussion started by: dvah
6 Replies

7. Shell Programming and Scripting

Fastest way to delete duplicates from a large filelist.....

OK I have two filelists...... The first is formatted like this.... /path/to/the/actual/file/location/filename.jpg and has up to a million records The second list shows filename.jpg where there is more then on instance. and has maybe up to 65,000 records I want to copy files... (4 Replies)
Discussion started by: Bashingaway
4 Replies

8. Shell Programming and Scripting

bash - remove duplicates

I need to use a bash script to remove duplicate files from a download list, but I cannot use uniq because the urls are different. I need to go from this: http://***/fae78fe/file1.wmv http://***/39du7si/file1.wmv http://***/d8el2hd/file2.wmv http://***/h893js3/file2.wmv to this: ... (2 Replies)
Discussion started by: locoroco
2 Replies

9. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

10. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies
aclsort(3C)															       aclsort(3C)

NAME
aclsort() - sort an Access Control List (JFS File Systems only) SYNOPSIS
DESCRIPTION
The routine sorts JFS Access Control List (ACL) entries into the correct order to be accepted by the acl(2) system call. points to a buffer containing ACL entries; if non-zero, indicates that the permissions should be recalculated; and specifies the number of ACL entries in the buffer. sorts the contents of the ACL buffer as follows: Entries will be in order and Entries of type and will be sorted in increasing order by numeric ID. The call will succeed if all of the following are true: There is exactly one entry each of type and There is at most one entry each of type and Entries of type or may not contain duplicate entries. A duplicate entry is one of the same type containing the same numeric id, irrespective of permission bits. If the argument is zero and there are no entries of type and no entries of type the permissions of the and entries must be the same. If there are no entries of type and no entries of type and the entry is specified, then the entry must also be specified, and the permissions of the and entries must be the same. RETURN VALUE
Upon successful completion, the return value is 0. If there are duplicate entries, the return value is the position of the first duplicate entry. If there is more than one entry of type or they are treated as duplicate entries, and the return value is the position of the first duplicate entry. For all other errors, the return value is -1. NOTICES
The buffer is sorted by type and ID before checking for any failures. Therefore the buffer is always sorted, even if there is a failure. The position of a duplicate entry returned on failure is not the byte offset of the duplicate entry from its base; rather it refers to the entry number of the duplicate entry within the sorted buffer. Checks will be performed in order of entry type. If there are multiple failures, the failure returned will be the first encountered, for example, if the ACL buffer contains a duplicate entry and does not contain an entry, the return value will be the first duplicate entry. ACLs do not have to be sorted with prior to passing them to acl(2). DEPENDENCIES
is supported only on JFS file systems on the standard HP-UX operating system. AUTHOR
was developed by AT&T. SEE ALSO
acl(2), aclv(5). aclsort(3C)
All times are GMT -4. The time now is 01:12 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy