Sponsored Content
Top Forums Shell Programming and Scripting Remove duplicates from File from specific location Post 302183393 by gopikgunda on Wednesday 9th of April 2008 01:13:42 AM
Old 04-09-2008
Remove duplicates from File from specific location

How can i remove the duplicate lines from a file, for example

sample123456Sample
testing123456testing
XXXXX131323XXXXX
YYYYY423432YYYYY
fsdfdsf123456gsdfdsd

all the duplicates from column 6-12 , must be deleted. I want to consider the first row, if same comes in the given range i want to delete the line.
The output am expecting is
sample123456Sample
XXXXX131323XXXXX
YYYYY423432YYYYY

Thanks
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

remove duplicates within a block in a file..help required

hi.. i have a file in the following format :- name-a age -12 address-123 age-12 phone-22222 ============ name-ab age -11 address-123 age-11 phone-222223 ============= name-abc age -12 address-1234 age-12 phone-2222223 ============= (2 Replies)
Discussion started by: nipun_garg
2 Replies

2. Shell Programming and Scripting

Remove duplicates from end of file

1/p ---- A B C A C o/p --- B A C From input file it should remove duplicates from end without changing order (5 Replies)
Discussion started by: lavnayas
5 Replies

3. Shell Programming and Scripting

Remove duplicates from a file

Hi, I need to remove duplicates from a file. The file will be like this 0003 10101 20100120 abcdefghi 0003 10101 20100121 abcdefghi 0003 10101 20100122 abcdefghi 0003 10102 20100120 abcdefghi 0003 10103 20100120 abcdefghi 0003 10103 20100121 abcdefghi Here if the first colum and... (6 Replies)
Discussion started by: gpaulose
6 Replies

4. Shell Programming and Scripting

How to remove duplicates from the .dat file

All, I have a file 1181CUSTOMER-L061411_003500.dat.Z having duplicate records in it. bash-2.05$ zcat 1181CUSTOMER-L061411_003500.dat.Z|grep "90876251S" 90876251S|ABG, AN ADAYANA COMPANY|3550 DEPAUW BLVD|||US|IN|INDIANAPOLIS||DAL|46268||||||GEN|||||||USD|||ABG, AN ADAYANA... (3 Replies)
Discussion started by: Oracle_User
3 Replies

5. Shell Programming and Scripting

Content merging at a specific location in a file

Hi, This is a bit lengthy problem, i will try to keep explaining it simple. I have got a file say file1 that contains the following in it, ------------------------------------------------------------------------ r201463 | ngupta@gmail.com | 2012-06-19 22:02:20 +0530 (Tue, 19 Jun 2012) |... (3 Replies)
Discussion started by: Kashyap
3 Replies

6. UNIX for Dummies Questions & Answers

Remove duplicates from a file

Can u tell me how to remove duplicate records from a file? (11 Replies)
Discussion started by: saga20
11 Replies

7. Shell Programming and Scripting

Finding duplicates in a file excluding specific pattern

I have unix file like below >newuser newuser <hello hello newone I want to find the unique values in the file(excluding <,>),so that the out put should be >newuser <hello newone can any body tell me what is command to get this new file. (7 Replies)
Discussion started by: shiva2985
7 Replies

8. UNIX for Dummies Questions & Answers

Remove duplicates and keep them in a separate file

Hi, I have a tablular separated file and I want to remove all the rows that have duplicates. The diuplicates I need to check are in column 13. I have tried to use awk but I have no Idea how to keep the duplicate file. awk 'FNR==NR{a++;next}(a> 1)' tomodify.txt tomodify.txt > new.txt ... (4 Replies)
Discussion started by: flacchy
4 Replies

9. Shell Programming and Scripting

To remove duplicates from pipe delimited file

Hi some one please help me to remove duplicates from a pipe delimited file based on first two columns. 123|asdf|sfsd|qwrer 431|yui|qwer|opws 123|asdf|pol|njio Here My first record and last record are duplicates.As per my requirement I want all the latest records into one file. I want the... (12 Replies)
Discussion started by: ginrkf
12 Replies

10. UNIX for Advanced & Expert Users

Remove duplicates in flat file

Hi all, I have a issues while loading a flat file to the DB. It is taking much time. When analyzed i found out that there are duplicates entry in the flat file. There are 2 type of Duplicate entry. 1) is entire row is duplicate. ( i can use sort | uniq) to remove the duplicated entry. 2) the... (4 Replies)
Discussion started by: samjoshuab
4 Replies
FDUPES(1)						      General Commands Manual							 FDUPES(1)

NAME
fdupes - finds duplicate files in a given set of directories SYNOPSIS
fdupes [ options ] DIRECTORY ... DESCRIPTION
Searches the given path for duplicate files. Such files are found by comparing file sizes and MD5 signatures, followed by a byte-by-byte comparison. OPTIONS
-r --recurse include files residing in subdirectories -s --symlinks follow symlinked directories -H --hardlinks normally, when two or more files point to the same disk area they are treated as non-duplicates; this option will change this behav- ior -n --noempty exclude zero-length files from consideration -f --omitfirst omit the first file in each set of matches -1 --sameline list each set of matches on a single line -S --size show size of duplicate files -q --quiet hide progress indicator -d --delete prompt user for files to preserve, deleting all others (see CAVEATS below) -v --version display fdupes version -h --help displays help SEE ALSO
md5sum(1) NOTES
Unless -1 or --sameline is specified, duplicate files are listed together in groups, each file displayed on a separate line. The groups are then separated from each other by blank lines. When -1 or --sameline is specified, spaces and backslash characters () appearing in a filename are preceded by a backslash character. CAVEATS
If fdupes returns with an error message such as fdupes: error invoking md5sum it means the program has been compiled to use an external program to calculate MD5 signatures (otherwise, fdupes uses interal routines for this purpose), and an error has occurred while attempting to execute it. If this is the case, the specified program should be properly installed prior to running fdupes. When using -d or --delete, care should be taken to insure against accidental data loss. When used together with options -s or --symlink, a user could accidentally preserve a symlink while deleting the file it points to. Furthermore, when specifying a particular directory more than once, all files within that directory will be listed as their own duplicates, leading to data loss should a user preserve a file without its "duplicate" (the file itself!). AUTHOR
Adrian Lopez <adrian2@caribe.net> FDUPES(1)
All times are GMT -4. The time now is 09:59 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy