Sponsored Content
Top Forums UNIX for Dummies Questions & Answers How to redirect duplicate lines from a file???? Post 302115316 by anbu23 on Tuesday 24th of April 2007 12:51:14 AM
Old 04-24-2007
Code:
awk ' ++arr[$0] > 1 ' file_dup.txt > file_nodup.txt

 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Duplicate lines in the file

Hi, I have a file with duplicate lines in it. I want to keep only the duplicate lines and delete the non duplicates. Can some one please help me? Regards Narayana Gupta (3 Replies)
Discussion started by: guptan
3 Replies

2. UNIX for Dummies Questions & Answers

Remove Duplicate lines from File

I have a log file "logreport" that contains several lines as seen below: 04:20:00 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 06:38:08 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 07:11:05 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but... (18 Replies)
Discussion started by: Nysif Steve
18 Replies

3. Shell Programming and Scripting

removing the duplicate lines in a file

Hi, I need to concatenate three files in to one destination file.In this if some duplicate data occurs it should be deleted. eg: file1: ----- data1 value1 data2 value2 data3 value3 file2: ----- data1 value1 data4 value4 data5 value5 file3: ----- data1 value1 data4 value4 (3 Replies)
Discussion started by: Sharmila_P
3 Replies

4. AIX

Grep multiple lines and redirect to file

I have setof files with data and with same fields multiple times in each of the files. for example: file 1 name = mary kate last name = kate address = 123 street = abc name = mary mark last name = mark address = 456 street = bcd file 2 name = mary kate last name = kate... (2 Replies)
Discussion started by: relearner
2 Replies

5. Shell Programming and Scripting

Duplicate lines in a file

Hi All, I am trying to remove the duplicate entries in a file and print them just once. For example, if my input file has: 00:44,37,67,56,15,12 00:44,34,67,56,15,12 00:44,58,67,56,15,12 00:44,35,67,56,15,12 00:59,37,67,56,15,12 00:59,34,67,56,15,12 00:59,35,67,56,15,12... (7 Replies)
Discussion started by: faiz1985
7 Replies

6. Shell Programming and Scripting

Redirect lines to another file

Hi, I have a file with a size of 10 mb and I need to redirect some specific lines to a new files. For eg. Executed Restore for 11227.EDCS.551.01.201110 from /tmp/bk/restore/CR81500/content/S24U15VA2.2010-10-29.16:49.EDT/ArchiveFile_11227.EDCS.551.01.201110.zip Operation output: <U+FEFF>Oct... (4 Replies)
Discussion started by: gsiva
4 Replies

7. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

8. Shell Programming and Scripting

bash keep only duplicate lines in file

hello all in my bash script I have a file and I only want to keep the lines that appear twice in the file.Is there a way to do this? thanks in advance! (4 Replies)
Discussion started by: vlm
4 Replies

9. UNIX for Dummies Questions & Answers

Duplicate lines in a file

I have a file with following data A B C I would like to print like this n times(For eg:5 times) A B C A B C A B C A B C A (7 Replies)
Discussion started by: nsuresh316
7 Replies

10. Shell Programming and Scripting

Remove duplicate lines from a file

Hi, I have a csv file which contains some millions of lines in it. The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line). I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies
FDUPES(1)						      General Commands Manual							 FDUPES(1)

NAME
fdupes - finds duplicate files in a given set of directories SYNOPSIS
fdupes [ options ] DIRECTORY ... DESCRIPTION
Searches the given path for duplicate files. Such files are found by comparing file sizes and MD5 signatures, followed by a byte-by-byte comparison. OPTIONS
-r --recurse for every directory given follow subdirectories encountered within -R --recurse: for each directory given after this option follow subdirectories encountered within (note the ':' at the end of option; see the Examples section below for further explanation) -s --symlinks follow symlinked directories -H --hardlinks normally, when two or more files point to the same disk area they are treated as non-duplicates; this option will change this behav- ior -n --noempty exclude zero-length files from consideration -f --omitfirst omit the first file in each set of matches -A --nohidden exclude hidden files from consideration -1 --sameline list each set of matches on a single line -S --size show size of duplicate files -m --summarize summarize duplicate files information -q --quiet hide progress indicator -d --delete prompt user for files to preserve, deleting all others (see CAVEATS below) -N --noprompt when used together with --delete, preserve the first file in each set of duplicates and delete the others without prompting the user -v --version display fdupes version -h --help displays help SEE ALSO
md5sum(1) NOTES
Unless -1 or --sameline is specified, duplicate files are listed together in groups, each file displayed on a separate line. The groups are then separated from each other by blank lines. When -1 or --sameline is specified, spaces and backslash characters () appearing in a filename are preceded by a backslash character. EXAMPLES
fdupes a --recurse: b will follow subdirectories under b, but not those under a. fdupes a --recurse b will follow subdirectories under both a and b. CAVEATS
If fdupes returns with an error message such as fdupes: error invoking md5sum it means the program has been compiled to use an external program to calculate MD5 signatures (otherwise, fdupes uses internal routines for this purpose), and an error has occurred while attempting to execute it. If this is the case, the specified program should be properly installed prior to running fdupes. When using -d or --delete, care should be taken to insure against accidental data loss. When used together with options -s or --symlink, a user could accidentally preserve a symlink while deleting the file it points to. Furthermore, when specifying a particular directory more than once, all files within that directory will be listed as their own duplicates, leading to data loss should a user preserve a file without its "duplicate" (the file itself!). AUTHOR
Adrian Lopez <adrian2@caribe.net> FDUPES(1)
All times are GMT -4. The time now is 02:09 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy