Sponsored Content
Full Discussion: report duplicate
Top Forums UNIX for Dummies Questions & Answers report duplicate Post 302539481 by rdcwayx on Monday 18th of July 2011 12:48:01 AM
Old 07-18-2011
Quote:
Originally Posted by dennis.jacob
Try:

PHP Code:
awk '!_[$1]++' filename 
The requester need the duplicate records. Smilie
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicate

Hi all, I have a text file fileA.txt DXRV|02/28/2006 11:36:49.049|SAC||||CDxAcct=2420991350 DXRV|02/28/2006 11:37:06.404|SAC||||CDxAcct=6070970034 DXRV|02/28/2006 11:37:25.740|SAC||||CDxAcct=2420991350 DXRV|02/28/2006 11:38:32.633|SAC||||CDxAcct=6070970034 DXRV|02/28/2006... (2 Replies)
Discussion started by: sabercats
2 Replies

2. HP-UX

Disk duplicate in 10.20

HI: I know this topic already exist in this forum but not exactly with my problem. I want to duplicate a disk , my source disk is like 2gb size, while the new disk is like 36 gb size. The problems: When I use the command dd it fails, I think because the disk sizes, and the sizes of the... (13 Replies)
Discussion started by: pmoren
13 Replies

3. UNIX for Dummies Questions & Answers

Report of duplicate files based on part of the filename

I have the files logged in the file system with names in the format of : filename_ordernumber_date_time eg: file_1_12012007_1101.txt file_2_12022007_1101.txt file_1_12032007_1101.txt I need to find out all the files that are logged multiple times with same order number. In the above eg, I... (1 Reply)
Discussion started by: sudheshnaiyer
1 Replies

4. Shell Programming and Scripting

duplicate directories

Hi, I have file which users like filename ->"readfile", following entries peter john alaska abcd xyz and i have directory /var/ i want to do first cat of "readfile" line by line and first read peter in variable and also cross check with /var/ how many directories are avaialble... (8 Replies)
Discussion started by: learnbash
8 Replies

5. Shell Programming and Scripting

Duplicate

I am looking for a way to delete duplicate entries in a VERY large file (approx 2gb) However I need to compare several fields before determining if this is a duplicate. I setup a hash in perl but it seems to not function correctly. Any help appreciated. of the 19 comma separated fields I... (2 Replies)
Discussion started by: Goyde
2 Replies

6. Shell Programming and Scripting

Duplicate Line Report per Section

I've been working on a script (/bin/sh) in which I have requested and received help here (in which I am very grateful for!). The client has modified their requirements (a tad), so without messing up the script to much, I come once again for assistance. Here are the file.dat contents: ABC1... (4 Replies)
Discussion started by: petersf
4 Replies

7. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

8. Shell Programming and Scripting

Duplicate value

Hi All, i have file like ID|Indiv_ID 12345|10001 |10001 |10001 23456|10002 |10002 |10002 |10002 |10003 |10004 if indiv_id having duplicate values and corresponding ID column is null then copy the id. I need output like: ID|Indiv_ID 12345|10001... (11 Replies)
Discussion started by: bmk
11 Replies

9. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies

10. UNIX for Beginners Questions & Answers

Iterate through a list - checking for a duplicate then report it ot

I have a job that produces a file of barcodes that gets added to every time the job runs I want to check the list to see if the barcode is already in the list and report it out if it is. (3 Replies)
Discussion started by: worky
3 Replies
hardlink(1)						      General Commands Manual						       hardlink(1)

NAME
hardlink - Consolidate duplicate files via hardlinks SYNOPSIS
hardlink [-c] [-n] [-v] [-vv] [-h] directory1 [ directory2 ... ] DESCRIPTION
This manual page documents hardlink, a program which consolidates duplicate files in one or more directories using hardlinks. hardlink traverses one or more directories searching for duplicate files. When it finds duplicate files, it uses one of them as the mas- ter. It then removes all other duplicates and places a hardlink for each one pointing to the master file. This allows for conservation of disk space where multiple directories on a single filesystem contain many duplicate files. Since hard links can only span a single filesystem, hardlink is only useful when all directories specified are on the same filesystem. OPTIONS
-c Compare only the contents of the files being considered for consolidation. Disregards permission, ownership and other differ- ences. -f Force hardlinking across file systems. -n Do not perform the consolidation; only print what would be changed. -v Print summary after hardlinking. -vv Print every hardlinked file and bytes saved. Also print summary after hardlinking. -h Show help. AUTHOR
hardlink was written by Jakub Jelinek <jakub@redhat.com>. Man page written by Brian Long. Man page updated by Jindrich Novy <jnovy@redhat.com> BUGS
hardlink assumes that its target directory trees do not change from under it. If a directory tree does change, this may result in hardlink accessing files and/or directories outside of the intended directory tree. Thus, you must avoid running hardlink on potentially changing directory trees, and especially on directory trees under control of another user. hardlink(1)
All times are GMT -4. The time now is 05:24 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy