Sponsored Content
Full Discussion: List Duplicate
Top Forums Shell Programming and Scripting List Duplicate Post 302123852 by vakharia Mahesh on Wednesday 27th of June 2007 01:02:14 PM
Old 06-27-2007
List Duplicate

Hi All
This is not class assignment . I would like to know awk script how to
list all the duplicate name from a file ,have a look below
Sl No Name Dt of birth Location
1 aaa 1/01/1975 delhi
2 bbb 2/03/1977 mumbai
3 aaa 1/01/1976 mumbai
4 bbb 2/03/1975 chennai
5 aaa 1/01/1975 kolkatta
6 bbb 2/03/1977 bangalore

Here what I would like is if the DOB is same and name is same then print all the details . I have tried with the command "uniq -D " in the awk script
but could not succeed.Smilie
With thanks in advance for guidence !!!
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Get a none duplicate list file

Dear sir i got a file like following format with the duplicate line: AAA AAA AAA AAA AAA BBB BBB BBB BBB CCC CCC CCC CCC ... (5 Replies)
Discussion started by: trynew
5 Replies

2. Shell Programming and Scripting

Removing duplicate files from list with different path

I have a list which contains all the jar files shipped with the product I am involved with. Now, in this list I have some jar files which appear again and again. But these jar files are present in different folders. My input file looks like this /path/1/to a.jar /path/2/to a.jar /path/1/to... (10 Replies)
Discussion started by: vino
10 Replies

3. Shell Programming and Scripting

Splitting a list @list by space delimiter so i can access it by using $list[0 ..1..2]

EDIT : This is for perl @data2 = grep(/$data/, @list_now); This gives me @data2 as Printing data2 11 testzone1 running /zones/testzone1 ***-*****-****-*****-***** native shared But I really cant access data2 by its individual elements. $data2 is the entire list, while $data,2,3...... (1 Reply)
Discussion started by: shriyer
1 Replies

4. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

5. Shell Programming and Scripting

Duplicate value

Hi All, i have file like ID|Indiv_ID 12345|10001 |10001 |10001 23456|10002 |10002 |10002 |10002 |10003 |10004 if indiv_id having duplicate values and corresponding ID column is null then copy the id. I need output like: ID|Indiv_ID 12345|10001... (11 Replies)
Discussion started by: bmk
11 Replies

6. Shell Programming and Scripting

Duplicate files and output list

Gents, I have a file like this. 1 1 1 2 2 3 2 4 2 5 3 6 3 7 4 8 5 9 I would like to get something like it 1 1 2 2 3 4 5 3 6 7 Thanks in advance for your support :b: (8 Replies)
Discussion started by: jiam912
8 Replies

7. Shell Programming and Scripting

Find and remove duplicate record and print list

Gents, I needs to delete duplicate values and only get uniq values based in columns 2-27 Always we should keep the last record found... I need to store one clean file and other with the duplicate values removed. Input : S3033.0 7305.01 0 420123.8... (18 Replies)
Discussion started by: jiam912
18 Replies

8. Shell Programming and Scripting

List duplicate files based on Name and size

Hello, I have a huge directory (with millions of files) and need to find out duplicates based on BOTH file name and File size. I know fdupes but it calculates MD5 which is very time-consuming and especially it takes forever as I have millions of files. Can anyone please suggest a script or... (7 Replies)
Discussion started by: prvnrk
7 Replies

9. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies

10. UNIX for Beginners Questions & Answers

Iterate through a list - checking for a duplicate then report it ot

I have a job that produces a file of barcodes that gets added to every time the job runs I want to check the list to see if the barcode is already in the list and report it out if it is. (3 Replies)
Discussion started by: worky
3 Replies
raid0run(8)						      System Manager's Manual						       raid0run(8)

NAME
raid0run - starts up old (superblock-less) RAID0/LINEAR arrays SYNOPSIS
raid0run [--configfile] [--version] [--force] [--upgrade] [-acvfuv] </dev/md?>+ DESCRIPTION
raid0run sets up a set of block devices into a single RAID0 array. It looks in its configuration file for the md devices mentioned on the command line, and initializes those arrays. raid0run only works for RAID0 and LINEAR devices, it's ment as a compatibility/migration com- mand for arrays created with old mdtools. Note that initializing RAID devices with the wrong RAID configuration file might destroy all data on the consituent devices! RAID arrays with superblocks are much safer and are auto-started by the kernel.. OPTIONS
-c, --configfile filename Use filename as the configuration file (/etc/raidtab is used by default). -a, --all Starts up all nonpersistent RAID0 and LINEAR arrays defined in raidtab. It skips all persistent or non-RAID0/LINEAR arrays. This switch is handy to be used in startup scripts for old installations. -h, --help Displays a short usage message, then exits. -V, --version Displays a short version message, then exits. NOTES
The raidtools are derived from the md-tools and raidtools packages, which were originally written by Marc Zyngier, Miguel de Icaza, Gadi Oxman, Bradley Ward Allen, and Ingo Molnar. BUGS
no known bugs. SEE ALSO
raidtab(5) raid0run(8)
All times are GMT -4. The time now is 06:04 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy