Sponsored Content
Top Forums Shell Programming and Scripting Remove Duplicates on multiple Key Columns and get the Latest Record from Date/Time Column Post 302797725 by vijaykodukula on Tuesday 23rd of April 2013 05:39:22 AM
Old 04-23-2013
Remove Duplicates on multiple Key Columns and get the Latest Record from Date/Time Column

Hi Experts ,

we have a CDC file where we need to get the latest record of the Key columns
Key Columns will be CDC_FLAG and SRC_PMTN_I
and fetch the latest record from the CDC_PRCS_TS


Can we do it with a single awk command.
Please help.
Code:
CDC_PRCS_TS|CDC_SEQ_I|CDC_FLAG|CDC_UPDT_USER|SRC_PMTN_I|TGT_PMTN_I|PMTN_N|PMTN_DESC_T
2013-03-27 10:32:30|0|I|NOT SET   |124|215|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:31|0|I|NOT SET   |124|215|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:32|0|I|NOT SET   |124|215|PRMO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:33|0|I|NOT SET   |124|215|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:30|0|U|NOT SET   |125|216|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:31|0|U|NOT SET   |125|216|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:32|0|U|NOT SET   |125|216|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:33|0|U|NOT SET   |125|216|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:30|0|U|NOT SET   |125|216|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:30|0|I|NOT SET   |126|217|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:30|0|U|NOT SET   |127|768|MDS PROJECT|UAT PHASE
2013-03-27 10:32:30|0|U|NOT SET   |128|454|MDS PROJECT|UAT PHASE
2013-03-27 10:32:30|0|U|NOT SET   |129|234|PROJECT|UAT PHASE
2013-03-27 10:32:30|0|D|NOT SET   |130|123|PROMO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:30|0|D|NOT SET   |131|212|PROO-2:PMI PROJECT|INITIAL PHASE
2013-03-27 10:32:30|0|D|NOT SET   |132|213|PROO-2:PMI PROJECT|INITIAL PHASE


Last edited by Franklin52; 04-26-2013 at 03:27 AM.. Reason: Please use code tags
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates based on the two key columns

Hi All, I needs to fetch unique records based on a keycolumn(ie., first column1) and also I needs to get the records which are having max value on column2 in sorted manner... and duplicates have to store in another output file. Input : Input.txt 1234,0,x 1234,1,y 5678,10,z 9999,10,k... (7 Replies)
Discussion started by: kmsekhar
7 Replies

2. Shell Programming and Scripting

Search based on 1,2,4,5 columns and remove duplicates in the same file.

Hi, I am unable to search the duplicates in a file based on the 1st,2nd,4th,5th columns in a file and also remove the duplicates in the same file. Source filename: Filename.csv "1","ccc","information","5000","temp","concept","new" "1","ddd","information","6000","temp","concept","new"... (2 Replies)
Discussion started by: onesuri
2 Replies

3. Shell Programming and Scripting

need to remove duplicates based on key in first column and pattern in last column

Given a file such as this I need to remove the duplicates. 00060011 PAUL BOWSTEIN ad_waq3_921_20100826_010517.txt 00060011 PAUL BOWSTEIN ad_waq3_921_20100827_010528.txt 0624-01 RUT CORPORATION ad_sade3_10_20100827_010528.txt 0624-01 RUT CORPORATION ... (13 Replies)
Discussion started by: script_op2a
13 Replies

4. Shell Programming and Scripting

finding duplicates in csv based on key columns

Hi team, I have 20 columns csv files. i want to find the duplicates in that file based on the column1 column10 column4 column6 coulnn8 coulunm2 . if those columns have same values . then it should be a duplicate record. can one help me on finding the duplicates, Thanks in advance. ... (2 Replies)
Discussion started by: baskivs
2 Replies

5. Shell Programming and Scripting

Removing duplicates in fixed width file which has multiple key columns

Hi All , I have a requirement where I need to remove duplicates from a fixed width file which has multiple key columns .Also , need to capture the duplicate records into another file . File has 8 columns. Key columns are col1 and col2. Col1 has the length of 8 col 2 has the length of 3. ... (5 Replies)
Discussion started by: saj
5 Replies

6. Shell Programming and Scripting

Remove the time from the date column

Hi, I have file named file1.txt with below contents cat file1.txt 1/29/2014 0:00,706886 1/30/2014 0:00,791265 1/31/2014 0:00,987087 2/1/2014 0:00,1098572 2/2/2014 0:00,572477 2/3/2014 0:00,701715 I want to display as below 1/29/2014,706886 1/30/2014,791265 1/31/2014,987087... (5 Replies)
Discussion started by: villain41
5 Replies

7. UNIX for Dummies Questions & Answers

Display latest record from file based on multiple columns combination

I have requirement to print latest record from file based on multiple columns combination. EWAPE EW1SLE0000 EW1SOMU01 ABORTED 03/16/2015 100004 03/16/2015 100005 001 EWAPE EW1SLE0000 EW1SOMU01 ABORTED 03/18/2015 140003 03/18/2015 140004 001 EWAPE EW1SLE0000 EW1SOMU01 ABORTED 03/18/2015 220006... (1 Reply)
Discussion started by: tmalik79
1 Replies

8. UNIX for Beginners Questions & Answers

Sort and remove duplicates in directory based on first 5 columns:

I have /tmp dir with filename as: 010020001_S-FOR-Sort-SYEXC_20160229_2212101.marker 010020001_S-FOR-Sort-SYEXC_20160229_2212102.marker 010020001-S-XOR-Sort-SYEXC_20160229_2212104.marker 010020001-S-XOR-Sort-SYEXC_20160229_2212105.marker 010020001_S-ZOR-Sort-SYEXC_20160229_2212106.marker... (4 Replies)
Discussion started by: gnnsprapa
4 Replies

9. Shell Programming and Scripting

awk to Sum columns when other column has duplicates and append one column value to another with Care

Hi Experts, Please bear with me, i need help I am learning AWk and stuck up in one issue. First point : I want to sum up column value for column 7, 9, 11,13 and column15 if rows in column 5 are duplicates.No action to be taken for rows where value in column 5 is unique. Second point : For... (1 Reply)
Discussion started by: as7951
1 Replies

10. UNIX for Beginners Questions & Answers

Remove duplicates in a dataframe (table) keeping all the different cells of just one of the columns

Hello all, I need to filter a dataframe composed of several columns of data to remove the duplicates according to one of the columns. I did it with pandas. In the main time, I need that the last column that contains all different data ( not redundant) is conserved in the output like this: A ... (5 Replies)
Discussion started by: pedro88
5 Replies
Net::Jabber::Key(3pm)					User Contributed Perl Documentation				     Net::Jabber::Key(3pm)

NAME
Net::Jabber::Key - Jabber Key Library SYNOPSIS
Net::Jabber::Key is a module that provides a developer easy access to generating, caching, and comparing keys. DESCRIPTION
Key.pm is a helper module for the Net::Jabber::Transport. When the Transport talks to a Client it sends a key and expects to get that key back from the Client. This module provides an API to generate, cache, and then compare the key send from the Client. Basic Functions $Key = new Net::Jabber::Key(); $key = $Key->Generate(); $key = $Key->Create("bob@jabber.org"); $test = $Key->Compare("bob@jabber.org","some key"); METHODS
Basic Functions new(debug=>string, - creates the Key object. debug should debugfh=>FileHandle, be set to the path for the debug debuglevel=>integer) log to be written. If set to "stdout" then the debug will go there. Also, you can specify a filehandle that already exists and use that. debuglevel controls the amount of debug. 0 is none, 1 is normal, 2 is all. Generate() - returns a key in Digest SHA1 form based on the current time and the PID. Create(cacheString) - generates a key and caches it with the key of cacheString. Create returns the key. Compare(cacheString, - compares the key stored in the cache under keyString) cacheString with the keyString. Returns 1 if they match, and 0 otherwise. AUTHOR
By Ryan Eatmon in May of 2000 for http://jabber.org. COPYRIGHT
This module is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.14.2 2013-03-25 Net::Jabber::Key(3pm)
All times are GMT -4. The time now is 10:02 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy