Sponsored Content
Top Forums Shell Programming and Scripting Delete a row that has a duplicate column Post 302321059 by ks_reddy on Saturday 30th of May 2009 01:55:20 AM
Old 05-30-2009
Hi Friends,

Can anybody change the above script "#awk 'a !~ $1; {a=$1}' test.log
" to keep the last repeated entry and delete all the previous duplicates.
For example if the input file is
1 2 3 4
2 2 4 5.
Here column 2 field(s) are repeating.
So I want the output as 2 2 4 5 but not 1 2 3 4.
Thanks in advance..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Delete first row last column

Hi All, I am having following file and I want to delete 1 row last column. Current File Content: ================ procedure test421 put_line procedure test321 test421 procedure test521 test321 procedure test621 test521 Expected File Content: =========================== procedure... (3 Replies)
Discussion started by: susau_79
3 Replies

2. Shell Programming and Scripting

Find and replace duplicate column values in a row

I have file which as 12 columns and values like this 1,2,3,4,5 a,b,c,d,e b,c,a,e,f a,b,e,a,h if you see the first column has duplicate values, I need to identify (print it to console) the duplicate value (which is 'a') and also remove duplicate values like below. I could be in two... (5 Replies)
Discussion started by: nuthalapati
5 Replies

3. Shell Programming and Scripting

duplicate row based on single column

I am a newbie to shell scripting .. I have a .csv file. It has 1000 some rows and about 7 columns... but before I insert this data to a table I have to parse it and clean it ..basing on the value of the first column..which a string of phone number type... example below.. column 1 ... (2 Replies)
Discussion started by: mitr
2 Replies

4. Shell Programming and Scripting

Delete row if a a particular column has more then three characters in it

Hi i have a data like hw:dsfnsmdf:39843 chr2 76219829 51M atatata 51 872389 hw:dsfnsmdf:39853 chr2 76219839 51M65T atatata 51 872389 hw:dsfnsmdf:39863 chr2 76219849 51M atatata 51 872389 hw:dsfnsmdf:39873 chr2 ... (3 Replies)
Discussion started by: bhargavpbk88
3 Replies

5. Shell Programming and Scripting

delete a row with a specific value at a certain column

Hi, I want to delete rows that have 0 at column 4. The file looks like this: chr01 13 61 2 chr01 65 153 0 chr01 157 309 1 chr01 313 309 0 chr01 317 469 1 chr01 473 557 0 I want to delete all rows with a 0 at column 4 chr01 13 61 2 chr01 157 309 1 chr01 ... (3 Replies)
Discussion started by: kylle345
3 Replies

6. Shell Programming and Scripting

Delete a row if either of column value is zero

Hi, My input file is this way 1.1 0.0 2.4 3.5 7.9 1.8 22.3 4.7 8.9 0.9 1.3 0.0 3.4 5.6 0.0 1.1 2.2 0.0 0.0 1.1 0.0 0.0 3.4 5.6 I would like to delete the entire row, if either of 2nd and 3rd columns are 0.0. Please note that my values are all decimal values. So, my output would... (4 Replies)
Discussion started by: jacobs.smith
4 Replies

7. UNIX for Dummies Questions & Answers

awk to sum column field from duplicate row/lines

Hello, I am new to Linux environment , I working on Linux script which should send auto email based on the specific condition from log file. Below is the sample log file Name m/c usage abc xxx 10 abc xxx 20 abc xxx 5 xyz ... (6 Replies)
Discussion started by: asjaiswal
6 Replies

8. Shell Programming and Scripting

Delete duplicate row

Hi all, how can delete duplicate files in file form, e.g. $cat file1 aaa 123 234 345 456 bbb 345 345 657 568 ccc 345 768 897 456 aaa 123 234 345 456 ddd 786 784 234 263 ccc 345 768 897 456 aaa 123 234 345 456 ccc 345 768 897 456 then i need ouput file1 some, (4 Replies)
Discussion started by: aav1307
4 Replies

9. Shell Programming and Scripting

Delete duplicate row based on criteria

Hi, I have an input file as shown below: 20140102;13:30;FR-AUD-LIBOR-1W;2.495 20140103;13:30;FR-AUD-LIBOR-1W;2.475 20140106;13:30;FR-AUD-LIBOR-1W;2.495 20140107;13:30;FR-AUD-LIBOR-1W;2.475 20140108;13:30;FR-AUD-LIBOR-1W;2.475 20140109;13:30;FR-AUD-LIBOR-1W;2.475... (2 Replies)
Discussion started by: shash
2 Replies

10. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies
MSGUNIQ(1)								GNU								MSGUNIQ(1)

NAME
msguniq - unify duplicate translations in message catalog SYNOPSIS
msguniq [OPTION] [INPUTFILE] DESCRIPTION
Unifies duplicate translations in a translation catalog. Finds duplicate translations of the same message ID. Such duplicates are invalid input for other programs like msgfmt, msgmerge or msgcat. By default, duplicates are merged together. When using the --repeated option, only duplicates are output, and all other messages are discarded. Comments and extracted comments will be cumulated, except that if --use-first is specified, they will be taken from the first translation. File positions will be cumulated. When using the --unique option, duplicates are discarded. Mandatory arguments to long options are mandatory for short options too. Input file location: INPUTFILE input PO file -D, --directory=DIRECTORY add DIRECTORY to list for input files search If no input file is given or if it is -, standard input is read. Output file location: -o, --output-file=FILE write output to specified file The results are written to standard output if no output file is specified or if it is -. Message selection: -d, --repeated print only duplicates -u, --unique print only unique messages, discard duplicates Input file syntax: -P, --properties-input input file is in Java .properties syntax --stringtable-input input file is in NeXTstep/GNUstep .strings syntax Output details: -t, --to-code=NAME encoding for output --use-first use first available translation for each message, don't merge several translations -e, --no-escape do not use C escapes in output (default) -E, --escape use C escapes in output, no extended chars --force-po write PO file even if empty -i, --indent write the .po file using indented style --no-location do not write '#: filename:line' lines -n, --add-location generate '#: filename:line' lines (default) --strict write out strict Uniforum conforming .po file -p, --properties-output write out a Java .properties file --stringtable-output write out a NeXTstep/GNUstep .strings file -w, --width=NUMBER set output page width --no-wrap do not break long message lines, longer than the output page width, into several lines -s, --sort-output generate sorted output -F, --sort-by-file sort output by file location Informative output: -h, --help display this help and exit -V, --version output version information and exit AUTHOR
Written by Bruno Haible. REPORTING BUGS
Report bugs to <bug-gnu-gettext@gnu.org>. COPYRIGHT
Copyright (C) 2001-2007 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html> This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. SEE ALSO
The full documentation for msguniq is maintained as a Texinfo manual. If the info and msguniq programs are properly installed at your site, the command info msguniq should give you access to the complete manual. GNU gettext-tools 0.17 November 2007 MSGUNIQ(1)
All times are GMT -4. The time now is 11:10 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy