07-23-2013
Thanks,
Its working...But just out of curiosity, is there any other way of doing it , just comparing the 2nd ,3rd and 4th field as a key to find duplicates in a file.Would be great its there in Unix
10 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
i have a file with some 1000 entries it will contain entries like
1000,ram
2000,pankaj
1001,rahim
1000,ram
2532,govind
2000,pankaj
3000,venkat
2532,govind
what i want is i want to extract only the distinct rows from this file
so my output should contain only
1000,ram... (2 Replies)
Discussion started by: trichyselva
2 Replies
2. Shell Programming and Scripting
I have data like this:
It's sorted by the 2nd field (TID).
envoy,90000000000000634600010001,04/11/2008,23:19:27,RB00266,0015,DETAIL,ERROR,
envoy,90000000000000634600010001,04/12/2008,04:23:45,RB00266,0015,DETAIL,ERROR,... (1 Reply)
Discussion started by: kinksville
1 Replies
3. UNIX for Dummies Questions & Answers
hey all,
I need some help.
I have a text file with names in it.
My target is that if a particular pattern exists in that file more than once..then i want to rename all the occurences of that pattern by alternate patterns..
for e.g if i have PATTERN occuring 5 times then i want to... (3 Replies)
Discussion started by: ashisharora
3 Replies
4. Shell Programming and Scripting
I have a log file with posts looking like this:
--
Messages can be delivered by different systems at different times. The id number is used to sort out duplicate messages. What I need is to strip the arrival time from each post, sort posts by id number, and reattach arrival time to respective... (2 Replies)
Discussion started by: Ilja
2 Replies
5. Shell Programming and Scripting
Hi Experts,
Please check the following new requirement. I got data like the following in a file.
FILE_HEADER
01cbbfde7898410| 3477945| home| 1
01cbc275d2c122| 3478234| WORK| 1
01cbbe4362743da| 3496386| Rich Spare| 1
01cbc275d2c122| 3478234| WORK| 1
This is pipe separated file with... (3 Replies)
Discussion started by: tinufarid
3 Replies
6. Shell Programming and Scripting
Hi,
I have a file that I want to change the format of. It is a large file in rows but I want it to be comma separated (comma then a space).
The current file looks like this:
HI, Joe, Bob, Jack, Jack
After I would want to remove any duplicates so it would look like this:
HI, Joe,... (2 Replies)
Discussion started by: kylle345
2 Replies
7. Shell Programming and Scripting
Hi all,
I am working with a huge amount of files in a Linux environment and I was trying to filter my data. Here's what my data looks like
Name............................Size
OLUSDN.gf.gif-1.JPEG.......5 kb
LKJFDA01.gf.gif-1.JPEG.....3 kb
LKJFDA01.gf.gif-2.JPEG.....1 kb... (7 Replies)
Discussion started by: Error404
7 Replies
8. UNIX for Dummies Questions & Answers
I have been using grep to output whole lines using a pattern file with identifiers (fileA):
fig|562.2322.peg.1
fig|562.2322.peg.3
fig|562.2322.peg.3
fig|562.2322.peg.3
fig|562.2322.peg.7
From fileB with corresponding identifiers in the second column:
NODE_0 fig|562.2322.peg.1 peg ... (2 Replies)
Discussion started by: Mauve
2 Replies
9. Shell Programming and Scripting
i hav two files like
i want to remove/delete all the duplicate lines in file2 which are viz unix,unix2,unix3 (2 Replies)
Discussion started by: sagar_1986
2 Replies
10. Shell Programming and Scripting
i hav two files like
i want to remove/delete all the duplicate lines in file2 which are viz unix,unix2,unix3.I have tried previous post also,but in that complete line must be similar.In this case i have to verify first column only regardless what is the content in succeeding columns. (3 Replies)
Discussion started by: sagar_1986
3 Replies
LEARN ABOUT DEBIAN
vk_logmerge
VK_LOGMERGE(1) General Commands Manual VK_LOGMERGE(1)
NAME
vk_logmerge - a Valgrind XML log file merger
SYNOPSIS
vk_logmerge [flags and input files in any order]
DESCRIPTION
vk_logmerge is a valkyrie(1) helper. Given multiple log files (in xml format) generated by multiple runs on a parallel machine, or multiple
log files generated by sequential runs on a single-processor machine, for the same binary, vk_logmerge merges the log files together, sum-
ming the counts of duplicates, and outputs the result to a single file. As input, vk_logmerge expects the log-files to-be-merged and/or a
file containing the list of log-files to-be-merged, with each entry on a separate line.
Log files can be merged from within valkyrie(1) , or use can invoke vk_logmerge directly.
OPTIONS
-h Show help message
-v Be verbose (more -v's give more)
-t Output plain text (non-xml)
-f <log_list>
Obtain input files from <log_list> file (one per line)
-o <writefile>
File to write output to
At least 1 input file must be given.
If no '-o outfile' is given, writes to standard output.
EXAMPLES
vk_logmerge log1.xml -f loglist.fls -o merged.xml
SEE ALSO
valkyrie(1), valgrind(1).
AUTHOR
vk_logmerge was written by Donna Robinson, Cerion Armour-Brown and others.
This manual page was written by Hai Zaar <haizaar@haizaar.com>, for the Debian project (but may be used by others).
2009-05-02 VK_LOGMERGE(1)