Sponsored Content
Full Discussion: Filtering duplicate lines
Top Forums UNIX for Advanced & Expert Users Filtering duplicate lines Post 14991 by AreaMan on Friday 8th of February 2002 12:51:52 PM
Old 02-08-2002
Thanks that does almost want I want. However, is there a way I can do it wile preserving the original order of the data?
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Issues with filtering duplicate records using gawk script

Hi All, I have huge trade file with milions of trades.I need to remove duplicate records (e.g I have following records) 30/10/2009,trdeId1,..,.. 26/10/2009.tradeId1,..,..,, 30/10/2009,tradeId2,.. In the above case i need to filter duplicate recods and I should get following output.... (2 Replies)
Discussion started by: nmumbarkar
2 Replies

2. UNIX for Dummies Questions & Answers

Filtering similar lines in a big list

I received this question for homework: We have to write our program into a .sh file, with "#!/bin/bash" as the first line. And we have the list of access logs in a file, looking like this (it's nearly 10,000 lines long): 65.214.44.112 - - "GET /~user0/cgg/msg08400.html HTTP/1.0" 304 -... (1 Reply)
Discussion started by: Andrew9191
1 Replies

3. Shell Programming and Scripting

filtering out duplicate substrings, regex string from a string

My input contains a single word lines. From each line data.txt prjtestBlaBlatestBlaBla prjthisBlaBlathisBlaBla prjthatBlaBladpthatBlaBla prjgoodBlaBladpgoodBlaBla prjgood1BlaBla123dpgood1BlaBla123 Desired output --> data_out.txt prjtestBlaBla prjthisBlaBla... (8 Replies)
Discussion started by: kchinnam
8 Replies

4. Homework & Coursework Questions

Filtering Unique Lines

Use and complete the template provided. The entire template must be completed. If you don't, your post may be deleted! 1. The problem statement, all variables and given/known data: The uniq command excludes consecutive duplicate lines. It has a -c option to display a count of the number... (1 Reply)
Discussion started by: billydeanmak
1 Replies

5. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

6. Shell Programming and Scripting

Perl: filtering lines based on duplicate values in a column

Hi I have a file like this. I need to eliminate lines with first column having the same value 10 times. 13 18 1 + chromosome 1, 122638287 AGAGTATGGTCGCGGTTG 13 18 1 + chromosome 1, 128904080 AGAGTATGGTCGCGGTTG 13 18 1 - chromosome 14, 13627938 CAACCGCGACCATACTCT 13 18 1 + chromosome 1,... (5 Replies)
Discussion started by: polsum
5 Replies

7. UNIX for Dummies Questions & Answers

Filtering data -extracting specific lines

I have a table to data which one of the columns include string of text from within that, I am searching to include few lines but not others for example I want to to include some combination of word address such as (address.| address? |the address | your address) but not (ip address | email... (17 Replies)
Discussion started by: A-V
17 Replies

8. Shell Programming and Scripting

Filtering out lines in a .csv file

Hi Guys, Would need your expert help with the following situation.. I have a comma seperated .csv file, with a header row and data as follows H1,H2,H3,H4,H5..... (header row) 0,0,0,0,0,1,2.... (data rows follow) 0,0,0,0,0,0,1 ......... ......... i need a code... (10 Replies)
Discussion started by: dev.devil.1983
10 Replies

9. Shell Programming and Scripting

Awk/sed : help on:Filtering multiple lines to one:

Experts Good day, I want to filter multiple lines of same error of same day , to only 1 error of each day, the first line from the log. Here is the file: May 26 11:29:19 cmihpx02 vmunix: NFS write failed for server cmiauxe1: error 5 (RPC: Timed out) May 26 11:29:19 cmihpx02 vmunix: NFS... (4 Replies)
Discussion started by: rveri
4 Replies

10. Shell Programming and Scripting

Filtering log file with lines older than 10 days.

Hi, I am trying to compare epoch time in a huge log file (2 million lines) with todays date. I have to create two files one which has lines older than 10 days and another file with less than 10 days. I am using while do but it takes forever to complete the script. It would be helpful if you can... (12 Replies)
Discussion started by: shunya
12 Replies
flow-merge(1)						      General Commands Manual						     flow-merge(1)

NAME
flow-merge -- Merge flow files. SYNOPSIS
flow-merge [-aghm] [-b big|little] [-C comment] [-d debug_level] [-o filename] [-z z_level] [file|directory ...] DESCRIPTION
The flow-merge utility processes files and/or directories of files in the flow-tools format. The resulting merged data set is written to the standard output or file specified by -o. If file is a single dash (`-') or absent, flow-merge will read from the standard input. Unlike flow-cat, flow-merge interleaves flow records preserving the relative chronological order. OPTIONS
-a Do not ignore filenames that begin with tmp. -b big|little Byte order of output. -C Comment Add a comment. -d debug_level Enable debugging. -g Sort file list by capture start time before processing. -h Display help. -m Disable the use of mmap(). -p Preload headers. Use to preserve meta information such as lost flows. -o file Write to file instead of the standard out. -z z_level Configure compression level to z_level. 0 is disabled (no compression), 9 is highest compression. file|directory... Process the files and/or directory. EXAMPLES
Merge all flow files begining with ft-v05.2001-05.01, use flow-print to display the results. flow-merge ft-v05.2001-05-01.* | flow-print BUGS
None known. AUTHOR
Larry Lidz ellidz@eridu.uchicago.edu SEE ALSO
flow-tools(1) flow-merge(1)
All times are GMT -4. The time now is 12:30 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy