01-10-2013
Amazing
It worked! Thanks rdrtx1!
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Can some one provide me a shell script.
I have file with many columns and many rows. need to sort the first column and then remove the duplicates records if exists.. finally print the full data with first coulm as unique.
Sort BASED ON FIRST FIELD and remove the duplicates if exists... (2 Replies)
Discussion started by: tuffEnuff
2 Replies
2. UNIX for Dummies Questions & Answers
Hi,
How to output the duplicate record to another file. We say the record is duplicate based on a column whose position is from 2 and its length is 11 characters.
The file is a fixed width file.
ex of Record:
DTYU12333567opert tjhi kkklTRG9012
The data in bold is the key on which... (1 Reply)
Discussion started by: Qwerty123
1 Replies
3. Shell Programming and Scripting
Hi All,
I needs to fetch unique records based on a keycolumn(ie., first column1) and also I needs to get the records which are having max value on column2 in sorted manner... and duplicates have to store in another output file.
Input :
Input.txt
1234,0,x
1234,1,y
5678,10,z
9999,10,k... (7 Replies)
Discussion started by: kmsekhar
7 Replies
4. Shell Programming and Scripting
Hi,
I am unable to search the duplicates in a file based on the 1st,2nd,4th,5th columns in a file and also remove the duplicates in the same file.
Source filename: Filename.csv
"1","ccc","information","5000","temp","concept","new"
"1","ddd","information","6000","temp","concept","new"... (2 Replies)
Discussion started by: onesuri
2 Replies
5. Shell Programming and Scripting
Hello,
I am new to shell scripting. I have a huge file with multiple columns for example:
I have 5 columns below.
HWUSI-EAS000_29:1:105 + chr5 76654650 AATTGGAA HHHHG
HWUSI-EAS000_29:1:106 + chr5 76654650 AATTGGAA B@HYL
HWUSI-EAS000_29:1:108 + ... (4 Replies)
Discussion started by: Diya123
4 Replies
6. Shell Programming and Scripting
Hi Folks -
I'm quite new to awk and didn't come across such issues before. The problem statement is that, I've a file with duplicate records in 3rd and 4th fields. The sample is as below:
aaaaaa|a12|45|56
abbbbaaa|a12|45|56
bbaabb|b1|51|45
bbbbbabbb|b2|51|45
aaabbbaaaa|a11|45|56
... (3 Replies)
Discussion started by: asyed
3 Replies
7. Shell Programming and Scripting
Hi
Description of input file I have:
-------------------------
1) CSV with double quotes for string fields.
2) Some string fields have Comma as part of field value.
3) Have Duplicate lines
4) Have 200 columns/fields
5) File size is more than 10GB
Description of output file I need:... (4 Replies)
Discussion started by: krishnix
4 Replies
8. UNIX for Dummies Questions & Answers
Hi,
I have a file with fields like below:
A;XYZ;102345;222
B;XYZ;123243;333
C;ABC;234234;444
D;MNO;103345;222
E;DEF;124243;333
desired output:
C;ABC;234234;444
D;MNO;103345;222
E;DEF;124243;333
ie, if the 4rth field is a duplicate.. i need only those records where... (5 Replies)
Discussion started by: wanderingmind16
5 Replies
9. Shell Programming and Scripting
I am trying to see if I can use awk to remove duplicates from a file. This is the file:
-==> Listvol <==
deleting /vol/eng_rmd_0941
deleting /vol/eng_rmd_0943
deleting /vol/eng_rmd_0943
deleting /vol/eng_rmd_1006
deleting /vol/eng_rmd_1012
rearrange /vol/eng_rmd_0943
... (6 Replies)
Discussion started by: newbie2010
6 Replies
10. UNIX for Beginners Questions & Answers
I have /tmp dir with filename as:
010020001_S-FOR-Sort-SYEXC_20160229_2212101.marker
010020001_S-FOR-Sort-SYEXC_20160229_2212102.marker
010020001-S-XOR-Sort-SYEXC_20160229_2212104.marker
010020001-S-XOR-Sort-SYEXC_20160229_2212105.marker
010020001_S-ZOR-Sort-SYEXC_20160229_2212106.marker... (4 Replies)
Discussion started by: gnnsprapa
4 Replies
LEARN ABOUT OSX
httpdstat.d
httpdstat.d(1m) USER COMMANDS httpdstat.d(1m)
NAME
httpdstat.d - realtime httpd statistics. Uses DTrace.
SYNOPSIS
httpdstat.d [interval [count]]
DESCRIPTION
This prints connection statistics for "httpd" processes, such as those from an Apache web server.
Since this uses DTrace, only users with root privileges can run this command.
EXAMPLES
Print statistics every second,
# httpdstat.d
Print every 5 seconds, 6 times,
# httpdstat.d 5 6
FIELDS
TIME time, string
NUM number of connections
GET number of GETs
POST number of POSTs
HEAD number of HEADs
TRACE number of TRACEs
NOTES
All statistics are printed as a value per interval.
This version does not process subsequent operations on keepalives.
IDEA
Ryan Matteson (who first wrote a solution to this).
DOCUMENTATION
See the DTraceToolkit for further documentation under the Docs directory. The DTraceToolkit docs may include full worked examples with ver-
bose descriptions explaining the output.
EXIT
httpdstat.d will run until Ctrl-C is hit.
AUTHOR
Brendan Gregg [Sydney, Australia]
SEE ALSO
dtrace(1M)
version 0.70 Nov 20, 2005 httpdstat.d(1m)