Hello all,
this should really be easy for you... I need AWK to print column maxima for each column of such input:
Input:
1 2 3 1
2 1 1 3
2 1 1 2
Output should be:
2 2 3 3
This does the sum, but i need max instead:
{ for(i=1; i<=NF; i++)
sum +=$i }
END {for(i=1; i in sum;... (3 Replies)
I feel stupid for asking this because it seems that MYSQL code isn't working the way that I think it should work.
Basically I wrote code like this:
select * from `Test_DC_Trailer` HAVING max(DR_RefKey);
Where the DR_RefKey is a unique numeric field that is auto iterated (like a primary key)... (7 Replies)
Hi,
I need an awk script (or whatever shell-construct) that would take data like below and get the max value of 3 column, when grouping by the 1st column.
clientname,day-of-month,max-users
-----------------------------------
client1,20120610,5
client2,20120610,2
client3,20120610,7... (3 Replies)
Dear All,
I have a data file input.csv like below. (Only five column shown here for example.)
Data1,StepNo,Data2,Data3,Data4
2,1,3,4,5
3,1,5,6,7
3,2,4,5,6
5,3,5,5,6
From this I want the below output
Data1,StepNo,Data2,Data3,Data4
2,1,3,4,5
3,1,5,6,7
where the second column... (4 Replies)
I have 2 files,
file01= 7 columns, row unknown (but few)
file02= 7 columns, row unknown (but many)
now I want to create an output with the first field that is shared in both of them and then subtract the results from the rest of the fields and print there
e.g.
file 01
James|0|50|25|10|50|30... (1 Reply)
Hi,
I have a tab delimited text file where the first two columns equal numbers. I want to delete all rows where the value in the first column equals the second column. How do I go about doing that? Thanks!
Input:
1 1 ABC DEF
2 2 IJK LMN
1 2 ZYX OPW
Output:
1 2 ZYX OPW (2 Replies)
I've the following data set. I would like to look at the column 3 and only use the rows which has the max value for column 3
Can we use the awk or sed to achieve it.
10 2 10 100
11 2 20 100
12 2 30 100
13 2 30 100
14 ... (7 Replies)
Dear fellows, I need your help.
I'm trying to write a script to convert a single column into multiple rows.
But it need to recognize the beginning of the string and set it to its specific Column number.
Each Line (loop) begins with digit (RANGE).
At this moment it's kind of working, but it... (6 Replies)
Hi all,
plz help me with this, I want to to extract the duplicate rows (column 1) in a file which at least repeat 4 times. then I want to summarize them by getting the max , mean, median and min. The file is sorted by column 1, all the repeated rows appear together.
If number of elements is... (5 Replies)
Gents,
I have a big file file like this.
5100010002
5100010004
5100010006
5100010008
5100010010
5100010012
5102010002
5102010004
5102010006
5102010008
5102010010
5102010012
The file is sorted and I would like to find the min and max value, taking in the consideration key1... (3 Replies)
Discussion started by: jiam912
3 Replies
LEARN ABOUT DEBIAN
vend::ship::postal
Vend::Ship::Postal(3pm) User Contributed Perl Documentation Vend::Ship::Postal(3pm)NAME
Vend::Ship::Postal -- Calculate US Postal service international rates
SYNOPSIS
(in catalog.cfg)
Database usps ship/usps.txt TAB
Database air_pp ship/air_pp.txt TAB
Database surf_pp ship/surf_pp.txt TAB
(in shipping.asc)
air_pp: US Postal Air Parcel
crit weight
min 0
max 0
cost e No shipping needed!
at_least 4
adder 1
aggregate 70
table air_pp
min 0
max 1000
cost s Postal
min 70
max 9999999
cost e Too heavy for Air Parcel
surf_pp: US Postal Surface Parcel
crit weight
min 0
max 0
cost e No shipping needed!
at_least 4
adder 1
aggregate 70
table surf_pp
min 0
max 1000
cost s Postal
min 70
max 9999999
cost e Too heavy for Postal Parcel
DESCRIPTION
Looks up a service zone by country in the "usps" table, then looks in the appropriate rate table for a price by that zone.
Can aggregate shipments greater than 70 pounds by assuming you will ship multiple 70-pound packages (plus one package with the remainder).
perl v5.14.2 2010-03-25 Vend::Ship::Postal(3pm)