Here is my file name countries
USSR 8650 262 Asia
Canada 3852 24 North America
China 3692 866 Asia
USA 3615 219 North America
Brazil 3286 116 South America
India 1269 637 Asia
Argentina 1072 ... (8 Replies)
Friends,
.
On linux i have to run iostat command and in each iteration have to print the greatest value in each column.
e.g
iostat -dt -kx 2 2 | awk ' !/sd/ &&!/%util/ && !/Time/ && !/Linux/ {print $12}'
4.38
0.00
0.00
0.00
What i would like to print is only the... (3 Replies)
Hi,
I have two files formatted as following:
File 1: (user_num_ID , realID) (the NR here is 41671)
1 cust_034_60
2 cust_80_91
3 cust_406_4
..
..
File 2: (realID , clusterNumber) (total NR here is 1000)
cust_034_60 2
cust_406_4 3
..
.. (11 Replies)
Hi All,
I want to remove the rows from File1.csv by comparing the columns/fields in the File2.csv. I only need the records whose first column is same and the second column is different for the same record in both files.Here is an example on what I need.
File1.csv:
RAJAK|ACTIVE|1... (2 Replies)
Hi all !
If there is only one single value in a column (e.g. column 1 below), then return this value in the same output column.
If there are several values in the same column (e.g. column 2 below), then return the different values separated by "," in the output.
pipe-separated input:
... (11 Replies)
Hi
I have 2 files as below
File 1
Chr Start End
chr1 120 130
chr1 140 150
chr2 130 140
File2
Chr Start End Value
chr1 121 128 ABC
chr1 144 149 XYZ
chr2 120 129 PQR
I would like to compare these files using awk; specifically if column 1 of file1 is equal to column 1 of file2... (7 Replies)
Hi All,
I am looking for an awk script to do the following
Join the fields together only if the first 4 fields are same.
Can it be done with join function in awk??
a,b,c,d,8,,,
a,b,c,d,,7,,
a,b,c,d,,,9,
a,b,p,e,8,,,
a.b,p,e,,9,,
a,b,p,z,,,,9
a,b,p,z,,8,,
desired output:
... (1 Reply)
Dear All,
I have 2 files. If field 1, 2, 4 and 5 matches in both file1 and file2, I want to print the whole line of file1 and file2 one after another in my output file.
File1:
sc2/80 20 . A T 86 F=5;U=4
sc2/60 55 . G T ... (1 Reply)
I have a csv dump from sql server that needs to be converted so it can be feed to another program. I already sorted on field 1 but there are multiple columns with same field 1 where it needs to be compared against and if it is same then append field 5.
i.e from
ANG SJ,0,B,LC22,LC22(0)
BAT... (2 Replies)
Hi guys,
I have problem to append new data at the end of each line of the files where it takes whole value of the nth column. My expected result i just want to take a specific value only. This new data is based on substring of 11th, 12th 13th column that has comma seperated value.
My code:
awk... (4 Replies)
Discussion started by: null7
4 Replies
LEARN ABOUT DEBIAN
snnewgroup
snnewgroup.v0.3.8(8) System Manager's Manual snnewgroup.v0.3.8(8)NAME
snnewgroup - create a new sn newsgroup
SYNOPSIS
snnewgroup newsgroup [server] [port]
DESCRIPTION
snnewgroup creates newsgroup assigning it an upstream NNTP server of server:port. If port is not specified, defaults to 119. If server is
also not specified, newsgroup is created as a local group which is fed only by articles POSTed to it.
You will need to be root or own /var/spool/sn in order to add new groups.
ENVIRONMENT VARIABLES
SNROOT If this is set and is not empty, the value is used in place of /var/spool/sn, the default news spool directory.
FILES CREATED
/var/spool/sn/newsgroup
Directory where articles will be stored.
/var/spool/sn/newsgroup/.created
Empty file for the newsgroup creation time.
/var/spool/sn/newsgroup/.serial
Where snget gets its idea of the new starting serial number for newsgroup on server server:port. The value is initialized to 0.
/var/spool/sn/newsgroup/.outgoing
If server is specified, is a symlink to /var/spool/sn/.outgoing/server:port, which is a directory, created if it does not already
exist. If server is not specified, no file /var/spool/sn/newsgroup/.outgoing will be created. See also snsend.
SEE ALSO
sndelgroup, snsend
N.B. Harold Tay snnewgroup.v0.3.8(8)