Search Results

Search: Posts Made By: nikhil jain
1,302
Posted By RavinderSingh13
Hello nikhil jain, Could you please try...
Hello nikhil jain,

Could you please try following.


awk 'BEGIN{FS=OFS=",";s1="##"} FNR==NR{a[$2,$3,$4]=a[$2,$3,$4]?a[$2,$3,$4] FS $1:$1;next} (($2,$3,$4) in a){print a[$2,$3,$4] s1...
1,302
Posted By Scrutinizer
Hi, try something like this: awk -F, ' {...
Hi, try something like this:

awk -F, '
{
i=$2 FS $3 FS $4
A[i]=(i in A?A[i] FS:"") $1
}
END {
for(i in A)
print A[i] "##" i
}
' file
9,557
Posted By Yoda
Don't use uuencode, here is an example...
Don't use uuencode, here is an example (https://www.unix.com/302940250-post2.html) using base64 and sendmail
1,149
Posted By RudiC
No idea by yourself on how to go about it? ...
No idea by yourself on how to go about it?

Building on what Don Cragun posted, try
awk -F, '
BEGIN {m = 9
}
$9 ~ /^(High|Critical)$/ &&
$7 == "availability" {c[$6]++
...
1,149
Posted By Don Cragun
Maybe you want something more like: awk -F, ' ...
Maybe you want something more like:
awk -F, '
BEGIN { m = 9
}
$9 ~ /^(High|Critical)$/ && $7 == "availability" {
c[$6]++
n[$6] += $11 == "NO"
y[$6] += $11 == "YES"
if(length($6) > m) m =...
2,051
Posted By RudiC
I'm glad I could (almost) help. For your required...
I'm glad I could (almost) help. For your required modifications, why don't you give it a try, with 168 posts and a six year membership?
869
Posted By RavinderSingh13
Hello nikhil jain, If you want duplicates...
Hello nikhil jain,

If you want duplicates into your answer then following may help you in same.

awk '/Student/{A=$0;next} /bytes/ && A && $2 > 1000{print A;A=""}' Input_file
If you don't...
2,287
Posted By stomp
If the patterns are always fixed strings the...
If the patterns are always fixed strings the usage of fgrep or grep -F may result in a HUGE Performance Boost.

If possible, run fgrep without -i. That'll get you another Performance Boost and also...
1,302
Posted By RavinderSingh13
Hello Nikhil, There seems to be space at...
Hello Nikhil,

There seems to be space at last of your lines, so could you please try following and let us know if this helps.

awk -F"|" 'NR==1{print $2 FS $3 FS "CNT";...
1,302
Posted By RudiC
You didn't answer my question, so I go on with my...
You didn't answer my question, so I go on with my idea. Try

awk 'NR==1 {print $2, $3, "CNT"; next} {SUM[$2 FS $3]++} END {for (s in SUM) print s, SUM[s]}' FS="|" OFS="|" file
Col2|Col3|CNT
POSIX...
4,258
Posted By Don Cragun
Here is a commented version of the script I...
Here is a commented version of the script I suggested:
#!/bin/ksh
# Set IAm to the final component of the pathname used to invoke this script.
IAm=${0##*/}

# If the number of operands given to...
2,619
Posted By RavinderSingh13
Hello Nikhil, In case you need only those...
Hello Nikhil,

In case you need only those lines which are having more number of delimiters following may help you. Taken code from RudiC's suggestion with a minor edit in it as follows.

awk -F...
2,619
Posted By RudiC
Try awk -F '|' 'NR==1 {n=NF} NF-gsub(/\\\|/,...
Try awk -F '|' 'NR==1 {n=NF} NF-gsub(/\\\|/, "&")!=n' file
1|Raj|null|
2,619
Posted By Scrutinizer
Try: awk -F '|' 'NR==1{n=NF}NF>n' file or if...
Try:
awk -F '|' 'NR==1{n=NF}NF>n' file
or if it is either less or more:
awk -F '|' 'NR==1{n=NF}NF!=n' file
1,276
Posted By rdrtx1
not sure. try: awk ' NR==1 {for (cc=1;...
not sure. try:
awk '
NR==1 {for (cc=1; cc<=NF; cc++) n[$cc]=$cc; t=$0; next;}
{
if ($1 != "0") c[1]++;
for (i=2; i<=NF; i++) if ($i != "NA" && $i != "null" ) c[i]++;
}
END {
print...
931
Posted By RudiC
Try this awk solution:awk -F, '{for (i=1; i<=NF;...
Try this awk solution:awk -F, '{for (i=1; i<=NF; i++) if ($i~/^"build":/) $i="\"build\":\"" PARM "\""}1' PARM="abcd release" OFS="," file
,"environment":"accent-release","build":"abcd...
1,743
Posted By anbu23
Removed first forward slash sed...
Removed first forward slash
sed 's/"build":"[^"]*"/"build":"sdk-1971_00_label_npr-release_releasevalue-1971_00_label_npr-release_releasevalue"/g' File
1,877
Posted By rbatte1
Is your requirement really to find all files...
Is your requirement really to find all files called v.info below /o but only once per immediate subdirectory? You could mangle that with something like:-cd /o
for subdir in *
do
find $subdir...
1,315
Posted By hanson44
I can't really test this, but I think it will...
I can't really test this, but I think it will work, if I understand the question:

pnos_save_file=$HOME/pnos_save_file.txt
num=1
dat10=`date "+%m/%d/%Y" -d "+10 days"`
dat=`date "+%m/%d/%Y"`...
1,743
Posted By PikK45
I tried echo...
I tried

echo "1.04/1.05ELA=20000,POLLK=35000,RH=5000,MH=7000,WH=4359
1.7:ELA=2000,POLLK=2000,RH=2000,MH=2000,WH=607
1.9:ELA=2000,POLLK=2000,RH=2000,MH=2000,WH=396...
1,213
Posted By RudiC
I'd bet your file has windows control chars in...
I'd bet your file has windows control chars in it, at least a <CR> at the end. Get rid of those and it will fly.
And, in the regex, you don't need the \ in front of the $ sign representing the line...
1,544
Posted By Yoda
awk -F, '{ s=$1;for(i=2;i<=NF;i+=2) s=($i~/^[...
awk -F, '{ s=$1;for(i=2;i<=NF;i+=2) s=($i~/^[ ]*$/)?s FS 0:s FS $i; print s }' filename
1,544
Posted By Scrutinizer
Try: awk -F'[ \t]+,[^,]+,' '$1=$1' OFS=,...
Try:
awk -F'[ \t]+,[^,]+,' '$1=$1' OFS=, fileawk '{gsub(/[ \t]+,[^,]+,/,",")}1' fileperl -pe 's/\s+,.*?,/,/g' file
1,929
Posted By MadeInGermany
Your last requirement "undefined lines should...
Your last requirement "undefined lines should have 0 value" applies for a matrix (two dimensional array).
Which GNU/Posix awk:
awk -F, '
FNR==1 { ++xm }
FNR>ym { ym=FNR }
{ a[FNR,xm]=$2 }
END {...
1,884
Posted By RudiC
Built on Jotne's proposal, you may want to try: ...
Built on Jotne's proposal, you may want to try:
$ awk -F, '{C5+=$5; C6+=$6; C7+=$7; C8+=$8; R=$5+$6+$7+$8; $1=$1; print $0, R}
END {print "total",x,x,x,C5, C6, C7, C8, C5+C6+C7+C8}
...
Showing results 1 to 25 of 42

 
All times are GMT -4. The time now is 06:05 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy