Search Results

Search: Posts Made By: ncwxpanther
1,214
Posted By RudiC
For exactly your above laid out problem, try ...
For exactly your above laid out problem, try
awk 'NR > 1 {print LEADIN, (sum+$3)/3} {LEADIN = $1 OFS $2; sum = $13 + $14} ' file
01001 271895 59.5933
01001 271896 60.9133
01001 271897 62.4867The...
2,286
Posted By Scrutinizer
Adaptation of post #7 for the new requirements: ...
Adaptation of post #7 for the new requirements:

awk -v c=4 '
!($1 in M) {
M[$1]=$c
}
{
if($c>-99.99 && $c<=M[$1]) {
M[$1]=$c
V[$1]=$2
}
}
END {
...
2,286
Posted By nezabudka
I get other data awk ' !($1 in t) {t[$1]...
I get other data
awk '
!($1 in t) {t[$1] = $3}
!/-99.99/ {for(i = 3; i <= NF; i++) if (t[$1] > $i) {
t[$1] = $i; k[$1] = $2}
}
END { for...
2,286
Posted By vgersh99
a bit verbose, but..... sometyhing along these...
a bit verbose, but..... sometyhing along these lines.
awk -f ncw.awk myFile where ncw.awk is:

{
for(i=3;i<=NF;i++) {
key[$1]
if ( $i >0 && (!(($1, i) in f1) || f1[key[$1],i] > $i))...
1,456
Posted By RudiC
After six and a half years as a member and with...
After six and a half years as a member and with more than 130 posts, some of which on average (https://www.unix.com/shell-programming-and-scripting/275793-average-select-rows.html#post303009064)...
1,300
Posted By Scrutinizer
This seems to work: awk '{ print...
This seems to work:
awk '{ print substr($1,62,11) " " $16}' file.txt
The third parameter in substr is the number of characters that you want to print, not the position.

Alternatives based on...
1,589
Posted By Don Cragun
You could also try using awk. If the output...
You could also try using awk. If the output order doesn't matter, try:
awk '
FNR == NR {
k[$0]
next
}
$0 in k {
k[$0]++
}
END { for(key in k)
if(k[key] > 1)
print key, k[key]
}'...
1,620
Posted By Padow1
Here is one way to do it. If you need the values...
Here is one way to do it. If you need the values put in to a variable for later use, follow the example below of i=`command` to have it for later use. Note that this script follows your logic of...
28,997
Posted By Don Cragun
The following seems to do what I think you...
The following seems to do what I think you want...
#!/bin/ksh
# Final component of script name.
IAm=${0##*/}

# Absolute pathname of control file.
CF='/some/dir/control.status'

# Absolute...
28,997
Posted By Don Cragun
Hi ncwxpanther, Instead of moving all of your...
Hi ncwxpanther,
Instead of moving all of your data files to a parent directory, I would just have executed the script in the child directory where the files were located. But, either way should...
28,997
Posted By Corona688
You can sort everything by value and just let awk...
You can sort everything by value and just let awk decide which belongs to which:
REF="test/190005.pnt"

sort -k3 -n test/{1900..2016}05.pnt |
awk '
# Read the values you want to...
9,985
Posted By Corona688
$ is the operator for column in awk. $ awk...
$ is the operator for column in awk.

$ awk -v COL=3 '{ print $COL }' <<EOF
A B C D E
EOF

C

$

[edit] Okay, 3 people beat me to it. I still like my example anyway :p
9,985
Posted By MadeInGermany
Prefix it with $ in awk: extract=$(awk '{print...
Prefix it with $ in awk:
extract=$(awk '{print ($'$i')}' input.file)
or
extract=$(awk -v myvar=$i '{print ($myvar)}' input.file)
9,985
Posted By rdrtx1
extract=$(awk '{print $fld}' fld=$i input.file)
extract=$(awk '{print $fld}' fld=$i input.file)
3,085
Posted By jim mcnamara
I do not see your logic clearly. But. What if...
I do not see your logic clearly. But. What if we remove all alphas and go from there.

awk '{
a=$0
gsub(/[A-Z]/, " ", a)
split(a,arr," ")
}
arr[1] ~ /123/...
2,973
Posted By RudiC
How aboutawk ' BEGIN {MAX=-1E100 } ...
How aboutawk '
BEGIN {MAX=-1E100
}
{for (x=2; x<=NF; x++) if ($x>MAX) {MAX = $x
C1 = $1
...
1,449
Posted By Don Cragun
Since you're matching on the 1st column in both...
Since you're matching on the 1st column in both files, you need A[$1] instead of A[$NF]. It looks like you want something more like:
awk 'NR == FNR {A[$1]=$2;next};$1 in A {print...
1,878
Posted By MadeInGermany
Instead of { print C1, MAX } You can do {...
Instead of
{ print C1, MAX }
You can do
{ print substr(C1,1,11), substr(C1,13,4), substr(C1,18,2), substr(C1,20,8)}, MAX }
1,878
Posted By MadeInGermany
The original code can be extended like this awk...
The original code can be extended like this
awk '{ for(x=2;x<=NF;x++) {a[++y]=$x; b[$x]=$1} } END { c=asort(a); print "max:",b[a[c]],a[c] }' fileBut is more efficient to only remember the current...
1,878
Posted By RudiC
Why don't you start at x=2 if you're not...
Why don't you start at x=2 if you're not interested in $1?
However, tryawk '
BEGIN {MAX=-1E100
}
{for (x=2; x<=NF; x++) if ($x>MAX) {MAX = $x
...
1,851
Posted By RudiC
Could you imagine somebody does not know what lat...
Could you imagine somebody does not know what lat is nor lon? Assuming we're talking of fields 1 and 2 in either file, tryawk 'FNR==NR {T[$1,$2]++; next} ($1,$2) in T' file1 file2
24.5625 -81.8125 ...
1,851
Posted By Don Cragun
What operating system and shell are you using? ...
What operating system and shell are you using? If you're using a Solaris/SunOS system, change awk to /usr/xpg4/bin/awk or nawk.

Is your fileA in DOS format with <carriage-return><newline> line...
1,742
Posted By MadeInGermany
I think that file ABC has all the information. ...
I think that file ABC has all the information.
while read c1 c2
do
fname="$c1.txt"
{
echo "> -Z$c2"
cat "$fname"
} >"$fname.new"
done <"ABC"
2,714
Posted By neutronscott
The problem with max is the > and comparing...
The problem with max is the > and comparing strings to numbers.
The problem with min is we start with min being 0 and none of those are below zero.

Let's skip lines that don't have 2 columns, and...
1,680
Posted By disedorgue
Hi, Try : awk 'NR == FNR {A[$1]=$2;next};$NF...
Hi,
Try :
awk 'NR == FNR {A[$1]=$2;next};$NF in A {print $NF,A[$NF],$2,$3}' fileA fileBRegards.
Showing results 1 to 25 of 48

 
All times are GMT -4. The time now is 08:51 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy