Search Results

Search: Posts Made By: geomarine
3,648
Posted By RudiC
What do you mean by "I want to retain the...
What do you mean by "I want to retain the structure of the table"?
Why do you run the for loop across all fields if you want to operate on $3 only?
What are the two ored branches in the if...
3,648
Posted By vgersh99
awk '$3 >= -2000 {$3="NaN"}1' myFile
awk '$3 >= -2000 {$3="NaN"}1' myFile
1,247
Posted By Scrutinizer
I tried your script and I get your expected...
I tried your script and I get your expected output. Do you have sample where the expected output is not produced?
1,247
Posted By vgersh99
a slightly simplified variation: awk...
a slightly simplified variation:

awk '{idx=$1 SUBSEP $2} FNR==NR{a[idx];next} idx in a' t.xyz a.xyz > out.xyz
1,247
Posted By RudiC
This works on my linux mawk 1.3.3: awk 'FNR==NR...
This works on my linux mawk 1.3.3:
awk 'FNR==NR {a[$1,$2]; next} ($1,$2) in a' t.xyz a.xyz
2,890
Posted By RudiC
Hmmm - repeating your unmodified problem...
Hmmm - repeating your unmodified problem description again doesn't really help, esp. when the error seems to have been tracked down by Scrutinizer - the (first?) file doesn't have the necessary line...
2,890
Posted By Don Cragun
You can use the following to directly process DOS...
You can use the following to directly process DOS format text files without manually stripping out the <carriage-return> characters from your input files...

Create a file named merge containing:...
2,890
Posted By wisecracker
Hi geomarine... Think of your problem and...
Hi geomarine...

Think of your problem and how to get round it.
You want a single file in UNIX format.

Here is a DEMO of your two files:
#!/bin/sh
# All longhand to see the sequence of...
2,890
Posted By RudiC
Please don't forget the requestor is using Win 7...
Please don't forget the requestor is using Win 7 Pr., probably with a cygwin or busybox setup, both with a (limited?) set of not-necessarily-*nix-compatible tools. I'm not sure that s/he will really...
2,890
Posted By wisecracker
I was well aware of the situation but he did...
I was well aware of the situation but he did stress that he wanted it in UNIX format in an earlier post.
Good point though, but it is just as easy to put '\r\n's back if that becomes a necessity.
2,890
Posted By Don Cragun
You have shown us that you have two input files...
You have shown us that you have two input files with <carriage-return><new-line> line separators and no final line terminators (i.e., DOS text file format).

You have shown us several outputs that...
2,890
Posted By RavinderSingh13
Hello geomarine, It is working well and fine...
Hello geomarine,

It is working well and fine for me and moreover I am getting different Input_file(s) output from new line only as follows.


cat file1.TXT file2.TXT > file3.TXT
cat file3.TXT...
2,890
Posted By Scrutinizer
Probably the last line does not have a closing...
Probably the last line does not have a closing linefeed character. Proper Unix files are required to have this..

See if this works:
echo | cat file1.txt - file2.txt > file3.txt
4,158
Posted By vgersh99
updated the post with MIA $. Thank RudiC
updated the post with MIA $.
Thank RudiC
4,158
Posted By Corona688
Use code tags for code please. Try read N L...
Use code tags for code please.

Try read N L <<< $(grdinfo data.grd | awk '{print $10,$5}')
echo $N $L
4,158
Posted By RudiC
@vgersh99: although I really love this creative...
@vgersh99: although I really love this creative approach, I'm afraid there's a leading $ sign missing in the "command substitution"?
4,158
Posted By vgersh99
eval $(grdinfo data.grd | awk...
eval $(grdinfo data.grd | awk '{printf("N=%s%sL=%s\n", $10,OFS,$5)}')
echo "N->[${N}] L->[${L}]"
4,158
Posted By RudiC
Welcome to the forum. Depending on your...
Welcome to the forum.


Depending on your bash version, there are different options:
- shopt -s lastpipe; set +m

- read N L <<< $(grdinfo data.grd | awk '{print $10,$5}')


- read N L...
Showing results 1 to 18 of 18

 
All times are GMT -4. The time now is 08:14 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy