Search Results

Search: Posts Made By: RudiC
266
Posted By Scrutinizer
Try: awk ' NR==FNR { ...
Try:

awk '
NR==FNR { # When reading the first file (then NR is equal to FNR)
A[$1]=$0 # Store the first file...
152
Posted By MadeInGermany
The echo $a1 | ssh ... redirects stdin - but...
The echo $a1 | ssh ... redirects stdin - but nothing reads from it.
You can just redirect stdin with </dev/null ssh ... or close stdin with ssh -n ...; that should have the same effect and not...
363
Posted By vgersh99
Strange... Given your sample input in post#1...
Strange...
Given your sample input in post#1 and using RudiC's code, I get:

{"NAME":"QLogic 570x/571x Gigabit Ethernet Driver",
"VERSION":"11.11,REV=2009.11.11",
"BASEDIR":"/"}
{"NAME":"QLogic...
486
Posted By Scrutinizer
Hi, try: join -t \| -a 1 -a 2 -e 'NULL' -o 0...
Hi, try:
join -t \| -a 1 -a 2 -e 'NULL' -o 0 1.2 1.3 1.4 2.2 2.3 2.4 \
<(join -t \| -a 1 -a 2 -e 'NULL' -o 0 1.2 2.2 2.3 file1 file2) file3
Forum: Shell Programming and Scripting 3 Weeks Ago
878
Posted By wisecracker
Perhaps because the OP might mistake '1' as being...
Perhaps because the OP might mistake '1' as being a result or a '$?' as opposed to a 'true'.
(Also it does not work in POSIX but that is my reason not the OP's.)
1,696
Posted By MadeInGermany
Attention: ssh -qn is ssh -q -n where -n...
Attention:
ssh -qn is ssh -q -n where -n inhibits reading from stdin - good for ssh -q -n remotehost remotecommand. But it must be ssh -q remotehost < file and ssh -q remotehost << heredoc and ...
769
Posted By rbatte1
If you have them in a file, how about:-cut -f-3...
If you have them in a file, how about:-cut -f-3 -d"." input|sort|uniq -cWould that do? It's not quite the pretty output, but you get the detail you need.



I hope that this helps,
Robin
1,765
Posted By Scrutinizer
Indeed \| is a GNU extension to Basic Regular...
Indeed \| is a GNU extension to Basic Regular Expressions ( BRE ), as are \? and \+. But I do not see a use for them, since GNU utilities also support at least Extended Regular Expressions (ERE)...
1,078
Posted By MadeInGermany
${IARR[-1]} is bash-4. ksh (and bash-3) need...
${IARR[-1]} is bash-4.
ksh (and bash-3) need ${IARR[${#IARR[@]}-1]}

read -A and <<< require ksh93 - do not work with the older ksh88.
1,261
Posted By gull04
Hi, Using the data that you provided in the...
Hi,

Using the data that you provided in the previous post and the "grep" already supplied I get;

[e434069@fbakirpomd4 data]$ grep -v " [01] *[+-]" test_01.txt
chr10 85504 85558...
395
Posted By gull04
Hi, If by; You actually mean to output...
Hi,

If by;

You actually mean to output the first file to the printer and then to reposition the paper to the first line and print the second file on a line by line basis starting at the first...
2,030
Posted By MadeInGermany
The if (t ~ $1) is a RE match that is "fuzzy"...
The if (t ~ $1) is a RE match that is "fuzzy" unless it is anchored.
Should be if (t ~ ("^" $1)); the ^ anchor means the string $1 must occur at the beginning of string t.
2,028
Posted By jim mcnamara
Do you need distinct awk functions for every line...
Do you need distinct awk functions for every line of text or just a few functions? Sample input, output would help a lot.

Edit: I don't understand the request either....
174
Posted By rdrtx1
here: 🥄
here: 🥄
498
Posted By wisecracker
Hi amar1208... Just a note reference...
Hi amar1208...

Just a note reference rdrtx1's post #2:
The first line should read something like #!/bin/bash as 'sh' assumes POSIX compatibility so ARRAYS are technically not possible.
Make...
6,250
Posted By Chubler_XL
You could try using index and substr instead of...
You could try using index and substr instead of match to avoid regex overheads. This takes about 2mins for a 2GB file on my system:

awk '
BEGIN {
HDLN =...
6,250
Posted By Scrutinizer
I did some testing and RudiC's second method...
I did some testing and RudiC's second method turned out to be fastest.

I would suggest trying mawk

In tests I conducted with RudiC's approach mawk was several orders faster than regular awk...
1,459
Posted By joeyg
When you address RudiC, and your approach to...
When you address RudiC, and your approach to this, you should also include a reason for the request.
Your request does not seem to have any logical reason, and thus gives the impression that it is...
1,679
Posted By Don Cragun
Note that RudiC's and Scrutinizer's suggestions...
Note that RudiC's and Scrutinizer's suggestions both depend on the fact that the orgX and orgXX strings in file2 are distinct. Had file2 also contained the line:
org2=japan
both of those...
1,841
Posted By sandeepgoli53
Hi Guys, I am able to get desired output...
Hi Guys,

I am able to get desired output after changing my shell script as mentioned below.

#!/bin/bash
file="db_detail.txt"
. $file
rm /batch/corpplan/bin/dan.csv...
1,765
Posted By vgersh99
Just to quote RudiC: A desired output would...
Just to quote RudiC:

A desired output would be helpful as well....
Also, you've been asked to start using code tags when posting code/data sample...
1,437
Posted By wisecracker
This is the third time you have asked this and...
This is the third time you have asked this and you have been steered in the right direction - twice:
...
5,141
Posted By nezabudka
Maybe this will help? arr=(${arr[@]: -2}...
Maybe this will help?
arr=(${arr[@]: -2} ${arr[@]:0:$((${#arr[@]}-2))})
echo ${arr[@]}
5,141
Posted By MadeInGermany
Comment on the previous solution: it allows echo...
Comment on the previous solution: it allows echo ${array[-i]} because the =( ) splits on IFS i.e. space and newline.

If there is a cyclic shift, the % operator (modulo) is nice!
The following...
211
Posted By Don Cragun
Assuming that $OPTIONS does not expand to a...
Assuming that $OPTIONS does not expand to a string containing anything other than alphanumeric characters and <space>s, one could also use:
echo $(echo $OPTIONS|tr ' [:upper:]' '\n[:lower:]'|sort...
Showing results 1 to 25 of 500

 

Featured Tech Videos

All times are GMT -4. The time now is 07:25 PM.
Unix & Linux Forums Content Copyright 1993-2019. All Rights Reserved.
Privacy Policy