Search Results

Search: Posts Made By: a_bahreini
3,845
Posted By vgersh99
awk -f ab.awk...
awk -f ab.awk FDA_Approved_drugs_names_sectioned.txt Drug-Gene_match_dgidb.txt where ab.awk is:

BEGIN {
FS=OFS="\t"
}
FNR==NR {a[$1]=$0;next}
{
for (i in a) {
one=$1
...
4,612
Posted By alister
If you can afford to keep the data in memory, the...
If you can afford to keep the data in memory, the following approach should me more efficient:

awk '
{
for (i=1; i<=NF; i++)
a[NR,i] = $i
}

END {
tmp_file = "tmp"
sort_cmd = "sort...
4,612
Posted By Don Cragun
Try this: #!/bin/ksh IAm=${0##*/} ...
Try this:
#!/bin/ksh
IAm=${0##*/}
base="${IAm}_$$_"
trap 'rm -f "$base"*' EXIT
file=${1:-file}
nf=1
while [ 1 ]
do tmpname=$(printf "%s%03d" "$base" $nf)
cut -f $nf "$file" > "$tmpname"...
5,611
Posted By ahamed101
I have corrected the code. Its kind of...
I have corrected the code.
Its kind of difficult to code and post via mobile/tablet ;)
5,611
Posted By ahamed101
awk 'NR==FNR{A[$4]=$0; next} $6 in A{ print...
awk 'NR==FNR{A[$4]=$0; next} $6 in A{ print A[$6]"\t"$0 }' file1 file2
2,483
Posted By Yoda
awk 'NR==FNR{A[$1];next}!($2 in A)' file1 file2
awk 'NR==FNR{A[$1];next}!($2 in A)' file1 file2
2,145
Posted By Yoda
Read input file twice: awk -F'\t' ' ...
Read input file twice:
awk -F'\t' '
NR == FNR {
v = $1 FS $5
if ( ! ( v in A ) )
C[$5]++
A[v]
...
1,255
Posted By targzeta
cat list.txt | while read pattern; do find...
cat list.txt | while read pattern; do find sourcedir -name "*$pattern*" -exec mv '{}' destdir/ \;;doneEmanuele
1,255
Posted By bartus11
Try:xargs -I% find /source -name "*%*" -exec mv...
Try:xargs -I% find /source -name "*%*" -exec mv {} /destination \; < list.txt
2,001
Posted By MadeInGermany
The ~/RE/ searches for "chr6" being a substring: ...
The ~/RE/ searches for "chr6" being a substring:
awk '$1~/chr6/ && $2~/chr6/' file
The exact string match:
awk '$1=="chr6" && "$2=="chr6"' file
10,664
Posted By Yoda
Try this if all columns needs to be compared: ...
Try this if all columns needs to be compared:
awk 'NR==FNR{A[$1];next}$1 in A || $2 in A || $3 in A' file1 file2
If just column 2 needs to be compared:
awk 'NR==FNR{A[$1];next}$2 in A' file1 file2
2,534
Posted By Yoda
awk '{n=gsub(/0\/1|1\/1/,"&");print $0,n}'...
awk '{n=gsub(/0\/1|1\/1/,"&");print $0,n}' file
2,534
Posted By Yoda
To display the count in the end of each record: ...
To display the count in the end of each record:
awk '{n=gsub("0/1","&");print $0,n}' file
To display total count:
awk '{n=gsub("0/1","&");c+=n}END{print c}' file
3,550
Posted By Don Cragun
OK. I can't really tell from your example, but...
OK. I can't really tell from your example, but it looks like you intend to have tabs as field separators. I will assume that the tabs were changed to spaces as part of a copy and paste process. ...
3,538
Posted By Don Cragun
When I invoke the script I provided in message #4...
When I invoke the script I provided in message #4 in this thread as follows:
cp mtDNA_GATK_reference_letters.txt in
tester 2 A C > out
then the only difference between files...
3,538
Posted By pamu
try this..... Just pass parameters.. ...
try this.....

Just pass parameters..
NUM=position of the string from first line.
Rpl=Check if this letter presents
subs=Replace with this letter

awk -F "" -v NUM="2" -v Rpl="A" -v subs="C"...
3,538
Posted By Don Cragun
Here is a much longer alternative way to do this:...
Here is a much longer alternative way to do this:
#!/bin/ksh
# Usage: tester [count [from [to]]]
# Change the "count"th occurrence of the character specified by "from"
# to the...
3,538
Posted By rdrtx1
awk -v p=150 -v l="C" '{for (i=1; i<=length($0);...
awk -v p=150 -v l="C" '{for (i=1; i<=length($0); i++) {++c; o=$0; if (c==p) o=substr($0,1,i-1) l substr($0,i+1);};print o;}' infile
3,538
Posted By pamu
try this... awk -F ""...
try this...

awk -F "" '{if((max+NF)>150){for(i=1;i<=NF;i++){if((max+i) == 150 && $i ~ /T/){$i = "C"}}}else{max+=NF}}1' file
Showing results 1 to 19 of 19

 
All times are GMT -4. The time now is 04:25 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy