Search Results

Search: Posts Made By: siquadri
5,374
Posted By siquadri
I think this cannot be implemented in unix
I think this cannot be implemented in unix
2,588
Posted By siquadri
Isql Data format
I am using the following script
#!/bin/ksh
/usr/sybase/bin/isql -SXYZ -UABC -PXYZ -b <<EOF >temp
select A.host,A.server,B.db,A.status from dbo.server as A,dbo.db as B where A.product in...
5,108
Posted By siquadri
ypmatch,ypcat,NIS
Hi
I am moving from shell scripting part to network shell scripting.
Can some one give me basic idea for all network related commands like
ypmatch,ypcat,NIS etc...

Documentation wiil be...
3,728
Posted By siquadri
This is what i am getting $ awk -F","...
This is what i am getting

$ awk -F"," 'FILENAME="file1.txt" {array[$1]=$1} FILENAME="file2.txt" { if (array[$4]) {print $0}}' file1.txt file2.txt
1,2,ua,xyz.com
1,2,ua,eg.com
$
4,836
Posted By siquadri
(( ))
(( ))
3,728
Posted By siquadri
Did you change your input file names to file1 and...
Did you change your input file names to file1 and file2
3,567
Posted By siquadri
This works nawk -F"," 'FILENAME=File1...
This works

nawk -F"," 'FILENAME=File1 {array[$1]=$1} FILENAME=File2 { if (array[$4]) {print $0}}' File1 File2
4,836
Posted By siquadri
The double brackets are called let command. And...
The double brackets are called let command.
And you can find lot of documentation on Internet
3,728
Posted By siquadri
This works nawk -F"," 'FILENAME="file1" ...
This works

nawk -F"," 'FILENAME="file1" {array[$1]=$1} FILENAME="file2" { if (array[$4]) {print $0}}' file1 file2
2,553
Posted By siquadri
This works: #!/bin/ksh ...
This works:

#!/bin/ksh
#_____________________________________
i=11
until (( i > 0 && i <= 9 ))
do
echo Please enter number between 1 to 9
read number
if (( number > 0 && number <= 9 ))...
1,898
Posted By siquadri
This should work
This works

nawk -F"," 'FILENAME="File_1" {arr[$0]=$0} FILENAME="File_2" {if (arr[$1]&& $3 ~/DATA=1/) {print $0}}' File_1 File_2
3,728
Posted By siquadri
Vidhyadhar, Can you please explain this...
Vidhyadhar,

Can you please explain this command.
I tried but invain
7,181
Posted By siquadri
quote=TonyFullerMalv;302303867]Try: for...
quote=TonyFullerMalv;302303867]Try:

for SFILE in `ls *-*`; do
TFILE=`echo "${SFILE}" | sed '/-/_/g'`
echo mv ${SFILE} ${TFILE}
mv ${SFILE} ${TFILE}
done
[/quote]


Use this it...
14,663
Posted By siquadri
quote=Dendany83;302303901]Dears, I'm new in...
quote=Dendany83;302303901]Dears,

I'm new in shell scripting and i need your help, i would like to know how can i create a script to ftp to a certain unix/linux machine/server IP address and get a...
36,272
Posted By siquadri
In place of findstring use I am the String
In place of findstring use I am the String
36,272
Posted By siquadri
Check this nawk '$0 ~/findstring/ {print...
Check this
nawk '$0 ~/findstring/ {print $0;getline;gsub(/replace string/,"replaced string");print}' datafile > datafile1
121,894
Posted By siquadri
Explain cmd
Can some please explain this command?
2,887
Posted By siquadri
chmod
chmod 755 dirname
1,968
Posted By siquadri
I node
ls -i the first coloumn in i node number
3,083
Posted By siquadri
Try this grep -w 500 access
Try this

grep -w 500 access
13,584
Posted By siquadri
Try this: awk -F';' 'BEGIN {OFS=";"}...
Try this:


awk -F';' 'BEGIN {OFS=";"} {$4=($4*100)+$5} END{print $1,$2,$3,$4,$6}'
2,197
Posted By siquadri
Try this: awk-f\/ '{print $2}'
Try this:
awk-f\/ '{print $2}'
Forum: Fedora 04-02-2009
3,553
Posted By siquadri
You have to execute like this . scriptname
You have to execute like this
. scriptname
13,584
Posted By siquadri
Hey can you explain your requirement clearly. ...
Hey can you explain your requirement clearly.
Do you will have two different files one comma seperated and other ; seperated.

Or you want to create ; seperated from , separetaed and then do...
9,072
Posted By siquadri
Try this `some_command` |sort -n +1 -3
Try this
`some_command` |sort -n +1 -3
Showing results 1 to 25 of 44

 
All times are GMT -4. The time now is 02:23 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy