Search Results

Search: Posts Made By: siquadri
5,087
Posted By siquadri
I think this cannot be implemented in unix
I think this cannot be implemented in unix
2,522
Posted By siquadri
Isql Data format
I am using the following script
#!/bin/ksh
/usr/sybase/bin/isql -SXYZ -UABC -PXYZ -b <<EOF >temp
select A.host,A.server,B.db,A.status from dbo.server as A,dbo.db as B where A.product in...
4,913
Posted By siquadri
ypmatch,ypcat,NIS
Hi
I am moving from shell scripting part to network shell scripting.
Can some one give me basic idea for all network related commands like
ypmatch,ypcat,NIS etc...

Documentation wiil be...
3,540
Posted By siquadri
This is what i am getting $ awk -F","...
This is what i am getting

$ awk -F"," 'FILENAME="file1.txt" {array[$1]=$1} FILENAME="file2.txt" { if (array[$4]) {print $0}}' file1.txt file2.txt
1,2,ua,xyz.com
1,2,ua,eg.com
$
4,722
Posted By siquadri
(( ))
(( ))
3,540
Posted By siquadri
Did you change your input file names to file1 and...
Did you change your input file names to file1 and file2
3,324
Posted By siquadri
This works nawk -F"," 'FILENAME=File1...
This works

nawk -F"," 'FILENAME=File1 {array[$1]=$1} FILENAME=File2 { if (array[$4]) {print $0}}' File1 File2
4,722
Posted By siquadri
The double brackets are called let command. And...
The double brackets are called let command.
And you can find lot of documentation on Internet
3,540
Posted By siquadri
This works nawk -F"," 'FILENAME="file1" ...
This works

nawk -F"," 'FILENAME="file1" {array[$1]=$1} FILENAME="file2" { if (array[$4]) {print $0}}' file1 file2
2,468
Posted By siquadri
This works: #!/bin/ksh ...
This works:

#!/bin/ksh
#_____________________________________
i=11
until (( i > 0 && i <= 9 ))
do
echo Please enter number between 1 to 9
read number
if (( number > 0 && number <= 9 ))...
1,729
Posted By siquadri
This should work
This works

nawk -F"," 'FILENAME="File_1" {arr[$0]=$0} FILENAME="File_2" {if (arr[$1]&& $3 ~/DATA=1/) {print $0}}' File_1 File_2
3,540
Posted By siquadri
Vidhyadhar, Can you please explain this...
Vidhyadhar,

Can you please explain this command.
I tried but invain
6,962
Posted By siquadri
quote=TonyFullerMalv;302303867]Try: for...
quote=TonyFullerMalv;302303867]Try:

for SFILE in `ls *-*`; do
TFILE=`echo "${SFILE}" | sed '/-/_/g'`
echo mv ${SFILE} ${TFILE}
mv ${SFILE} ${TFILE}
done
[/quote]


Use this it...
14,238
Posted By siquadri
quote=Dendany83;302303901]Dears, I'm new in...
quote=Dendany83;302303901]Dears,

I'm new in shell scripting and i need your help, i would like to know how can i create a script to ftp to a certain unix/linux machine/server IP address and get a...
35,902
Posted By siquadri
In place of findstring use I am the String
In place of findstring use I am the String
35,902
Posted By siquadri
Check this nawk '$0 ~/findstring/ {print...
Check this
nawk '$0 ~/findstring/ {print $0;getline;gsub(/replace string/,"replaced string");print}' datafile > datafile1
117,900
Posted By siquadri
Explain cmd
Can some please explain this command?
2,544
Posted By siquadri
chmod
chmod 755 dirname
1,894
Posted By siquadri
I node
ls -i the first coloumn in i node number
2,868
Posted By siquadri
Try this grep -w 500 access
Try this

grep -w 500 access
13,242
Posted By siquadri
Try this: awk -F';' 'BEGIN {OFS=";"}...
Try this:


awk -F';' 'BEGIN {OFS=";"} {$4=($4*100)+$5} END{print $1,$2,$3,$4,$6}'
2,069
Posted By siquadri
Try this: awk-f\/ '{print $2}'
Try this:
awk-f\/ '{print $2}'
Forum: Fedora 04-02-2009
2,533
Posted By siquadri
You have to execute like this . scriptname
You have to execute like this
. scriptname
13,242
Posted By siquadri
Hey can you explain your requirement clearly. ...
Hey can you explain your requirement clearly.
Do you will have two different files one comma seperated and other ; seperated.

Or you want to create ; seperated from , separetaed and then do...
8,651
Posted By siquadri
Try this `some_command` |sort -n +1 -3
Try this
`some_command` |sort -n +1 -3
Showing results 1 to 25 of 44

 
All times are GMT -4. The time now is 09:44 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy