Search Results

Search: Posts Made By: avis1981
36,046
Posted By avis1981
Can you try this one: #! /bin/ksh ...
Can you try this one:

#! /bin/ksh
IPLIST=`cat ./pinglist1.txt`
for ip in $IPLIST
do
echo $ip
ping -c 2 $ip >>log.txt
if [[ $? -eq 0 ]]
then
print $ip "PINGS">>pingresults.txt
else...
12,801
Posted By avis1981
Well there are lot of options, trimming leading...
Well there are lot of options, trimming leading zeroes and doing compare..
below is one of the option..
chk_sum=`tail -1 $FILE_NAME | cut -d~ -f7 | cut -c2-|awk '{print "%.5f",$0}'`
5,402
Posted By avis1981
I believe you are saying about the varying...
I believe you are saying about the varying pattern when you mention multiple files/columns.
Can you prepare pattern file based on no. of columns just before doing grep ?

And use grep -v -f...
6,544
Posted By avis1981
For the point 1, since you are assuming that...
For the point 1, since you are assuming that there will be always header and footer, not sure why you need to read whole file NAM2008101601.OUT to get record count.

Can you try using file size...
36,046
Posted By avis1981
Looks more like ping syntax Are you trying...
Looks more like ping syntax

Are you trying to do 2 echoes to ip, if so

ping -c 2 $ip | awk '/100%/ {print "no"}' |read Pingable
2,336
Posted By avis1981
How about first joining the file join -t"|"...
How about first joining the file

join -t"|" 1.txt 2.txt|awk 'BEGIN{FS="|"} $2!=$9 {print 2,$9}'

Above does for field 2,Similarly need to do for other fields This is assuming 8 fields are there...
6,435
Posted By avis1981
not working at all? Can you tell what is the...
not working at all?
Can you tell what is the expected output and provide what you tried?
5,402
Posted By avis1981
How about using grep and grep -v, might be...
How about using grep and grep -v, might be something like below can work
grep "^[a-z0-9]*-[a-z0-9]*-[0-9]*$" <inputfile> >goodFile
grep -v "^[a-z0-9]*-[a-z0-9]*-[0-9]*$" <inputfile> >badFile
20,469
Posted By avis1981
or you can just use read sql script into a...
or you can just use
read sql script into a string variable sqltext,
db2 -x +w <sqltext>
20,469
Posted By avis1981
Assuming your sql scripts are @ delimited You...
Assuming your sql scripts are @ delimited
You might want to provide full path for db2 command,
#!/bin/ksh
db2 -td@ -vf script_1
db2 -td@ -vf script_2 &
db2 -td@ -vf script_3 &
db2 -td@ -vf...
6,435
Posted By avis1981
awk '{ print substr($0,1,120)"\n"substr($0,121)}'...
awk '{ print substr($0,1,120)"\n"substr($0,121)}' <filename> > outfilename
2,071
Posted By avis1981
print $file1 and $UNIQDIR to make sure you have...
print $file1 and $UNIQDIR to make sure you have full path.
2,071
Posted By avis1981
why do you need redirection ">"
why do you need redirection ">"
14,743
Posted By avis1981
How about setting alias? alias dir1="cd...
How about setting alias?

alias dir1="cd /home/username/directory1"
alias temp3="cd /home/username/directory1/temp1/temp2/temp3"

then just enter dir1 to goto directory1, temp3 to temp3 folder.
3,923
Posted By avis1981
Can you provide sample expected output.
Can you provide sample expected output.
4,688
Posted By avis1981
Are you looking for something like below ? ...
Are you looking for something like below ?

cat `ls -l|sort -n +4|tail -1|awk '{print $9}'`
5,744
Posted By avis1981
Can you give a provide the file content.
Can you give a provide the file content.
24,529
Posted By avis1981
One more option sed 's/[^|]//g'...
One more option

sed 's/[^|]//g' <filename>|wc -c

you need to add logic to loop through multiple files, and if the above command gives 14, you can print the file name.
79,361
Posted By avis1981
How about using awk, printing it $1, $2.. ...
How about using awk, printing it $1, $2..

e.g.
trackingnum=echo $line|awk '{print $1}'
4,291
Posted By avis1981
How about this, assuming you need to append stuff...
How about this, assuming you need to append stuff at 2nd line.

sed '2 s|"$|test.ear&|' < file.txt
Showing results 1 to 20 of 20

 
All times are GMT -4. The time now is 10:13 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy