Is there a more efficient way?


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Is there a more efficient way?
# 1  
Old 12-15-2005
Is there a more efficient way?

I'm using korn shell to connect to oracle, retrieve certain values, put them in a list, and iterate through them. While this method works, I can't help but think there is an easier method.

If you know of one, please suggest a shorter, more efficient method.


PHP Code:
###############  FUNCTIONS  ####################################################################
check_for_sqlplus_error() 
{
    
# $1 = $? (results of last command)
    # $2 = user defined error code
  
if [ $-ne 0 ]; then
   write_log 
" ERROR - sqlplus execution failure at: "$2
   
exit 1
  fi
  
return 0
}    

run_query()
{
    
# $1 - db login
    # $2 - user defined error code
    # $3 - SQL to execute
    # $4 - logfile to re-direct to
$SQLPLUS -$1<<EOF |  sed '/^$/d' > $4
  set HEADING   OFF  
  set FEED OFF
  set TRIMOUT   ON 
  set TRIMSPOOL ON 
  
$3;
EOF

check_for_sqlplus_error 
$? "code "$2
return 0
}

run_proc()
{
    
# $1 - db login
    # $2 - user defined error code
    # $3 - SQL to execute
    # $4 - logfile to re-direct to
$SQLPLUS -$1<<EOF |  sed '/^$/d' > $4
  set pagesize 0
  set AUTOT ON
  set serveroutput on SIZE 50000
  
$3;
EOF

check_for_sqlplus_error 
$? "code "$2
return 0
}

write_log ()
{
  echo `
date +"%m/%d/%Y %H:%M:%S"`" "$1
}

#################  MAIN  ########################################################################
# SQL*PLUS code:200 - Get the list of batches for this load 
write_log " Get the Batch List..."
run_query $ODS 200 "select distinct batch_num from cr_batch where load_desc = '$SOURCE_SYSTEM_DESC'"  $LOGPATH$SOURCE_SYSTEM_DESC.batch.lst

if [ -s $LOGPATH$SOURCE_SYSTEM_DESC.batch.lst ]; then
  write_log 
" Begin Processing Batches..."
  
while read BATCH_NUM
  
do
    if [ 
$BATCH_NUM ]; then
       
# SQL*PLUS code:300 - Run the batch load stored procedure for this batch 
       
write_log " Calling p_stage_load with $SOURCE_SYSTEM_DESC , $LOAD_RUN_NUM$BATCH_NUM "
       
run_proc $STAGING 300 "exec p_stage_load('$SOURCE_SYSTEM_DESC', $LOAD_RUN_NUM$BATCH_NUM )"  $LOGPATH$SOURCE_SYSTEM_DESC.$BATCH_NUM.log &
    else 
      
write_log " No batches found for '$SOURCE_SYSTEM_DESC'"
    
fi
  done 
$LOGPATH$SOURCE_SYSTEM_DESC.batch.lst
else 
    
# if there are no batches, lets just exit
    
write_log " No batches for "$SOURCE_SYSTEM_DESC" were found!...Exiting"
    
exit
fi 
# 2  
Old 12-15-2005
First, FYI: sqlplus will return exit code 0 when the database isn't up; you can try it yourself. Add "WHENEVER SQLERROR" to force a non zero exit code.

Quote:
Originally Posted by SelectSplat
Code:
run_query()
{
# $1 - db login
# $2 - user defined error code
# $3 - SQL to execute
# $4 - logfile to re-direct to
$SQLPLUS -s $1<<EOF |  sed '/^$/d' > $4
  WHENEVER SQLERROR EXIT 1
  set HEADING   OFF  
  set FEED OFF
  set TRIMOUT   ON 
  set TRIMSPOOL ON 
  $3;
EOF

check_for_sqlplus_error $? "code "$2
return 0
}

Second: You certainly could combine your sqlplus calls into one.
Code:
run_query() 
{ 
    # $1 - db login 
    # $2 - user defined error code 
    # $3 - SQL to execute 
    # $4 - logfile to re-direct to 
$SQLPLUS -s $1<<EOF | sed '/^$/d' > $4 
  $3; 
EOF 

check_for_sqlplus_error $? "code "$2 
return 0 
}

run_query $ODS 200 "
  set HEADING OFF FEEDBACK OFF TRIMOUT ON TRIMSPOOL ON
  select distinct
	batch_num 
  from	cr_batch
  where	load_desc = '$SOURCE_SYSTEM_DESC';
"  $LOGPATH$SOURCE_SYSTEM_DESC.batch.lst 

...
run_query $STAGING 300 "
  set pagesize 0 AUTOT ON serveroutput on SIZE 50000
     exec p_stage_load('$SOURCE_SYSTEM_DESC', $LOAD_RUN_NUM, $BATCH_NUM );
"  $LOGPATH$SOURCE_SYSTEM_DESC.$BATCH_NUM.log & 
...

Personally, I prefer to use this kind of construct:
Code:
# Read whole lines
IFS='
'
set -A RESULTS $({
    sqlplus -s /nolog "
        connect un/pw
        select or exec ...;
"
    print RC=$?
} 2>&1)

if [[ "RC=0" != ${RESULST[(( ${#RESULTS[@]} - 1} ))] ]]
then
    write_log "sqlplus error"
    exit 1
fi

for i in ${RESULTS[@]}
do
    case $i in
        ORA*|SP2*|PLS*)
            write_log "sql error"
            exit 1
        ;;
        ... whatever else you want to look for ...
        ;;
    esac
done

If you results exceed 4098, you won't want to use an array if you Korn shell is KSH88. I believe that the newer Korn shells will allow more the 4098 elements.

Last edited by tmarikle; 12-15-2005 at 11:48 PM..
# 3  
Old 12-16-2005
The first two tips are GREAT thanks. SQL*Plus is new to me, as nearly all my experience has been with ISQL.

Your last suggestion is extremly interesting to me, as I can barely follow what's going on. In fact, I don't understand at all.

Is 'RESULTS' an array of strings being assigned the result set from sqlplus?
What is IFS?
What is /nolog ?
I'm completly lost with the 'if'.
If I use this method, will I still have each 'word' in each row in it's own variable?
Can you psudo code document it for me?

Also, I'm not exactly sure what version of korn we're using, but I did notice that VAR=$(command) doesn't work as I'd expect. It looks like I have the same funcationality with VAR=`command`, but I'm not 100% sure.

Sorry for all the questions. Thanks for you reply, and in advance for your further elaboration.


Quote:
Originally Posted by tmarikle

Personally, I prefer to use this kind of construct:
Code:
# Read whole lines
IFS='
'
set -A RESULTS $({
    sqlplus -s /nolog "
        connect un/pw
        select or exec ...;
"
    print RC=$?
} 2>&1)

if [[ "RC=0" != ${RESULST[(( ${#RESULTS[@]} - 1} ))] ]]
then
    write_log "sqlplus error"
    exit 1
fi

for i in ${RESULTS[@]}
do
    case $i in
        ORA*|SP2*|PLS*)
            write_log "sql error"
            exit 1
        ;;
        ... whatever else you want to look for ...
        ;;
    esac
done

If you results exceed 4098, you won't want to use an array if you Korn shell is KSH88. I believe that the newer Korn shells will allow more the 4098 elements.
# 4  
Old 12-16-2005
[code]
# Read whole lines
IFS defines what field seperators consist of. Normally they are space, tab, newlines, etc. I am redefining the field
seperators to newlines only so for and while loops process lines instead of words.

IFS='
'
This is defining an array from sqlplus' output plus my print command. "/nolog" just keeps sqlplus from
attempting to log in from any point except from a "connect" command from the SQL.

Code:
set -A RESULTS $({
    sqlplus -s /nolog "
        WHENEVER SQLERROR EXIT 1    <== Forgot to include my earlier recommendation
        connect un/pw
        select or exec ...;
"
    print RC=$?
} 2>&1)


My results array will consist of elements (whole lines of text) starting from 0.  Remember that I manufactured
my own message following the sqlplus to show sqlplus' exit code (RC=$?).  This sits in the array's last element.
Since we don't typically know how many rows will return I have to compute the size of the array before knowing
which element is the last.

${#RESULTS[@]} gives me the array's size in terms of the number of elements and (( ${#RESULTS[@]} - 1} )) 
points me to the array's last element (remember that the array starts with at element 0).  Therefore, the "if"
statement compares a known "good" exit code message of "RC=0" with the actual message returned following sqlplus'
execution.  If the message is not "RC=0" it must be bad and sqlplus exited with an error so we will exit too.
if [[ "RC=0" != ${RESULST[(( ${#RESULTS[@]} - 1} ))] ]]
then
    write_log "sqlplus error"
    exit 1
fi

This loop simply processes each element in the array and tests each array element for whatever we want.  "ORA*|SP2*|PLS*)"
are known Oracle error messages so we typically want to handle them.
for i in ${RESULTS[@]}
do
    case $i in
        ORA*|SP2*|PLS*)
            write_log "sql error"
            exit 1
        ;;
        ... whatever else you want to look for ...
        ;;
    esac
done

# 5  
Old 12-16-2005
That's outstanding.

Will this still work in my version of ksh, even when it appears that $(command) isn't working? Or do I need to do the obvious, and replace the $(command) with `command` ?
# 6  
Old 12-16-2005
Also, 2 other important questions regarding this technique.

Using the 'while read', I'm able to capture each 'word' of a line in a seperate variable, while processing all of the lines in the file. It appears that your technique is putting the whole line in an element of the array. If I do that, I'd need to use awk, or something similar, to address a particular 'word' in the line, correct?

Also, in this loop, I'm spawning batches off in the background. Each spawned batch load a list of tables sequentially. If one of the tables in the batch fails to load, the desired effect is to continue on with the next table. So, from your explaination, I gather that this statement...

if [[ "RC=0" != ${RESULST[(( ${#RESULTS[@]} - 1} ))] ]]

Would not be relevent. Is that correct?
# 7  
Old 12-16-2005
Regarding KSH: I'm using KSH88, which is about as old as they get I think. I do not know why $() doesn't work but back ticks should work fine. Bourne and Bash use back ticks; are use certain that you aren't using one of these shells?

Regarding words vs lines: this is what happened with we changed IFS. You can reset IFS to use spaces at any time letting you process words in a read loop.
Code:
IFS=' '

Regarding the if statement: mine is only testing whether sqlplus exited with a non-zero result so I don't know whether that would be relavent in your case; perhaps.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help with Efficient Looping

Hello guys My requirement is to read a file with parent-child relationship we need to iterate through each row to find its latest child. for eg. parent child ABC PQR PQR DEF DEF XYZ Expected Output ABC XYZ PQR XYZ DEF XYZ Script Logic : read parent from file seach child... (4 Replies)
Discussion started by: joshiamit
4 Replies

2. UNIX for Advanced & Expert Users

Efficient way to grep

Hi Experts, I've been trying simple grep to search for a string in a huge number of files in a directory. grep <pattern> * this gives the search results as well as the following - grep: <filename>: Permission denied grep: <filename>: Permission denied for files which I don't have... (4 Replies)
Discussion started by: sumoka
4 Replies

3. UNIX for Dummies Questions & Answers

Is this regex efficient?

I want to match the red portion: 9784323456787-Unknown Phrase with punctuation "Some other PhrASE." Is this the best regex to match this? '978\{10\}-*' (4 Replies)
Discussion started by: glev2005
4 Replies

4. UNIX for Advanced & Expert Users

efficient repace

some of the data i receive has been typed in manually due to which there are often places where i find 8 instead of ( and the incorrect use of case what according to you is the best way to correct such data. The data has around 20,000 records. The value i want to change is in the 4th field.... (2 Replies)
Discussion started by: VGR
2 Replies

5. Shell Programming and Scripting

efficient search

Hi, i have 2 files each with 200K lines. Each line contains a number. Now, i need to get the list of numbers existing in one fine and NOT in other file. I'm doing this by reading each number from 1 file and grepping on other file. But this taking LOT of time. Is there any efficient way of doing... (14 Replies)
Discussion started by: prvnrk
14 Replies

6. Shell Programming and Scripting

Is there a way to make this more efficient

I have the following code. printf "Test Message Report" > report.txt while read line do msgid=$(printf "%n" "$line" | cut -c1-6000| sed -e 's///g' -e 's|.*ex:Msg\(.*\)ex:Msg.*|\1|') putdate=$(printf "%n" "$line" | cut -c1-6000| sed -e 's///g' -e 's|.*PutDate\(.*\)PutTime.*|\1|')... (9 Replies)
Discussion started by: gugs
9 Replies

7. Shell Programming and Scripting

help on most efficient search

Hello, We have a directory with 15 sub-directories where each sub-directory contains 1.5 to 2 lakhs of files in it. Daily, around 300-500 files will be uploaded to each sub-directory. Now, i need to get the list of files received today in most efficient way. I tried using "find with newer... (16 Replies)
Discussion started by: prvnrk
16 Replies

8. Shell Programming and Scripting

Can you suggest a more efficient way for this?

Hi I have the following at the end of a service shutdown script used in part of an active-passive failover setup: ### # Shutdown all primary Network Interfaces # associated with failover ### # get interface names based on IP's # and shut them down to simulate loss of # heartbeatd ... (1 Reply)
Discussion started by: mikie
1 Replies

9. Shell Programming and Scripting

Efficient way of Awk

Hi, Can someone let me know if the below AWK can be made much simpler / efficient ? I have 200 fields, I need to substr only the last fields. So i'm printing awk -F~ 'print {$1, $2, $3....................................$196,$197 , susbstr($198,1,3999), substr($199,1,3999)..}' Is there a... (4 Replies)
Discussion started by: braindrain
4 Replies

10. UNIX for Advanced & Expert Users

Efficient Dispatching

Does anyone know what's new with Efficient dispatching in the Solaris 2.8 release (vs Solaris 2.6) release? Specifically, does anyone know of a good website to get detailed information on thread dispatching using efficient dispatching in solaris 2.8? Thank you. (1 Reply)
Discussion started by: uchachra
1 Replies
Login or Register to Ask a Question