I am unable to get beyond the exit function. The shell script is used to look for masked files and copy paste them to another location. Please refer to the code below for more information.
Thanks
Brinjit
Code:
#!/usr/bin/ksh
#########################################################################
# script name: cdg_file_load.ksh
# Created : March 29, 2013
# Author : David Ayotte
#
# Find new files for the file type and period that is passed in in the form
# of a lookup key that is used to determine what directory and file masks
# to look for. The lookup file is $FP_ENV/CDG_Lookup.txt
#
# Revision History: (include date, username, and brief description in
# descending order)
#########################################################################
#
# Function definitions
#
# Usage Function
#
function usage_msg {
echo "\n---> Usage: \n"
echo "---> cdg_file_load.ksh -k< key > "
echo " -k is a Key lookup value in the $FP_ENV/CDG_Lookup.txt file."
# grep -v "#" $lookupFile | cut -d"|" -f1 | sort | uniq
echo ""
exit_func
}
#
# Exit Function
#
function exit_func {
echo "---> Running exit function..."
exit $rc
# update_job_table only when $rc != 0
}
#
# Make sure the file is static
#
function is_file_changing {
baseName=`basename ${sourceFile}`
timeFile=${FP_LOGDIR}/${baseName}_tmstp
touch ${timeFile}
echo "Time Compare file is ${timeFile}"
echo "---> Sleeping to see if file changes in 20 seconds "
sleep 20
while [ ${sourceFile} -nt ${timeFile} ]
do
touch ${timeFile}
echo "-> Sleeping 20 more seconds...attempt $i"
sleep 20
let i=$i+1
if [ $i -gt 60 ] # I am sick of waiting
then
rc=24
exit_func
fi
done
echo "---> File is found and not changing. "
}
#
# Execute Informatica workflow to load new file.
#
function load_file {
#
# Determine that file is not changing.
#
# TEST is_file_changing
#
# Creat link to file, much quicker than copying to local name.
#
if [ -f ${FP_FTP_DOWN}/${staticFile} or -h ${FP_FTP_DOWN}/${staticFile} ]
then
rm ${FP_FTP_DOWN}/${staticFile} # no need to check return code, the ln command will fail.
fi
echo "---> Creating link from static file name ${staticFile} "
ln -s ${sourceFile} ${FP_FTP_DOWN}/${staticFile}
rc=$?
if [ $rc -ne 0 ]
then
echo "---> ERROR: Unable to create link. "
rc=30
exit_func
fi
#
# Execute Informatica workflow
#
export parmFile=${FP_PARMFILES}/${workFlow}.par
echo "---> Executing Informatica workflow ${workFlow} "
cmd="pmcmd startworkflow -wait -sv ${FP_SERVICE} -d ${FP_DOMAIN} -uv FP_INF_USER -pv FP_INF_PSWD "
cmd=$cmd" -paramfile ${parmFile} -f ${FP_FOLDER} ${workFlow} "
echo "--> Command: $cmd"
$cmd
rc=$?
if [ $rc -ne 0 ]
then
echo "---> ERROR: Error executing Informatica Workflow."
rc=30
exit_func
fi
}
#
# Determine if this file has been loaded already.
#
function has_file_been_loaded {
alreadyLoaded=0 # False - File has not been loaded
cnt=`grep -c ${sourceFile} ${cdgFileList}`
if [ $cnt -eq 0 ]
then
echo "---> This file has not been loaded "
else
# echo "---> This file has already been loaded "
alreadyLoaded=1
fi
}
#
# Check for files in a directory
#
# More like, process a file mask for a directory for a workflow. Several file types
# can be found in each directory, but must be processed by a different Informatica
# workflow.
#
function process_a_directory {
echo "Checking ${FP_CDG_CFR}${searchPath}/${searchMask}"
# for sourceFile in `ls -1 ${FP_CDG_CFR}${searchPath}/${searchMask}*`
for sourceFile in `find ${FP_CDG_CFR}${searchPath} -name "${searchMask}" -type f -mtime -5`
do
echo "---> Checking $sourceFile "
has_file_been_loaded
if [ ${alreadyLoaded} -eq 0 ]
then
# for Jeff to add later update_job_table
load_file
let fileCntr=$fileCntr+1
echo $sourceFile >> ${cdgFileList} # So we won't load it again
# for Jeff to add later update_job_table
fi
done
}
#
# extract values from lookup record
#
function get_values {
# Extract values
searchPath=`echo $rec | cut -d"|" -f2`
searchMask=`echo $rec | cut -d"|" -f3`
staticFile=`echo $rec | cut -d"|" -f4`
workFlow=`echo $rec | cut -d"|" -f5`
# Display values
echo "\n---> Search Path = $searchPath"
echo "---> Search Mask = $searchMask"
echo "---> Static File = $staticFile"
echo "---> Workflow = $workFlow"
}
#
# Find new files and load them
#
function find_and_load_new_files {
echo "---> attempting to process $keyValue "
lookupValuesFound=0
for rec in `grep "^${keyValue}|" $lookupFile`
do
get_values
let lookupValuesFound=$lookupValuesFound+1
process_a_directory
done
}
#
# Command Line Parsing
#
export fileCntr=0
export lookupFile=${FP_ENV}/CDG_Lookup.txt
export cdgFileList=$FP_FTP_DOWN/cdg_loaded_files.txt # list of files already loaded.
# remove this next line after testing is done
#
# export FP_CDG_CFR=/fpapps/op/qa/ftproot/cdgtest
#
if [ $# -ne 1 ]
then
usage_msg
fi
while getopts "k:" option
do
case "$option" in
k) keyValue="$OPTARG"
;;
[?])
usage_msg
;;
esac
done
#
#
if [ "$keyValue" = "" ]
then
exit_func
fi
#
find_and_load_new_files
#
if [ $lookupValuesFound -eq 0 ]
then
echo "\n---> ERROR: No values found in lookup file. "
rc=30
else
echo "\n---> $lookupValuesFound lookup values found"
fi
#
echo "\n---> Total Files processed is $fileCntr \n"
exit_func
The actual requirement is specified below:
Code:
aCR3027 - OP - Operational Data Store - Sophia Decommission - CDG ProcessingnotesCreate date: June 20, 2013Author: David AyotteUNIX Systems:Development: Host: nh1a7t05 - Unix ID: opdevQA: Host: nh1a7t05 - Unix ID: opqaProduction: Host: nh1a7p09 - Unix ID: opuserInformatica Environments: Development: Repository: FP_DEV_REPO - Folder: OP QA: Repository: FP_QA_REPO - Folder: OP Production: Repository: FP_PRD_REPO_P04 - Folder: OPProject Goal: Load files from CDG directories for multiple file types at different times of the month (cycles). These cycles are listed below:CYCLE04CYCLE06CYCLE07CYCLE10CYCLE13CYCLE19CYCLE25DAILYEOM Each cycle corresponds to a Control-M calendar with similar names as shown in the following diagram. The calendars will determine when to check for files in the directories that correspond to the cycles defined above:A lookup file called CDG_Lookup.txt is used to drive the process. This file resides in /fpapps/op/qa/env on nh1a7t05. UNIX scripting and Control-M work is not done in the development directories due to the environment not being static. Below are the entries in the lookup file for CYCLE04. This is a pipe “|” delimited file with the fields from left to right being: Key | Directory to check | File Mask to check | Static File | Informatica workflow | Calendar (for reference only)Key: This is to be passed to the script on the command line of the Control-M job. The script will use this to determine what and how to process this cycle. Directory to check: These are the subdirectories under /fpapps/cfr/CDG that the script will check for files to load.File Mask to Check: Files found that match this format will be loaded.Static File: Each file that is found to match the format mask will be copied, one at a time, to the static file name. After this copy, the Informatica Workflow will run and load the static file name. Informatica Workflow: This is the name of the Informatica workflow that will run to load the static file. Calendar: This is for reference only. When this has a value of multiple, you may find this directory and file mask under more than one cycle.CYCLE04|/Reports/0570/04th|AC3.C*.AC0020.D*.T*.CSV|AC0020.CSV|WF_LOAD_FF_TEMP_AC0020|CDG_Cycle4|CYCLE04|/Reports/0570/04th|AC3.C*.AC0025.D*.T*.CSV|AC0025.CSV|WF_LOAD_FF_TEMP_AC0025|CDG_Cycle4|CYCLE04|/Reports/0570/04th|AC3.C*.AC0068.D*.T*.CSV|AC0068.CSV|WF_LOAD_FF_TEMP_AC0068|CDG_Cycle4|CYCLE04|/Reports/0570/04th|AC3.C*.AC0730.D*.T*.CSV|AC0730.CSV|WF_LOAD_FF_TEMP_AC0730|CDG_Cycle4|CYCLE04|/Reports/0570/04th|AC3.C*.AC0760.D*.T*.CSV|AC0760.CSV|WF_LOAD_FF_TEMP_AC0760|CDG_Cycle4|CYCLE04|/Reports/0570/04th|AC3.C*.AC0780.D*.T*.CSV|AC0780.CSV|WF_LOAD_FF_TEMP_AC0780|CDG_Cycle4|CYCLE04|/Reports/0571/04th|AC3.C*.AC0020.D*.T*.CSV|AC0020.CSV|WF_LOAD_FF_TEMP_AC0020|CDG_Cycle4|CYCLE04|/Reports/0571/04th|AC3.C*.AC0025.D*.T*.CSV|AC0025.CSV|WF_LOAD_FF_TEMP_AC0025|CDG_Cycle4|CYCLE04|/Reports/0571/04th|AC3.C*.AC0068.D*.T*.CSV|AC0068.CSV|WF_LOAD_FF_TEMP_AC0068|CDG_Cycle4|CYCLE04|/Reports/0571/04th|AC3.C*.AC0730.D*.T*.CSV|AC0730.CSV|WF_LOAD_FF_TEMP_AC0730|CDG_Cycle4|CYCLE04|/Reports/0571/04th|AC3.C*.AC0760.D*.T*.CSV|AC0760.CSV|WF_LOAD_FF_TEMP_AC0760|CDG_Cycle4|CYCLE04|/Reports/0571/04th|AC3.C*.AC0780.D*.T*.CSV|AC0780.CSV|WF_LOAD_FF_TEMP_AC0780|CDG_Cycle4|CYCLE04|/Reports/0572/04th|AC3.C*.AC0020.D*.T*.CSV|AC0020.CSV|WF_LOAD_FF_TEMP_AC0020|CDG_Cycle4|CYCLE04|/Reports/0572/04th|AC3.C*.AC0025.D*.T*.CSV|AC0025.CSV|WF_LOAD_FF_TEMP_AC0025|CDG_Cycle4|CYCLE04|/Reports/0572/04th|AC3.C*.AC0068.D*.T*.CSV|AC0068.CSV|WF_LOAD_FF_TEMP_AC0068|CDG_Cycle4|CYCLE04|/Reports/0572/04th|AC3.C*.AC0730.D*.T*.CSV|AC0730.CSV|WF_LOAD_FF_TEMP_AC0730|CDG_Cycle4|CYCLE04|/Reports/0572/04th|AC3.C*.AC0760.D*.T*.CSV|AC0760.CSV|WF_LOAD_FF_TEMP_AC0760|CDG_Cycle4|CYCLE04|/Reports/0572/04th|AC3.C*.AC0780.D*.T*.CSV|AC0780.CSV|WF_LOAD_FF_TEMP_AC0780|CDG_Cycle4|CYCLE04|/Reports/0581/04th|AC3.C*.AC0020.D*.T*.CSV|AC0020.CSV|WF_LOAD_FF_TEMP_AC0020|CDG_Cycle4|CYCLE04|/Reports/0581/04th|AC3.C*.AC0025.D*.T*.CSV|AC0025.CSV|WF_LOAD_FF_TEMP_AC0025|CDG_Cycle4|CYCLE04|/Reports/0581/04th|AC3.C*.AC0068.D*.T*.CSV|AC0068.CSV|WF_LOAD_FF_TEMP_AC0068|CDG_Cycle4|CYCLE04|/Reports/0581/04th|AC3.C*.AC0730.D*.T*.CSV|AC0730.CSV|WF_LOAD_FF_TEMP_AC0730|CDG_Cycle4|CYCLE04|/Reports/0581/04th|AC3.C*.AC0760.D*.T*.CSV|AC0760.CSV|WF_LOAD_FF_TEMP_AC0760|CDG_Cycle4|CYCLE04|/Reports/0581/04th|AC3.C*.AC0780.D*.T*.CSV|AC0780.CSV|WF_LOAD_FF_TEMP_AC0780|CDG_Cycle4|CYCLE04|/Reports/0582/04th|AC3.C*.AC0020.D*.T*.CSV|AC0020.CSV|WF_LOAD_FF_TEMP_AC0020|CDG_Cycle4|CYCLE04|/Reports/0582/04th|AC3.C*.AC0025.D*.T*.CSV|AC0025.CSV|WF_LOAD_FF_TEMP_AC0025|CDG_Cycle4|CYCLE04|/Reports/0582/04th|AC3.C*.AC0068.D*.T*.CSV|AC0068.CSV|WF_LOAD_FF_TEMP_AC0068|CDG_Cycle4|CYCLE04|/Reports/0582/04th|AC3.C*.AC0730.D*.T*.CSV|AC0730.CSV|WF_LOAD_FF_TEMP_AC0730|CDG_Cycle4|CYCLE04|/Reports/0582/04th|AC3.C*.AC0760.D*.T*.CSV|AC0760.CSV|WF_LOAD_FF_TEMP_AC0760|CDG_Cycle4|CYCLE04|/Reports/0582/04th|AC3.C*.AC0780.D*.T*.CSV|AC0780.CSV|WF_LOAD_FF_TEMP_AC0780|CDG_Cycle4|CYCLE04|/Reports/0583/04th|AC3.C*.AC0020.D*.T*.CSV|AC0020.CSV|WF_LOAD_FF_TEMP_AC0020|CDG_Cycle4|CYCLE04|/Reports/0583/04th|AC3.C*.AC0025.D*.T*.CSV|AC0025.CSV|WF_LOAD_FF_TEMP_AC0025|CDG_Cycle4|CYCLE04|/Reports/0583/04th|AC3.C*.AC0068.D*.T*.CSV|AC0068.CSV|WF_LOAD_FF_TEMP_AC0068|CDG_Cycle4|CYCLE04|/Reports/0583/04th|AC3.C*.AC0730.D*.T*.CSV|AC0730.CSV|WF_LOAD_FF_TEMP_AC0730|CDG_Cycle4|CYCLE04|/Reports/0583/04th|AC3.C*.AC0760.D*.T*.CSV|AC0760.CSV|WF_LOAD_FF_TEMP_AC0760|CDG_Cycle4|CYCLE04|/Reports/0583/04th|AC3.C*.AC0780.D*.T*.CSV|AC0780.CSV|WF_LOAD_FF_TEMP_AC0780|CDG_Cycle4|CYCLE04|/Reports|Processed_AC3.C*.AC0877J.D*.T*.CSV|AC0877.CSV|WF_LOAD_FF_TEMP_AC0877|Multiple|The script to run the load is called cdg_file_load.ksh and resides in the following directory on nh1a7t05:/fpapps/op/qa/prod/scripts/cdgand in the following directory on nh1a8p04:/fpapps/op/prod/scripts/cdgThe only command line option for the script is -k<key>, where key is the key value from the CDG_Lookup.txt file (such as CYCLE04).A sample command line for Control-M would look like this:batchjob OP %%JOBNAME ${FP_SCRIPTS}/cdg/cdg_file_load.ksh -eCYCLE04The above command line example will work in QA and in production on nh1a8p04. Misc:A file called cdg_loaded_files.txt exists in the following directory on nh1a7t05: /fpapps/op/qa/ftproot/ftpdown The cdg_loaded_files.txt file contains a list of all files and the full directory to each file that has been loaded. This is updated after a file is loaded without errors. This way, if a load fails, it is not marked as loaded. The need for this file is because we do not have the ability to remove files from the cfr (Central File Repository). I suppose we could have built an entire archive directory structure that matches what is on the cfr, but this file should work okay for a long time. Since the find command in the load script only goes back five days, this file can eventually have old rows removed from the beginning. Much care should be taken before this is done and a script should probably be written to do this. For example, find all files on the cfr older than 5 days and remove them from this lookup file, and remove any files from this lookup that do not exist in the cfr.
I wanted to put "|" this sign at starting and at end of every field but its not working with first field like
Currently the out put is :
abc | abc | abc |
xyz | xyz | xyz |
But I want the out put in this form:
| abc | abc | abc |
| xyz | xyz | xyz |
plz help me. (2 Replies)
Hi,
I have a script in which an entry like this .....
FILENAME_B="PIC_${DATE}0732*.JPG"
The script connects to an ATM and pull a pic file from it.The format for the file is like PIC_2008061400000001.JPG in the ATM.
Means 1st 8 digit is the date(YYYYMMDD) field
2nd 8 digit means hrs... (2 Replies)
hi
i am running script which contains the commmnds and i am redirecting the script output to a file.
like
./script 1> result.txt 2>&1
the above redirection is not working for commands when run in background in a script.
but the problem here result.txt containg output which is repeated.... (3 Replies)
Below is the my cide which is working fine but I am not getting the output indesired format.there is some problem in alignment.Can someone help me to correct this?
if ];
then
summary=$(
echo -e "Please review the log file of auto coloclean utility.\n";
echo -e... (2 Replies)
currently I have process from a raw file to this stage
ALTER TABLE "EXCEL_ADMIN"."TC_TXN_VOID" ADD CONSTRAINT "PK_TC_TXN_VOID" PRIMARY KEY ("TC_TXN_IID")
ALTER TABLE "EXCEL_ADMIN"."TC_TXN_AMT" ADD CONSTRAINT "PK_TC_TXN_AMT" PRIMARY KEY ("TC_TXN_AMT_IID")
ALTER TABLE... (10 Replies)
Am in need of your help to get the desired output.
nameSECURITY.SERVICES.CONFIG:GETVALUEisPrefetchedNsAccessLast2013-09-13 10:50:13 MESTsAccessTotal1sRunningcHitLastnamePUBLIC.SERVER:INVOKEisPrefetchedNsAccessLast2013-09-17 15:02:05... (5 Replies)
Hello ,
I am creating a controlfile of database in linux and below is the error coming:
SQL> CREATE CONTROLFILE REUSE set DATABASE "newdbcln" RESETLOGS NOARCHIVELOG
2 MAXLOGFILES 5
3 MAXLOGMEMBERS 5
MAXDATAFILES 100
4 5 MAXINSTANCES 1
6 MAXLOGHISTORY... (2 Replies)
Hi guys.
I have a file containing some hosts and their IPs.
host host1 192.168.2.10
host host2 192.168.2.11
host host3 192.168.2.12
I am writing a script where I want to print these values in 1 line. My script looks like
RUNTIME_NODE=`cat hosts.properties | grep host`
for i in... (7 Replies)
I have an file which have data in lines as follows
ad, findline=24,an=54,ab=34,av=64,ab=7989,ab65=34,aj=323,ay=34,au=545,ad=5545
ab,abc,an10=23,an2=24,an31=32,findline=00,an33=23,an32=26,an40=45,ac23=5,ac=87,al=76,ad=26... (3 Replies)