I have files Bank1 ,Bank2 ,Bank3 with high level customer information in one folder.
The detail level information of all customers of each Bank are present in separate directories .
Here Bank1 is a directory which contains set of files (details,details2,details3) which contains all the information of customers.
Similarly Bank2,Bank3,Bank4 are the folders which contains information of there customers
How to join Bank1 file in /home/Cust_information/ path with all the files present in Bank1 folder /home/Bankdetails/Bank1/* based on Cust_ID. Same should be done for all the files present in /home/Cust_information/ folder using a loop.
For example Bank1 file in /home/Cust_information/ contains following details
complete details of all the customers are present in /home/Bankdetails/Bank1/ path
PFA the folder structure for reference.
Here i need to join /home/Cust_information/Bank1 with files in /home/Bankdetails/Bank1/details,details1 to get a consolidated output for matching IDS
It's easier in SQL than shell, and there are JDBC and ODBC tools that make flat files work like tables. You can 'sort' and 'join' files in shell to get a similar effect. You can join in bash by putting one file into an associative array and then looking up the next file's key fields to merge the output.
Give this a shot and come back with detailed results/logs/errors...
Code:
set -vx
for BANK in /home/Cust_information/Bank*
do awk 'NR==1 {print FILENAME}
NR==FNR {DATA[$1]=$0; next}
$1 in DATA {$0=DATA[$1]" "$0}
1
' "$BANK" /home/Bankdetails/"${BANK##*/}"/details* >> consolidated.out
done
Last edited by RudiC; 11-08-2013 at 03:57 PM..
Reason: typo/ more typos / error correction
I have two files with the below contents :
sampleoutput3.txt
20150202;hostname1
20150223;hostname2
20150716;hostname3
sampleoutput1.txt
hostname;packages_out_of_date;errata_out_of_date;
hostname1;11;0;
hostnamea;12;0;
hostnameb;11;0;
hostnamec;95;38;
hostnamed;440;358;... (2 Replies)
Hi frnds,
My requirement is I have a zip file with name say eg: test_ABC_UH_ccde2a_awdeaea_20150422.zip
within that there are subdirectories on each directory we again have .zip files and in that we have files like mama20150422.gz and so on.
Iam in need of a bash script so that it unzips... (0 Replies)
how can i move "dataName".sql.gz into a folder called 'database' and then move "$fileName".tar.gz * .htaccess into a folder called 'www' with the entire gzipped file being "$fileName".tar.gz? Is this doable or overly complex.
so
mydemo--2015-03-23-1500.tar.gz
> database
-... (5 Replies)
Hi,
My requirement is,there is a directory location like:
:camp/current/
In this location there can be different flat files that are generated in a single day with same header and the data will be different, differentiated by timestamp, so i need to verify how many files are generated... (10 Replies)
Can anyone come up with a unix command that lists
all the files, directories and sub-directories in the current directory
except a folder called log.?
Thank you in advance. (7 Replies)
Hi,
I need a script/command to list out all the files in current path and also the files in folder and subfolders.
Ex: My files are like below
$ ls -lrt
total 8
-rw-r--r-- 1 abc users 419 May 25 10:27 abcd.xml
drwxr-xr-x 3 abc users 4096 May 25 10:28 TEST
$
Under TEST, there are... (2 Replies)
Hi,
I have about 20 tab delimited text files that have non sequential numbering such as:
UCD2.summary.txt
UCD45.summary.txt
UCD56.summery.txt
The first column of each file has the same number of lines and content. The next 2 column have data points:
i.e UCD2.summary.txt:
a 8.9 ... (8 Replies)
Hi,
Please help me, how to get all the direcotries, its sub directories and its sub directories recursively, need to exclude all the files in the process.
I wanted to disply using a unix command all the directories recursively excluding files.
I tried 'ls -FR' but that display files as... (3 Replies)