Dear All,
I have a requirement in which i have to load a file placed in FTP location onto my database. The process i'll follow is as below:
1) Get the files using FTP.
2) Create the desired load files as i have to load only 19 fields out of the 104 available in the file.
The fields i require can only be distinguished using their position. E.g 1-3 is one value, 5th field is another, 16-33 is another and so on for 19 fields.
I'm using CUT command to create a new file which will have "|" as delimiter. So while cutting the fields i'm appending the delimiter.
3) The file created above will then be used to load into database, pipe "|" being the delimiter using SQLLDR utility.
My concern is the step no. 2), as even for a file of length 10000 (each line 3159), it takes around 20 minutes
. And i have to process around >7500000 records per day.
Is there any other method to achieve wht i want in a much shorter time
?
I'm new to Unix, so please point out if you see any area for improvement.
Unix box is : SunOS 5.8
Thanks!!