Sponsored Content
Top Forums Shell Programming and Scripting Problem in processing a very large file. Post 302082325 by jim mcnamara on Tuesday 1st of August 2006 09:35:24 AM
Old 08-01-2006
Unless you expect to use thousands of large files (>2GB) just use split to whack the file into two/three pieces. sqlldr -> the first file, then the second. split should be able to read files that large... if your filesystem correctly handles the big file.

Code:
split -b 270000000 file.dat  split_file
for file in `ls split_file*`
do
   sqlldr data="$file" control=somefile.ctl
done

 

9 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Large file transfer problem

Hello Everyone, I can't transfer a large file (~15GB TAR Archive) from one linux machine to another via FTP. I have tried the following: 1) Normal FTP the whole 15GB. This stops when it gets to about 2GB and doesn't go any further. 2) Split the 15GB file into 500MB pieces using the... (1 Reply)
Discussion started by: VVV
1 Replies

2. Shell Programming and Scripting

Problem with parsing a large file

Hi All, Following is the sample file and following is the op desired that is the last entry of each unique first field is required. My solution is as follows However the original file has around a million entries and around a 100,000 uniques first fields, so this soln.... (6 Replies)
Discussion started by: gauravgoel
6 Replies

3. UNIX for Dummies Questions & Answers

problem while making ftp of a large file

Hi Friends, I'mfacing a problem while doing ftp of a large file.The control session is getting closed after sometime.But data session transfers the file successfully even when the control seeion is lost.I need to make the control session available as long as data session is active. How can i... (1 Reply)
Discussion started by: rprajendran
1 Replies

4. UNIX for Advanced & Expert Users

Large file FTP problem

We are experiencing a problem on a lengthy data transfer by FTP through a firewall. Since there are two ports in use on a ftp transfer (data and control), one sits idle while the other's transfering data. The idle port (control) will get timed out and the data transfer won't know that it's... (3 Replies)
Discussion started by: rprajendran
3 Replies

5. UNIX for Dummies Questions & Answers

Have problem transfer large file bigger 1GB

Hi folks, I have a big problem.... and need help from your experience/knowledge. I previously install and use FREEBSD 7.0 release on my storage/backup file server, for some reason, I can not transfer any files that is bigger than 1GB. If I transfer it to Freebsd file server, the system... (2 Replies)
Discussion started by: bsdme2
2 Replies

6. UNIX for Dummies Questions & Answers

Large file problem

I have a large file, around 570 gb that I want to copy to tape. However, my tape drive will load only up to 500 gb. I don't have enough space on disk to compress it before copying to tape. Can I compress and tar to tape in one command without writing a compressed disk file? Any suggestions... (8 Replies)
Discussion started by: iancrozier
8 Replies

7. UNIX for Dummies Questions & Answers

Copying large file problem on SVR4 Unix

We have 3 Unix servers all running SVR4 Unix 1.4. I have no problems copying files to and from 2 of the servers using either the rcp command or ftp but when i come to transfer large files to the third server the copy gives up part way through and crashes this server. Copying smaller files using RCP... (7 Replies)
Discussion started by: coatesd
7 Replies

8. UNIX for Advanced & Expert Users

problem while doing Large file transfer thru Scp and FTP

Hi , I want to transfer one file having 6GB(after compression) which is in .cpk format from one server to other server. I tried scp command as well as FTP and also split the file then transfer the files thru scp command. At last i am facing the data lost and connection lost issue. Generally it... (2 Replies)
Discussion started by: Sumit sarangi
2 Replies

9. Shell Programming and Scripting

Problem with splitting large file based on pattern

Hi Experts, I have to split huge file based on the pattern to create smaller files. The pattern which is expected in the file is: Master..... First... second.... second... third.. third... Master... First.. second... third... Master... First... second.. second.. second..... (2 Replies)
Discussion started by: saisanthi
2 Replies
flow-split(1)						      General Commands Manual						     flow-split(1)

NAME
flow-split -- Split flow files into smaller files. SYNOPSIS
flow-split [-gGhn] [-b big|little] [-C comment] [-d debug_level] [-N nflows] [-o outfile_basename] [-T nseconds] [-z z_level] DESCRIPTION
The flow-split utility will split a flow file into smaller files based on the the number of flows or the ammount of time that has passed. OPTIONS
-b big|little Byte order of output. -C Comment Add a comment. -d debug_level Enable debugging. -g Split on source tag. -G Split on destination tag. -h Display help. -n Use symbols for tag field in filename. -N nflows Split after processing nflows. -o outfile_basename The basename of the resulting files. -T nsecond Split after processing an interval of nseconds flows. -z z_level Configure compression level to z_level. 0 is disabled (no compression), 9 is highest compression. EXAMPLES
Create 1 minute flow files from the flow archive in /flows/krc4. Store the results in /flows/krc4.split flow-cat /flows/krc4 | flow-split -T60 -o /flows/krc4.split/1min. BUGS
None known. AUTHOR
Mark Fullmer maf@splintered.net SEE ALSO
flow-tools(1) flow-split(1)
All times are GMT -4. The time now is 10:37 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy