sorting huge file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting sorting huge file
# 1  
Old 05-13-2008
sorting huge file

Hi All
I am sorting a huge file

-rw-r--r-- 1 rama users 448156978 May 13 18:48 102384.temp

$ sort -k 1,40n 102384.temp > 102384.temp1

msgcnt 1468 vxfs: mesg 001: vx_nospace - /dev/vg00/var file system full (1 block extent)
sort: A write error occurred while sorting.

I thought of doing this logic of splitting the input file into separte files and then do sort on individual files.

Any idea how we can avoid these kinds of errors.

Regards
Dhana
# 2  
Old 05-13-2008
If you split them and sorted them... how would you join them together again such that they were sorted?

Check the sort man page, there should be an option which allows you to specify an alternative temporary directory which has more space.
# 3  
Old 05-13-2008
use the -T option to specify a temp directory

before that search the forums please Smilie
# 4  
Old 05-14-2008
sorting huge file

Hi
I was able to run successfully when i set the variable TMPDIR to a location where the space is available.

Thanks for the information.

Regards
Dhana
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Reading ALL BUT the first and last line of a huge file

Hi. Pardon me if I'm posting a duplicate thread but.. I have a text file with over 150 Million records, file size is in the range if MB(close to GB). The requirement is to read ALL the lines excepting the FIRST LINE which is the file header and the LAST LINE which is it's trailer record. ... (8 Replies)
Discussion started by: kumarjt
8 Replies

2. UNIX for Dummies Questions & Answers

My file system is 100%, can't find the huge file

Please help. My file system is 100%, I can't seem to find what is taking so much space. The total hard drive space is 150Gig free but I got nothing now. I did to this to find the big file but it's taking so much time. Is there any other way? du -ah / | more find ./ -size +200M... (3 Replies)
Discussion started by: samnyc
3 Replies

3. AIX

FTP huge file transfer

Hi, I need to transfer 2000 files from one host to another.. I modified /etc/security/limits to -1 and ulimit -f, ulimit -s, ulimit -a.. Even then only 700 files are transferred. Could You please help me to sort out this issue.. I think some configuration related to memory is... (3 Replies)
Discussion started by: Priya Amaresh
3 Replies

4. Shell Programming and Scripting

Optimised way for search & replace a value on one line in a very huge file (File Size is 24 GB).

Hi Experts, I had to edit (a particular value) in header line of a very huge file so for that i wanted to search & replace a particular value on a file which was of 24 GB in Size. I managed to do it but it took long time to complete. Can anyone please tell me how can we do it in a optimised... (7 Replies)
Discussion started by: manishkomar007
7 Replies

5. Shell Programming and Scripting

FTP a huge Size file

Dear All, Good Evening!! I have a requirement to ftp a 220GB backup file to a remote backup server. I wrote a script for this purpose. But it takes more than 8 hours to transfer this file. Is there any other method to do it in less time??? Thanks in Advance!!! ---------- Post updated... (5 Replies)
Discussion started by: Naga06
5 Replies

6. Shell Programming and Scripting

Huge File Comparison

Hi i need to compare two fixed length files and produce the differences if any to a seperate file. I have to capture each and every differneces line by line. Ideally my files should not have any differences but if there are any then it should be captured without any miss. Also my files sizes are... (4 Replies)
Discussion started by: naveenn08
4 Replies

7. Shell Programming and Scripting

Help on splitting this huge file

Hi , i have files coming in my system which are very huge in MB and GBs, all these files are in a single line, there is no newline character. I need to get only last 700 bytes of these files, of this i am splitting the files by "split -b 700 filename" but this gives all the splitted... (2 Replies)
Discussion started by: Prateek007
2 Replies

8. Shell Programming and Scripting

insert a header in a huge data file without using an intermediate file

I have a file with data extracted, and need to insert a header with a constant string, say: H|PayerDataExtract if i use sed, i have to redirect the output to a seperate file like sed ' sed commands' ExtractDataFile.dat > ExtractDataFileWithHeader.dat the same is true for awk and... (10 Replies)
Discussion started by: deepaktanna
10 Replies

9. Solaris

compare huge file

Hi, I have files with records of 40,00,000& 39,00,000 and i want to find out the content 1.which is existing in file1 and not in file2. 2.Which is exisitng in file2 and not in file1. The format of the file will be like 404ABCDEFGHIJK|CDEFGHIJK|1234567890|1 If its a smaller one i... (1 Reply)
Discussion started by: salaathi
1 Replies

10. UNIX for Dummies Questions & Answers

spliting up a huge file

I have a file {filename} which contains 65000 records I need to split into 6 smaller files roughly 11000 records each. Can someone advise me of the Unix command to do so ? Many thanks (2 Replies)
Discussion started by: grinder182533
2 Replies
Login or Register to Ask a Question