Error : fatal: record too large


 
Thread Tools Search this Thread
Operating Systems AIX Error : fatal: record too large
# 1  
Old 08-04-2008
Error : fatal: record too large

while running a script to process a file , it ended in termination . I able to find from the logs that it ended due to the error : fatal: record too large 20480 . Then I tried lots to fix the issue and left in vain . Can any one help me in finding reason for it ? . If you provide me a solution , I would feel extremely happy .

And also when I tried to do a command on the file YEARMONBUZ_08-12 , it ended in resulting in 0 records and it took more time . I totally surprised of it .

~SAKTHIFIRE
# 2  
Old 08-04-2008
First off, this is the AIX part, not the Shell programming and scripting part of the forum. From what you have told us by now, this has nothing to do with AIX, yes?

Secondly, how shall we find out the error when we do not know your script? Please post it, along with a detailed explanation of what has gone wrong. Which program issued the error you described?

Thirdly, you mention some logs. Which logs? Where have you been looking and what have you found out so far?

You have tried "lots to fix the issue and left in vain". Which were these tries? What have they resulted in?

We can only help you if you give us enough information to do so. Right now the info is far from being complete.

I hope this helps.

bakunin
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. What is on Your Mind?

PHP Fatal Errors During SSL Cert Management - PHP Fatal error: xc_fcntl_mutex failed

Today, I noticed some errors in our SSL cert renewal log files, mostly related to domains where the IP address had changed. Concerned about this, rebuilt out SSL cert, which normally goes well without a hiccup. However, for today, for some reason which I cannot explain, there was a PHP error... (0 Replies)
Discussion started by: Neo
0 Replies

2. UNIX for Advanced & Expert Users

How to split large file with different record delimiter?

Hi, I have received a file which is 20 GB. We would like to split the file into 4 equal parts and process it to avoid memory issues. If the record delimiter is unix new line, I could use split command either with option l or b. The problem is that the line terminator is |##| How to use... (5 Replies)
Discussion started by: Ravi.K
5 Replies

3. Shell Programming and Scripting

Split a large file in n records and skip a particular record

Hello All, I have a large file, more than 50,000 lines, and I want to split it in even 5000 records. Which I can do using sed '1d;$d;' <filename> | awk 'NR%5000==1{x="F"++i;}{print > x}'Now I need to add one more condition that is not to break the file at 5000th record if the 5000th record... (20 Replies)
Discussion started by: ibmtech
20 Replies

4. Shell Programming and Scripting

sed and awk not working on a large record file

Hi All, I have a very large single record file. abc;date||bcd;efg|......... pqr;stu||record_count;date when i do wc -l on this file it gives me "0" records, coz of missing line feed. my problem is there is an extra pipe that is coming at the end of this record like... (6 Replies)
Discussion started by: Gurkamal83
6 Replies

5. Shell Programming and Scripting

How to delete 1 record in large file!

Hi All, I'm a newbie here, I'm just wondering on how to delete a single record in a large file in unix. ex. file1.txt is 1000 records nikki1 nikki2 nikki3 what i want to do is delete the nikki2 record in file1.txt. is it possible? Please advise, Thanks, (3 Replies)
Discussion started by: nikki1200
3 Replies

6. UNIX for Advanced & Expert Users

fatal: record too large

while running a script to process a file , it ended in termination . I able to find from the logs that it ended due to the error : fatal: record too large 20480 . Then I tried lots to fix the issue and left in vain . Can any one help me in finding reason for it ? . If you provide me a solution , I... (1 Reply)
Discussion started by: sakthifire
1 Replies

7. UNIX for Advanced & Expert Users

Shell script failing to read large Xml record-urgent critical help

Hi All, I have shell script running on AIX 5.3 box. It has 7 to 8 "sed" commands piped(|) together. It has a an Xml file as its input which has many records internally. There are certain record which which have more than hundered tags.The script is taking a huge amount of time more than 1.5 hrs... (10 Replies)
Discussion started by: aixjadoo
10 Replies

8. Programming

ld: fatal error

i am trying to compile my program using a makefile and i keep getting this message: Undefined first referenced symbol in file main /usr/local/lib/gcc-lib/sparc-sun-solaris2.9/3.3/crt1.o ld: fatal: Symbol referencing errors. No output written to prog5 collect2: ld returned 1 exit status ***... (2 Replies)
Discussion started by: betterdayz
2 Replies

9. Programming

Fatal Error

Hi, I just pulled out my code from source control, then I compiled, the compilation is successful at that time. Then I modified one of the source file, Then I compiled, I got the following error ld: fatal: Symbol referencing errors. No output written to ../../CM/bin/cato Before... (1 Reply)
Discussion started by: sarwan
1 Replies

10. Programming

ld: fatal error

dear all , iam trying to compile a progam using gcc compiler on a sun 280 R machine running solaris 9 . iam getting an error in the last step when the compiler tries to link the objects , although the compiler executes some applicarions that contains no objects smoothly, the error that... (1 Reply)
Discussion started by: ppass
1 Replies
Login or Register to Ask a Question