This is probably nothing other than an array, of some other data or control structure, that is not declared as large enough. This is common in Fortran because you have to pre-size everything and hope you leave enough space. Sometines an allocation will be sufficient when a program is first developed, but may need to be increased later. The use of a parameter.dat file is common where you hard code constants with sizes for things that may need to change. Then you use the constant as the array size. You keep them all in one place and well documented so it is easy to change. When you have too many input files, the array boundaries are exceeded and there is a seg fault. You could also be flat running out of ram, especially if this is run on an older machine with limited resources.
There are lots of other possibilities, but those are the most likely since the program runs for smaller input files. Another thing to consider is that one of the input files is malformed. Fortran is unbelievably picky about file formats and such, so if one of your models had an issue and produced improperly formatted output, or no output at all, that could also cause in issue if the specific format problem wasn't trapped out. The best way to evaluate that is to try to find where the process fails, if you can find the location in the specific input file where the app crashes, you can look at it to see if there is a visible issue or not.
It is still unclear to me what the final output of the app is supposed to be. If the input files are text files, you can try to compress them, or post them on another location on the web and post a link. If I have a sample of the two input files that you are trying to merge, and a sample of the expected output, it should not be to difficult to fix. The files do not need to be complete, or actual files, just enough to see what the format should be.
A quick look shows that this seems to be set to open up to 1000 files, after that you will exceed the size of the do loop iterator.
Quote:
Originally Posted by
Tytalus
are you opening all files concurrently or sequentially; are you closing each file before reading from the next ?
:-)
This is an important point, it looks like all of the files are closed at once, and if you are keep all of those large files opened, you may indeed be running out of memory. You have you checked you resources while the app is running to see how much available RAM you have?
LMHmedchem