Quote:
Originally posted by RTM
I have never heard of a maximum capacity for cron - I doubt you could reach it since those of us who have mistakenly created incorrect entries in cron which have created multiple processes in seconds can tell you, it won't be the crontab that gets to a maximum but the server - with runaway processes in endless loops. The most entries I have seen in a crontab to date: 171 (a DBA who has so much running in cron, I don't think he even needs to come to work anymore).
Modern versions of cron do have a limit. At least both HP-UX and Solaris do. And I have bumped into that limit many times. I have never heard on a limit of the number of entries in a crontab or a collection of crontabs. But there is a limit on the max number of sumultaneous cron jobs running. When you exceed this you get a message like this:
! c queue max run limit reached Sat Jul 6 04:00:00 2002
! rescheduling a cron job Sat Jul 6 04:00:00 2002
And yes, I copied that message from the log file on one of our servers.
The message is a little cryptic. If you look at your queuedefs file, you will probably find two lines, one for "at" jobs, and one for "batch" jobs. A third line that starts with a "c" could exist and it would be for cron jobs. In the absence of such a line, the defaults apply. That is 100, so you can have no more than 100 cron jobs running at once. But you can override this by editing your queuedefs file. And this is documented, see "man queuedefs".