What Operating System and version are you running? This is very important. I haven't seen the "fork" error in many years.
We gather that you have ksh.
Quote:
#!/usr/bin/ksh
for num in `cat file1.txt`
do
find . -name "*processed" -print | xargs gunzip -c | grep -q $num || echo "$num not found" >> outputfile.txt &
done
The script posted makes no sense because it does not search for zipped files.
If "file1.txt" contains 5,000,000 numbers this level of blunt processing is absurd in unix Shell when searching 5Gb of data.
You appear to want to search a specific field at a specific position within a record but have provided sample data in "file1.txt" which does not match the exact length of the highlighted field.
Do you have use of a professional Systems Analyst?
Do you have a database engine (e.g. Oracle) and use of professional Database Programmers?
IMHO you are way out of you depth. Hire a professional.
The background "&" within a 5,000,000 iteration loop is why you are getting "fork" errors. It would take a seriously special kernel build to create a unix which could cope with 5,000,000 concurrent processes (hmm. temptied to try it). I am surprised that you did not crash the computer with this irresponsible, uninformed and ignorant code.
Ps: Given a decent commercial database engine and some top-class Database Programmers this problem is solveable.