ah, ok, already got a little bit confused because i found no info about a subdir-command
http://unixforums.lg1x8zmax.simplecd...lies/smile.gif find seems also not to be the right tool for the job.
Let me explain with diffrent words, what i want to do:
I have one directory without subdirectories. In this directory i have round about 100 files
Each file is bigger than 1 GB, the biggest file is about 100 GB. The whole directory contains
round about 500 GB of data.
All these files shall be zipped. If i use 'gzip *' once it will take some time to finish zipping all the files.
To speed up the the process I want to use more, maybe 2 or 3, gzip-processes to finish the job.
Now I search for an elegant and safe way to do this.
My first approach was to send the command 'gzip * &' several times.
This seems to work, but I dont feel comfortable with this solution,
because i know it is not elegant and because I don't know if it is safe.
Safe in sense of possible data loss because two gzip-processes try to
zip the same file at the same time or because sth. else goes wrong.
So there are two final questions:
1. Can the problem be solved using the approach I described or are there any dangers?
2. Does anyone know a more sophisticated solution to the problem?
Thanks in advance,
Basch
***Edited on 31.08.09 to correct formating ***