Quote:
Originally Posted by aamir1234
Hi All
I need guidance on this requirement .
We have a directory structure which has data of approx 100 GB
We need to tar the structure then zip it and create different files of not more than 10 GB
A separate tar file then a .gz should not be created , on the fly a script is needed which should tar then gzip it and create splitted zipped files of not more than 10 GB only
Please advice on this
I would really apprecite it.
Thanks
Aamir
are you getting any error messages when attempting these?
what you can do is go into the directory with the 100GB size. once in the directory, issue a du -sk * | sort -rn
that will list to you the all the other directories in the directory and their sizes from biggest to smallest.
using that information, you add up the directory sizes. you can pick the first 2 or 3 or 4 or whatever till it adds up to 10GB. Then you create a tar archive for this first 10GB like this
create tar archive = tar cf /name-to-give-archive /directory-to-archive
Example = tar cf /var/tmp/james
bond1.tar /usr/impt/dir1 /usr/impt/dir2 /usr/impt/dir3 ...etc
for the second 10GB data you archive u do the same; you add up the remaining directory sizes till it reaches 10GB. then u also archive that.
create tar archive = tar cf /name-to-give-archive /directory-to-archive
Example = tar cf /var/tmp/james
bond2.tar /usr/impt/dir1 /usr/impt/dir2 /usr/impt/dir3 ...etc