Archiving a directory that has data ~150GB


 
Thread Tools Search this Thread
Top Forums UNIX for Advanced & Expert Users Archiving a directory that has data ~150GB
# 1  
Old 03-05-2012
Archiving a directory that has data ~150GB

I have a directory that need to backup, its size is about 150GB consist of multiple files and directories. I try to compress it become a single archive file using these commands:
Code:
tar cjf this_archive.tar.bz2 this_archive/

or
Code:
tar cf - this_archive/ | 7z a -si -t7z -m0=lzma -mx=9 -mfb=64 -md=32m -ms=on this_archive.tar.7z

both commands always take a long time to process compressing the directory, even if it can finish the process, the output file archive always corrupt/error.

I want to know if there's any efficient compressing method/technique for large size directory that capable to do archiving directory like in my case?
# 2  
Old 03-05-2012
What Operating System and version to you have and what Shell do you use?

Are any of the individual files larger that 2 Gb ?
What is the size of the largest file?

What backup software and backup medium do you use for your normal day-to-day backups?
# 3  
Old 03-05-2012
- I'm using RedHat Enterprise Linux 5.5, Kernel 2.6.18-194.el5 (default), with 16Gb RAM.
- No
- Largest file is ~100Kb, and average size is ~50Kb
- I create relatively simple shell script with those tar/7z commands, scheduled run by crontab, if it can successfully create the archive file then it'll will push to raid5 storage on my NAS server.

Last edited by erlanq; 03-05-2012 at 08:24 PM..
# 4  
Old 03-05-2012
You should update the archive with only the files that have been changed between runs. Appends new files. Try not to remake the entire archive every time. Read the man page for tar, consider multiple volumes.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Archiving or removing few data from log file in real time

Hi, I have a log file that gets updated every second. Currently the size has grown to 20+ GB. I need to have a command/script, that will try to get the actual size of the file and will remove 50% of the data that are in the log file. I don't mind removing the data as the size has grown to huge... (8 Replies)
Discussion started by: Souvik Patra
8 Replies

2. Shell Programming and Scripting

Archiving files keeping the same structure directory

Hello Team, We would like to backup a lot of files inside of a structure of directories, four, five or more levels in some Ubuntu, Mac and Solaris systems. For instance: /home/chuck/sales/virgin/rent-quote.pdf /home/chuck/sales/marriott/vacation-quote.pdf... (2 Replies)
Discussion started by: csierra
2 Replies

3. UNIX for Dummies Questions & Answers

Moving all files in a directory to another directory and archiving them

Hi All, i need to move all files in a directory to some other directory and need to archive them,,, Ex.. Source_Path/my_directory/ files in it are... acw.csv 123.txt bge.dat etc ..and we dont know how many files does my_directory contains and all are with different extensions ..so i need... (6 Replies)
Discussion started by: dssyadav
6 Replies

4. Shell Programming and Scripting

Script for Archiving data folderwise

Hi, I want to archive more than 15 days old files from /var/spool/ directory which contains 67 folders. I have done a script for a single folder which is as follows : cd /data1/ctron80/var/spool/bmaprt/ find . -mtime +15 | awk '{print "mv " $1 " /back/spool/bmaprt"}' | sh cd /back/spool/... (1 Reply)
Discussion started by: tuxian
1 Replies

5. Shell Programming and Scripting

Compress the contents of a directory while copying data into it

Hi guys I have a need to compress the contents of a directory while I am copying data into it. I was able to do this when it was only one file by doing as below: STEP1: mknod myfile p STEP2: chmod 777 myfile STEP3: compress -v < myfile > myfile.Z & STEP4: cp -p xyz_file myfile... (2 Replies)
Discussion started by: user1602
2 Replies

6. Shell Programming and Scripting

Automatically Load data from all files in directory

I'm new in Unix shell scripting and i need someone to help me to make Script that run all time watching my directory that files uploaded to it via FTP (/mydir/incoming_files), if any files exists in it then (if many files exists, then sort files and load them ascending) it‘ll checks the size of the... (1 Reply)
Discussion started by: m_fighter
1 Replies

7. Solaris

Full Directory without data....

$ df -hl Filesystem size used avail capacity Mounted on /dev/md/dsk/d0 9.8G 5.0G 4.7G 52% / /proc 0K 0K 0K 0% /proc mnttab 0K 0K 0K 0% /etc/mnttab fd 0K 0K 0K ... (2 Replies)
Discussion started by: adel8483
2 Replies

8. Solaris

Deletion of Data from Lost+Found Directory

Hie I am running a sun solaris server of about 300 gigabytes disk capacity. The problem is that the machine has been having problems over the past year and at times the machine would just freeze or hang and had to be re-booted. Consequently there are too many entries in the lost+found... (1 Reply)
Discussion started by: Ranganai
1 Replies

9. SCO

/data directory not mounting

Dear sir, In my SCO unix system while running an isql because of the size of the files created the ./data directory become full and now I cannot boot the system in normal mode. I am botting the machine in single user mode but i cannot delete the files from /data directory cos it is not... (2 Replies)
Discussion started by: khelen
2 Replies

10. UNIX for Dummies Questions & Answers

archiving a directory using tar

How do i archive a directory and subdirectories under it using tar? (2 Replies)
Discussion started by: harminder
2 Replies
Login or Register to Ask a Question