03-06-2008
Compressing means keeping all the data as it was.
If you decrease file size a lot simply by removing stuff or using a predetermined methods for replacing redundancy, you are kind creating huffman encoding on your own. Without a table, so it can't be reversed unless a human knows the drill.
Why don't you just write these files off to tape and delete them off disk? That would result in an ultimate space savings. It will always take human intervention to expand and then interpret your hashed files anyway. So why not add in a little bit more time on the restore side and save time and lots of disk on the compression side. Or get really good archiving software --- :smile:
9 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
Is it possible to unzip / compress a file that was zipped using WinZip?
thanks,
kristy (2 Replies)
Discussion started by: kristy
2 Replies
2. UNIX for Dummies Questions & Answers
I've noticed bzip2 gives a little bit better compression than gzip. So...I'm curious...what's gives the best compression out of all the compression utilities?
Thanks! (6 Replies)
Discussion started by: jalburger
6 Replies
3. UNIX for Dummies Questions & Answers
Hi Folks,
As I am familiar wih both types compresion forms: gun-zip and .rpm. My questions is how do I uncompress gunz.zip type? As the .rpm I can double click and it will extract...Can someone shed some light on this and thank you...
M (2 Replies)
Discussion started by: Mombo_Z
2 Replies
4. Shell Programming and Scripting
Hi all,
I am having few files which needs to be concted into a single file and then it is compressed and FTPed from the UNIX server to the Windows server.
For the above purpose i am using gzip command to compress the files after concetenation.
And i am FTP ing the compressed file in the... (3 Replies)
Discussion started by: Codesearcher
3 Replies
5. UNIX for Advanced & Expert Users
I currently have in root's crontab:
20 4 * * 0,3 /usr/cluster/lib/sc/newcleventlog /var/cluster/logs/eventlog
20 4 * * 0,3 /usr/cluster/lib/sc/newcleventlog /var/cluster/logs/DS
20 4 * * 0,3 /usr/cluster/lib/sc/newcleventlog /var/cluster/logs/commandlog
there is no man page on... (1 Reply)
Discussion started by: rkruck
1 Replies
6. Shell Programming and Scripting
I have a DB folder which sizes to 60GB approx. It has logs which size from 500MB - 1GB. I have an Installation which would update the DB. I need to backup this DB folder, just incase my Installation FAILS. But I do not need the logs in my backup. How do I exclude them during compression (tar)?
... (2 Replies)
Discussion started by: DevendraG
2 Replies
7. Shell Programming and Scripting
Hi @all!
In my MySQL-backup-script I backup and compress every single table with this command:
/usr/bin/mysqldump --opt database_x table_y | /usr/bin/pbzip2 -c > "/media/BackUpDrive/Backup/table_x.gz"Unfortunately these files need modification - they have to start with the following line(s):... (7 Replies)
Discussion started by: gogo555
7 Replies
8. UNIX for Advanced & Expert Users
Hi ,
1-I need to know please if it's possible to compress using openssl?
Here is the version used:
openssl version -a
OpenSSL 0.9.7d 17 Mar 2004 (+ security fixes for: CVE-2005-2969 CVE-2006-2937 CVE-2006-2940 CVE2006-3738 CVE-2006-4339 CVE-2006-4343 CVE-2007-5135 CVE-2008-5077... (3 Replies)
Discussion started by: Eman_in_forum
3 Replies
9. Shell Programming and Scripting
Redirecting log files to null writing junk into log files.
i have log files which created from below command
exec <processname> >$logfile
but when it reaches some size i am redirecting to null while process is running like
>$logfile
manually but after that it writes some junk into... (7 Replies)
Discussion started by: greenworld123
7 Replies