Sponsored Content
Top Forums UNIX for Advanced & Expert Users Best compression for log files? Post 302173460 by jim mcnamara on Thursday 6th of March 2008 05:43:12 PM
Old 03-06-2008
Compressing means keeping all the data as it was.

If you decrease file size a lot simply by removing stuff or using a predetermined methods for replacing redundancy, you are kind creating huffman encoding on your own. Without a table, so it can't be reversed unless a human knows the drill.

Why don't you just write these files off to tape and delete them off disk? That would result in an ultimate space savings. It will always take human intervention to expand and then interpret your hashed files anyway. So why not add in a little bit more time on the restore side and save time and lots of disk on the compression side. Or get really good archiving software --- :smile:
 

9 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

file compression

Is it possible to unzip / compress a file that was zipped using WinZip? thanks, kristy (2 Replies)
Discussion started by: kristy
2 Replies

2. UNIX for Dummies Questions & Answers

compression utilities

I've noticed bzip2 gives a little bit better compression than gzip. So...I'm curious...what's gives the best compression out of all the compression utilities? Thanks! (6 Replies)
Discussion started by: jalburger
6 Replies

3. UNIX for Dummies Questions & Answers

Un-compression types...

Hi Folks, As I am familiar wih both types compresion forms: gun-zip and .rpm. My questions is how do I uncompress gunz.zip type? As the .rpm I can double click and it will extract...Can someone shed some light on this and thank you... M (2 Replies)
Discussion started by: Mombo_Z
2 Replies

4. Shell Programming and Scripting

How to avoid CR after compression

Hi all, I am having few files which needs to be concted into a single file and then it is compressed and FTPed from the UNIX server to the Windows server. For the above purpose i am using gzip command to compress the files after concetenation. And i am FTP ing the compressed file in the... (3 Replies)
Discussion started by: Codesearcher
3 Replies

5. UNIX for Advanced & Expert Users

Sun Cluster log rotation & compression

I currently have in root's crontab: 20 4 * * 0,3 /usr/cluster/lib/sc/newcleventlog /var/cluster/logs/eventlog 20 4 * * 0,3 /usr/cluster/lib/sc/newcleventlog /var/cluster/logs/DS 20 4 * * 0,3 /usr/cluster/lib/sc/newcleventlog /var/cluster/logs/commandlog there is no man page on... (1 Reply)
Discussion started by: rkruck
1 Replies

6. Shell Programming and Scripting

Compression - Exclude huge files

I have a DB folder which sizes to 60GB approx. It has logs which size from 500MB - 1GB. I have an Installation which would update the DB. I need to backup this DB folder, just incase my Installation FAILS. But I do not need the logs in my backup. How do I exclude them during compression (tar)? ... (2 Replies)
Discussion started by: DevendraG
2 Replies

7. Shell Programming and Scripting

Modification of MySQLDump-files before compression needed

Hi @all! In my MySQL-backup-script I backup and compress every single table with this command: /usr/bin/mysqldump --opt database_x table_y | /usr/bin/pbzip2 -c > "/media/BackUpDrive/Backup/table_x.gz"Unfortunately these files need modification - they have to start with the following line(s):... (7 Replies)
Discussion started by: gogo555
7 Replies

8. UNIX for Advanced & Expert Users

Compression with openssl

Hi , 1-I need to know please if it's possible to compress using openssl? Here is the version used: openssl version -a OpenSSL 0.9.7d 17 Mar 2004 (+ security fixes for: CVE-2005-2969 CVE-2006-2937 CVE-2006-2940 CVE2006-3738 CVE-2006-4339 CVE-2006-4343 CVE-2007-5135 CVE-2008-5077... (3 Replies)
Discussion started by: Eman_in_forum
3 Replies

9. Shell Programming and Scripting

Redirecting log files to null writing junk into log files

Redirecting log files to null writing junk into log files. i have log files which created from below command exec <processname> >$logfile but when it reaches some size i am redirecting to null while process is running like >$logfile manually but after that it writes some junk into... (7 Replies)
Discussion started by: greenworld123
7 Replies
SAVELOG(8)						      System Manager's Manual							SAVELOG(8)

NAME
savelog - save a log file SYNOPSIS
savelog [-m mode] [-u user] [-g group] [-t] [-p] [-c cycle] [-l] [-j] [-J] [-1 .. -9] [-C] [-d] [-l] [-r rolldir] [-n] [-q] [-D dateformat] file ... DESCRIPTION
The savelog command saves and optionally compresses old copies of files. Older versions of file are named: file.<number><compress_suffix> where <number> is the version number, 0 being the newest. Version numbers > 0 are compressed unless -l prevents it. Version number 0 is not compressed because a process might still have file opened for I/O. Only cycle versions of the file are kept. If the file does not exist and -t was given, it will be created. For files that do exist and have lengths greater than zero, the following actions are performed: 1) Version numbered files are cycled. Version file.2 is moved to version file.3, version file.1 is moved to version file.2, and so on. Finally version file.0 is moved to version file.1, and version file is deleted. Both compressed names and uncompressed names are cycled, regardless of -l. Missing version files are ignored. 2) The new file.1 is compressed unless the -l flag was given. It is changed subject to the -m, -u, and -g flags. 3) The main file is moved to file.0. 4) If the -m, -u, -g, -t, or -p flags are given, then an empty file is created subject to the given flags. With the -p flag, the file is created with the same owner, group, and permissions as before. 5) The new file.0 is changed subject to the -m, -u, and -g flags. OPTIONS
-m mode chmod the log files to mode, implies -t -u user chown log files to user, implies -t -g group chgrp log files to group, implies -t -c cycle Save cycle versions of the logfile (default: 7). The cycle count must be at least 2. -t touch new logfile into existence -l don't compress any log files (default: do compress) -p preserve owner, group, and permissions of logfile -j compress with bzip2 instead of gzip -J compress with xz instead of gzip For xz no strength option is set, and xz decides on the default based on the total amount of physical RAM. Note that xz can use a very large amount of memory for the higher compression levels. -1 .. -9 compression strength or memory usage (default: 9, except for xz) -C force cleanup of cycled logfiles -d use standard date for rolling -D dateformat override date format, in the syntax understood by the date(1) command -r use rolldir instead of . to roll files -n do not rotate empty files -q be quiet BUGS
If a process is still writing to file.0, and savelog moves it to file.1 and compresses it, data could be lost. SEE ALSO
logrotate(8) Debian 30 Dec 2017 SAVELOG(8)
All times are GMT -4. The time now is 10:58 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy