Sponsored Content
Full Discussion: Compressing files
Top Forums UNIX for Dummies Questions & Answers Compressing files Post 50312 by sh9492 on Wednesday 21st of April 2004 10:17:21 AM
Old 04-21-2004
Compressing files

I have never used this command before. I need to use the "compress" command to compress all files located in the subdirectories under the following directories:

/home/ftp/inbound/Fordin

Please advise, I appreciate your help.

Thanks,

Syed
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Compressing files from DOS to Unix

As the title sugests I need to compress files from DOS to Unix. The files should be in .Z format, as created using 'compress' in an Unix environment. Ligs (2 Replies)
Discussion started by: ligs
2 Replies

2. UNIX for Dummies Questions & Answers

compressing two files in a script

Hi, i have written a script in unix which produces two files(.csv file) at the end. Now i want to add these to files in a zip file and send the zip file across the network by FTP. Problem is that i dunno how to make a single zip file containing the two files that have been created by the script.... (1 Reply)
Discussion started by: nimish
1 Replies

3. Shell Programming and Scripting

tarring/compressing files in Unix directory

hi guys, i'm totally new with Unix sripting and no idea how to do the scripting at all. My problem is that my boss asked me to do this: 1.) create a script that will tar or gzip the files in particular directory eg: i'm on my home directory and I need to tar/gzip the file in.. assuming... (1 Reply)
Discussion started by: montski
1 Replies

4. UNIX for Advanced & Expert Users

Compressing files on NAS mount

Hello, I am having difficulty compressing the files using compress or GZIP utility on NAS share NFS mounted on my linux server. Any one have idea on how to do this ? I get the followign error but the trying to compress the files STRP2> compress STR_OUTBOUND_CDM_LOG_LOB.PRT2008_26.txt... (1 Reply)
Discussion started by: kamathg
1 Replies

5. UNIX for Dummies Questions & Answers

Compressing of log files

Hello All My first post in the forum. :) I've this huge log files of size 20GB-30 GB in my unix server. I want to analyse the log file for some error messages. But because of the enormity of the size of these files i'm not able to grep/search the pattern in the file . Also, tried to gzip the... (1 Reply)
Discussion started by: sgbhat
1 Replies

6. Shell Programming and Scripting

What is the code for compressing files using pkzip ?

hi everyone , Can someone provide me a shell program to compress and decompress files using gzip , i dont know anything in shell programming , this code is a part of my project. So can someone help with me the code ? (2 Replies)
Discussion started by: mig23
2 Replies

7. Shell Programming and Scripting

Compressing files

I need help to do a script that will compress a file that's bigger than 5000 octets and won't overwrite the previous compress file. lets say I have mylogfile.log and I would compress it I would become mylogfile. 1. log and if I would compress again mylogfile.log it would be mylogfile. 2.... (8 Replies)
Discussion started by: Froob
8 Replies

8. Shell Programming and Scripting

Getting latest files and compressing from a textfile

I'm doing a cleanup script for a directory using KSH. I'm keeping the file name prefixes in a text file. In a KSH, I want to read the prefix from the file, and match the pattern of the file and keep and compress(.Z) the latest 4 versions of the matched files in the directory. And I want to delete... (1 Reply)
Discussion started by: manchimahesh
1 Replies

9. Shell Programming and Scripting

Problem in compressing and moving files

Hi I am writing a sample script (sample.ksh) to compress files in the source directory and move them to another directory. The script takes a config file (files.config) as the paramter the contents of which are as given under: /abc/src ${TSTENV}-xxx-yyy~1.log /abc/src/dest /abc/src... (8 Replies)
Discussion started by: swasid
8 Replies

10. UNIX for Beginners Questions & Answers

Logrotate and Compressing only yesterdays files

Hello, I have a syslog server at home and am currently experiencing an issue where my logs will rotate and compress however it will rotate and compress yesterdays file and the newly created log file for the current day. When it does this however it will also create another new file for today... (9 Replies)
Discussion started by: MyUserName7000
9 Replies
ECM-COMPRESS(1) 					      General Commands Manual						   ECM-COMPRESS(1)

NAME
ecm-compress - selectively strips error correction codes from CD images SYNOPSIS
ecm-compress cdimagefile [ecmfile] DESCRIPTION
ecm-compress reduces the size of a CD image file (ISO, BIN, CDI, NRG, CCD or any other format that uses raw sectors) by eliminating the Error Correction Codes (EEC) and Error Detection Codes (EDC) from each sector when possible. Since the data ecm-compress strips is usually nearly impossible to compress with traditional tools, the resulting ECM file will compress far better than the raw CD image. Hard drives already have mechanisms to protect data integrity, EEC/EDC data can thus be discarded safely. This data can then be generated again using the command ecm-uncompress. Because ecm-compress only discards EDC/ECC data for sectors where it's verifiably possible to recreate that data, the process is lossless. In case of copy protection, ecm-compress will preserve the bogus data. EXAMPLES
ecm-compress foo.bin strips the ECC/EDC data from the foo.bin CD image and save the resulting file as foo.bin.ecm ecm-compress foo.img foobar strips the ECC/EDC data from the foo.img CD image and save the resulting file as foobar SEE ALSO
ecm-uncompress(1). AUTHOR
ecm-compress was written by Neill Corlett. This manual page was written by Loic Martin <loic.martin3@gmail.com>, for the Debian project (but may be used by others), using the docu- mentation written by ECM author Neill Corlett. December 22, 2008 ECM-COMPRESS(1)
All times are GMT -4. The time now is 07:01 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy